Sample records for basic theoretical assumptions

  1. Misleading Theoretical Assumptions in Hypertext/Hypermedia Research.

    ERIC Educational Resources Information Center

    Tergan, Sigmar-Olaf

    1997-01-01

    Reviews basic theoretical assumptions of research on learning with hypertext/hypermedia. Focuses on whether the results of research on hypertext/hypermedia-based learning support these assumptions. Results of empirical studies and theoretical analysis reveal that many research approaches have been misled by inappropriate theoretical assumptions on…

  2. Nuclear Reactions in Micro/Nano-Scale Metal Particles

    NASA Astrophysics Data System (ADS)

    Kim, Y. E.

    2013-03-01

    Low-energy nuclear reactions in micro/nano-scale metal particles are described based on the theory of Bose-Einstein condensation nuclear fusion (BECNF). The BECNF theory is based on a single basic assumption capable of explaining the observed LENR phenomena; deuterons in metals undergo Bose-Einstein condensation. The BECNF theory is also a quantitative predictive physical theory. Experimental tests of the basic assumption and theoretical predictions are proposed. Potential application to energy generation by ignition at low temperatures is described. Generalized theory of BECNF is used to carry out theoretical analyses of recently reported experimental results for hydrogen-nickel system.

  3. Sampling Assumptions in Inductive Generalization

    ERIC Educational Resources Information Center

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  4. Riddles of masculinity: gender, bisexuality, and thirdness.

    PubMed

    Fogel, Gerald I

    2006-01-01

    Clinical examples are used to illuminate several riddles of masculinity-ambiguities, enigmas, and paradoxes in relation to gender, bisexuality, and thirdness-frequently seen in male patients. Basic psychoanalytic assumptions about male psychology are examined in the light of advances in female psychology, using ideas from feminist and gender studies as well as important and now widely accepted trends in contemporary psychoanalytic theory. By reexamining basic assumptions about heterosexual men, as has been done with ideas concerning women and homosexual men, complexity and nuance come to the fore to aid the clinician in treating the complex characterological pictures seen in men today. In a context of rapid historical and theoretical change, the use of persistent gender stereotypes and unnecessarily limiting theoretical formulations, though often unintended, may mask subtle countertransference and theoretical blind spots, and limit optimal clinical effectiveness.

  5. School, Cultural Diversity, Multiculturalism, and Contact

    ERIC Educational Resources Information Center

    Pagani, Camilla; Robustelli, Francesco; Martinelli, Cristina

    2011-01-01

    The basic assumption of this paper is that school's potential to improve cross-cultural relations, as well as interpersonal relations in general, is enormous. This assumption is supported by a number of theoretical considerations and by the analysis of data we obtained from a study we conducted on the attitudes toward diversity and…

  6. Social Studies Curriculum Guidelines.

    ERIC Educational Resources Information Center

    Manson, Gary; And Others

    These guidelines, which set standards for social studies programs K-12, can be used to update existing programs or may serve as a baseline for further innovation. The first section, "A Basic Rationale for Social Studies Education," identifies the theoretical assumptions basic to the guidelines as knowledge, thinking, valuing, social participation,…

  7. On the Basis of the Basic Variety.

    ERIC Educational Resources Information Center

    Schwartz, Bonnie D.

    1997-01-01

    Considers the interplay between source and target language in relation to two points made by Klein and Perdue: (1) the argument that the analysis of the target language should not be used as the model for analyzing interlanguage data; and (2) the theoretical claim that under the technical assumptions of minimalism, the Basic Variety is a "perfect"…

  8. Theory and interpretation in qualitative studies from general practice: Why and how?

    PubMed

    Malterud, Kirsti

    2016-03-01

    In this article, I want to promote theoretical awareness and commitment among qualitative researchers in general practice and suggest adequate and feasible theoretical approaches. I discuss different theoretical aspects of qualitative research and present the basic foundations of the interpretative paradigm. Associations between paradigms, philosophies, methodologies and methods are examined and different strategies for theoretical commitment presented. Finally, I discuss the impact of theory for interpretation and the development of general practice knowledge. A scientific theory is a consistent and soundly based set of assumptions about a specific aspect of the world, predicting or explaining a phenomenon. Qualitative research is situated in an interpretative paradigm where notions about particular human experiences in context are recognized from different subject positions. Basic theoretical features from the philosophy of science explain why and how this is different from positivism. Reflexivity, including theoretical awareness and consistency, demonstrates interpretative assumptions, accounting for situated knowledge. Different types of theoretical commitment in qualitative analysis are presented, emphasizing substantive theories to sharpen the interpretative focus. Such approaches are clearly within reach for a general practice researcher contributing to clinical practice by doing more than summarizing what the participants talked about, without trying to become a philosopher. Qualitative studies from general practice deserve stronger theoretical awareness and commitment than what is currently established. Persistent attention to and respect for the distinctive domain of knowledge and practice where the research deliveries are targeted is necessary to choose adequate theoretical endeavours. © 2015 the Nordic Societies of Public Health.

  9. Is Tissue the Issue? A Critique of SOMPA's Models and Tests.

    ERIC Educational Resources Information Center

    Goodman, Joan F.

    1979-01-01

    A critical view of the underlying theoretical rationale of the System of Multicultural Pluralistic Assessment (SOMPA) model for student assessment is presented. The critique is extensive and questions the basic assumptions of the model. (JKS)

  10. Three regularities of recognition memory: the role of bias.

    PubMed

    Hilford, Andrew; Maloney, Laurence T; Glanzer, Murray; Kim, Kisok

    2015-12-01

    A basic assumption of Signal Detection Theory is that decisions are made on the basis of likelihood ratios. In a preceding paper, Glanzer, Hilford, and Maloney (Psychonomic Bulletin & Review, 16, 431-455, 2009) showed that the likelihood ratio assumption implies that three regularities will occur in recognition memory: (1) the Mirror Effect, (2) the Variance Effect, (3) the normalized Receiver Operating Characteristic (z-ROC) Length Effect. The paper offered formal proofs and computational demonstrations that decisions based on likelihood ratios produce the three regularities. A survey of data based on group ROCs from 36 studies validated the likelihood ratio assumption by showing that its three implied regularities are ubiquitous. The study noted, however, that bias, another basic factor in Signal Detection Theory, can obscure the Mirror Effect. In this paper we examine how bias affects the regularities at the theoretical level. The theoretical analysis shows: (1) how bias obscures the Mirror Effect, not the other two regularities, and (2) four ways to counter that obscuring. We then report the results of five experiments that support the theoretical analysis. The analyses and the experimental results also demonstrate: (1) that the three regularities govern individual, as well as group, performance, (2) alternative explanations of the regularities are ruled out, and (3) that Signal Detection Theory, correctly applied, gives a simple and unified explanation of recognition memory data.

  11. The Case for a Hierarchical Cosmology

    ERIC Educational Resources Information Center

    Vaucouleurs, G. de

    1970-01-01

    The development of modern theoretical cosmology is presented and some questionable assumptions of orthodox cosmology are pointed out. Suggests that recent observations indicate that hierarchical clustering is a basic factor in cosmology. The implications of hierarchical models of the universe are considered. Bibliography. (LC)

  12. Some Remarks on the Theory of Political Education. German Studies Notes.

    ERIC Educational Resources Information Center

    Holtmann, Antonius

    This theoretical discussion explores pedagogical assumptions of political education in West Germany. Three major methodological orientations are discussed: the normative-ontological, empirical-analytical, and dialectical-historical. The author recounts the aims, methods, and basic presuppositions of each of these approaches. Topics discussed…

  13. Calculation of Temperature Rise in Calorimetry.

    ERIC Educational Resources Information Center

    Canagaratna, Sebastian G.; Witt, Jerry

    1988-01-01

    Gives a simple but fuller account of the basis for accurately calculating temperature rise in calorimetry. Points out some misconceptions regarding these calculations. Describes two basic methods, the extrapolation to zero time and the equal area method. Discusses the theoretical basis of each and their underlying assumptions. (CW)

  14. Network Analysis in Comparative Social Sciences

    ERIC Educational Resources Information Center

    Vera, Eugenia Roldan; Schupp, Thomas

    2006-01-01

    This essay describes the pertinence of Social Network Analysis (SNA) for the social sciences in general, and discusses its methodological and conceptual implications for comparative research in particular. The authors first present a basic summary of the theoretical and methodological assumptions of SNA, followed by a succinct overview of its…

  15. Moral Development in Higher Education

    ERIC Educational Resources Information Center

    Liddell, Debora L.; Cooper, Diane L.

    2012-01-01

    In this article, the authors lay out the basic foundational concepts and assumptions that will guide the reader through the chapters to come as the chapter authors explore "how" moral growth can be facilitated through various initiatives on the college campus. This article presents a brief review of the theoretical frameworks that provide the…

  16. What Are We Looking For?--Pro Critical Realism in Text Interpretation

    ERIC Educational Resources Information Center

    Siljander, Pauli

    2011-01-01

    A visible role in the theoretical discourses on education has been played in the last couple of decades by the constructivist epistemologies, which have questioned the basic assumptions of realist epistemologies. The increased popularity of interpretative approaches especially has put the realist epistemologies on the defensive. Basing itself on…

  17. The Effective Elementary School Principal: Theoretical Bases, Research Findings and Practical Implications.

    ERIC Educational Resources Information Center

    Burnett, I. Emett, Jr.; Pankake, Anita M.

    Although much of the current school reform movement relies on the basic assumption of effective elementary school administration, insufficient effort has been made to synthesize key concepts found in organizational theory and management studies with relevant effective schools research findings. This paper attempts such a synthesis to help develop…

  18. Basic principles of respiratory function monitoring in ventilated newborns: A review.

    PubMed

    Schmalisch, Gerd

    2016-09-01

    Respiratory monitoring during mechanical ventilation provides a real-time picture of patient-ventilator interaction and is a prerequisite for lung-protective ventilation. Nowadays, measurements of airflow, tidal volume and applied pressures are standard in neonatal ventilators. The measurement of lung volume during mechanical ventilation by tracer gas washout techniques is still under development. The clinical use of capnography, although well established in adults, has not been embraced by neonatologists because of technical and methodological problems in very small infants. While the ventilatory parameters are well defined, the calculation of other physiological parameters are based upon specific assumptions which are difficult to verify. Incomplete knowledge of the theoretical background of these calculations and their limitations can lead to incorrect interpretations with clinical consequences. Therefore, the aim of this review was to describe the basic principles and the underlying assumptions of currently used methods for respiratory function monitoring in ventilated newborns and to highlight methodological limitations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Lectures on Dark Matter Physics

    NASA Astrophysics Data System (ADS)

    Lisanti, Mariangela

    Rotation curve measurements from the 1970s provided the first strong indication that a significant fraction of matter in the Universe is non-baryonic. In the intervening years, a tremendous amount of progress has been made on both the theoretical and experimental fronts in the search for this missing matter, which we now know constitutes nearly 85% of the Universe's matter density. These series of lectures provide an introduction to the basics of dark matter physics. They are geared for the advanced undergraduate or graduate student interested in pursuing research in high-energy physics. The primary goal is to build an understanding of how observations constrain the assumptions that can be made about the astro- and particle physics properties of dark matter. The lectures begin by delineating the basic assumptions that can be inferred about dark matter from rotation curves. A detailed discussion of thermal dark matter follows, motivating Weakly Interacting Massive Particles, as well as lighter-mass alternatives. As an application of these concepts, the phenomenology of direct and indirect detection experiments is discussed in detail.

  20. Study of photon emission by electron capture during solar nuclei acceleration, 1: Temperature-dependent cross section for charge changing processes

    NASA Technical Reports Server (NTRS)

    Perez-Peraza, J.; Alvarez, M.; Laville, A.; Gallegos, A.

    1985-01-01

    The study of charge changing cross sections of fast ions colliding with matter provides the fundamental basis for the analysis of the charge states produced in such interactions. Given the high degree of complexity of the phenomena, there is no theoretical treatment able to give a comprehensive description. In fact, the involved processes are very dependent on the basic parameters of the projectile, such as velocity charge state, and atomic number, and on the target parameters, the physical state (molecular, atomic or ionized matter) and density. The target velocity, may have also incidence on the process, through the temperature of the traversed medium. In addition, multiple electron transfer in single collisions intrincates more the phenomena. Though, in simplified cases, such as protons moving through atomic hydrogen, considerable agreement has been obtained between theory and experiments However, in general the available theoretical approaches have only limited validity in restricted regions of the basic parameters. Since most measurements of charge changing cross sections are performed in atomic matter at ambient temperature, models are commonly based on the assumption of targets at rest, however at Astrophysical scales, temperature displays a wide range in atomic and ionized matter. Therefore, due to the lack of experimental data , an attempt is made here to quantify temperature dependent cross sections on basis to somewhat arbitrary, but physically reasonable assumptions.

  1. Science Awareness and Science Literacy through the Basic Physics Course: Physics with a bit of Metaphysics?

    NASA Astrophysics Data System (ADS)

    Rusli, Aloysius

    2016-08-01

    Until the 1980s, it is well known and practiced in Indonesian Basic Physics courses, to present physics by its effective technicalities: The ideally elastic spring, the pulley and moving blocks, the thermodynamics of ideal engine models, theoretical electrostatics and electrodynamics with model capacitors and inductors, wave behavior and its various superpositions, and hopefully closed with a modern physics description. A different approach was then also experimented with, using the Hobson and Moore texts, stressing the alternative aim of fostering awareness, not just mastery, of science and the scientific method. This is hypothesized to be more in line with the changed attitude of the so-called Millenials cohort who are less attentive if not interested, and are more used to multi-tasking which suits their shorter span of attention. The upside is increased awareness of science and the scientific method. The downside is that they are getting less experience of the scientific method which intensely bases itself on critical observation, analytic thinking to set up conclusions or hypotheses, and checking consistency of the hypotheses with measured data. Another aspect is recognition that the human person encompasses both the reasoning capacity and the mental- spiritual-cultural capacity. This is considered essential, as the world grows even smaller due to increased communication capacity, causing strong interactions, nonlinear effects, and showing that value systems become more challenging and challenged due to physics / science and its cosmology, which is successfully based on the scientific method. So students should be made aware of the common basis of these two capacities: the assumptions, the reasoning capacity and the consistency assumption. This shows that the limits of science are their set of basic quantifiable assumptions, and the limits of the mental-spiritual-cultural aspects of life are their set of basic metaphysical (non-quantifiable) assumptions. The bridging between these two human aspects of life, can lead to a “why” of science, and a “meaning” of life. A progress report on these efforts is presented, essentially being of the results indicated by an extended format of the usual weekly reporting used previously in Basic Physics lectures.

  2. MHD processes in the outer heliosphere

    NASA Technical Reports Server (NTRS)

    Burlaga, L. F.

    1984-01-01

    The magnetic field measurements from Voyager and the magnetohydrodynamic (MHD) processes in the outer heliosphere are reviewed. A bibliography of the experimental and theoretical work concerning magnetic fields and plasmas observed in the outer heliosphere is given. Emphasis in this review is on basic concepts and dynamical processes involving the magnetic field. The theory that serves to explain and unify the interplanetary magnetic field and plasma observations is magnetohydrodynamics. Basic physical processes and observations that relate directly to solutions of the MHD equations are emphasized, but obtaining solutions of this complex system of equations involves various assumptions and approximations. The spatial and temporal complexity of the outer heliosphere and some approaches for dealing with this complexity are discussed.

  3. A comparison between EGS4 and MCNP computer modeling of an in vivo X-ray fluorescence system.

    PubMed

    Al-Ghorabie, F H; Natto, S S; Al-Lyhiani, S H

    2001-03-01

    The Monte Carlo computer codes EGS4 and MCNP were used to develop a theoretical model of a 180 degrees geometry in vivo X-ray fluorescence system for the measurement of platinum concentration in head and neck tumors. The model included specification of the photon source, collimators, phantoms and detector. Theoretical results were compared and evaluated against X-ray fluorescence data obtained experimentally from an existing system developed by the Swansea In Vivo Analysis and Cancer Research Group. The EGS4 results agreed well with the MCNP results. However, agreement between the measured spectral shape obtained using the experimental X-ray fluorescence system and the simulated spectral shape obtained using the two Monte Carlo codes was relatively poor. The main reason for the disagreement between the results arises from the basic assumptions which the two codes used in their calculations. Both codes assume a "free" electron model for Compton interactions. This assumption will underestimate the results and invalidates any predicted and experimental spectra when compared with each other.

  4. The Equations of Oceanic Motions

    NASA Astrophysics Data System (ADS)

    Müller, Peter

    2006-10-01

    Modeling and prediction of oceanographic phenomena and climate is based on the integration of dynamic equations. The Equations of Oceanic Motions derives and systematically classifies the most common dynamic equations used in physical oceanography, from large scale thermohaline circulations to those governing small scale motions and turbulence. After establishing the basic dynamical equations that describe all oceanic motions, M|ller then derives approximate equations, emphasizing the assumptions made and physical processes eliminated. He distinguishes between geometric, thermodynamic and dynamic approximations and between the acoustic, gravity, vortical and temperature-salinity modes of motion. Basic concepts and formulae of equilibrium thermodynamics, vector and tensor calculus, curvilinear coordinate systems, and the kinematics of fluid motion and wave propagation are covered in appendices. Providing the basic theoretical background for graduate students and researchers of physical oceanography and climate science, this book will serve as both a comprehensive text and an essential reference.

  5. [Medical errors from positions of mutual relations of patient-lawyer-doctor].

    PubMed

    Radysh, Ia F; Tsema, Ie V; Mehed', V P

    2013-01-01

    The basic theoretical and practical aspects of problem of malpractice in the system of health protection Ukraine are presented in the article. On specific examples the essence of the term "malpractice" is expounded. It was considered types of malpractice, conditions of beginning and kinds of responsibility to assumption of malpractice. The special attention to the legal and mental and ethical questions of problem from positions of protection of rights for a patient and medical worker is spared. The necessity of qualification malpractices on intentional and unintentional, possible and impermissible is grounded.

  6. Self, College Experiences, and Society: Rethinking the Theoretical Foundations of Student Development Theory

    ERIC Educational Resources Information Center

    Winkle-Wagner, Rachelle

    2012-01-01

    This article examines the psychological theoretical foundations of college student development theory and the theoretical assumptions of this framework. A complimentary, sociological perspective and the theoretical assumptions of this approach are offered. The potential limitations of the overuse of each perspective are considered. The conclusion…

  7. Ecological Footprint in relation to Climate Change Strategy in Cities

    NASA Astrophysics Data System (ADS)

    Belčáková, Ingrid; Diviaková, Andrea; Belaňová, Eliška

    2017-10-01

    Ecological footprint determines how much natural resources are consumed by an individual, city, region, state or all inhabitants of our planet in order to ensure their requirements and needs. It includes all activities, from food consumption, housing, transport to waste produced and allows us to compare particular activities and their impacts on the environment and natural resources. Ecological footprint is important issue for making sustainable development concept more popular using simplifications, which provide the public with basic information on situation on our planet. Today we know calculations of global (worldwide), national and local ecological footprints. During our research in cities, we were concentrated on calculation of city’s ecological footprint. The article tries to outline theoretical and assumptions and practical results of climate change consequences in cities of Bratislava and Nitra (Slovakia), to describe potential of mitigating adverse impacts of climate change and to provide information for general and professional public on theoretical assumptions in calculating ecological footprint. The intention is to present innovation of ecological footprint calculation, taking into consideration ecological stability of a city (with a specific focus on micro-climate functions of green areas). Present possibilities to reduce ecological footprint are presented.

  8. Plant uptake of elements in soil and pore water: field observations versus model assumptions.

    PubMed

    Raguž, Veronika; Jarsjö, Jerker; Grolander, Sara; Lindborg, Regina; Avila, Rodolfo

    2013-09-15

    Contaminant concentrations in various edible plant parts transfer hazardous substances from polluted areas to animals and humans. Thus, the accurate prediction of plant uptake of elements is of significant importance. The processes involved contain many interacting factors and are, as such, complex. In contrast, the most common way to currently quantify element transfer from soils into plants is relatively simple, using an empirical soil-to-plant transfer factor (TF). This practice is based on theoretical assumptions that have been previously shown to not generally be valid. Using field data on concentrations of 61 basic elements in spring barley, soil and pore water at four agricultural sites in mid-eastern Sweden, we quantify element-specific TFs. Our aim is to investigate to which extent observed element-specific uptake is consistent with TF model assumptions and to which extent TF's can be used to predict observed differences in concentrations between different plant parts (root, stem and ear). Results show that for most elements, plant-ear concentrations are not linearly related to bulk soil concentrations, which is congruent with previous studies. This behaviour violates a basic TF model assumption of linearity. However, substantially better linear correlations are found when weighted average element concentrations in whole plants are used for TF estimation. The highest number of linearly-behaving elements was found when relating average plant concentrations to soil pore-water concentrations. In contrast to other elements, essential elements (micronutrients and macronutrients) exhibited relatively small differences in concentration between different plant parts. Generally, the TF model was shown to work reasonably well for micronutrients, whereas it did not for macronutrients. The results also suggest that plant uptake of elements from sources other than the soil compartment (e.g. from air) may be non-negligible. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Rationality.

    PubMed

    Shafir, Eldar; LeBoeuf, Robyn A

    2002-01-01

    This chapter reviews selected findings in research on reasoning, judgment, and choice and considers the systematic ways in which people violate basic requirements of the corresponding normative analyses. Recent objections to the empirical findings are then considered; these objections question the findings' relevance to assumptions about rationality. These objections address the adequacy of the tasks used in the aforementioned research and the appropriateness of the critical interpretation of participants' responses, as well as the justifiability of some of the theoretical assumptions made by experimenters. The objections are each found not to seriously impinge on the general conclusion that people often violate tenets of rationality in inadvisable ways. In the process, relevant psychological constructs, ranging from cognitive ability and need for cognition, to dual process theories and the role of incentives, are discussed. It is proposed that the rationality critique is compelling and rightfully gaining influence in the social sciences in general.

  10. Zipf's word frequency law in natural language: a critical review and future directions.

    PubMed

    Piantadosi, Steven T

    2014-10-01

    The frequency distribution of words has been a key object of study in statistical linguistics for the past 70 years. This distribution approximately follows a simple mathematical form known as Zipf's law. This article first shows that human language has a highly complex, reliable structure in the frequency distribution over and above this classic law, although prior data visualization methods have obscured this fact. A number of empirical phenomena related to word frequencies are then reviewed. These facts are chosen to be informative about the mechanisms giving rise to Zipf's law and are then used to evaluate many of the theoretical explanations of Zipf's law in language. No prior account straightforwardly explains all the basic facts or is supported with independent evaluation of its underlying assumptions. To make progress at understanding why language obeys Zipf's law, studies must seek evidence beyond the law itself, testing assumptions and evaluating novel predictions with new, independent data.

  11. Neopositivism and the DSM psychiatric classification. An epistemological history. Part 1: Theoretical comparison.

    PubMed

    Aragona, Massimiliano

    2013-06-01

    Recent research suggests that the DSM psychiatric classification is in a paradigmatic crisis and that the DSM-5 will be unable to overcome it. One possible reason is that the DSM is based on a neopositivist epistemology which is inadequate for the present-day needs of psychopathology. However, in which sense is the DSM a neopositivist system? This paper will explore the theoretical similarities between the DSM structure and the neopositivist basic assumptions. It is shown that the DSM has the following neopositivist features: (a) a sharp distinction between scientific and non-scientific diagnoses; (b) the exclusion of the latter as nonsensical; (c) the faith on the existence of a purely observable basis (the description of reliable symptoms); (d) the introduction of the operative diagnostic criteria as rules of correspondence linking the observational level to the diagnostic concept.

  12. On the Worthwhileness of Theoretical Activities

    ERIC Educational Resources Information Center

    Hand, Michael

    2009-01-01

    R.S. Peters' arguments for the worthwhileness of theoretical activities are intended to justify education per se, on the assumption that education is necessarily a matter of initiating people into theoretical activities. If we give up this assumption, we can ask whether Peters' arguments might serve instead to justify the academic curriculum over…

  13. Study on low intensity aeration oxygenation model and optimization for shallow water

    NASA Astrophysics Data System (ADS)

    Chen, Xiao; Ding, Zhibin; Ding, Jian; Wang, Yi

    2018-02-01

    Aeration/oxygenation is an effective measure to improve self-purification capacity in shallow water treatment while high energy consumption, high noise and expensive management refrain the development and the application of this process. Based on two-film theory, the theoretical model of the three-dimensional partial differential equation of aeration in shallow water is established. In order to simplify the equation, the basic assumptions of gas-liquid mass transfer in vertical direction and concentration diffusion in horizontal direction are proposed based on engineering practice and are tested by the simulation results of gas holdup which are obtained by simulating the gas-liquid two-phase flow in aeration tank under low-intensity condition. Based on the basic assumptions and the theory of shallow permeability, the model of three-dimensional partial differential equations is simplified and the calculation model of low-intensity aeration oxygenation is obtained. The model is verified through comparing the aeration experiment. Conclusions as follows: (1)The calculation model of gas-liquid mass transfer in vertical direction and concentration diffusion in horizontal direction can reflect the process of aeration well; (2) Under low-intensity conditions, the long-term aeration and oxygenation is theoretically feasible to enhance the self-purification capacity of water bodies; (3) In the case of the same total aeration intensity, the effect of multipoint distributed aeration on the diffusion of oxygen concentration in the horizontal direction is obvious; (4) In the shallow water treatment, reducing the volume of aeration equipment with the methods of miniaturization, array, low-intensity, mobilization to overcome the high energy consumption, large size, noise and other problems can provide a good reference.

  14. A theoretical approach to measuring pilot workload

    NASA Technical Reports Server (NTRS)

    Kantowitz, B. H.

    1984-01-01

    Theoretical assumptions used by researchers in the area of attention, with emphasis upon errors and inconsistent assumptions used by some researchers were studied. Two GAT experiments, two laboratory studies and one field experiment were conducted.

  15. Dynamics of an HIV-1 infection model with cell mediated immunity

    NASA Astrophysics Data System (ADS)

    Yu, Pei; Huang, Jianing; Jiang, Jiao

    2014-10-01

    In this paper, we study the dynamics of an improved mathematical model on HIV-1 virus with cell mediated immunity. This new 5-dimensional model is based on the combination of a basic 3-dimensional HIV-1 model and a 4-dimensional immunity response model, which more realistically describes dynamics between the uninfected cells, infected cells, virus, the CTL response cells and CTL effector cells. Our 5-dimensional model may be reduced to the 4-dimensional model by applying a quasi-steady state assumption on the variable of virus. However, it is shown in this paper that virus is necessary to be involved in the modeling, and that a quasi-steady state assumption should be applied carefully, which may miss some important dynamical behavior of the system. Detailed bifurcation analysis is given to show that the system has three equilibrium solutions, namely the infection-free equilibrium, the infectious equilibrium without CTL, and the infectious equilibrium with CTL, and a series of bifurcations including two transcritical bifurcations and one or two possible Hopf bifurcations occur from these three equilibria as the basic reproduction number is varied. The mathematical methods applied in this paper include characteristic equations, Routh-Hurwitz condition, fluctuation lemma, Lyapunov function and computation of normal forms. Numerical simulation is also presented to demonstrate the applicability of the theoretical predictions.

  16. Why we do what we do: a theoretical evaluation of the integrated practice model for forensic nursing science.

    PubMed

    Valentine, Julie L

    2014-01-01

    An evaluation of the Integrated Practice Model for Forensic Nursing Science () is presented utilizing methods outlined by . A brief review of nursing theory basics and evaluation methods by Meleis is provided to enhance understanding of the ensuing theoretical evaluation and critique. The Integrated Practice Model for Forensic Nursing Science, created by forensic nursing pioneer Virginia Lynch, captures the theories, assumptions, concepts, and propositions inherent in forensic nursing practice and science. The historical background of the theory is explored as Lynch's model launched the role development of forensic nursing practice as both a nursing and forensic science specialty. It is derived from a combination of nursing, sociological, and philosophical theories to reflect the grounding of forensic nursing in the nursing, legal, psychological, and scientific communities. As Lynch's model is the first inception of forensic nursing theory, it is representative of a conceptual framework although the title implies a practice theory. The clarity and consistency displayed in the theory's structural components of assumptions, concepts, and propositions are analyzed. The model is described and evaluated. A summary of the strengths and limitations of the model is compiled followed by application to practice, education, and research with suggestions for ongoing theory development.

  17. Hypersonic aerodynamic characteristics of a family of power-law, wing body configurations

    NASA Technical Reports Server (NTRS)

    Townsend, J. C.

    1973-01-01

    The configurations analyzed are half-axisymmetric, power-law bodies surmounted by thin, flat wings. The wing planform matches the body shock-wave shape. Analytic solutions of the hypersonic small disturbance equations form a basis for calculating the longitudinal aerodynamic characteristics. Boundary-layer displacement effects on the body and the wing upper surface are approximated. Skin friction is estimated by using compressible, laminar boundary-layer solutions. Good agreement was obtained with available experimental data for which the basic theoretical assumptions were satisfied. The method is used to estimate the effects of power-law, fineness ratio, and Mach number variations at full-scale conditions. The computer program is included.

  18. Zipf’s word frequency law in natural language: A critical review and future directions

    PubMed Central

    2014-01-01

    The frequency distribution of words has been a key object of study in statistical linguistics for the past 70 years. This distribution approximately follows a simple mathematical form known as Zipf ’ s law. This article first shows that human language has a highly complex, reliable structure in the frequency distribution over and above this classic law, although prior data visualization methods have obscured this fact. A number of empirical phenomena related to word frequencies are then reviewed. These facts are chosen to be informative about the mechanisms giving rise to Zipf’s law and are then used to evaluate many of the theoretical explanations of Zipf’s law in language. No prior account straightforwardly explains all the basic facts or is supported with independent evaluation of its underlying assumptions. To make progress at understanding why language obeys Zipf’s law, studies must seek evidence beyond the law itself, testing assumptions and evaluating novel predictions with new, independent data. PMID:24664880

  19. Testing electrostatic equilibrium in the ionosphere by detailed comparison of ground magnetic deflection and incoherent scatter radar.

    NASA Astrophysics Data System (ADS)

    Cosgrove, R. B.; Schultz, A.; Imamura, N.

    2016-12-01

    Although electrostatic equilibrium is always assumed in the ionosphere, there is no good theoretical or experimental justification for the assumption. In fact, recent theoretical investigations suggest that the electrostatic assumption may be grossly in error. If true, many commonly used modeling methods are placed in doubt. For example, the accepted method for calculating ionospheric conductance??field line integration??may be invalid. In this talk we briefly outline the theoretical research that places the electrostatic assumption in doubt, and then describe how comparison of ground magnetic field data with incoherent scatter radar (ISR) data can be used to test the electrostatic assumption in the ionosphere. We describe a recent experiment conducted for the purpose, where an array of magnetometers was temporalily installed under the Poker Flat AMISR.

  20. Quantitative Methodology: A Guide for Emerging Physical Education and Adapted Physical Education Researchers

    ERIC Educational Resources Information Center

    Haegele, Justin A.; Hodge, Samuel R.

    2015-01-01

    Emerging professionals, particularly senior-level undergraduate and graduate students in kinesiology who have an interest in physical education for individuals with and without disabilities, should understand the basic assumptions of the quantitative research paradigm. Knowledge of basic assumptions is critical for conducting, analyzing, and…

  1. What neuropsychology tells us about human tool use? The four constraints theory (4CT): mechanics, space, time, and effort.

    PubMed

    Osiurak, François

    2014-06-01

    Our understanding of human tool use comes mainly from neuropsychology, particularly from patients with apraxia or action disorganization syndrome. However, there is no integrative, theoretical framework explaining what these neuropsychological syndromes tell us about the cognitive/neural bases of human tool use. The goal of the present article is to fill this gap, by providing a theoretical framework for the study of human tool use: The Four Constraints Theory (4CT). This theory rests on two basic assumptions. First, everyday tool use activities can be formalized as multiple problem situations consisted of four distinct constraints (mechanics, space, time, and effort). Second, each of these constraints can be solved by the means of a specific process (technical reasoning, semantic reasoning, working memory, and simulation-based decision-making, respectively). Besides presenting neuropsychological evidence for 4CT, this article shall address epistemological, theoretical and methodological issues I will attempt to resolve. This article will discuss how 4CT diverges from current cognitive models about several widespread hypotheses (e.g., notion of routine, direct and automatic activation of tool knowledge, simulation-based tool knowledge).

  2. The theoretical limit to plant productivity.

    PubMed

    DeLucia, Evan H; Gomez-Casanovas, Nuria; Greenberg, Jonathan A; Hudiburg, Tara W; Kantola, Ilsa B; Long, Stephen P; Miller, Adam D; Ort, Donald R; Parton, William J

    2014-08-19

    Human population and economic growth are accelerating the demand for plant biomass to provide food, fuel, and fiber. The annual increment of biomass to meet these needs is quantified as net primary production (NPP). Here we show that an underlying assumption in some current models may lead to underestimates of the potential production from managed landscapes, particularly of bioenergy crops that have low nitrogen requirements. Using a simple light-use efficiency model and the theoretical maximum efficiency with which plant canopies convert solar radiation to biomass, we provide an upper-envelope NPP unconstrained by resource limitations. This theoretical maximum NPP approached 200 tC ha(-1) yr(-1) at point locations, roughly 2 orders of magnitude higher than most current managed or natural ecosystems. Recalculating the upper envelope estimate of NPP limited by available water reduced it by half or more in 91% of the land area globally. While the high conversion efficiencies observed in some extant plants indicate great potential to increase crop yields without changes to the basic mechanism of photosynthesis, particularly for crops with low nitrogen requirements, realizing such high yields will require improvements in water use efficiency.

  3. Brain Modules, Personality Layers, Planes of Being, Spiral Structures, and the Equally Implausible Distinction between TCI-R "Temperament" and "Character" Scales: A Reply to Cloninger.

    PubMed

    Farmer, Richard F; Goldberg, Lewis R

    2008-09-01

    In this reply we address comments by Cloninger (this issue) related to our report (Farmer & Goldberg, this issue) on the psychometric properties of the revised Temperament and Character Inventory (TCI-R) and a short inventory derivative, the TCI-140. Even though Cloninger's psychobiological model has undergone substantial theoretical modifications, the relevance of these changes for the evaluation and use of the TCI-R remains unclear. Aspects of TCI-R assessment also appear to be theoretically and empirically incongruent with Cloninger's assertion that TCI-R personality domains are non-linear and dynamic in nature. Several other core assumptions from the psychobiological model, including this most recent iteration, are non-falsifiable, inconsistently supported, or have no apparent empirical basis. Although researchers using the TCI and TCI-R have frequently accepted the temperament/character distinction and associated theoretical ramifications, for example, we find little overall support for the differentiation of TCI-R domains into these two basic categories. The implications of these observations for TCI-R assessment are briefly discussed.

  4. Didactics and History of Mathematics: Knowledge and Self-Knowledge

    ERIC Educational Resources Information Center

    Fried, Michael N.

    2007-01-01

    The basic assumption of this paper is that mathematics and history of mathematics are both forms of knowledge and, therefore, represent different ways of knowing. This was also the basic assumption of Fried (2001) who maintained that these ways of knowing imply different conceptual and methodological commitments, which, in turn, lead to a conflict…

  5. The Discrepancy-Induced Source Comprehension (D-ISC) Model: Basic Assumptions and Preliminary Evidence

    ERIC Educational Resources Information Center

    Braasch, Jason L. G.; Bråten, Ivar

    2017-01-01

    Despite the importance of source attention and evaluation for learning from texts, little is known about the particular conditions that encourage sourcing during reading. In this article, basic assumptions of the discrepancy-induced source comprehension (D-ISC) model are presented, which describes the moment-by-moment cognitive processes that…

  6. Principles of Bobath neuro-developmental therapy in cerebral palsy.

    PubMed

    Klimont, L

    2001-01-01

    The purpose of this article is to present the basics of Bobath Neurodevelopment Therapy (NDT) for the rehabilitation of patients with cerebral palsy, based on the fundamentals of neurophysiology.
    Two factors are continually stressed in therapy: first, postural tension, whose quality provides the foundation for the development of motor coordination, both normal and pathological, and plays a role in shaping the mechanism of the normal postural reflex; and secondly, the impact of damage to the central nervous system on the process of its growth and development.
    The practical application of the theoretical assumptions includes the use of inhibition, facilitation, and stimulation by key points of control, preparatory to evoking more nearly normal motor responses.

  7. Knowledge Discovery from Relations

    ERIC Educational Resources Information Center

    Guo, Zhen

    2010-01-01

    A basic and classical assumption in the machine learning research area is "randomness assumption" (also known as i.i.d assumption), which states that data are assumed to be independent and identically generated by some known or unknown distribution. This assumption, which is the foundation of most existing approaches in the literature, simplifies…

  8. Teaching Critical Literacy across the Curriculum in Multimedia America.

    ERIC Educational Resources Information Center

    Semali, Ladislaus M.

    The teaching of media texts as a form of textual construction is embedded in the assumption that audiences bring individual preexisting dispositions even though the media may contribute to their shaping of basic attitudes, beliefs, values, and behavior. As summed up by D. Lusted, at the core of such textual construction are basic assumptions that…

  9. The generalized van der Waals theory of pure fluids and mixtures: Annual report for September 1985 to November 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandler, S.I.

    1986-01-01

    The objective of the work is to use the generalized van der Waals theory, as derived earlier (''The Generalized van der Waals Partition Function I. Basic Theory'' by S.I. Sandler, Fluid Phase Equilibria 19, 233 (1985)) to: (1) understand the molecular level assumptions inherent in current thermodynamic models; (2) use theory and computer simulation studies to test these assumptions; and (3) develop new, improved thermodynamic models based on better molecular level assumptions. From such a fundamental study, thermodynamic models will be developed that will be applicable to mixtures of molecules of widely different size and functionality, as occurs in themore » processing of heavy oils, coal liquids and other synthetic fuels. An important aspect of our work is to reduce our fundamental theoretical developments to engineering practice through extensive testing and evaluation with experimental data on real mixtures. During the first year of this project important progress was made in the areas specified in the original proposal, as well as several subsidiary areas identified as the work progressed. Some of this work has been written up and submitted for publication. Manuscripts acknowledging DOE support, together with a very brief description, are listed herein.« less

  10. Whose drag is it anyway? Drag kings and monarchy in the UK.

    PubMed

    Willox, Annabelle

    2002-01-01

    This chapter will show that the term "drag" in drag queen has a different meaning, history and value to the term "drag" in drag king. By exposing this basic, yet fundamental, difference this paper will expose the problems inherent in the assumption of parity between the two forms of drag. An exposition of how camp has been used to comprehend and theorise drag queens will facilitating an understanding of the parasitic interrelationship between camp and drag queen performances, while a critique of "Towards a Butch-Femme Aesthetic," by Sue Ellen Case, will point out the problematic assumptions made about camp when attributed to a cultural location different to the drag queen. By interrogating the historical, cultural and theoretical similarities and differences between drag kings, butches, drag queens and femmes this paper will expose the flawed assumption that camp can be attributed to all of the above without proviso, and hence expose why drag has a fundamentally different contextual meaning for kings and queens. This chapter will conclude by examining the work of both Judith Halberstam and Biddy Martin and the practical examples of drag king and queen performances provided at the UK drag contest held at The Fridge in Brixton, London on 23 June 1999.

  11. Small Molecule Docking from Theoretical Structural Models

    NASA Astrophysics Data System (ADS)

    Novoa, Eva Maria; de Pouplana, Lluis Ribas; Orozco, Modesto

    Structural approaches to rational drug design rely on the basic assumption that pharmacological activity requires, as necessary but not sufficient condition, the binding of a drug to one or several cellular targets, proteins in most cases. The traditional paradigm assumes that drugs that interact only with a single cellular target are specific and accordingly have little secondary effects, while promiscuous molecules are more likely to generate undesirable side effects. However, current examples indicate that often efficient drugs are able to interact with several biological targets [1] and in fact some dirty drugs, such as chlorpromazine, dextromethorphan, and ibogaine exhibit desired pharmacological properties [2]. These considerations highlight the tremendous difficulty of designing small molecules that both have satisfactory ADME properties and the ability of interacting with a limited set of target proteins with a high affinity, avoiding at the same time undesirable interactions with other proteins. In this complex and challenging scenario, computer simulations emerge as the basic tool to guide medicinal chemists during the drug discovery process.

  12. Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.

    PubMed

    Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian

    2011-01-01

    Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.

  13. Intonation and compensation of fretted string instruments

    NASA Astrophysics Data System (ADS)

    Varieschi, Gabriele; Gower, Christina

    2011-04-01

    We discuss theoretical and physical models that are useful for analyzing the intonation of musical instruments such as guitars and mandolins and can be used to improve the tuning on these instruments. The placement of frets on the fingerboard is designed according to mathematical rules and the assumption of an ideal string. The analysis becomes more complicated when we include the effects of deformation of the string and inharmonicity due to other string characteristics. As a consequence, perfect intonation of all the notes on the instrument cannot be achieved, but complex compensation procedures can be introduced to minimize the problem. To test the validity of these procedures, we performed extensive measurements using standard monochord sonometers and other acoustical devices, confirming the correctness of our theoretical models. These experimental activities can be integrated into acoustics courses and laboratories and can become a more advanced version of basic experiments with monochords and sonometers. This work was supported by a grant from the Frank R. Seaver College of Science and Engineering, Loyola Marymount University.

  14. Evolutionary ethnobiology and cultural evolution: opportunities for research and dialog.

    PubMed

    Santoro, Flávia Rosa; Nascimento, André Luiz Borba; Soldati, Gustavo Taboada; Ferreira Júnior, Washington Soares; Albuquerque, Ulysses Paulino

    2018-01-09

    The interest in theoretical frameworks that improve our understanding of social-ecological systems is growing within the field of ethnobiology. Several evolutionary questions may underlie the relationships between people and the natural resources that are investigated in this field. A new branch of research, known as evolutionary ethnobiology (EE), focuses on these questions and has recently been formally conceptualized. The field of cultural evolution (CE) has significantly contributed to the development of this new field, and it has introduced the Darwinian concepts of variation, competition, and heredity to studies that focus on the dynamics of local knowledge. In this article, we introduce CE as an important theoretical framework for evolutionary ethnobiological research. We present the basic concepts and assumptions of CE, along with the adjustments that are necessary for its application in EE. We discuss different ethnobiological studies in the context of this new framework and the new opportunities for research that exist in this area. We also propose a dialog that includes our findings in the context of cultural evolution.

  15. [Memorandum IV: Theoretical and Normative Grounding of Health Services Research].

    PubMed

    Baumann, W; Farin, E; Menzel-Begemann, A; Meyer, T

    2016-05-01

    With Memoranda and other initiatives, the German Network for Health Service Research [Deutsches Netzwerk Versorgungsforschung e.V. (DNVF)] is fostering the methodological quality of care research studies for years. Compared to the standards of empirical research, questions concerning the role and function of theories, theoretical approaches and scientific principles have not been taken up on its own. Therefore, the DNVF e.V. has set up a working group in 2013, which was commissioned to prepare a memorandum on "theories in health care research". This now presented memorandum will primarily challenge scholars in health care services research to pay more attention to questions concerning the theoretical arsenal and the background assumptions in the research process. The foundation in the philosophy of science, the reference to normative principles and the theory-bases of the research process are addressed. Moreover, the memorandum will call on to advance the theorizing in health services research and to strengthen not empirical approaches, research on basic principles or studies with regard to normative sciences and to incorporate these relevant disciplines in health services research. Research structures and funding of health services research needs more open space for theoretical reflection and for self-observation of their own, multidisciplinary research processes. © Georg Thieme Verlag KG Stuttgart · New York.

  16. The Influence of Theoretical Tools on Teachers' Orientation to Notice and Classroom Practice: A Case Study

    ERIC Educational Resources Information Center

    Mellone, Maria

    2011-01-01

    Assumptions about the construction and the transmission of knowledge and about the nature of mathematics always underlie any teaching practice, even if often unconsciously. I examine the conjecture that theoretical tools suitably chosen can help the teacher to make such assumptions explicit and to support the teacher's reflection on his/her…

  17. Preliminary Investigation of an Underwater Ramjet Powered by Compressed Air

    NASA Technical Reports Server (NTRS)

    Mottard, Elmo J.; Shoemaker, Charles J.

    1961-01-01

    Part I contains the results of a preliminary experimental investigation of a particular design of an underwater ramjet or hydroduct powered by compressed air. The hydroduct is a propulsion device in which the energy of an expanding gas imparts additional momentum to a stream of water through mixing. The hydroduct model had a fineness ratio of 5.9, a maximum diameter of 3.2 inches, and a ratio of inlet area to frontal area of 0.32. The model was towed at a depth of 1 inch at forward speeds between 20 and 60 feet per second for airflow rates from 0.1 to 0.3 pound per second. Longitudinal force and pressures at the inlet and in the mixing chamber were determined. The hydroduct produced a positive thrust-minus-drag force at every test speed. The force and pressure coefficients were functions primarily of the ratio of weight airflow to free-stream velocity. The maximum propulsive efficiency based on the net internal thrust and an isothermal expansion of the air was approximately 53 percent at a thrust coefficient of 0.10. The performance of the test model may have been influenced by choking of the exit flow. Part II is a theoretical development of an underwater ramjet using air as "fuel." The basic assumption of the theoretical analysis is that a mixture of water and air can be treated as a compressible gas. More information on the properties of air-water mixtures is required to confirm this assumption or to suggest another approach. A method is suggested from which a more complete theoretical development, with the effects of choking included, may be obtained. An exploratory computation, in which this suggested method was used, indicated that the effect of choked flow on the thrust coefficient was minor.

  18. Rumor spreading model with the different attitudes towards rumors

    NASA Astrophysics Data System (ADS)

    Hu, Yuhan; Pan, Qiuhui; Hou, Wenbing; He, Mingfeng

    2018-07-01

    Rumor spreading has a profound influence on people's well-being and social stability. There are many factors influencing rumor spreading. In this paper, we recommended an assumption that among the common mass there are three attitudes towards rumors: to like rumor spreading, to dislike rumor spreading, and to be hesitant (or neutral) to rumor spreading. Based on such an assumption, a Susceptible-Hesitating-Affected-Resistant(SHAR) model is established, which considered individuals' different attitudes towards rumor spreading. We also analyzed the local and global stability of rumor-free equilibrium and rumor-existence equilibrium, calculated the basic reproduction number of our model. With numerical simulations, we illustrated the effect of parameter changes on rumor spreading, analyzing the parameter sensitivity of the model. The results of the theoretical analysis and numerical simulations illustrated the conclusions of this study. People having different attitudes towards rumors may play different roles in the process of rumor spreading. It was surprising to find, in our research, that people who hesitate to spread rumors have a positive effect on the spread of rumors.

  19. Psychotherapy research needs theory. Outline for an epistemology of the clinical exchange.

    PubMed

    Salvatore, Sergio

    2011-09-01

    This paper provides an analysis of a basic assumption grounding the clinical research: the ontological autonomy of psychotherapy-based on the idea that the clinical exchange is sufficiently distinguished from other social objects (i.e. exchange between teacher and pupils, or between buyer and seller, or interaction during dinner, and so forth). A criticism of such an assumption is discussed together with the proposal of a different epistemological interpretation, based on the distinction between communicative dynamics and the process of psychotherapy-psychotherapy is a goal-oriented process based on the general dynamics of human communication. Theoretical and methodological implications are drawn from such a view: It allows further sources of knowledge to be integrated within clinical research (i.e. those coming from other domains of analysis of human communication); it also enables a more abstract definition of the psychotherapy process to be developed, leading to innovative views of classical critical issues, like the specific-nonspecific debate. The final part of the paper is devoted to presenting a model of human communication--the Semiotic Dialogical Dialectic Theory--which is meant as the framework for the analysis of psychotherapy.

  20. Helping Students to Recognize and Evaluate an Assumption in Quantitative Reasoning: A Basic Critical-Thinking Activity with Marbles and Electronic Balance

    ERIC Educational Resources Information Center

    Slisko, Josip; Cruz, Adrian Corona

    2013-01-01

    There is a general agreement that critical thinking is an important element of 21st century skills. Although critical thinking is a very complex and controversial conception, many would accept that recognition and evaluation of assumptions is a basic critical-thinking process. When students use simple mathematical model to reason quantitatively…

  1. An Introduction to the Problem of the Existence of Classical and Quantum Information

    NASA Astrophysics Data System (ADS)

    Rocchi, Paolo; Gianfagna, Leonida

    2006-01-01

    Quantum computing raises novel meditation upon the nature of information, notably a number of theorists set out the critical elements of Shannon's work, which currently emerges as the most popular reference in the quantum territory. The present paper follows this vein and highlights how the prerequisites of the information theory, which should detail the precise hypotheses of this theory, appear rather obscure and the problem of the existence of information is still open. This work puts forward a theoretical scheme that calculates the existence of elementary items. These results clarify basic assumptions in information engineering. Later we bring evidence how information is not an absolute quantity and close with a discussion upon the information relativity.

  2. Categorial Compositionality II: Universal Constructions and a General Theory of (Quasi-)Systematicity in Human Cognition

    PubMed Central

    Phillips, Steven; Wilson, William H.

    2011-01-01

    A complete theory of cognitive architecture (i.e., the basic processes and modes of composition that together constitute cognitive behaviour) must explain the systematicity property—why our cognitive capacities are organized into particular groups of capacities, rather than some other, arbitrary collection. The classical account supposes: (1) syntactically compositional representations; and (2) processes that are sensitive to—compatible with—their structure. Classical compositionality, however, does not explain why these two components must be compatible; they are only compatible by the ad hoc assumption (convention) of employing the same mode of (concatenative) compositionality (e.g., prefix/postfix, where a relation symbol is always prepended/appended to the symbols for the related entities). Architectures employing mixed modes do not support systematicity. Recently, we proposed an alternative explanation without ad hoc assumptions, using category theory. Here, we extend our explanation to domains that are quasi-systematic (e.g., aspects of most languages), where the domain includes some but not all possible combinations of constituents. The central category-theoretic construct is an adjunction involving pullbacks, where the primary focus is on the relationship between processes modelled as functors, rather than the representations. A functor is a structure-preserving map (or construction, for our purposes). An adjunction guarantees that the only pairings of functors are the systematic ones. Thus, (quasi-)systematicity is a necessary consequence of a categorial cognitive architecture whose basic processes are functors that participate in adjunctions. PMID:21857816

  3. A Minimalist Analysis of English Topicalization: A Phase-Based Cartographic Complementizer Phrase (CP) Perspective.

    PubMed

    Tanaka, Hiroyoshi

    Under the basic tenet that syntactic derivation offers an optimal solution to both phonological realization and semantic interpretation of linguistic expression, the recent minimalist framework of syntactic theory claims that the basic unit for the derivation is equivalent to a syntactic propositional element, which is called a phase. In this analysis, syntactic derivation is assumed to proceed at phasal projections that include Complementizer Phrases (CP). However, there have been pointed out some empirical problems with respect to the failure of multiple occurrences of discourse-related elements in the CP domain. This problem can be easily overcome if the alternative approach in the recent minimalist perspective, which is called Cartographic CP analysis, is adopted, but this may raise a theoretical issue about the tension between phasality and four kinds of functional projections assumed in this analysis (Force Phrase (ForceP), Finite Phrase (FinP), Topic Phrase (TopP) and Focus Phrase (FocP)). This paper argues that a hybrid analysis with these two influential approaches can be proposed by claiming a reasonable assumption that syntactically requisite projections (i.e., ForceP and FinP) are phases and independently constitute a phasehood with relevant heads in the derivation. This then enables us to capture various syntactic properties of the Topicalization construction in English. Our proposed analysis, coupled with some additional assumptions and observations in recent minimalist studies, can be extended to incorporate peculiar properties in temporal/conditional adverbials and imperatives.

  4. Causality and headache triggers

    PubMed Central

    Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.

    2013-01-01

    Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872

  5. On the Origins of the Linear Free Energy Relationships: Exploring the Nature of the Off-Diagonal Coupling Elements in SN2 Reactions

    PubMed Central

    Rosta, Edina; Warshel, Arieh

    2012-01-01

    Understanding the relationship between the adiabatic free energy profiles of chemical reactions and the underlining diabatic states is central to the description of chemical reactivity. The diabatic states form the theoretical basis of Linear Free Energy Relationships (LFERs) and thus play a major role in physical organic chemistry and related fields. However, the theoretical justification for some of the implicit LFER assumptions has not been fully established by quantum mechanical studies. This study follows our earlier works1,2 and uses the ab initio frozen density functional theory (FDFT) method3 to evaluate both the diabatic and adiabatic free energy surfaces and to determine the corresponding off-diagonal coupling matrix elements for a series of SN2 reactions. It is found that the off-diagonal coupling matrix elements are almost the same regardless of the nucleophile and the leaving group but change upon changing the central group. Furthermore, it is also found that the off diagonal elements are basically the same in gas phase and in solution, even when the solvent is explicitly included in the ab initio calculations. Furthermore, our study establishes that the FDFT diabatic profiles are parabolic to a good approximation thus providing a first principle support to the origin of LFER. These findings further support the basic approximation of the EVB treatment. PMID:23329895

  6. Teaching Critical Thinking by Examining Assumptions

    ERIC Educational Resources Information Center

    Yanchar, Stephen C.; Slife, Brent D.

    2004-01-01

    We describe how instructors can integrate the critical thinking skill of examining theoretical assumptions (e.g., determinism and materialism) and implications into psychology courses. In this instructional approach, students formulate questions that help them identify assumptions and implications, use those questions to identify and examine the…

  7. The Applied Behavior Analysis Research Paradigm and Single-Subject Designs in Adapted Physical Activity Research.

    PubMed

    Haegele, Justin A; Hodge, Samuel Russell

    2015-10-01

    There are basic philosophical and paradigmatic assumptions that guide scholarly research endeavors, including the methods used and the types of questions asked. Through this article, kinesiology faculty and students with interests in adapted physical activity are encouraged to understand the basic assumptions of applied behavior analysis (ABA) methodology for conducting, analyzing, and presenting research of high quality in this paradigm. The purposes of this viewpoint paper are to present information fundamental to understanding the assumptions undergirding research methodology in ABA, describe key aspects of single-subject research designs, and discuss common research designs and data-analysis strategies used in single-subject studies.

  8. The Robustness of the Studentized Range Statistic to Violations of the Normality and Homogeneity of Variance Assumptions.

    ERIC Educational Resources Information Center

    Ramseyer, Gary C.; Tcheng, Tse-Kia

    The present study was directed at determining the extent to which the Type I Error rate is affected by violations in the basic assumptions of the q statistic. Monte Carlo methods were employed, and a variety of departures from the assumptions were examined. (Author)

  9. Students' perspectives on basic nursing care education.

    PubMed

    Huisman-de Waal, Getty; Feo, Rebecca; Vermeulen, Hester; Heinen, Maud

    2018-02-05

    The aim of the study is to explore the perspectives of nursing students on their education concerning basic nursing care, learned either during theoretical education or clinical placement, with a specific focus on nutrition and communication. Basic care activities lie at the core of nursing, but are ill-informed by evidence and often poorly delivered. Nursing students' education on basic care might be lacking, and the question remains how they learn to deliver basic care in clinical practice. Descriptive study, using an online questionnaire. Nursing students at the vocational and bachelor level of six nursing schools in the Netherlands were invited to complete an online questionnaire regarding their perception of basic nursing care education in general (both theoretical education and clinical placement), and specifically in relation to nutrition and communication. Nursing students (n=226 bachelor students, n=30 vocational students) completed the questionnaire. Most students reported that they learned more about basic nursing care during clinical placement than during theoretical education. Vocational students also reported learning more about basic nursing care in both theoretical education and clinical practice than bachelor students. In terms of nutrition, low numbers of students from both education levels reported learning about nutrition protocols and guidelines during theoretical education. In terms of communication, vocational students indicated that they learned more about different aspects of communication during clinical practice than theoretical education, and were also more likely to learn about communication (in both theoretical education and clinical practice) than were bachelor students. Basic nursing care seems to be largely invisible in nursing education, especially at the bachelor level and during theoretical education. Improved basic nursing care will enhance nurse sensitive outcomes and patient satisfaction and will contribute to lower healthcare costs. This study shows that there is scope within current nurse education in the Netherlands to focus more systematically and explicitly on basic nursing care. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  10. Assessing theoretical uncertainties in fission barriers of superheavy nuclei

    DOE PAGES

    Agbemava, S. E.; Afanasjev, A. V.; Ray, D.; ...

    2017-05-26

    Here, theoretical uncertainties in the predictions of inner fission barrier heights in superheavy elements have been investigated in a systematic way for a set of state-of-the-art covariant energy density functionals which represent major classes of the functionals used in covariant density functional theory. They differ in basic model assumptions and fitting protocols. Both systematic and statistical uncertainties have been quantified where the former turn out to be larger. Systematic uncertainties are substantial in superheavy elements and their behavior as a function of proton and neutron numbers contains a large random component. The benchmarking of the functionals to the experimental datamore » on fission barriers in the actinides allows to reduce the systematic theoretical uncertainties for the inner fission barriers of unknown superheavy elements. However, even then they on average increase on moving away from the region where benchmarking has been performed. In addition, a comparison with the results of non-relativistic approaches is performed in order to define full systematic theoretical uncertainties over the state-of-the-art models. Even for the models benchmarked in the actinides, the difference in the inner fission barrier height of some superheavy elements reaches $5-6$ MeV. This uncertainty in the fission barrier heights will translate into huge (many tens of the orders of magnitude) uncertainties in the spontaneous fission half-lives.« less

  11. Disease Extinction Versus Persistence in Discrete-Time Epidemic Models.

    PubMed

    van den Driessche, P; Yakubu, Abdul-Aziz

    2018-04-12

    We focus on discrete-time infectious disease models in populations that are governed by constant, geometric, Beverton-Holt or Ricker demographic equations, and give a method for computing the basic reproduction number, [Formula: see text]. When [Formula: see text] and the demographic population dynamics are asymptotically constant or under geometric growth (non-oscillatory), we prove global asymptotic stability of the disease-free equilibrium of the disease models. Under the same demographic assumption, when [Formula: see text], we prove uniform persistence of the disease. We apply our theoretical results to specific discrete-time epidemic models that are formulated for SEIR infections, cholera in humans and anthrax in animals. Our simulations show that a unique endemic equilibrium of each of the three specific disease models is asymptotically stable whenever [Formula: see text].

  12. Responsibility and age-related dementia.

    PubMed

    Frantik, Petr

    2018-05-01

    This article identifies the assumption of responsibility as a basic need of human beings and applies the concept specifically to older people with dementia or Alzheimer's disease. It suggests a two-level concept of responsibility, based on the approach of discourse ethicist Karl-Otto Apel, as a promising approach to recognizing human diversity while at the same time respecting people's equal rights to participate in discourse. This concept can serve as a theoretical starting point for the construction of individually adapted types of responsibility. Furthermore, the article describes practical ideas (primarily the practice of doll therapy) that can enable people with dementia or Alzheimer's disease to assume responsibility. Direct communication and a reflective, sensitive consideration of each individual case are identified as important prerequisites for the inclusion of elderly people with dementia. © 2018 John Wiley & Sons Ltd.

  13. The implantation of life on Mars - Feasibility and motivation

    NASA Technical Reports Server (NTRS)

    Haynes, Robert H.; Mckay, Christopher P.

    1992-01-01

    Scientific concepts are reviewed regarding the potential formation and development of a life-bearing environment on Mars, and a potential ecopoiesis scenario is given. The development of the earth's biosphere is defined, and the major assumptions related to the formation of Martian life are listed. Three basic phases are described for the life-implantation concept which include determining whether sufficient quantities of volatiles are available, engineering the warming of the planet, and implanting microbial communities if necessary. Warming the planet theoretically releases liquid H2O and produces a thick CO2 atmosphere, and the implantation of biological communities is only necessary if no indigenous microbes emerge. It is concluded that a feasibility study is required to assess the possibilities of implanting life on Mars more concretely.

  14. Dynamics of an epidemic model with quarantine on scale-free networks

    NASA Astrophysics Data System (ADS)

    Kang, Huiyan; Liu, Kaihui; Fu, Xinchu

    2017-12-01

    Quarantine strategies are frequently used to control or reduce the transmission risks of epidemic diseases such as SARS, tuberculosis and cholera. In this paper, we formulate a susceptible-exposed-infected-quarantined-recovered model on a scale-free network incorporating the births and deaths of individuals. Considering that the infectivity is related to the degrees of infectious nodes, we introduce quarantined rate as a function of degree into the model, and quantify the basic reproduction number, which is shown to be dependent on some parameters, such as quarantined rate, infectivity and network structures. A theoretical result further indicates the heterogeneity of networks and higher infectivity will raise the disease transmission risk while quarantine measure will contribute to the prevention of epidemic spreading. Meanwhile, the contact assumption between susceptibles and infectives may impact the disease transmission. Furthermore, we prove that the basic reproduction number serves as a threshold value for the global stability of the disease-free and endemic equilibria and the uniform persistence of the disease on the network by constructing appropriate Lyapunov functions. Finally, some numerical simulations are illustrated to perform and complement our analytical results.

  15. [The Basic-Symptom Concept and its Influence on Current International Research on the Prediction of Psychoses].

    PubMed

    Schultze-Lutter, F

    2016-12-01

    The early detection of psychoses has become increasingly relevant in research and clinic. Next to the ultra-high risk (UHR) approach that targets an immediate risk of developing frank psychosis, the basic symptom approach that targets the earliest possible detection of the developing disorder is being increasingly used worldwide. The present review gives an introduction to the development and basic assumptions of the basic symptom concept, summarizes the results of studies on the specificity of basic symptoms for psychoses in different age groups as well as on studies of their psychosis-predictive value, and gives an outlook on future results. Moreover, a brief introduction to first recent imaging studies is given that supports one of the main assumptions of the basic symptom concept, i. e., that basic symptoms are the most immediate phenomenological expression of the cerebral aberrations underlying the development of psychosis. From this, it is concluded that basic symptoms might be able to provide important information on future neurobiological research on the etiopathology of psychoses. © Georg Thieme Verlag KG Stuttgart · New York.

  16. Re-emerging conceptual integration: commentary on Berkowitz's "on the consideration of automatic as well as controlled psychological processes in aggression".

    PubMed

    Pahlavan, Farzaneh

    2008-01-01

    In recent decades, researchers in various areas of psychology have challenged the claims of a single mode of information processing, and developed dual-process models of social behaviors. Although these theories differ on a number of dimensions, they all share the basic assumption that two different modes of information processing operate in making decision and copying behavior. In essence, the common distinction in these perspectives is between controlled vs. automatic, conscious vs. unconscious, and affective vs. cognitive modes of processing. The purpose of Berkowitz's article is to go beyond the notion of automatic processes in order to use classic notions of conditioning and displacement to explain aggressive behavior. I assert that an explanatory framework for psychology of aggression must be anchored not only in the new but also classic theoretical paradigms. However, progress in psychology does not rest solely on the accumulation of theoretical insights. It demands a large body of empirical facts, with attention to incongruities, discordances, and conceptual clarifications. Copyright 2008 Wiley-Liss, Inc.

  17. The Vocational Turn in Adult Literacy Education and the Impact of the International Adult Literacy Survey

    NASA Astrophysics Data System (ADS)

    Druine, Nathalie; Wildemeersch, Danny

    2000-09-01

    The authors critically examine some of the underlying epistemological and theoretical assumptions of the IALS. In doing so, they distinguish among two basic orientations towards literacy. First, the standard approach (of which IALS is an example) subscribes to the possibility of measuring literacy as abstract, cognitive skills, and endorses the claim that there is an important relationship between literacy skills and economic success in the so-called 'knowledge society.' The second, called a socio-cultural approach, insists on the contextual and power-related character of people's literacy practices. The authors further illustrate that the assumptions of the IALS are rooted in a neo-liberal ideology that forces all members of society to adjust to the exigencies of the globalised economy. In the current, contingent conditions of the risk society, however, it does not seem very wise to limit the learning of adults to enhancing labour-market competencies. Adult education should relate to the concrete literacy practices people already have in their lives. It should make its learners co-responsible actors of their own learning process and participants in a democratic debate on defining the kind of society people want to build.

  18. Numerical distance effect size is a poor metric of approximate number system acuity.

    PubMed

    Chesney, Dana

    2018-04-12

    Individual differences in the ability to compare and evaluate nonsymbolic numerical magnitudes-approximate number system (ANS) acuity-are emerging as an important predictor in many research areas. Unfortunately, recent empirical studies have called into question whether a historically common ANS-acuity metric-the size of the numerical distance effect (NDE size)-is an effective measure of ANS acuity. NDE size has been shown to frequently yield divergent results from other ANS-acuity metrics. Given these concerns and the measure's past popularity, it behooves us to question whether the use of NDE size as an ANS-acuity metric is theoretically supported. This study seeks to address this gap in the literature by using modeling to test the basic assumption underpinning use of NDE size as an ANS-acuity metric: that larger NDE size indicates poorer ANS acuity. This assumption did not hold up under test. Results demonstrate that the theoretically ideal relationship between NDE size and ANS acuity is not linear, but rather resembles an inverted J-shaped distribution, with the inflection points varying based on precise NDE task methodology. Thus, depending on specific methodology and the distribution of ANS acuity in the tested population, positive, negative, or null correlations between NDE size and ANS acuity could be predicted. Moreover, peak NDE sizes would be found for near-average ANS acuities on common NDE tasks. This indicates that NDE size has limited and inconsistent utility as an ANS-acuity metric. Past results should be interpreted on a case-by-case basis, considering both specifics of the NDE task and expected ANS acuity of the sampled population.

  19. Can Basic Research on Children and Families Be Useful for the Policy Process?

    ERIC Educational Resources Information Center

    Moore, Kristin A.

    Based on the assumption that basic science is the crucial building block for technological and biomedical progress, this paper examines the relevance for public policy of basic demographic and behavioral sciences research on children and families. The characteristics of basic research as they apply to policy making are explored. First, basic…

  20. Validation of the underlying assumptions of the quality-adjusted life-years outcome: results from the ECHOUTCOME European project.

    PubMed

    Beresniak, Ariel; Medina-Lara, Antonieta; Auray, Jean Paul; De Wever, Alain; Praet, Jean-Claude; Tarricone, Rosanna; Torbica, Aleksandra; Dupont, Danielle; Lamure, Michel; Duru, Gerard

    2015-01-01

    Quality-adjusted life-years (QALYs) have been used since the 1980s as a standard health outcome measure for conducting cost-utility analyses, which are often inadequately labeled as 'cost-effectiveness analyses'. This synthetic outcome, which combines the quantity of life lived with its quality expressed as a preference score, is currently recommended as reference case by some health technology assessment (HTA) agencies. While critics of the QALY approach have expressed concerns about equity and ethical issues, surprisingly, very few have tested the basic methodological assumptions supporting the QALY equation so as to establish its scientific validity. The main objective of the ECHOUTCOME European project was to test the validity of the underlying assumptions of the QALY outcome and its relevance in health decision making. An experiment has been conducted with 1,361 subjects from Belgium, France, Italy, and the UK. The subjects were asked to express their preferences regarding various hypothetical health states derived from combining different health states with time durations in order to compare observed utility values of the couples (health state, time) and calculated utility values using the QALY formula. Observed and calculated utility values of the couples (health state, time) were significantly different, confirming that preferences expressed by the respondents were not consistent with the QALY theoretical assumptions. This European study contributes to establishing that the QALY multiplicative model is an invalid measure. This explains why costs/QALY estimates may vary greatly, leading to inconsistent recommendations relevant to providing access to innovative medicines and health technologies. HTA agencies should consider other more robust methodological approaches to guide reimbursement decisions.

  1. Shaping the use of psychotropic medicines in nursing homes: A qualitative study on organisational culture.

    PubMed

    Sawan, Mouna; Jeon, Yun-Hee; Chen, Timothy F

    2018-04-01

    Psychotropic medicines have limited efficacy in the management of behavioural and psychological disturbances, yet they are commonly used in nursing homes. Organisational culture is an important consideration influencing use of psychotropic medicines. Schein's theory elucidates that organisational culture is underpinned by basic assumptions, which are the taken for granted beliefs driving organisational members' behaviour and practices. By exploring the basic assumptions of culture we are able to find explanations for why psychotropic medicines are prescribed contrary to standards. A qualitative study guided by Schein's theory was conducted using semi-structured interviews with 40 staff representing a broad range of roles from eight nursing homes. Findings from the study suggest two basic assumptions influenced the use of psychotropic medicines: locus of control and necessity for efficiency or comprehensiveness. Locus of control pertained to whether staff believed they could control decisions when facing negative work experiences. Necessity for efficiency or comprehensiveness concerned how much time and effort was spent on a given task. Participants' arrived at decisions to use psychotropic medicines that were inconsistent with ideal standards when they believed they were helpless to do the right thing by the resident and it was necessary to restrict time on a given task. Basic assumptions tended to provide the rationale for staff to use psychotropic medicines when it was not compatible with standards. Organisational culture is an important factor that should be addressed to optimise psychotropic medicine use. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Physical context for theoretical approaches to sediment transport magnitude-frequency analysis in alluvial channels

    NASA Astrophysics Data System (ADS)

    Sholtes, Joel; Werbylo, Kevin; Bledsoe, Brian

    2014-10-01

    Theoretical approaches to magnitude-frequency analysis (MFA) of sediment transport in channels couple continuous flow probability density functions (PDFs) with power law flow-sediment transport relations (rating curves) to produce closed-form equations relating MFA metrics such as the effective discharge, Qeff, and fraction of sediment transported by discharges greater than Qeff, f+, to statistical moments of the flow PDF and rating curve parameters. These approaches have proven useful in understanding the theoretical drivers behind the magnitude and frequency of sediment transport. However, some of their basic assumptions and findings may not apply to natural rivers and streams with more complex flow-sediment transport relationships or management and design scenarios, which have finite time horizons. We use simple numerical experiments to test the validity of theoretical MFA approaches in predicting the magnitude and frequency of sediment transport. Median values of Qeff and f+ generated from repeated, synthetic, finite flow series diverge from those produced with theoretical approaches using the same underlying flow PDF. The closed-form relation for f+ is a monotonically increasing function of flow variance. However, using finite flow series, we find that f+ increases with flow variance to a threshold that increases with flow record length. By introducing a sediment entrainment threshold, we present a physical mechanism for the observed diverging relationship between Qeff and flow variance in fine and coarse-bed channels. Our work shows that through complex and threshold-driven relationships sediment transport mode, channel morphology, flow variance, and flow record length all interact to influence estimates of what flow frequencies are most responsible for transporting sediment in alluvial channels.

  3. Statistical Issues for Calculating Reentry Hazards

    NASA Technical Reports Server (NTRS)

    Matney, Mark; Bacon, John

    2016-01-01

    A number of statistical tools have been developed over the years for assessing the risk of reentering object to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. This information, combined with information on the expected ground path of the reentry, is used to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of this analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper builds on previous IAASS work to re-examine many of these theoretical assumptions, including the mathematical basis for the hazard calculations, and outlining the conditions under which the simplifying assumptions hold. This study also employs empirical and theoretical information to test these assumptions, and makes recommendations how to improve the accuracy of these calculations in the future.

  4. Imulation of temperature field in swirl pulverized coal boiler

    NASA Astrophysics Data System (ADS)

    Lv, Wei; Wu, Weifeng; Chen, Chen; Chen, Weifeng; Qi, Guoli; Zhang, Songsong

    2018-02-01

    In order to achieve the goal of energy saving and emission reduction and energy efficient utilization, taking a 58MW swirl pulverized coal boiler as the research object, the three-dimensional model of the rotor is established. According to the principle of CFD, basic assumptions and boundary conditions are selected, the temperature field in the furnace of 6 kinds of working conditions is numerically solved, and the temperature distribution in the furnace is analyzed. The calculation results show that the temperature of the working condition 1 is in good agreement with the experimental data, and the error is less than 10%,the results provide a theoretical basis for the following calculation. Through the comparison of the results of the 6 conditions, it is found that the working condition 3 is the best operating condition of the pulverized coal boiler.

  5. Artificial Intelligence: Underlying Assumptions and Basic Objectives.

    ERIC Educational Resources Information Center

    Cercone, Nick; McCalla, Gordon

    1984-01-01

    Presents perspectives on methodological assumptions underlying research efforts in artificial intelligence (AI) and charts activities, motivations, methods, and current status of research in each of the major AI subareas: natural language understanding; computer vision; expert systems; search, problem solving, planning; theorem proving and logic…

  6. Teaching Practices: Reexamining Assumptions.

    ERIC Educational Resources Information Center

    Spodek, Bernard, Ed.

    This publication contains eight papers, selected from papers presented at the Bicentennial Conference on Early Childhood Education, that discuss different aspects of teaching practices. The first two chapters reexamine basic assumptions underlying the organization of curriculum experiences for young children. Chapter 3 discusses the need to…

  7. The psychiatric interview: validity, structure, and subjectivity.

    PubMed

    Nordgaard, Julie; Sass, Louis A; Parnas, Josef

    2013-06-01

    There is a glaring gap in the psychiatric literature concerning the nature of psychiatric symptoms and signs, and a corresponding lack of epistemological discussion of psycho-diagnostic interviewing. Contemporary clinical neuroscience heavily relies on the use of fully structured interviews that are historically rooted in logical positivism and behaviorism. These theoretical approaches marked decisively the so-called "operational revolution in psychiatry" leading to the creation of DSM-III. This paper attempts to examine the theoretical assumptions that underlie the use of a fully structured psychiatric interview. We address the ontological status of pathological experience, the notions of symptom, sign, prototype and Gestalt, and the necessary second-person processes which are involved in converting the patient's experience (originally lived in the first-person perspective) into an "objective" (third person), actionable format, used for classification, treatment, and research. Our central thesis is that psychiatry targets the phenomena of consciousness, which, unlike somatic symptoms and signs, cannot be grasped on the analogy with material thing-like objects. We claim that in order to perform faithful distinctions in this particular domain, we need a more adequate approach, that is, an approach that is guided by phenomenologically informed considerations. Our theoretical discussion draws upon clinical examples derived from structured and semi-structured interviews. We conclude that fully structured interview is neither theoretically adequate nor practically valid in obtaining psycho-diagnostic information. Failure to address these basic issues may have contributed to the current state of malaise in the study of psychopathology.

  8. Humans display a reduced set of consistent behavioral phenotypes in dyadic games.

    PubMed

    Poncela-Casasnovas, Julia; Gutiérrez-Roig, Mario; Gracia-Lázaro, Carlos; Vicens, Julian; Gómez-Gardeñes, Jesús; Perelló, Josep; Moreno, Yamir; Duch, Jordi; Sánchez, Angel

    2016-08-01

    Socially relevant situations that involve strategic interactions are widespread among animals and humans alike. To study these situations, theoretical and experimental research has adopted a game theoretical perspective, generating valuable insights about human behavior. However, most of the results reported so far have been obtained from a population perspective and considered one specific conflicting situation at a time. This makes it difficult to extract conclusions about the consistency of individuals' behavior when facing different situations and to define a comprehensive classification of the strategies underlying the observed behaviors. We present the results of a lab-in-the-field experiment in which subjects face four different dyadic games, with the aim of establishing general behavioral rules dictating individuals' actions. By analyzing our data with an unsupervised clustering algorithm, we find that all the subjects conform, with a large degree of consistency, to a limited number of behavioral phenotypes (envious, optimist, pessimist, and trustful), with only a small fraction of undefined subjects. We also discuss the possible connections to existing interpretations based on a priori theoretical approaches. Our findings provide a relevant contribution to the experimental and theoretical efforts toward the identification of basic behavioral phenotypes in a wider set of contexts without aprioristic assumptions regarding the rules or strategies behind actions. From this perspective, our work contributes to a fact-based approach to the study of human behavior in strategic situations, which could be applied to simulating societies, policy-making scenario building, and even a variety of business applications.

  9. Non-Normality and Testing that a Correlation Equals Zero

    ERIC Educational Resources Information Center

    Levy, Kenneth J.

    1977-01-01

    The importance of the assumption of normality for testing that a bivariate normal correlation equals zero is examined. Both empirical and theoretical evidence suggest that such tests are robust with respect to violation of the normality assumption. (Author/JKS)

  10. Power, Revisited

    ERIC Educational Resources Information Center

    Roscigno, Vincent J.

    2011-01-01

    Power is a core theoretical construct in the field with amazing utility across substantive areas, levels of analysis and methodologies. Yet, its use along with associated assumptions--assumptions surrounding constraint vs. action and specifically organizational structure and rationality--remain problematic. In this article, and following an…

  11. How biological background assumptions influence scientific risk evaluation of stacked genetically modified plants: an analysis of research hypotheses and argumentations.

    PubMed

    Rocca, Elena; Andersen, Fredrik

    2017-08-14

    Scientific risk evaluations are constructed by specific evidence, value judgements and biological background assumptions. The latter are the framework-setting suppositions we apply in order to understand some new phenomenon. That background assumptions co-determine choice of methodology, data interpretation, and choice of relevant evidence is an uncontroversial claim in modern basic science. Furthermore, it is commonly accepted that, unless explicated, disagreements in background assumptions can lead to misunderstanding as well as miscommunication. Here, we extend the discussion on background assumptions from basic science to the debate over genetically modified (GM) plants risk assessment. In this realm, while the different political, social and economic values are often mentioned, the identity and role of background assumptions at play are rarely examined. We use an example from the debate over risk assessment of stacked genetically modified plants (GM stacks), obtained by applying conventional breeding techniques to GM plants. There are two main regulatory practices of GM stacks: (i) regulate as conventional hybrids and (ii) regulate as new GM plants. We analyzed eight papers representative of these positions and found that, in all cases, additional premises are needed to reach the stated conclusions. We suggest that these premises play the role of biological background assumptions and argue that the most effective way toward a unified framework for risk analysis and regulation of GM stacks is by explicating and examining the biological background assumptions of each position. Once explicated, it is possible to either evaluate which background assumptions best reflect contemporary biological knowledge, or to apply Douglas' 'inductive risk' argument.

  12. Statistical Issues for Uncontrolled Reentry Hazards

    NASA Technical Reports Server (NTRS)

    Matney, Mark

    2008-01-01

    A number of statistical tools have been developed over the years for assessing the risk of reentering objects to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. The statistical tools use this information to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of the analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper looks at a number of these theoretical assumptions, examining the mathematical basis for the hazard calculations, and outlining the conditions under which the simplifying assumptions hold. In addition, this paper will also outline some new tools for assessing ground hazard risk in useful ways. Also, this study is able to make use of a database of known uncontrolled reentry locations measured by the United States Department of Defense. By using data from objects that were in orbit more than 30 days before reentry, sufficient time is allowed for the orbital parameters to be randomized in the way the models are designed to compute. The predicted ground footprint distributions of these objects are based on the theory that their orbits behave basically like simple Kepler orbits. However, there are a number of factors - including the effects of gravitational harmonics, the effects of the Earth's equatorial bulge on the atmosphere, and the rotation of the Earth and atmosphere - that could cause them to diverge from simple Kepler orbit behavior and change the ground footprints. The measured latitude and longitude distributions of these objects provide data that can be directly compared with the predicted distributions, providing a fundamental empirical test of the model assumptions.

  13. Laser-induced breakdown spectroscopy (LIBS), part I: review of basic diagnostics and plasma-particle interactions: still-challenging issues within the analytical plasma community.

    PubMed

    Hahn, David W; Omenetto, Nicoló

    2010-12-01

    Laser-induced breakdown spectroscopy (LIBS) has become a very popular analytical method in the last decade in view of some of its unique features such as applicability to any type of sample, practically no sample preparation, remote sensing capability, and speed of analysis. The technique has a remarkably wide applicability in many fields, and the number of applications is still growing. From an analytical point of view, the quantitative aspects of LIBS may be considered its Achilles' heel, first due to the complex nature of the laser-sample interaction processes, which depend upon both the laser characteristics and the sample material properties, and second due to the plasma-particle interaction processes, which are space and time dependent. Together, these may cause undesirable matrix effects. Ways of alleviating these problems rely upon the description of the plasma excitation-ionization processes through the use of classical equilibrium relations and therefore on the assumption that the laser-induced plasma is in local thermodynamic equilibrium (LTE). Even in this case, the transient nature of the plasma and its spatial inhomogeneity need to be considered and overcome in order to justify the theoretical assumptions made. This first article focuses on the basic diagnostics aspects and presents a review of the past and recent LIBS literature pertinent to this topic. Previous research on non-laser-based plasma literature, and the resulting knowledge, is also emphasized. The aim is, on one hand, to make the readers aware of such knowledge and on the other hand to trigger the interest of the LIBS community, as well as the larger analytical plasma community, in attempting some diagnostic approaches that have not yet been fully exploited in LIBS.

  14. Belief Structures about People Held by Selected Graduate Students.

    ERIC Educational Resources Information Center

    Dole, Arthur A.; And Others

    Wrightsman has established that assumptions about human nature distinguish religious, occupational, political, gender, and other groups, and that they predict behavior in structured situations. Hjelle and Ziegler proposed a set of nine basic bipolar assumptions about the nature of people: freedom-determinism; rationality-irrationality;…

  15. A utility-theoretic model for QALYs and willingness to pay.

    PubMed

    Klose, Thomas

    2003-01-01

    Despite the widespread use of quality-adjusted life years (QALY) in economic evaluation studies, their utility-theoretic foundation remains unclear. A model for preferences over health, money, and time is presented in this paper. Under the usual assumptions of the original QALY-model, an additive separable presentation of the utilities in different periods exists. In contrast to the usual assumption that QALY-weights do solely depend on aspects of health-related quality of life, wealth-standardized QALY-weights might vary with the wealth level in the presented extension of the original QALY-model resulting in an inconsistent measurement of QALYs. Further assumptions are presented to make the measurement of QALYs consistent with lifetime preferences over health and money. Even under these strict assumptions, QALYs and WTP (which also can be defined in this utility-theoretic model) are not equivalent preference-based measures of the effects of health technologies on an individual level. The results suggest that the individual WTP per QALY can depend on the magnitude of the QALY-gain as well as on the disease burden, when health influences the marginal utility of wealth. Further research seems to be indicated on this structural aspect of preferences over health and wealth and to quantify its impact. Copyright 2002 John Wiley & Sons, Ltd.

  16. Bearing witness: an existential position in caring.

    PubMed

    Arman, Maria

    2007-12-01

    A basic assumption for the study is that perceiving a person's deepest needs and desires to be on hand for another person, and their attempt to do so, have, in an ontological sense, the power to bear witness of goodness and eternity. The study was based on a theoretical basis of a caring science view of suffering, as well as the ethics of the philosopher Lévinas. The aim was to explore and clinically validate nuances of witnessing as a caring act.A Socratic dialogue was performed and an interpretive (hermeneutic) method was employed in this study. The Socratic dialogue with four nurses in palliative care focused on and analysed one clinical example of witnessing in palliative care. As basis for the findings are the participating nurses jointly formulated assumptions on the subject: To be a witness you have to be with the patient and refer back to him or her what you have seen; but also to act in accordance with what you have perceived. In the moment you witness, a window is opened onto the unknown; you become vulnerable as a caregiver and require courage. Being a witness encompasses existential and spiritual aspects; being a fellow human being, having a heart to heart relationship is a wilful act on the part of the nurse. Our theoretical discussion focuses on the language of the body, courage as a bridge to an existential encounter and the alleviation of patients' suffering through caregivers' witnessing. A conclusive aspect is that being a witness may bring a new understanding of life in the face of death and suffering. The existential position of being a witness requires the caregiver to be courageous because of its transformative prospect, but may utterly enrich both parties' inner life of shared meaning.

  17. Thin Skin, Deep Damage: Addressing the Wounded Writer in the Basic Writing Course

    ERIC Educational Resources Information Center

    Boone, Stephanie D.

    2010-01-01

    How do institutions and their writing faculties see basic writers? What assumptions about these writers drive writing curricula, pedagogies and assessments? How do writing programs enable or frustrate these writers? How might course design facilitate the outcomes we envision? This article argues that, in order to teach basic writers to enter…

  18. Writing Partners: Service Learning as a Route to Authority for Basic Writers

    ERIC Educational Resources Information Center

    Gabor, Catherine

    2009-01-01

    This article looks at best practices in basic writing instruction in terms of non-traditional audiences and writerly authority. Much conventional wisdom discourages participation in service-learning projects for basic writers because of the assumption that their writing is not yet ready to "go public." Countering this line of thinking, the author…

  19. Introduction to the Application of Web-Based Surveys.

    ERIC Educational Resources Information Center

    Timmerman, Annemarie

    This paper discusses some basic assumptions and issues concerning web-based surveys. Discussion includes: assumptions regarding cost and ease of use; disadvantages of web-based surveys, concerning the inability to compensate for four common errors of survey research: coverage error, sampling error, measurement error and nonresponse error; and…

  20. Categorization: The View from Animal Cognition.

    PubMed

    Smith, J David; Zakrzewski, Alexandria C; Johnson, Jennifer M; Valleau, Jeanette C; Church, Barbara A

    2016-06-15

    Exemplar, prototype, and rule theory have organized much of the enormous literature on categorization. From this theoretical foundation have arisen the two primary debates in the literature-the prototype-exemplar debate and the single system-multiple systems debate. We review these theories and debates. Then, we examine the contribution that animal-cognition studies have made to them. Animals have been crucial behavioral ambassadors to the literature on categorization. They reveal the roots of human categorization, the basic assumptions of vertebrates entering category tasks, the surprising weakness of exemplar memory as a category-learning strategy. They show that a unitary exemplar theory of categorization is insufficient to explain human and animal categorization. They show that a multiple-systems theoretical account-encompassing exemplars, prototypes, and rules-will be required for a complete explanation. They show the value of a fitness perspective in understanding categorization, and the value of giving categorization an evolutionary depth and phylogenetic breadth. They raise important questions about the internal similarity structure of natural kinds and categories. They demonstrate strong continuities with humans in categorization, but discontinuities, too. Categorization's great debates are resolving themselves, and to these resolutions animals have made crucial contributions.

  1. Categorization: The View from Animal Cognition

    PubMed Central

    Smith, J. David; Zakrzewski, Alexandria C.; Johnson, Jennifer M.; Valleau, Jeanette C.; Church, Barbara A.

    2016-01-01

    Exemplar, prototype, and rule theory have organized much of the enormous literature on categorization. From this theoretical foundation have arisen the two primary debates in the literature—the prototype-exemplar debate and the single system-multiple systems debate. We review these theories and debates. Then, we examine the contribution that animal-cognition studies have made to them. Animals have been crucial behavioral ambassadors to the literature on categorization. They reveal the roots of human categorization, the basic assumptions of vertebrates entering category tasks, the surprising weakness of exemplar memory as a category-learning strategy. They show that a unitary exemplar theory of categorization is insufficient to explain human and animal categorization. They show that a multiple-systems theoretical account—encompassing exemplars, prototypes, and rules—will be required for a complete explanation. They show the value of a fitness perspective in understanding categorization, and the value of giving categorization an evolutionary depth and phylogenetic breadth. They raise important questions about the internal similarity structure of natural kinds and categories. They demonstrate strong continuities with humans in categorization, but discontinuities, too. Categorization’s great debates are resolving themselves, and to these resolutions animals have made crucial contributions. PMID:27314392

  2. On the nameless love and infinite sexualities: John Henry Mackay, Magnus Hirschfeld and the origins of the sexual emancipation movement.

    PubMed

    Bauer, J Edgar

    2005-01-01

    Two prominent representatives of the sexual emancipation movement in Germany, John Henry Mackay (1864-1933) and Magnus Hirschfeld (1868-1935) launched significant attacks on sexual binarism and its combinatories. Although Mackay defended the nameless love against seminal Christian and subsequent secularised misconstructions of its nature, he was unable to overcome the fundamental scheme of binomic sexuality. Hirschfeld, however, resolved the theoretical issue through his doctrine of sexual intermediaries (Zwischenstufenlehre) which purports that-without exception- all human beings are intersexual variants, i.e. unique composites of different proportions of masculinity and femininity. Since these proportions vary from one sexual layer of description to another in the same individual and can alter or be altered in time, it is sensu stricto not possible implies a radical deconstruction of not only binomic sexuality but its supplementation through a third sex. It offers a meta-theoretical framework for rethinking sexual difference beyond the fictional schemes and categorial closures of Western traditions of sexual identity. His assumption of potentially infinite sexualities anticipates some of the basic tenets forwarded by the philosophical and political agendas of queer studies. to postulate discrete sexual categories. Hirschfeld's doctrine.

  3. An Information Theoretic Investigation Of Complex Adaptive Supply Networks With Organizational Topologies

    DTIC Science & Technology

    2016-12-22

    assumptions of behavior. This research proposes an information theoretic methodology to discover such complex network structures and dynamics while overcoming...the difficulties historically associated with their study. Indeed, this was the first application of an information theoretic methodology as a tool...1 Research Objectives and Questions..............................................................................2 Methodology

  4. Symbolic interactionism as a theoretical perspective for multiple method research.

    PubMed

    Benzies, K M; Allen, M N

    2001-02-01

    Qualitative and quantitative research rely on different epistemological assumptions about the nature of knowledge. However, the majority of nurse researchers who use multiple method designs do not address the problem of differing theoretical perspectives. Traditionally, symbolic interactionism has been viewed as one perspective underpinning qualitative research, but it is also the basis for quantitative studies. Rooted in social psychology, symbolic interactionism has a rich intellectual heritage that spans more than a century. Underlying symbolic interactionism is the major assumption that individuals act on the basis of the meaning that things have for them. The purpose of this paper is to present symbolic interactionism as a theoretical perspective for multiple method designs with the aim of expanding the dialogue about new methodologies. Symbolic interactionism can serve as a theoretical perspective for conceptually clear and soundly implemented multiple method research that will expand the understanding of human health behaviour.

  5. Electrostatic plasma simulation by Particle-In-Cell method using ANACONDA package

    NASA Astrophysics Data System (ADS)

    Blandón, J. S.; Grisales, J. P.; Riascos, H.

    2017-06-01

    Electrostatic plasma is the most representative and basic case in plasma physics field. One of its main characteristics is its ideal behavior, since it is assumed be in thermal equilibrium state. Through this assumption, it is possible to study various complex phenomena such as plasma oscillations, waves, instabilities or damping. Likewise, computational simulation of this specific plasma is the first step to analyze physics mechanisms on plasmas, which are not at equilibrium state, and hence plasma is not ideal. Particle-In-Cell (PIC) method is widely used because of its precision for this kind of cases. This work, presents PIC method implementation to simulate electrostatic plasma by Python, using ANACONDA packages. The code has been corroborated comparing previous theoretical results for three specific phenomena in cold plasmas: oscillations, Two-Stream instability (TSI) and Landau Damping(LD). Finally, parameters and results are discussed.

  6. Unsteady flow model for circulation-control airfoils

    NASA Technical Reports Server (NTRS)

    Rao, B. M.

    1979-01-01

    An analysis and a numerical lifting surface method are developed for predicting the unsteady airloads on two-dimensional circulation control airfoils in incompressible flow. The analysis and the computer program are validated by correlating the computed unsteady airloads with test data and also with other theoretical solutions. Additionally, a mathematical model for predicting the bending-torsion flutter of a two-dimensional airfoil (a reference section of a wing or rotor blade) and a computer program using an iterative scheme are developed. The flutter program has a provision for using the CC airfoil airloads program or the Theodorsen hard flap solution to compute the unsteady lift and moment used in the flutter equations. The adopted mathematical model and the iterative scheme are used to perform a flutter analysis of a typical CC rotor blade reference section. The program seems to work well within the basic assumption of the incompressible flow.

  7. Tandem-pulsed acousto-optics: an analytical framework of modulated high-contrast speckle patterns.

    PubMed

    Resink, S G; Steenbergen, W

    2015-06-07

    Recently we presented acousto-optic (AO) probing of scattering media using addition or subtraction of speckle patterns due to tandem nanosecond pulses. Here we present a theoretical framework for ideal (polarized, noise-free) speckle patterns with unity contrast that links ultrasound-induced optical phase modulation, the fraction of light that is tagged by ultrasound, speckle contrast, mean square difference of speckle patterns and the contrast of the summation of speckle patterns acquired at different ultrasound phases. We derive the important relations from basic assumptions and definitions, and then validate them with simulations. For ultrasound-generated phase modulation angles below 0.7 rad (assuming uniform modulation), we are now able to relate speckle pattern statistics to the acousto-optic phase modulation. Hence our theory allows quantifying speckle observations in terms of ultrasonically tagged fractions of light for near-unity-contrast speckle patterns.

  8. Kinetics of carbon clustering in detonation of high explosives: Does theory match experiment?

    NASA Astrophysics Data System (ADS)

    Velizhanin, Kirill; Watkins, Erik; Dattelbaum, Dana; Gustavsen, Richard; Aslam, Tariq; Podlesak, David; Firestone, Millicent; Huber, Rachel; Ringstrand, Bryan; Willey, Trevor; Bagge-Hansen, Michael; Hodgin, Ralph; Lauderbach, Lisa; van Buuren, Tony; Sinclair, Nicholas; Rigg, Paulo; Seifert, Soenke; Gog, Thomas

    2017-06-01

    Chemical reactions in detonation of carbon-rich high explosives yield carbon clusters as major constituents of the products. Efforts to model carbon clustering as a diffusion-limited irreversible coagulation of carbon clusters go back to the seminal paper by Shaw and Johnson. However, first direct experimental observations of the kinetics of clustering yielded cluster growth one to two orders of magnitude slower than theoretical predictions. Multiple efforts were undertaken to test and revise the basic assumptions of the model in order to achieve better agreement with experiment. We discuss our very recent direct experimental observations of carbon clustering dynamics and demonstrate that these new results are in much better agreement with the modified Shaw-Johnson model. The implications of this much better agreement on our present understanding of detonation carbon clustering processes and possible ways to increase the agreement between theory and experiment even further are discussed.

  9. On computing stress in polymer systems involving multi-body potentials from molecular dynamics simulation

    NASA Astrophysics Data System (ADS)

    Fu, Yao; Song, Jeong-Hoon

    2014-08-01

    Hardy stress definition has been restricted to pair potentials and embedded-atom method potentials due to the basic assumptions in the derivation of a symmetric microscopic stress tensor. Force decomposition required in the Hardy stress expression becomes obscure for multi-body potentials. In this work, we demonstrate the invariance of the Hardy stress expression for a polymer system modeled with multi-body interatomic potentials including up to four atoms interaction, by applying central force decomposition of the atomic force. The balance of momentum has been demonstrated to be valid theoretically and tested under various numerical simulation conditions. The validity of momentum conservation justifies the extension of Hardy stress expression to multi-body potential systems. Computed Hardy stress has been observed to converge to the virial stress of the system with increasing spatial averaging volume. This work provides a feasible and reliable linkage between the atomistic and continuum scales for multi-body potential systems.

  10. Density functional computational studies on the glucose and glycine Maillard reaction: Formation of the Amadori rearrangement products

    NASA Astrophysics Data System (ADS)

    Jalbout, Abraham F.; Roy, Amlan K.; Shipar, Abul Haider; Ahmed, M. Samsuddin

    Theoretical energy changes of various intermediates leading to the formation of the Amadori rearrangement products (ARPs) under different mechanistic assumptions have been calculated, by using open chain glucose (O-Glu)/closed chain glucose (A-Glu and B-Glu) and glycine (Gly) as a model for the Maillard reaction. Density functional theory (DFT) computations have been applied on the proposed mechanisms under different pH conditions. Thus, the possibility of the formation of different compounds and electronic energy changes for different steps in the proposed mechanisms has been evaluated. B-Glu has been found to be more efficient than A-Glu, and A-Glu has been found more efficient than O-Glu in the reaction. The reaction under basic condition is the most favorable for the formation of ARPs. Other reaction pathways have been computed and discussed in this work.0

  11. Evaluating scaling models in biology using hierarchical Bayesian approaches

    PubMed Central

    Price, Charles A; Ogle, Kiona; White, Ethan P; Weitz, Joshua S

    2009-01-01

    Theoretical models for allometric relationships between organismal form and function are typically tested by comparing a single predicted relationship with empirical data. Several prominent models, however, predict more than one allometric relationship, and comparisons among alternative models have not taken this into account. Here we evaluate several different scaling models of plant morphology within a hierarchical Bayesian framework that simultaneously fits multiple scaling relationships to three large allometric datasets. The scaling models include: inflexible universal models derived from biophysical assumptions (e.g. elastic similarity or fractal networks), a flexible variation of a fractal network model, and a highly flexible model constrained only by basic algebraic relationships. We demonstrate that variation in intraspecific allometric scaling exponents is inconsistent with the universal models, and that more flexible approaches that allow for biological variability at the species level outperform universal models, even when accounting for relative increases in model complexity. PMID:19453621

  12. Evolution of basic equations for nearshore wave field

    PubMed Central

    ISOBE, Masahiko

    2013-01-01

    In this paper, a systematic, overall view of theories for periodic waves of permanent form, such as Stokes and cnoidal waves, is described first with their validity ranges. To deal with random waves, a method for estimating directional spectra is given. Then, various wave equations are introduced according to the assumptions included in their derivations. The mild-slope equation is derived for combined refraction and diffraction of linear periodic waves. Various parabolic approximations and time-dependent forms are proposed to include randomness and nonlinearity of waves as well as to simplify numerical calculation. Boussinesq equations are the equations developed for calculating nonlinear wave transformations in shallow water. Nonlinear mild-slope equations are derived as a set of wave equations to predict transformation of nonlinear random waves in the nearshore region. Finally, wave equations are classified systematically for a clear theoretical understanding and appropriate selection for specific applications. PMID:23318680

  13. A generating function approach to HIV transmission with dynamic contact rates

    DOE PAGES

    Romero-Severson, Ethan O.; Meadors, Grant D.; Volz, Erik M.

    2014-04-24

    The basic reproduction number, R 0, is often defined as the average number of infections generated by a newly infected individual in a fully susceptible population. The interpretation, meaning, and derivation of R 0 are controversial. However, in the context of mean field models, R 0 demarcates the epidemic threshold below which the infected population approaches zero in the limit of time. In this manner, R 0 has been proposed as a method for understanding the relative impact of public health interventions with respect to disease eliminations from a theoretical perspective. The use of R 0 is made more complexmore » by both the strong dependency of R 0 on the model form and the stochastic nature of transmission. A common assumption in models of HIV transmission that have closed form expressions for R 0 is that a single individual’s behavior is constant over time. For this research, we derive expressions for both R 0 and probability of an epidemic in a finite population under the assumption that people periodically change their sexual behavior over time. We illustrate the use of generating functions as a general framework to model the effects of potentially complex assumptions on the number of transmissions generated by a newly infected person in a susceptible population. In conclusion, we find that the relationship between the probability of an epidemic and R 0 is not straightforward, but, that as the rate of change in sexual behavior increases both R 0 and the probability of an epidemic also decrease.« less

  14. A generating function approach to HIV transmission with dynamic contact rates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romero-Severson, Ethan O.; Meadors, Grant D.; Volz, Erik M.

    The basic reproduction number, R 0, is often defined as the average number of infections generated by a newly infected individual in a fully susceptible population. The interpretation, meaning, and derivation of R 0 are controversial. However, in the context of mean field models, R 0 demarcates the epidemic threshold below which the infected population approaches zero in the limit of time. In this manner, R 0 has been proposed as a method for understanding the relative impact of public health interventions with respect to disease eliminations from a theoretical perspective. The use of R 0 is made more complexmore » by both the strong dependency of R 0 on the model form and the stochastic nature of transmission. A common assumption in models of HIV transmission that have closed form expressions for R 0 is that a single individual’s behavior is constant over time. For this research, we derive expressions for both R 0 and probability of an epidemic in a finite population under the assumption that people periodically change their sexual behavior over time. We illustrate the use of generating functions as a general framework to model the effects of potentially complex assumptions on the number of transmissions generated by a newly infected person in a susceptible population. In conclusion, we find that the relationship between the probability of an epidemic and R 0 is not straightforward, but, that as the rate of change in sexual behavior increases both R 0 and the probability of an epidemic also decrease.« less

  15. Linking parasite populations in hosts to parasite populations in space through Taylor's law and the negative binomial distribution

    PubMed Central

    Poulin, Robert; Lagrue, Clément

    2017-01-01

    The spatial distribution of individuals of any species is a basic concern of ecology. The spatial distribution of parasites matters to control and conservation of parasites that affect human and nonhuman populations. This paper develops a quantitative theory to predict the spatial distribution of parasites based on the distribution of parasites in hosts and the spatial distribution of hosts. Four models are tested against observations of metazoan hosts and their parasites in littoral zones of four lakes in Otago, New Zealand. These models differ in two dichotomous assumptions, constituting a 2 × 2 theoretical design. One assumption specifies whether the variance function of the number of parasites per host individual is described by Taylor's law (TL) or the negative binomial distribution (NBD). The other assumption specifies whether the numbers of parasite individuals within each host in a square meter of habitat are independent or perfectly correlated among host individuals. We find empirically that the variance–mean relationship of the numbers of parasites per square meter is very well described by TL but is not well described by NBD. Two models that posit perfect correlation of the parasite loads of hosts in a square meter of habitat approximate observations much better than two models that posit independence of parasite loads of hosts in a square meter, regardless of whether the variance–mean relationship of parasites per host individual obeys TL or NBD. We infer that high local interhost correlations in parasite load strongly influence the spatial distribution of parasites. Local hotspots could influence control and conservation of parasites. PMID:27994156

  16. High Tech Educators Network Evaluation.

    ERIC Educational Resources Information Center

    O'Shea, Dan

    A process evaluation was conducted to assess the High Tech Educators Network's (HTEN's) activities. Four basic components to the evaluation approach were documentation review, program logic model, written survey, and participant interviews. The model mapped the basic goals and objectives, assumptions, activities, outcome expectations, and…

  17. Statistical Issues for Calculating Reentry Hazards

    NASA Technical Reports Server (NTRS)

    Bacon, John B.; Matney, Mark

    2016-01-01

    A number of statistical tools have been developed over the years for assessing the risk of reentering object to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. This information, combined with information on the expected ground path of the reentry, is used to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of this analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper builds on previous IAASS work to re-examine one of these theoretical assumptions.. This study employs empirical and theoretical information to test the assumption of a fully random decay along the argument of latitude of the final orbit, and makes recommendations how to improve the accuracy of this calculation in the future.

  18. Problematising Mathematics Education

    ERIC Educational Resources Information Center

    Begg, Andy

    2015-01-01

    We assume many things when considering our practice, but our assumptions limit what we do. In this theoretical/philosophical paper I consider some assumptions that relate to our work. My purpose is to stimulate a debate, a search for alternatives, and to help us improve mathematics education by influencing our future curriculum documents and…

  19. Testing a pollen-parent fecundity distribution model on seed-parent fecundity distributions in bee-pollinated forage legume polycrosses

    USDA-ARS?s Scientific Manuscript database

    Random mating (i.e., panmixis) is a fundamental assumption in quantitative genetics. In outcrossing bee-pollinated perennial forage legume polycrosses, mating is assumed by default to follow theoretical random mating. This assumption informs breeders of expected inbreeding estimates based on polycro...

  20. The Metatheoretical Assumptions of Literacy Engagement: A Preliminary Centennial History

    ERIC Educational Resources Information Center

    Hruby, George G.; Burns, Leslie D.; Botzakis, Stergios; Groenke, Susan L.; Hall, Leigh A.; Laughter, Judson; Allington, Richard L.

    2016-01-01

    In this review of literacy education research in North America over the past century, the authors examined the historical succession of theoretical frameworks on students' active participation in their own literacy learning, and in particular the metatheoretical assumptions that justify those frameworks. The authors used "motivation" and…

  1. On Cognitive Constraints and Learning Progressions: The Case of "Structure of Matter"

    ERIC Educational Resources Information Center

    Talanquer, Vicente

    2009-01-01

    Based on the analysis of available research on students' alternative conceptions about the particulate nature of matter, we identified basic implicit assumptions that seem to constrain students' ideas and reasoning on this topic at various learning stages. Although many of these assumptions are interrelated, some of them seem to change or…

  2. Rationality as the Basic Assumption in Explaining Japanese (or Any Other) Business Culture.

    ERIC Educational Resources Information Center

    Koike, Shohei

    Economic analysis, with its explicit assumption that people are rational, is applied to the Japanese and American business cultures to illustrate how the approach is useful for understanding cultural differences. Specifically, differences in cooperative behavior among Japanese and American workers are examined. Economic analysis goes beyond simple…

  3. Standardization of Selected Semantic Differential Scales with Secondary School Children.

    ERIC Educational Resources Information Center

    Evans, G. T.

    A basic assumption of this study is that the meaning continuum registered by an adjective pair remains relatively constant over a large universe of concepts and over subjects within a relatively homogeneous population. An attempt was made to validate this assumption by showing the invariance of the factor structure across different types of…

  4. What's Love Got to Do with It? Rethinking Common Sense Assumptions

    ERIC Educational Resources Information Center

    Trachman, Matthew; Bluestone, Cheryl

    2005-01-01

    One of the most basic tasks in introductory social science classes is to get students to reexamine their common sense assumptions concerning human behavior. This article introduces a shared assignment developed for a learning community that paired an introductory sociology and psychology class. The assignment challenges students to rethink the…

  5. Signal Detection with Criterion Noise: Applications to Recognition Memory

    ERIC Educational Resources Information Center

    Benjamin, Aaron S.; Diaz, Michael; Wee, Serena

    2009-01-01

    A tacit but fundamental assumption of the theory of signal detection is that criterion placement is a noise-free process. This article challenges that assumption on theoretical and empirical grounds and presents the noisy decision theory of signal detection (ND-TSD). Generalized equations for the isosensitivity function and for measures of…

  6. A theoretical approach to artificial intelligence systems in medicine.

    PubMed

    Spyropoulos, B; Papagounos, G

    1995-10-01

    The various theoretical models of disease, the nosology which is accepted by the medical community and the prevalent logic of diagnosis determine both the medical approach as well as the development of the relevant technology including the structure and function of the A.I. systems involved. A.I. systems in medicine, in addition to the specific parameters which enable them to reach a diagnostic and/or therapeutic proposal, entail implicitly theoretical assumptions and socio-cultural attitudes which prejudice the orientation and the final outcome of the procedure. The various models -causal, probabilistic, case-based etc. -are critically examined and their ethical and methodological limitations are brought to light. The lack of a self-consistent theoretical framework in medicine, the multi-faceted character of the human organism as well as the non-explicit nature of the theoretical assumptions involved in A.I. systems restrict them to the role of decision supporting "instruments" rather than regarding them as decision making "devices". This supporting role and, especially, the important function which A.I. systems should have in the structure, the methods and the content of medical education underscore the need of further research in the theoretical aspects and the actual development of such systems.

  7. Computation in generalised probabilisitic theories

    NASA Astrophysics Data System (ADS)

    Lee, Ciarán M.; Barrett, Jonathan

    2015-08-01

    From the general difficulty of simulating quantum systems using classical systems, and in particular the existence of an efficient quantum algorithm for factoring, it is likely that quantum computation is intrinsically more powerful than classical computation. At present, the best upper bound known for the power of quantum computation is that {{BQP}}\\subseteq {{AWPP}}, where {{AWPP}} is a classical complexity class (known to be included in {{PP}}, hence {{PSPACE}}). This work investigates limits on computational power that are imposed by simple physical, or information theoretic, principles. To this end, we define a circuit-based model of computation in a class of operationally-defined theories more general than quantum theory, and ask: what is the minimal set of physical assumptions under which the above inclusions still hold? We show that given only an assumption of tomographic locality (roughly, that multipartite states and transformations can be characterized by local measurements), efficient computations are contained in {{AWPP}}. This inclusion still holds even without assuming a basic notion of causality (where the notion is, roughly, that probabilities for outcomes cannot depend on future measurement choices). Following Aaronson, we extend the computational model by allowing post-selection on measurement outcomes. Aaronson showed that the corresponding quantum complexity class, {{PostBQP}}, is equal to {{PP}}. Given only the assumption of tomographic locality, the inclusion in {{PP}} still holds for post-selected computation in general theories. Hence in a world with post-selection, quantum theory is optimal for computation in the space of all operational theories. We then consider whether one can obtain relativized complexity results for general theories. It is not obvious how to define a sensible notion of a computational oracle in the general framework that reduces to the standard notion in the quantum case. Nevertheless, it is possible to define computation relative to a ‘classical oracle’. Then, we show there exists a classical oracle relative to which efficient computation in any theory satisfying the causality assumption does not include {{NP}}.

  8. Using plot experiments to test the validity of mass balance models employed to estimate soil redistribution rates from 137Cs and 210Pb(ex) measurements.

    PubMed

    Porto, Paolo; Walling, Des E

    2012-10-01

    Information on rates of soil loss from agricultural land is a key requirement for assessing both on-site soil degradation and potential off-site sediment problems. Many models and prediction procedures have been developed to estimate rates of soil loss and soil redistribution as a function of the local topography, hydrometeorology, soil type and land management, but empirical data remain essential for validating and calibrating such models and prediction procedures. Direct measurements using erosion plots are, however, costly and the results obtained relate to a small enclosed area, which may not be representative of the wider landscape. In recent years, the use of fallout radionuclides and more particularly caesium-137 ((137)Cs) and excess lead-210 ((210)Pb(ex)) has been shown to provide a very effective means of documenting rates of soil loss and soil and sediment redistribution in the landscape. Several of the assumptions associated with the theoretical conversion models used with such measurements remain essentially unvalidated. This contribution describes the results of a measurement programme involving five experimental plots located in southern Italy, aimed at validating several of the basic assumptions commonly associated with the use of mass balance models for estimating rates of soil redistribution on cultivated land from (137)Cs and (210)Pb(ex) measurements. Overall, the results confirm the general validity of these assumptions and the importance of taking account of the fate of fresh fallout. However, further work is required to validate the conversion models employed in using fallout radionuclide measurements to document soil redistribution in the landscape and this could usefully direct attention to different environments and to the validation of the final estimates of soil redistribution rate as well as the assumptions of the models employed. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Optimum runway orientation relative to crosswinds

    NASA Technical Reports Server (NTRS)

    Falls, L. W.; Brown, S. C.

    1972-01-01

    Specific magnitudes of crosswinds may exist that could be constraints to the success of an aircraft mission such as the landing of the proposed space shuttle. A method is required to determine the orientation or azimuth of the proposed runway which will minimize the probability of certain critical crosswinds. Two procedures for obtaining the optimum runway orientation relative to minimizing a specified crosswind speed are described and illustrated with examples. The empirical procedure requires only hand calculations on an ordinary wind rose. The theoretical method utilizes wind statistics computed after the bivariate normal elliptical distribution is applied to a data sample of component winds. This method requires only the assumption that the wind components are bivariate normally distributed. This assumption seems to be reasonable. Studies are currently in progress for testing wind components for bivariate normality for various stations. The close agreement between the theoretical and empirical results for the example chosen substantiates the bivariate normal assumption.

  10. Intellectualizing Adult Basic Literacy Education: A Case Study

    ERIC Educational Resources Information Center

    Bradbury, Kelly S.

    2012-01-01

    At a time when accusations of American ignorance and anti-intellectualism are ubiquitous, this article challenges problematic assumptions about intellectualism that overlook the work of adult basic literacy programs and proposes an expanded view of intellectualism. It is important to recognize and to challenge narrow views of intellectualism…

  11. Adult Literacy Programs: Guidelines for Effectiveness.

    ERIC Educational Resources Information Center

    Lord, Jerome E.

    This report is a summary of information from both research and experience about the assumptions and practices that guide successful basic skills programs. The 31 guidelines are basic to building a solid foundation on which effective instructional programs for adults can be developed. The first six guidelines address some important characteristics…

  12. Language Performance Assessment: Current Trends in Theory and Research

    ERIC Educational Resources Information Center

    El-Koumy, Abdel-Salam Abdel-Khalek

    2004-01-01

    The purpose of this paper is to review the theoretical and empirical literature relevant to language performance assessment. Following a definition of performance assessment, this paper considers: (1) theoretical assumptions underlying performance assessment; (2) purposes of performance assessment; (3) performance assessment procedures; (4) merits…

  13. Teachers' Perspectives on Principal Mistreatment

    ERIC Educational Resources Information Center

    Blase, Joseph; Blase, Jo

    2006-01-01

    Although there is some important scholarly work on the problem of workplace mistreatment/abuse, theoretical or empirical work on abusive school principals is nonexistent. Symbolic interactionism was the theoretical structure for the present study. This perspective on social research is founded on three primary assumptions: (1) individuals act…

  14. Variation is the universal: making cultural evolution work in developmental psychology.

    PubMed

    Kline, Michelle Ann; Shamsudheen, Rubeena; Broesch, Tanya

    2018-04-05

    Culture is a human universal, yet it is a source of variation in human psychology, behaviour and development. Developmental researchers are now expanding the geographical scope of research to include populations beyond relatively wealthy Western communities. However, culture and context still play a secondary role in the theoretical grounding of developmental psychology research, far too often. In this paper, we highlight four false assumptions that are common in psychology, and that detract from the quality of both standard and cross-cultural research in development. These assumptions are: (i) the universality assumption , that empirical uniformity is evidence for universality, while any variation is evidence for culturally derived variation; (ii) the Western centrality assumption , that Western populations represent a normal and/or healthy standard against which development in all societies can be compared; (iii) the deficit assumption , that population-level differences in developmental timing or outcomes are necessarily due to something lacking among non-Western populations; and (iv) the equivalency assumption , that using identical research methods will necessarily produce equivalent and externally valid data, across disparate cultural contexts. For each assumption, we draw on cultural evolutionary theory to critique and replace the assumption with a theoretically grounded approach to culture in development. We support these suggestions with positive examples drawn from research in development. Finally, we conclude with a call for researchers to take reasonable steps towards more fully incorporating culture and context into studies of development, by expanding their participant pools in strategic ways. This will lead to a more inclusive and therefore more accurate description of human development.This article is part of the theme issue 'Bridging cultural gaps: interdisciplinary studies in human cultural evolution'. © 2018 The Author(s).

  15. Basic lubrication equations

    NASA Technical Reports Server (NTRS)

    Hamrock, B. J.; Dowson, D.

    1981-01-01

    Lubricants, usually Newtonian fluids, are assumed to experience laminar flow. The basic equations used to describe the flow are the Navier-Stokes equation of motion. The study of hydrodynamic lubrication is, from a mathematical standpoint, the application of a reduced form of these Navier-Stokes equations in association with the continuity equation. The Reynolds equation can also be derived from first principles, provided of course that the same basic assumptions are adopted in each case. Both methods are used in deriving the Reynolds equation, and the assumptions inherent in reducing the Navier-Stokes equations are specified. Because the Reynolds equation contains viscosity and density terms and these properties depend on temperature and pressure, it is often necessary to couple the Reynolds with energy equation. The lubricant properties and the energy equation are presented. Film thickness, a parameter of the Reynolds equation, is a function of the elastic behavior of the bearing surface. The governing elasticity equation is therefore presented.

  16. Finite area combustor theoretical rocket performance

    NASA Technical Reports Server (NTRS)

    Gordon, Sanford; Mcbride, Bonnie J.

    1988-01-01

    Previous to this report, the computer program of NASA SP-273 and NASA TM-86885 was capable of calculating theoretical rocket performance based only on the assumption of an infinite area combustion chamber (IAC). An option was added to this program which now also permits the calculation of rocket performance based on the assumption of a finite area combustion chamber (FAC). In the FAC model, the combustion process in the cylindrical chamber is assumed to be adiabatic, but nonisentropic. This results in a stagnation pressure drop from the injector face to the end of the chamber and a lower calculated performance for the FAC model than the IAC model.

  17. Program Evaluation Theory and Practice: A Comprehensive Guide

    ERIC Educational Resources Information Center

    Mertens, Donna M.; Wilson, Amy T.

    2012-01-01

    This engaging text takes an evenhanded approach to major theoretical paradigms in evaluation and builds a bridge from them to evaluation practice. Featuring helpful checklists, procedural steps, provocative questions that invite readers to explore their own theoretical assumptions, and practical exercises, the book provides concrete guidance for…

  18. Radiologic technology educators and andragogy.

    PubMed

    Galbraith, M W; Simon-Galbraith, J A

    1984-01-01

    Radiologic technology educators are in constant contact with adult learners. However, the theoretical framework that radiologic educators use to guide their instruction may not be appropriate for adults. This article examines the assumptions of the standard instructional theory and the most modern approach to adult education-- andragogy . It also shows how these assumptions affect the adult learner in a radiologic education setting.

  19. [A reflection about organizational culture according to psychoanalysis' view].

    PubMed

    Cardoso, Maria Lúcia Alves Pereira

    2008-01-01

    This article aims at submitting a reflection on the universal presuppositions of human culture proposed by Freud, as a prop for analyzing presuppositions of organizational culture according to Schein. In an article published in 1984, the latter claims that in order to decipher organizational culture one cannot rely upon the (visible) artifacts or to (perceptible) values, but should take a deeper plunge and identify the basic assumptions underlying organizational culture. Such pressupositions spread into the field of sttudy concerning the individual inner self, within the sphere of Psychoanalysis. We have therefore examined Freud's basic assumptions of human culture in order to ascertain its conformity with the paradigms of organizational culture as proposed by Schein.

  20. Thermodynamic Properties of Low-Density {}^{132}Xe Gas in the Temperature Range 165-275 K

    NASA Astrophysics Data System (ADS)

    Akour, Abdulrahman

    2018-01-01

    The method of static fluctuation approximation was used to calculate selected thermodynamic properties (internal energy, entropy, energy capacity, and pressure) for xenon in a particularly low-temperature range (165-270 K) under different conditions. This integrated microscopic study started from an initial basic assumption as the main input. The basic assumption in this method was to replace the local field operator with its mean value, then numerically solve a closed set of nonlinear equations using an iterative method, considering the Hartree-Fock B2-type dispersion potential as the most appropriate potential for xenon. The results are in very good agreement with those of an ideal gas.

  1. What Is This Substance? What Makes It Different? Mapping Progression in Students' Assumptions about Chemical Identity

    ERIC Educational Resources Information Center

    Ngai, Courtney; Sevian, Hannah; Talanquer, Vicente

    2014-01-01

    Given the diversity of materials in our surroundings, one should expect scientifically literate citizens to have a basic understanding of the core ideas and practices used to analyze chemical substances. In this article, we use the term 'chemical identity' to encapsulate the assumptions, knowledge, and practices upon which chemical…

  2. An entropic framework for modeling economies

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel; Golan, Amos

    2014-08-01

    We develop an information-theoretic framework for economic modeling. This framework is based on principles of entropic inference that are designed for reasoning on the basis of incomplete information. We take the point of view of an external observer who has access to limited information about broad macroscopic economic features. We view this framework as complementary to more traditional methods. The economy is modeled as a collection of agents about whom we make no assumptions of rationality (in the sense of maximizing utility or profit). States of statistical equilibrium are introduced as those macrostates that maximize entropy subject to the relevant information codified into constraints. The basic assumption is that this information refers to supply and demand and is expressed in the form of the expected values of certain quantities (such as inputs, resources, goods, production functions, utility functions and budgets). The notion of economic entropy is introduced. It provides a measure of the uniformity of the distribution of goods and resources. It captures both the welfare state of the economy as well as the characteristics of the market (say, monopolistic, concentrated or competitive). Prices, which turn out to be the Lagrange multipliers, are endogenously generated by the economy. Further studies include the equilibrium between two economies and the conditions for stability. As an example, the case of the nonlinear economy that arises from linear production and utility functions is treated in some detail.

  3. Human body surface area: a theoretical approach.

    PubMed

    Wang, Jianfeng; Hihara, Eiji

    2004-04-01

    Knowledge of the human body surface area has important applications in medical practice, garment design, and other engineering sizing. Therefore, it is not surprising that several expressions correlating body surface area with direct measurements of body mass and length have been reported in the literature. In the present study, based on the assumption that the exterior shape of the human body is the result of convex and concave deformations from a basic cylinder, we derive a theoretical equation minimizing body surface area (BSA) at a fixed volume (V): BSA=(9pi VL)(0.5), where L is the reference length of the body. Assuming a body density value of 1,000 kg.m(-3), the equation becomes BSA=(BM.BH/35.37)(0.5), where BSA is in square meters, BM is the body mass in kilograms, and BH is the body height in meters. BSA values calculated by means of this equation fall within +/-7% of the values obtained by means of the equations available in the literature, in the range of BSA from children to adults. It is also suggested that the above equation, which is obtained by minimizing the outer body surface at a fixed volume, implies a fundamental relation set by the geometrical constraints governing the growth and the development of the human body.

  4. A theoretical basis for the analysis of redundant software subject to coincident errors

    NASA Technical Reports Server (NTRS)

    Eckhardt, D. E., Jr.; Lee, L. D.

    1985-01-01

    Fundamental to the development of redundant software techniques fault-tolerant software, is an understanding of the impact of multiple-joint occurrences of coincident errors. A theoretical basis for the study of redundant software is developed which provides a probabilistic framework for empirically evaluating the effectiveness of the general (N-Version) strategy when component versions are subject to coincident errors, and permits an analytical study of the effects of these errors. The basic assumptions of the model are: (1) independently designed software components are chosen in a random sample; and (2) in the user environment, the system is required to execute on a stationary input series. The intensity of coincident errors, has a central role in the model. This function describes the propensity to introduce design faults in such a way that software components fail together when executing in the user environment. The model is used to give conditions under which an N-Version system is a better strategy for reducing system failure probability than relying on a single version of software. A condition which limits the effectiveness of a fault-tolerant strategy is studied, and it is posted whether system failure probability varies monotonically with increasing N or whether an optimal choice of N exists.

  5. A socio-relational framework of sex differences in the expression of emotion.

    PubMed

    Vigil, Jacob Miguel

    2009-10-01

    Despite a staggering body of research demonstrating sex differences in expressed emotion, very few theoretical models (evolutionary or non-evolutionary) offer a critical examination of the adaptive nature of such differences. From the perspective of a socio-relational framework, emotive behaviors evolved to promote the attraction and aversion of different types of relationships by advertising the two most parsimonious properties of reciprocity potential, or perceived attractiveness as a prospective social partner. These are the individual's (a) perceived capacity or ability to provide expedient resources, or to inflict immediate harm onto others, and their (b) perceived trustworthiness or probability of actually reciprocating altruism (Vigil 2007). Depending on the unique social demands and relational constraints that each sex evolved, individuals should be sensitive to advertise "capacity" and "trustworthiness" cues through selective displays of dominant versus submissive and masculine versus feminine emotive behaviors, respectively. In this article, I introduce the basic theoretical assumptions and hypotheses of the framework, and show how the models provide a solid scaffold with which to begin to interpret common sex differences in the emotional development literature. I conclude by describing how the framework can be used to predict condition-based and situation-based variation in affect and other forms of expressive behaviors.

  6. Genital Measures: Comments on Their Role in Understanding Human Sexuality

    ERIC Educational Resources Information Center

    Geer, James H.

    1976-01-01

    This paper discusses the use of genital measures in the study of both applied and basic work in human sexuality. Some of the advantages of psychophysiological measures are considered along with cautions concerning unwarranted assumptions. Some of the advances that are possible in both applied and basic work are examined. (Author)

  7. What proportion of prescription items dispensed in community pharmacies are eligible for the New Medicine Service?

    PubMed

    Wells, Katharine M; Boyd, Matthew J; Thornley, Tracey; Boardman, Helen F

    2014-03-07

    The payment structure for the New Medicine Service (NMS) in England is based on the assumption that 0.5% of prescription items dispensed in community pharmacies are eligible for the service. This assumption is based on a theoretical calculation. This study aimed to find out the actual proportion of prescription items eligible for the NMS dispensed in community pharmacies in order to compare this with the theoretical assumption. The study also aimed to investigate whether the proportion of prescription items eligible for the NMS is affected by pharmacies' proximity to GP practices. The study collected data from eight pharmacies in Nottingham belonging to the same large chain of pharmacies. Pharmacies were grouped by distance from the nearest GP practice and sampled to reflect the distribution by distance of all pharmacies in Nottingham. Data on one thousand consecutive prescription items were collected from each pharmacy and the number of NMS eligible items recorded. All NHS prescriptions were included in the sample. Data were analysed and proportions calculated with 95% confidence intervals used to compare the study results against the theoretical figure of 0.5% of prescription items being eligible for the NMS. A total of 8005 prescription items were collected (a minimum of 1000 items per pharmacy) of which 17 items were eligible to receive the service. The study found that 0.25% (95% confidence intervals: 0.14% to 0.36%) of prescription items were eligible for the NMS which differs significantly from the theoretical assumption of 0.5%. The opportunity rate for the service was lower, 0.21% (95% confidence intervals: 0.10% to 0.32%) of items, as some items eligible for the NMS did not translate into opportunities to offer the service. Of all the prescription items collected in the pharmacies, 28% were collected by patient representatives. The results of this study show that the proportion of items eligible for the NMS dispensed in community pharmacies is lower than the Department of Health assumption of 0.5%. This study did not find a significant difference in the rate of NMS opportunities between pharmacies located close to GP practices compared to those further away.

  8. Analytical Implications of Using Practice Theory in Workplace Information Literacy Research

    ERIC Educational Resources Information Center

    Moring, Camilla; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical focus and interest when researching workplace…

  9. Centroid and Theoretical Rotation: Justification for Their Use in Q Methodology Research

    ERIC Educational Resources Information Center

    Ramlo, Sue

    2016-01-01

    This manuscript's purpose is to introduce Q as a methodology before providing clarification about the preferred factor analytical choices of centroid and theoretical (hand) rotation. Stephenson, the creator of Q, designated that only these choices allowed for scientific exploration of subjectivity while not violating assumptions associated with…

  10. NLPIR: A Theoretical Framework for Applying Natural Language Processing to Information Retrieval.

    ERIC Educational Resources Information Center

    Zhou, Lina; Zhang, Dongsong

    2003-01-01

    Proposes a theoretical framework called NLPIR that integrates natural language processing (NLP) into information retrieval (IR) based on the assumption that there exists representation distance between queries and documents. Discusses problems in traditional keyword-based IR, including relevance, and describes some existing NLP techniques.…

  11. 39 Questionable Assumptions in Modern Physics

    NASA Astrophysics Data System (ADS)

    Volk, Greg

    2009-03-01

    The growing body of anomalies in new energy, low energy nuclear reactions, astrophysics, atomic physics, and entanglement, combined with the failure of the Standard Model and string theory to predict many of the most basic fundamental phenomena, all point to a need for major new paradigms. Not Band-Aids, but revolutionary new ways of conceptualizing physics, in the spirit of Thomas Kuhn's The Structure of Scientific Revolutions. This paper identifies a number of long-held, but unproven assumptions currently being challenged by an increasing number of alternative scientists. Two common themes, both with venerable histories, keep recurring in the many alternative theories being proposed: (1) Mach's Principle, and (2) toroidal, vortex particles. Matter-based Mach's Principle differs from both space-based universal frames and observer-based Einsteinian relativity. Toroidal particles, in addition to explaining electron spin and the fundamental constants, satisfy the basic requirement of Gauss's misunderstood B Law, that motion itself circulates. Though a comprehensive theory is beyond the scope of this paper, it will suggest alternatives to the long list of assumptions in context.

  12. Application of the Recreation Opportunity Spectrum for Outdoor Recreation Planning on Army Installations.

    DTIC Science & Technology

    1982-03-01

    to preference types, and uses capacity estimation; therefore, it is basically a good system for recreation and resource inventory and classification...quan- tity, and distribution of recreational resources. Its basic unit of inventory is landform, or the homogeneity of physical features used to...by Clark and Stankey, "the basic assumption underlying the ROS is that quality recreational experiences are best assured by providing a diverse set of

  13. Extended physics as a theoretical framework for systems biology?

    PubMed

    Miquel, Paul-Antoine

    2011-08-01

    In this essay we examine whether a theoretical and conceptual framework for systems biology could be built from the Bailly and Longo (2008, 2009) proposal. These authors aim to understand life as a coherent critical structure, and propose to develop an extended physical approach of evolution, as a diffusion of biomass in a space of complexity. Their attempt leads to a simple mathematical reconstruction of Gould's assumption (1989) concerning the bacterial world as a "left wall of least complexity" that we will examine. Extended physical systems are characterized by their constructive properties. Time is acting and new properties emerge by their history that can open the list of their initial properties. This conceptual and theoretical framework is nothing more than a philosophical assumption, but as such it provides a new and exciting approach concerning the evolution of life, and the transition between physics and biology. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Change in Soil Porosity under Load

    NASA Astrophysics Data System (ADS)

    Dyba, V. P.; Skibin, E. G.

    2017-11-01

    The theoretical basis for the process of soil compaction under various loading paths is considered in the article, the theoretical assumptions are compared with the results of the tests of clay soil on a stabilometer. The variant of the critical state model of the sealing plastic-rigid environment is also considered the strength characteristics of which depend on the porosity coefficient. The loading surface is determined by the results of compression and stabilometrical tests. In order to clarify the results of this task, it is necessary to carry out stabilometric tests under conditions of simple loading, i.e. where the vertical pressure would be proportional to the compression pressure σ3 = kσ1. Within the study the attempts were made to confirm the model given in the beginning of the article by laboratory tests. After the analysis of the results, the provided theoretical assumptions were confirmed.

  15. Methodological problems in the use of indirect comparisons for evaluating healthcare interventions: survey of published systematic reviews.

    PubMed

    Song, Fujian; Loke, Yoon K; Walsh, Tanya; Glenny, Anne-Marie; Eastwood, Alison J; Altman, Douglas G

    2009-04-03

    To investigate basic assumptions and other methodological problems in the application of indirect comparison in systematic reviews of competing healthcare interventions. Survey of published systematic reviews. Inclusion criteria Systematic reviews published between 2000 and 2007 in which an indirect approach had been explicitly used. Identified reviews were assessed for comprehensiveness of the literature search, method for indirect comparison, and whether assumptions about similarity and consistency were explicitly mentioned. The survey included 88 review reports. In 13 reviews, indirect comparison was informal. Results from different trials were naively compared without using a common control in six reviews. Adjusted indirect comparison was usually done using classic frequentist methods (n=49) or more complex methods (n=18). The key assumption of trial similarity was explicitly mentioned in only 40 of the 88 reviews. The consistency assumption was not explicit in most cases where direct and indirect evidence were compared or combined (18/30). Evidence from head to head comparison trials was not systematically searched for or not included in nine cases. Identified methodological problems were an unclear understanding of underlying assumptions, inappropriate search and selection of relevant trials, use of inappropriate or flawed methods, lack of objective and validated methods to assess or improve trial similarity, and inadequate comparison or inappropriate combination of direct and indirect evidence. Adequate understanding of basic assumptions underlying indirect and mixed treatment comparison is crucial to resolve these methodological problems. APPENDIX 1: PubMed search strategy. APPENDIX 2: Characteristics of identified reports. APPENDIX 3: Identified studies. References of included studies.

  16. The basic aerodynamics of floatation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davies, M.J.; Wood, D.H.

    1983-09-01

    The original derivation of the basic theory governing the aerodynamics of both hovercraft and modern floatation ovens, requires the validity of some extremely crude assumptions. However, the basic theory is surprisingly accurate. It is shown that this accuracy occurs because the final expression of the basic theory can be derived by approximating the full Navier-Stokes equations in a manner that clearly shows the limitations of the theory. These limitations are used in discussing the relatively small discrepancies between the theory and experiment, which may not be significant for practical purposes.

  17. Idiographic versus Nomothetic Approaches to Research in Organizations.

    DTIC Science & Technology

    1981-07-01

    alternative methodologic assumption based on intensive examination of one or a few cases under the theoretic assumption of dynamic interactionism is, with...phenomenological studies the researcher may not enter the actual setting but instead examines symbolic meanings as they constitute themselves in...B. Interactionism in personality from a historical perspective. Psychological Bulletin, 1974, 81, 1026-l148. Elashoff, J.D.; & Thoresen, C.E

  18. Facilitative Dimensions in Interpersonal Relations: Verifying the Theoretical Assumptions of Carl Rogers in School, Family Education, Client-Centered Therapy, and Encounter Groups

    ERIC Educational Resources Information Center

    Tausch, Reinhard

    1978-01-01

    Summarized numerous different projects which investigated assumptions made by Carol Rogers about the necessary and sufficient conditions for significant positive change in person-to-person contact. Findings agree with Rogers' about the importance of empathy, genuineness, and respect. Presented at the Thirtieth Congress of Deutsch Gesell Schaft for…

  19. Individual Change and the Timing and Onset of Important Life Events: Methods, Models, and Assumptions

    ERIC Educational Resources Information Center

    Grimm, Kevin; Marcoulides, Katerina

    2016-01-01

    Researchers are often interested in studying how the timing of a specific event affects concurrent and future development. When faced with such research questions there are multiple statistical models to consider and those models are the focus of this paper as well as their theoretical underpinnings and assumptions regarding the nature of the…

  20. Estimation Methods for Non-Homogeneous Regression - Minimum CRPS vs Maximum Likelihood

    NASA Astrophysics Data System (ADS)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.; Zeileis, Achim

    2017-04-01

    Non-homogeneous regression models are widely used to statistically post-process numerical weather prediction models. Such regression models correct for errors in mean and variance and are capable to forecast a full probability distribution. In order to estimate the corresponding regression coefficients, CRPS minimization is performed in many meteorological post-processing studies since the last decade. In contrast to maximum likelihood estimation, CRPS minimization is claimed to yield more calibrated forecasts. Theoretically, both scoring rules used as an optimization score should be able to locate a similar and unknown optimum. Discrepancies might result from a wrong distributional assumption of the observed quantity. To address this theoretical concept, this study compares maximum likelihood and minimum CRPS estimation for different distributional assumptions. First, a synthetic case study shows that, for an appropriate distributional assumption, both estimation methods yield to similar regression coefficients. The log-likelihood estimator is slightly more efficient. A real world case study for surface temperature forecasts at different sites in Europe confirms these results but shows that surface temperature does not always follow the classical assumption of a Gaussian distribution. KEYWORDS: ensemble post-processing, maximum likelihood estimation, CRPS minimization, probabilistic temperature forecasting, distributional regression models

  1. Mediating objects: scientific and public functions of models in nineteenth-century biology.

    PubMed

    Ludwig, David

    2013-01-01

    The aim of this article is to examine the scientific and public functions of two- and three-dimensional models in the context of three episodes from nineteenth-century biology. I argue that these models incorporate both data and theory by presenting theoretical assumptions in the light of concrete data or organizing data through theoretical assumptions. Despite their diverse roles in scientific practice, they all can be characterized as mediators between data and theory. Furthermore, I argue that these different mediating functions often reflect their different audiences that included specialized scientists, students, and the general public. In this sense, models in nineteenth-century biology can be understood as mediators between theory, data, and their diverse audiences.

  2. 5 CFR 841.502 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... EMPLOYEES RETIREMENT SYSTEM-GENERAL ADMINISTRATION Employee Deductions and Government Contributions § 841... standards (using dynamic assumptions) and expressed as a level percentage of aggregate basic pay. Normal...

  3. Experimental investigation of two-phase heat transfer in a porous matrix.

    NASA Technical Reports Server (NTRS)

    Von Reth, R.; Frost, W.

    1972-01-01

    One-dimensional two-phase flow transpiration cooling through porous metal is studied experimentally. The experimental data is compared with a previous one-dimensional analysis. Good agreement with calculated temperature distribution is obtained as long as the basic assumptions of the analytical model are satisfied. Deviations from the basic assumptions are caused by nonhomogeneous and oscillating flow conditions. Preliminary derivation of nondimensional parameters which characterize the stable and unstable flow conditions is given. Superheated liquid droplets observed sputtering from the heated surface indicated incomplete evaporation at heat fluxes well in access of the latent energy transport. A parameter is developed to account for the nonequilibrium thermodynamic effects. Measured and calculated pressure drops show contradicting trends which are attributed to capillary forces.

  4. An Extension of the Partial Credit Model with an Application to the Measurement of Change.

    ERIC Educational Resources Information Center

    Fischer, Gerhard H.; Ponocny, Ivo

    1994-01-01

    An extension to the partial credit model, the linear partial credit model, is considered under the assumption of a certain linear decomposition of the item x category parameters into basic parameters. A conditional maximum likelihood algorithm for estimating basic parameters is presented and illustrated with simulation and an empirical study. (SLD)

  5. The Not So Common Sense: Differences in How People Judge Social and Political Life.

    ERIC Educational Resources Information Center

    Rosenberg, Shawn W.

    This interdisciplinary book challenges two basic assumptions that orient much contemporary social scientific thinking. Offering theory and empirical research, the book rejects the classic liberal view that people share a basic common sense or rationality; while at the same time, it questions the view of contemporary social theory that meaning is…

  6. How Mean is the Mean?

    PubMed Central

    Speelman, Craig P.; McGann, Marek

    2013-01-01

    In this paper we voice concerns about the uncritical manner in which the mean is often used as a summary statistic in psychological research. We identify a number of implicit assumptions underlying the use of the mean and argue that the fragility of these assumptions should be more carefully considered. We examine some of the ways in which the potential violation of these assumptions can lead us into significant theoretical and methodological error. Illustrations of alternative models of research already extant within Psychology are used to explore methods of research less mean-dependent and suggest that a critical assessment of the assumptions underlying its use in research play a more explicit role in the process of study design and review. PMID:23888147

  7. Adolescent Egocentrism and Formal Operations: Tests of a Theoretical Assumption.

    ERIC Educational Resources Information Center

    Lapsley, David K.; And Others

    1986-01-01

    Describes two studies of the theoretical relation between adolescent egocentrism and formal operations. Study 1 used the Adolescent Egocentrism Scale (AES) and Lunzer's battery of formal reasoning tasks to assess 183 adolescents. Study 2 administered the AES, the Imaginary Audience Scale (IAS), and the Test of Logical Thinking to 138 adolescents.…

  8. Ever since language and learning: afterthoughts on the Piaget-Chomsky debate.

    PubMed

    Piattelli-Palmarini, M

    1994-01-01

    The central arguments and counter-arguments presented by several participants during the debate between Piaget and Chomsky at the Royaumont Abbey in October 1975 are here reconstructed in a particularly consice chronological and "logical" sequence. Once the essential points of this important exchange are thus clearly laid out, it is easy to witness that recent developments in generative grammar, as well as new data on language acquisition, especially in the acquisition of pronouns by the congenitally deaf child, corroborate the "language specificity" thesis defended by Chomsky. By the same token these data and these new theoretical refinements refute the Piagetian hypothesis that language is constructed upon abstractions from sensorimotor schemata. Moreover, in the light of modern evolutionary theory, Piaget's basic assumptions on the biological roots of cognition, language and learning turn out to be unfounded. In hindsight, all this accrues to the validity of Fodor's seemingly "paradoxical" argument against "learning" as a transition from "less" powerful to "more" powerful conceptual systems.

  9. On computing stress in polymer systems involving multi-body potentials from molecular dynamics simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Yao, E-mail: fu5@mailbox.sc.edu, E-mail: jhsong@cec.sc.edu; Song, Jeong-Hoon, E-mail: fu5@mailbox.sc.edu, E-mail: jhsong@cec.sc.edu

    2014-08-07

    Hardy stress definition has been restricted to pair potentials and embedded-atom method potentials due to the basic assumptions in the derivation of a symmetric microscopic stress tensor. Force decomposition required in the Hardy stress expression becomes obscure for multi-body potentials. In this work, we demonstrate the invariance of the Hardy stress expression for a polymer system modeled with multi-body interatomic potentials including up to four atoms interaction, by applying central force decomposition of the atomic force. The balance of momentum has been demonstrated to be valid theoretically and tested under various numerical simulation conditions. The validity of momentum conservation justifiesmore » the extension of Hardy stress expression to multi-body potential systems. Computed Hardy stress has been observed to converge to the virial stress of the system with increasing spatial averaging volume. This work provides a feasible and reliable linkage between the atomistic and continuum scales for multi-body potential systems.« less

  10. Automated Routines for Calculating Whole-Stream Metabolism: Theoretical Background and User's Guide

    USGS Publications Warehouse

    Bales, Jerad D.; Nardi, Mark R.

    2007-01-01

    In order to standardize methods and facilitate rapid calculation and archival of stream-metabolism variables, the Stream Metabolism Program was developed to calculate gross primary production, net ecosystem production, respiration, and selected other variables from continuous measurements of dissolved-oxygen concentration, water temperature, and other user-supplied information. Methods for calculating metabolism from continuous measurements of dissolved-oxygen concentration and water temperature are fairly well known, but a standard set of procedures and computation software for all aspects of the calculations were not available previously. The Stream Metabolism Program addresses this deficiency with a stand-alone executable computer program written in Visual Basic.NET?, which runs in the Microsoft Windows? environment. All equations and assumptions used in the development of the software are documented in this report. Detailed guidance on application of the software is presented, along with a summary of the data required to use the software. Data from either a single station or paired (upstream, downstream) stations can be used with the software to calculate metabolism variables.

  11. On the Shallow Processing (Dis)Advantage: Grammar and Economy.

    PubMed

    Koornneef, Arnout; Reuland, Eric

    2016-01-01

    In the psycholinguistic literature it has been proposed that readers and listeners often adopt a "good-enough" processing strategy in which a "shallow" representation of an utterance driven by (top-down) extra-grammatical processes has a processing advantage over a "deep" (bottom-up) grammatically-driven representation of that same utterance. In the current contribution we claim, both on theoretical and experimental grounds, that this proposal is overly simplistic. Most importantly, in the domain of anaphora there is now an accumulating body of evidence showing that the anaphoric dependencies between (reflexive) pronominals and their antecedents are subject to an economy hierarchy. In this economy hierarchy, deriving anaphoric dependencies by deep-grammatical-operations requires less processing costs than doing so by shallow-extra-grammatical-operations. In addition, in case of ambiguity when both a shallow and a deep derivation are available to the parser, the latter is actually preferred. This, we argue, contradicts the basic assumptions of the shallow-deep dichotomy and, hence, a rethinking of the good-enough processing framework is warranted.

  12. [Conversation analysis for improving nursing communication].

    PubMed

    Yi, Myungsun

    2007-08-01

    Nursing communication has become more important than ever before because quality of nursing services largely depends on the quality of communication in a very competitive health care environment. This article was to introduce ways to improve nursing communication using conversation analysis. This was a review study on conversation analysis, critically examining previous studies in nursing communication and interpersonal relationships. This study provided theoretical backgrounds and basic assumptions of conversation analysis which was influenced by ethnomethodology, phenomenology, and sociolinguistic. In addition, the characteristics and analysis methods of conversation analysis were illustrated in detail. Lastly, how conversation analysis could help improve communication was shown, by examining researches using conversation analysis not only for ordinary conversations but also for extraordinary or difficult conversations such as conversations between patients with dementia and their professional nurses. Conversation analysis can help in improving nursing communication by providing various structures and patterns as well as prototypes of conversation, and by suggesting specific problems and problem-solving strategies in communication.

  13. The British welfare state and mental health problems: the continuing relevance of the work of Claus Offe.

    PubMed

    Pilgrim, David

    2012-09-01

    It is now over thirty years since Claus Offe theorised the crisis tendencies of the welfare state in late capitalism. As part of that work he explored ongoing and irresolvable forms of crisis management in parliamentary democracies: capitalism cannot live with the welfare state but also cannot live without it. This article examines the continued relevance of this analysis by Offe, by applying its basic assumptions to the response of the British welfare state to mental health problems, at the turn of the twenty first century. His general theoretical abstractions are tested against the empirical picture of mental health service priorities, evident since the 1980s, in sections dealing with: re-commodification tendencies; the ambiguity of wage labour in the mental health workforce; the emergence of new social movements; and the limits of legalism. © 2012 The Author. Sociology of Health & Illness © 2012 Foundation for the Sociology of Health & Illness/Blackwell Publishing Ltd.

  14. Validity in work-based assessment: expanding our horizons.

    PubMed

    Govaerts, Marjan; van der Vleuten, Cees P M

    2013-12-01

    Although work-based assessments (WBA) may come closest to assessing habitual performance, their use for summative purposes is not undisputed. Most criticism of WBA stems from approaches to validity consistent with the quantitative psychometric framework. However, there is increasing research evidence that indicates that the assumptions underlying the predictive, deterministic framework of psychometrics may no longer hold. In this discussion paper we argue that meaningfulness and appropriateness of current validity evidence can be called into question and that we need alternative strategies to assessment and validity inquiry that build on current theories of learning and performance in complex and dynamic workplace settings. Drawing from research in various professional fields we outline key issues within the mechanisms of learning, competence and performance in the context of complex social environments and illustrate their relevance to WBA. In reviewing recent socio-cultural learning theory and research on performance and performance interpretations in work settings, we demonstrate that learning, competence (as inferred from performance) as well as performance interpretations are to be seen as inherently contextualised, and can only be under-stood 'in situ'. Assessment in the context of work settings may, therefore, be more usefully viewed as a socially situated interpretive act. We propose constructivist-interpretivist approaches towards WBA in order to capture and understand contextualised learning and performance in work settings. Theoretical assumptions underlying interpretivist assessment approaches call for a validity theory that provides the theoretical framework and conceptual tools to guide the validation process in the qualitative assessment inquiry. Basic principles of rigour specific to qualitative research have been established, and they can and should be used to determine validity in interpretivist assessment approaches. If used properly, these strategies generate trustworthy evidence that is needed to develop the validity argument in WBA, allowing for in-depth and meaningful information about professional competence. © 2013 John Wiley & Sons Ltd.

  15. Network-level reproduction number and extinction threshold for vector-borne diseases.

    PubMed

    Xue, Ling; Scoglio, Caterina

    2015-06-01

    The basic reproduction number of deterministic models is an essential quantity to predict whether an epidemic will spread or not. Thresholds for disease extinction contribute crucial knowledge of disease control, elimination, and mitigation of infectious diseases. Relationships between basic reproduction numbers of two deterministic network-based ordinary differential equation vector-host models, and extinction thresholds of corresponding stochastic continuous-time Markov chain models are derived under some assumptions. Numerical simulation results for malaria and Rift Valley fever transmission on heterogeneous networks are in agreement with analytical results without any assumptions, reinforcing that the relationships may always exist and proposing a mathematical problem for proving existence of the relationships in general. Moreover, numerical simulations show that the basic reproduction number does not monotonically increase or decrease with the extinction threshold. Consistent trends of extinction probability observed through numerical simulations provide novel insights into mitigation strategies to increase the disease extinction probability. Research findings may improve understandings of thresholds for disease persistence in order to control vector-borne diseases.

  16. Patient Autonomy in a High-Tech Care Context - A Theoretical Framework.

    PubMed

    Lindberg, Catharina; Fagerström, Cecilia; Willman, Ania

    2018-06-12

    To synthesise and interpret previous findings with the aim of developing a theoretical framework for patient autonomy in a high-tech care context. Putting the somewhat abstract concept of patient autonomy into practice can prove difficult since when it is highlighted in healthcare literature the patient perspective is often invisible. Autonomy presumes that a person has experience, education, self-discipline and decision-making capacity. Reference to autonomy in relation to patients in high-tech care environments could therefore be considered paradoxical, as in most cases these persons are vulnerable, with impaired physical and/or metacognitive capacity, thus making extended knowledge of patient autonomy for these persons even more important. Theory development. The basic approaches in theory development by Walker and Avant were used to create a theoretical framework through an amalgamation of the results from three qualitative studies conducted previously by the same research group. A theoretical framework - the control-partnership-transition framework - was delineated disclosing different parts co-creating the prerequisites for patient autonomy in high-tech care environments. Assumptions and propositional statements that guide theory development were also outlined, as were guiding principles for use in day-to-day nursing care. Four strategies used by patients were revealed: the strategy of control, the strategy of partnership, the strategy of trust, and the strategy of transition. An extended knowledge base, founded on theoretical reasoning about patient autonomy, could facilitate nursing care that would allow people to remain/become autonomous in the role of patient in high-tech care environments. The control-partnership-transition framework would be of help in supporting and defending patient autonomy when caring for individual patients, as it provides an understanding of the strategies employed by patients to achieve autonomy in high-tech care contexts. The guiding principles for patient autonomy presented could be used in nursing guidelines. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  17. The Federal Role and Chapter 1: Rethinking Some Basic Assumptions.

    ERIC Educational Resources Information Center

    Kirst, Michael W.

    In the 20 years since the major Federal program for the disadvantaged began, surprisingly little has changed from its original vision. It is now time to question some of the basic policies of Chapter 1 of the Education Consolidation and Improvement Act in view of the change in conceptions about the Federal role and the recent state and local…

  18. Achieving Successful Employment Outcomes with the Use of Assistive Technology. Report from the Study Group, Institute on Rehabilitation Issues (24th, Washington, DC, May 1998).

    ERIC Educational Resources Information Center

    Radtke, Jean, Ed.

    Developed as a result of an institute on rehabilitation issues, this document is a guide to assistive technology as it affects successful competitive employment outcomes for people with disabilities. Chapter 1 offers basic information on assistive technology including basic assumptions, service provider approaches, options for technology…

  19. 5 CFR 842.702 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... for valuation of the System, based on dynamic assumptions. The present value factors are unisex... EMPLOYEES RETIREMENT SYSTEM-BASIC ANNUITY Alternative Forms of Annuities § 842.702 Definitions. In this...

  20. Sense of Community as Construct and Theory: Authors' Response to McMillan

    ERIC Educational Resources Information Center

    Nowell, Branda; Boyd, Neil

    2011-01-01

    In this article, we respond to criticisms posed by McMillan (2011) of our recent paper, "Viewing Community as Responsibility as well as a Resource: Deconstructing the Theoretical Roots of Psychological Sense of Community." We clarify that the focus of our article was to explore the macro theoretical frameworks and second-order assumptions that…

  1. Argumentation and Participation in the Primary Mathematics Classroom: Two Episodes and Related Theoretical Abductions

    ERIC Educational Resources Information Center

    Krummheuer, Gotz

    2007-01-01

    The main assumption of this article is that learning mathematics depends on the student's participation in processes of collective argumentation. On the empirical level, such processes will be analyzed with Toulmin's theory of argumentation and Goffman's idea of decomposition of the speaker's role. On the theoretical level, different statuses of…

  2. Social Representations of the Development of Intelligence, Parental Values and Parenting Styles: A Theoretical Model for Analysis

    ERIC Educational Resources Information Center

    Miguel, Isabel; Valentim, Joaquim Pires; Carugati, Felice

    2013-01-01

    Within the theoretical framework of social representations theory, a substantial body of literature has advocated and shown that, as interpretative systems and forms of knowledge concurring in the construction of a social reality, social representations are guides for action, influencing behaviours and social relations. Based on this assumption,…

  3. Steady-state heat conduction in quiescent fluids: Incompleteness of the Navier-Stokes-Fourier equations

    NASA Astrophysics Data System (ADS)

    Brenner, Howard

    2011-10-01

    Linear irreversible thermodynamic principles are used to demonstrate, by counterexample, the existence of a fundamental incompleteness in the basic pre-constitutive mass, momentum, and energy equations governing fluid mechanics and transport phenomena in continua. The demonstration is effected by addressing the elementary case of steady-state heat conduction (and transport processes in general) occurring in quiescent fluids. The counterexample questions the universal assumption of equality of the four physically different velocities entering into the basic pre-constitutive mass, momentum, and energy conservation equations. Explicitly, it is argued that such equality is an implicit constitutive assumption rather than an established empirical fact of unquestioned authority. Such equality, if indeed true, would require formal proof of its validity, currently absent from the literature. In fact, our counterexample shows the assumption of equality to be false. As the current set of pre-constitutive conservation equations appearing in textbooks are regarded as applicable both to continua and noncontinua (e.g., rarefied gases), our elementary counterexample negating belief in the equality of all four velocities impacts on all aspects of fluid mechanics and transport processes, continua and noncontinua alike.

  4. A Test of Major Assumptions about Behavior Change: A Comprehensive Look at the Effects of Passive and Active HIV-Prevention Interventions Since the Beginning of the Epidemic

    ERIC Educational Resources Information Center

    Albarracin, Dolores; Gillette, Jeffrey C.; Earl, Allison N.; Glasman, Laura R.; Durantini, Marta R.; Ho, Moon-Ho

    2005-01-01

    This meta-analysis tested the major theoretical assumptions about behavior change by examining the outcomes and mediating mechanisms of different preventive strategies in a sample of 354 HIV-prevention interventions and 99 control groups, spanning the past 17 years. There were 2 main conclusions from this extensive review. First, the most…

  5. Model error in covariance structure models: Some implications for power and Type I error

    PubMed Central

    Coffman, Donna L.

    2010-01-01

    The present study investigated the degree to which violation of the parameter drift assumption affects the Type I error rate for the test of close fit and power analysis procedures proposed by MacCallum, Browne, and Sugawara (1996) for both the test of close fit and the test of exact fit. The parameter drift assumption states that as sample size increases both sampling error and model error (i.e. the degree to which the model is an approximation in the population) decrease. Model error was introduced using a procedure proposed by Cudeck and Browne (1992). The empirical power for both the test of close fit, in which the null hypothesis specifies that the Root Mean Square Error of Approximation (RMSEA) ≤ .05, and the test of exact fit, in which the null hypothesis specifies that RMSEA = 0, is compared with the theoretical power computed using the MacCallum et al. (1996) procedure. The empirical power and theoretical power for both the test of close fit and the test of exact fit are nearly identical under violations of the assumption. The results also indicated that the test of close fit maintains the nominal Type I error rate under violations of the assumption. PMID:21331302

  6. The effect of errors in the assignment of the transmission functions on the accuracy of the thermal sounding of the atmosphere

    NASA Technical Reports Server (NTRS)

    Timofeyev, Y. M.

    1979-01-01

    In order to test the error of calculation in assumed values of the transmission function for Soviet and American radiometers sounding the atmosphere thermally from orbiting satellites, the assumptions of the transmission calculation is varied with respect to atmospheric CO2 content, transmission frequency, and atmospheric absorption. The error arising from variations of the assumptions from the standard basic model is calculated.

  7. Student Services: Programs and Functions. A Report on the Administration of Selected Student and Campus Services of the University of Illinois at Chicago Circle. Part 1 and 2.

    ERIC Educational Resources Information Center

    Bentz, Robert P.; And Others

    The commuter institute is one to which students commute. The two basic assumptions of this study are: (1) the Chicago Circle campus of the University of Illinois will remain a commuter institution during the decade ahead; and (2) the campus will increasingly serve a more heterogeneous student body. These assumptions have important implications for…

  8. Effect of body size and body mass on δ 13 C and δ 15 N in coastal fishes and cephalopods

    NASA Astrophysics Data System (ADS)

    Vinagre, C.; Máguas, C.; Cabral, H. N.; Costa, M. J.

    2011-11-01

    Carbon and nitrogen isotopes have been widely used in the investigation of trophic relations, energy pathways, trophic levels and migrations, under the assumption that δ 13C is independent of body size and that variation in δ 15N occurs exclusively due to ontogenetic changes in diet and not body size increase per se. However, several studies have shown that these assumptions are uncertain. Data from food-webs containing an important number of species lack theoretical support on these assumptions because very few species have been tested for δ 13C and δ 15N variation in captivity. However, if sampling comprises a wide range of body sizes from various species, the variation of δ 13C and δ 15N with body size can be investigated. While correlation between body size and δ 13C and δ 15N can be due to ontogenetic diet shifts, stability in such values throughout the size spectrum can be considered an indication that δ 13C and δ 15N in muscle tissues of such species is independent of body size within that size range, and thus the basic assumptions can be applied in the interpretation of such food webs. The present study investigated the variation in muscle δ 13C and δ 15N with body size and body mass of coastal fishes and cephalopods. It was concluded that muscle δ 13C and δ 15N did not vary with body size or mass for all bony fishes with only one exception, the dragonet Callionymus lyra. Muscle δ 13C and δ 15N also did not vary with body size or mass in cartilaginous fishes and cephalopods, meaning that body size/mass per se have no effect on δ 13C or δ 15N, for most species analysed and within the size ranges sampled. The assumption that δ 13C is independent of body size and that variation in δ 15N is not affected by body size increase per se was upheld for most organisms and can be applied to the coastal food web studied taking into account that C. lyra is an exception.

  9. Pulsational mode fluctuations and their basic conservation laws

    NASA Astrophysics Data System (ADS)

    Borah, B.; Karmakar, P. K.

    2015-01-01

    We propose a theoretical hydrodynamic model for investigating the basic features of nonlinear pulsational mode stability in a partially charged dust molecular cloud within the framework of the Jeans homogenization assumption. The inhomogeneous cloud is modeled as a quasi-neutral multifluid consisting of the warm electrons, warm ions, and identical inertial cold dust grains with partial ionization in a neutral gaseous background. The grain-charge is assumed not to vary in the fluctuation evolution time scale. The active inertial roles of the thermal species are included. We apply a standard multiple scaling technique centered on the gravito-electrostatic equilibrium to understand the fluctuations on the astrophysical scales of space and time. This is found that electrostatic and self-gravitational eigenmodes co-exist as diverse solitary spectral patterns governed by a pair of Korteweg-de Vries (KdV) equations. In addition, all the relevant classical conserved quantities associated with the KdV system under translational invariance are methodologically derived and numerically analyzed. A full numerical shape-analysis of the fluctuations, scale lengths and perturbed densities with multi-parameter variation of judicious plasma conditions is carried out. A correlation of the perturbed densities and gravito-electrostatic spectral patterns is also graphically indicated. It is demonstrated that the solitary mass, momentum and energy densities also evolve like solitary spectral patterns which remain conserved throughout the spatiotemporal scales of the fluctuation dynamics. Astrophysical and space environments significant to our results are briefly highlighted.

  10. A guide to understanding social science research for natural scientists.

    PubMed

    Moon, Katie; Blackman, Deborah

    2014-10-01

    Natural scientists are increasingly interested in social research because they recognize that conservation problems are commonly social problems. Interpreting social research, however, requires at least a basic understanding of the philosophical principles and theoretical assumptions of the discipline, which are embedded in the design of social research. Natural scientists who engage in social science but are unfamiliar with these principles and assumptions can misinterpret their results. We developed a guide to assist natural scientists in understanding the philosophical basis of social science to support the meaningful interpretation of social research outcomes. The 3 fundamental elements of research are ontology, what exists in the human world that researchers can acquire knowledge about; epistemology, how knowledge is created; and philosophical perspective, the philosophical orientation of the researcher that guides her or his action. Many elements of the guide also apply to the natural sciences. Natural scientists can use the guide to assist them in interpreting social science research to determine how the ontological position of the researcher can influence the nature of the research; how the epistemological position can be used to support the legitimacy of different types of knowledge; and how philosophical perspective can shape the researcher's choice of methods and affect interpretation, communication, and application of results. The use of this guide can also support and promote the effective integration of the natural and social sciences to generate more insightful and relevant conservation research outcomes. © 2014 Society for Conservation Biology.

  11. Deciphering flood frequency curves from a coupled human-nature system perspective

    NASA Astrophysics Data System (ADS)

    Li, H. Y.; Abeshu, G. W.; Wang, W.; Ye, S.; Guo, J.; Bloeschl, G.; Leung, L. R.

    2017-12-01

    Most previous studies and applications in deriving or applying FFC are underpinned by the stationarity assumption. To examine the theoretical robustness of this basic assumption, we analyzed the observed FFCs at hundreds of catchments in the contiguous United States along the gradients of climate conditions and human influences. The shape of FFCs is described using three similarity indices: mean annual floods (MAF), coefficient of variance (CV), and a seasonality index defined using circular statistics. The characteristics of catchments are quantified with a small number of dimensionless indices, including particularly: 1) the climatic aridity index, AI, which is a measure of the competition between energy and water availability; 2) reservoir impact index, defined as the total upstream reservoir storage capacity normalized by the annual streamflow volume. The linkages between these two sets of indices are then explored based on a combination of mathematical derivations of the Budyko formula, simple but physically based reservoir operation models, and other auxiliary data. It is found that the shape of FFCs shifts from arid to humid climate, and from periods with weak human influences to periods with strong influences. The seasonality of floods is found to be largely controlled by the synchronization between the seasonal cycles of precipitation and solar radiation in pristine catchments, but also by the reservoir regulation capacity in managed catchments. Our findings may help improve flood-risk assessment and mitigation in both natural and regulated river systems across various climate gradients.

  12. Implicit assumptions underlying simple harvest models of marine bird populations can mislead environmental management decisions.

    PubMed

    O'Brien, Susan H; Cook, Aonghais S C P; Robinson, Robert A

    2017-10-01

    Assessing the potential impact of additional mortality from anthropogenic causes on animal populations requires detailed demographic information. However, these data are frequently lacking, making simple algorithms, which require little data, appealing. Because of their simplicity, these algorithms often rely on implicit assumptions, some of which may be quite restrictive. Potential Biological Removal (PBR) is a simple harvest model that estimates the number of additional mortalities that a population can theoretically sustain without causing population extinction. However, PBR relies on a number of implicit assumptions, particularly around density dependence and population trajectory that limit its applicability in many situations. Among several uses, it has been widely employed in Europe in Environmental Impact Assessments (EIA), to examine the acceptability of potential effects of offshore wind farms on marine bird populations. As a case study, we use PBR to estimate the number of additional mortalities that a population with characteristics typical of a seabird population can theoretically sustain. We incorporated this level of additional mortality within Leslie matrix models to test assumptions within the PBR algorithm about density dependence and current population trajectory. Our analyses suggest that the PBR algorithm identifies levels of mortality which cause population declines for most population trajectories and forms of population regulation. Consequently, we recommend that practitioners do not use PBR in an EIA context for offshore wind energy developments. Rather than using simple algorithms that rely on potentially invalid implicit assumptions, we recommend use of Leslie matrix models for assessing the impact of additional mortality on a population, enabling the user to explicitly define assumptions and test their importance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Carbon chemistry in dense molecular clouds: Theory and observational constraints

    NASA Technical Reports Server (NTRS)

    Blake, Geoffrey A.

    1990-01-01

    For the most part, gas phase models of the chemistry of dense molecular clouds predict the abundances of simple species rather well. However, for larger molecules and even for small systems rich in carbon these models often fail spectacularly. Researchers present a brief review of the basic assumptions and results of large scale modeling of the carbon chemistry in dense molecular clouds. Particular attention is to the influence of the gas phase C/O ratio in molecular clouds, and the likely role grains play in maintaining this ratio as clouds evolve from initially diffuse objects to denser cores with associated stellar and planetary formation. Recent spectral line surveys at centimeter and millimeter wavelengths along with selected observations in the submillimeter have now produced an accurate inventory of the gas phase carbon budget in several different types of molecular clouds, though gaps in our knowledge clearly remain. The constraints these observations place on theoretical models of interstellar chemistry can be used to gain insights into why the models fail, and show also which neglected processes must be included in more complete analyses. Looking toward the future, larger molecules are especially difficult to study both experimentally and theoretically in such dense, cold regions, and some new methods are therefore outlined which may ultimately push the detectability of small carbon chains and rings to much heavier species.

  14. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  15. Motion and Stability of Saturated Soil Systems under Dynamic Loading.

    DTIC Science & Technology

    1985-04-04

    12 7.3 Experimental Verification of Theories ............................. 13 8. ADDITIONAL COMMENTS AND OTHER WORK, AT THE OHIO...theoretical/computational models. The continuing rsearch effort will extend and refine the theoretical models, allow for compressibility of soil as...motion of soil and water and, therefore, a correct theory of liquefaction should not include this assumption. Finite element methodologies have been

  16. Implementing Geographical Key Concepts: Design of a Symbiotic Teacher Training Course Based on Empirical and Theoretical Evidence

    ERIC Educational Resources Information Center

    Fögele, Janis; Mehren, Rainer

    2015-01-01

    A central desideratum for the professionalization of qualified teachers is an improved practice of further teacher education. The present work constitutes a course of in-service training, which is built upon both a review of empirical findings concerning the efficacy of in-service training courses for teachers and theoretical assumptions about the…

  17. Critical assessment of inverse gas chromatography as means of assessing surface free energy and acid-base interaction of pharmaceutical powders.

    PubMed

    Telko, Martin J; Hickey, Anthony J

    2007-10-01

    Inverse gas chromatography (IGC) has been employed as a research tool for decades. Despite this record of use and proven utility in a variety of applications, the technique is not routinely used in pharmaceutical research. In other fields the technique has flourished. IGC is experimentally relatively straightforward, but analysis requires that certain theoretical assumptions are satisfied. The assumptions made to acquire some of the recently reported data are somewhat modified compared to initial reports. Most publications in the pharmaceutical literature have made use of a simplified equation for the determination of acid/base surface properties resulting in parameter values that are inconsistent with prior methods. In comparing the surface properties of different batches of alpha-lactose monohydrate, new data has been generated and compared with literature to allow critical analysis of the theoretical assumptions and their importance to the interpretation of the data. The commonly used (simplified) approach was compared with the more rigorous approach originally outlined in the surface chemistry literature. (c) 2007 Wiley-Liss, Inc.

  18. Critical frontier of the triangular Ising antiferromagnet in a field

    NASA Astrophysics Data System (ADS)

    Qian, Xiaofeng; Wegewijs, Maarten; Blöte, Henk W.

    2004-03-01

    We study the critical line of the triangular Ising antiferromagnet in an external magnetic field by means of a finite-size analysis of results obtained by transfer-matrix and Monte Carlo techniques. We compare the shape of the critical line with predictions of two different theoretical scenarios. Both scenarios, while plausible, involve assumptions. The first scenario is based on the generalization of the model to a vertex model, and the assumption that the exact analytic form of the critical manifold of this vertex model is determined by the zeroes of an O(2) gauge-invariant polynomial in the vertex weights. However, it is not possible to fit the coefficients of such polynomials of orders up to 10, such as to reproduce the numerical data for the critical points. The second theoretical prediction is based on the assumption that a renormalization mapping exists of the Ising model on the Coulomb gas, and analysis of the resulting renormalization equations. It leads to a shape of the critical line that is inconsistent with the first prediction, but consistent with the numerical data.

  19. Social factors in space station interiors

    NASA Technical Reports Server (NTRS)

    Cranz, Galen; Eichold, Alice; Hottes, Klaus; Jones, Kevin; Weinstein, Linda

    1987-01-01

    Using the example of the chair, which is often written into space station planning but which serves no non-cultural function in zero gravity, difficulties in overcoming cultural assumptions are discussed. An experimental approach is called for which would allow designers to separate cultural assumptions from logistic, social and psychological necessities. Simulations, systematic doubt and monitored brainstorming are recommended as part of basic research so that the designer will approach the problems of space module design with a complete program.

  20. The Eleventh Quadrennial Review of Military Compensation. Supporting Research Papers

    DTIC Science & Technology

    2012-06-01

    value. 4. BAH + BAS is roughly equal to expenditures for housing and food for servicemembers.22 In the first phase of the formal model, we further...assume that taxes, housing, and food are the only basic living expenses. Then, in the next phase, we include estimates of noncash benefits not included...assumption 4 with assumption 2 implies that civilian housing and food expenses are also equal to military BAH and BAS. However, civilian housing and food

  1. Costing interventions in primary care.

    PubMed

    Kernick, D

    2000-02-01

    Against a background of increasing demands on limited resources, studies that relate benefits of health interventions to the resources they consume will be an important part of any decision-making process in primary care, and an accurate assessment of costs will be an important part of any economic evaluation. Although there is no such thing as a gold standard cost estimate, there are a number of basic costing concepts that underlie any costing study. How costs are derived and combined will depend on the assumptions that have been made in their derivation. It is important to be clear what assumptions have been made and why in order to maintain consistency across comparative studies and prevent inappropriate conclusions being drawn. This paper outlines some costing concepts and principles to enable primary care practitioners and researchers to have a basic understanding of costing exercises and their pitfalls.

  2. Understanding sport continuation: an integration of the theories of planned behaviour and basic psychological needs.

    PubMed

    Gucciardi, Daniel F; Jackson, Ben

    2015-01-01

    Fostering individuals' long-term participation in activities that promote positive development such as organised sport is an important agenda for research and practice. We integrated the theories of planned behaviour (TPB) and basic psychological needs (BPN) to identify factors associated with young adults' continuation in organised sport over a 12-month period. Prospective study, including an online psycho-social assessment at Time 1 and an assessment of continuation in sport approximately 12 months later. Participants (N=292) aged between 17 and 21 years (M=18.03; SD=1.29) completed an online survey assessing the theories of planned behaviour and basic psychological needs constructs. Bayesian structural equation modelling (BSEM) was employed to test the hypothesised theoretical sequence, using informative priors for structural relations based on empirical and theoretical expectations. The analyses revealed support for the robustness of the hypothesised theoretical model in terms of the pattern of relations as well as the direction and strength of associations among the constructs derived from quantitative summaries of existing research and theoretical expectations. The satisfaction of basic psychological needs was associated with more positive attitudes, higher levels of perceived behavioural control, and more favourable subjective norms; positive attitudes and perceived behavioural control were associated with higher behavioural intentions; and both intentions and perceived behavioural control predicted sport continuation. This study demonstrated the utility of Bayesian structural equation modelling for testing the robustness of an integrated theoretical model, which is informed by empirical evidence from meta-analyses and theoretical expectations, for understanding sport continuation. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  3. More similarities than differences in contemporary theories of social development?: a plea for theory bridging.

    PubMed

    Leaper, Campbell

    2011-01-01

    Many contemporary theories of social development are similar and/or share complementary constructs. Yet, there have been relatively few efforts toward theoretical integration. The present chapter represents a call for increased theory bridging. The problem of theoretical fragmentation in psychology is reviewed. Seven highlighted reasons for this predicament include differences between behavioral sciences and other sciences, theoretical paradigms as social identities, the uniqueness assumption, information overload, field fixation, linguistic fragmentation, and few incentives for theoretical integration. Afterward, the feasibility of theoretical synthesis is considered. Finally, some possible directions are proposed for theoretical integration among five contemporary theories of social and gender development: social cognitive theory, expectancy-value theory, cognitive-developmental theory, gender schema theory, and self-categorization theory.

  4. Interpretation of the results of statistical measurements. [search for basic probability model

    NASA Technical Reports Server (NTRS)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  5. The Disk Instability Model for SU UMa systems - a Comparison of the Thermal-Tidal Model and Plain Vanilla Model

    NASA Astrophysics Data System (ADS)

    Cannizzo, John K.

    2017-01-01

    We utilize the time dependent accretion disk model described by Ichikawa & Osaki (1992) to explore two basic ideas for the outbursts in the SU UMa systems, Osaki's Thermal-Tidal Model, and the basic accretion disk limit cycle model. We explore a range in possible input parameters and model assumptions to delineate under what conditions each model may be preferred.

  6. A Simple Diagnostic Model of the Circulation Beneath an Ice Shelf

    NASA Astrophysics Data System (ADS)

    Jenkins, Adrian; Nøst, Ole Anders

    2017-04-01

    The ocean circulation beneath ice shelves supplies the heat required to melt ice and exports the resulting freshwater. It therefore plays a key role in determining the mass balance and geometry of the ice shelves and hence the restraint they impose on the outflow of grounded ice from the interior of the ice sheet. Despite this critical role in regulating the ice sheet's contribution to eustatic sea level, an understanding of some of the most basic features of the circulation is lacking. The conventional paradigm is one of a buoyancy-forced overturning circulation, with inflow of warm, salty water along the seabed and outflow of cooled and freshened waters along the ice base. However, most sub-ice-shelf cavities are broad relative to the internal Rossby radius, so a horizontal circulation accompanies the overturning. Primitive equation ocean models applied to idealised geometries produce cyclonic gyres of comparable magnitude, but in the absence of a theoretical understanding of what controls the gyre strength, those solutions can only be validated against each other. Furthermore, we have no understanding of how the gyre circulation should change given more complex geometries. To begin to address this gap in our theoretical understanding we present a simple, linear, steady-state model for the circulation beneath an ice shelf. Our approach in analogous to that of Stommel's classic analysis of the wind-driven gyres, but is complicated by the fact that his most basic assumption of homogeneity is inappropriate. The only forcing on the flow beneath an ice shelf arises because of the horizontal density gradients set up by melting. We thus arrive at a diagnostic model which gives us the depth-dependent horizontal circulation that results from an imposed geometry and density distribution. We describe the development of the model and present some preliminary solutions for the simplest cavity geometries.

  7. Behavioral health at-risk contracting--a rate development and financial reporting guide.

    PubMed

    Zinser, G R

    1994-01-01

    The process of developing rates for behavioral capitation contracts can seem mysterious and intimidating. The following article explains several key features of the method used to develop capitation rates. These include: (1) a basic understanding of the mechanics of rate calculation; (2) awareness of the variables to be considered and assumptions to be made; (3) a source of information to use as a basis for these assumptions; and (4) a system to collect detailed actual experience data.

  8. An Examination of Brazil and the United States as Potential Partners in a Joint Supersonic Military Fighter Aircraft Codevelopment and Production Program.

    DTIC Science & Technology

    1986-09-01

    Brazilian-American Chamber of Commerce Mr. Frank J. Devine, Executive Director Embraer, Empresa Brasileira De Aeronautica Mr. Salo Roth Vice President...Throughout this study the following assumptions have been made. First, it is assumed that the reader has a basic familiarity with aircraft. Therefore...of the 5 1 weapons acquisition process. Third, the assumption is made that most readers are familiar with U.S. procedures involving the sale of

  9. The four-principle formulation of common morality is at the core of bioethics mediation method.

    PubMed

    Ahmadi Nasab Emran, Shahram

    2015-08-01

    Bioethics mediation is increasingly used as a method in clinical ethics cases. My goal in this paper is to examine the implicit theoretical assumptions of the bioethics mediation method developed by Dubler and Liebman. According to them, the distinguishing feature of bioethics mediation is that the method is useful in most cases of clinical ethics in which conflict is the main issue, which implies that there is either no real ethical issue or if there were, they are not the key to finding a resolution. I question the tacit assumption of non-normativity of the mediation method in bioethics by examining the various senses in which bioethics mediation might be non-normative or neutral. The major normative assumption of the mediation method is the existence of common morality. In addition, the four-principle formulation of the theory articulated by Beauchamp and Childress implicitly provides the normative content for the method. Full acknowledgement of the theoretical and normative assumptions of bioethics mediation helps clinical ethicists better understand the nature of their job. In addition, the need for a robust philosophical background even in what appears to be a purely practical method of mediation cannot be overemphasized. Acknowledgement of the normative nature of bioethics mediation method necessitates a more critical attitude of the bioethics mediators towards the norms they usually take for granted uncritically as valid.

  10. Basic stages in the development of the theory of Ramjet Engines (RJE)

    NASA Technical Reports Server (NTRS)

    Merkulov, I. A.

    1977-01-01

    Basic periods in the history of the development of ramjet engine theory are cited. The periods include the first experimental tests as well as the development of basic ideas and theoretical development of the cosmic ramjet engine.

  11. Epistemological issues in the study of microbial life: alternative terran biospheres?

    PubMed

    Cleland, Carol E

    2007-12-01

    The assumption that all life on Earth today shares the same basic molecular architecture and biochemistry is part of the paradigm of modern biology. This paper argues that there is little theoretical or empirical support for this widely held assumption. Scientists know that life could have been at least modestly different at the molecular level and it is clear that alternative molecular building blocks for life were available on the early Earth. If the emergence of life is, like other natural phenomena, highly probable given the right chemical and physical conditions then it seems likely that the early Earth hosted multiple origins of life, some of which produced chemical variations on life as we know it. While these points are often conceded, it is nevertheless maintained that any primitive alternatives to familiar life would have been eliminated long ago, either amalgamated into a single form of life through lateral gene transfer (LGT) or alternatively out-competed by our putatively more evolutionarily robust form of life. Besides, the argument continues, if such life forms still existed, we surely would have encountered telling signs of them by now. These arguments do not hold up well under close scrutiny. They reflect a host of assumptions that are grounded in our experience with large multicellular organisms and, most importantly, do not apply to microbial forms of life, which cannot be easily studied without the aid of sophisticated technologies. Significantly, the most powerful molecular biology techniques available-polymerase chain reaction (PCR) amplification of rRNA genes augmented by metagenomic analysis-could not detect such microbes if they existed. Given the profound philosophical and scientific importance that such a discovery would represent, a dedicated search for 'shadow microbes' (heretofore unrecognized 'alien' forms of terran microbial life) seems in order. The best place to start such a search is with puzzling (anomalous) phenomena, such as desert varnish, that resist classification as 'biological' or 'nonbiological'.

  12. What procedure to choose while designing a fuzzy control? Towards mathematical foundations of fuzzy control

    NASA Technical Reports Server (NTRS)

    Kreinovich, Vladik YA.; Quintana, Chris; Lea, Robert

    1991-01-01

    Fuzzy control has been successfully applied in industrial systems. However, there is some caution in using it. The reason is that it is based on quite reasonable ideas, but each of these ideas can be implemented in several different ways, and depending on which of the implementations chosen different results are achieved. Some implementations lead to a high quality control, some of them not. And since there are no theoretical methods for choosing the implementation, the basic way to choose it now is experimental. But if one chooses a method that is good for several examples, there is no guarantee that it will work fine in all of them. Hence the caution. A theoretical basis for choosing the fuzzy control procedures is provided. In order to choose a procedure that transforms a fuzzy knowledge into a control, one needs, first, to choose a membership function for each of the fuzzy terms that the experts use, second, to choose operations of uncertainty values that corresponds to 'and' and 'or', and third, when a membership function for control is obtained, one must defuzzy it, that is, somehow generate a value of the control u that will be actually used. A general approach that will help to make all these choices is described: namely, it is proved that under reasonable assumptions membership functions should be linear or fractionally linear, defuzzification must be described by a centroid rule and describe all possible 'and' and 'or' operations. Thus, a theoretical explanation of the existing semi-heuristic choices is given and the basis for the further research on optimal fuzzy control is formulated.

  13. Psychological theory and pedagogical effectiveness: the learning promotion potential framework.

    PubMed

    Tomlinson, Peter

    2008-12-01

    After a century of educational psychology, eminent commentators are still lamenting problems besetting the appropriate relating of psychological insights to teaching design, a situation not helped by the persistence of crude assumptions concerning the nature of pedagogical effectiveness. To propose an analytical or meta-theoretical framework based on the concept of learning promotion potential (LPP) as a basis for understanding the basic relationship between psychological insights and teaching strategies, and to draw out implications for psychology-based pedagogical design, development and research. This is a theoretical and meta-theoretical paper relying mainly on conceptual analysis, though also calling on psychological theory and research. Since teaching consists essentially in activity designed to promote learning, it follows that a teaching strategy has the potential in principle to achieve particular kinds of learning gains (LPP) to the extent that it embodies or stimulates the relevant learning processes on the part of learners and enables the teacher's functions of on-line monitoring and assistance for such learning processes. Whether a teaching strategy actually does realize its LPP by way of achieving its intended learning goals depends also on the quality of its implementation, in conjunction with other factors in the situated interaction that teaching always involves. The core role of psychology is to provide well-grounded indication of the nature of such learning processes and the teaching functions that support them, rather than to directly generate particular ways of teaching. A critically eclectic stance towards potential sources of psychological insight is argued for. Applying this framework, the paper proposes five kinds of issue to be attended to in the design and evaluation of psychology-based pedagogy. Other work proposing comparable ideas is briefly reviewed, with particular attention to similarities and a key difference with the ideas of Oser and Baeriswyl (2001).

  14. Shifts in rotifer life history in response to stable isotope enrichment: testing theories of isotope effects on organismal growth

    PubMed Central

    2017-01-01

    In ecology, stable isotope labelling is commonly used for tracing material transfer in trophic interactions, nutrient budgets and biogeochemical processes. The main assumption in this approach is that the enrichment with a heavy isotope has no effect on the organism growth and metabolism. This assumption is, however, challenged by theoretical considerations and experimental studies on kinetic isotope effects in vivo. Here, I demonstrate profound changes in life histories of the rotifer Brachionus plicatilis fed 15N-enriched algae (0.4–5.0 at%); i.e. at the enrichment levels commonly used in ecological studies. These findings support theoretically predicted effects of heavy isotope enrichment on growth, metabolism and ageing in biological systems and underline the importance of accounting for such effects when using stable isotope labelling in experimental studies. PMID:28405367

  15. Personality psychology: lexical approaches, assessment methods, and trait concepts reveal only half of the story--why it is time for a paradigm shift.

    PubMed

    Uher, Jana

    2013-03-01

    This article develops a comprehensive philosophy-of-science for personality psychology that goes far beyond the scope of the lexical approaches, assessment methods, and trait concepts that currently prevail. One of the field's most important guiding scientific assumptions, the lexical hypothesis, is analysed from meta-theoretical viewpoints to reveal that it explicitly describes two sets of phenomena that must be clearly differentiated: 1) lexical repertoires and the representations that they encode and 2) the kinds of phenomena that are represented. Thus far, personality psychologists largely explored only the former, but have seriously neglected studying the latter. Meta-theoretical analyses of these different kinds of phenomena and their distinct natures, commonalities, differences, and interrelations reveal that personality psychology's focus on lexical approaches, assessment methods, and trait concepts entails a) erroneous meta-theoretical assumptions about what the phenomena being studied actually are, and thus how they can be analysed and interpreted, b) that contemporary personality psychology is largely based on everyday psychological knowledge, and c) a fundamental circularity in the scientific explanations used in trait psychology. These findings seriously challenge the widespread assumptions about the causal and universal status of the phenomena described by prominent personality models. The current state of knowledge about the lexical hypothesis is reviewed, and implications for personality psychology are discussed. Ten desiderata for future research are outlined to overcome the current paradigmatic fixations that are substantially hampering intellectual innovation and progress in the field.

  16. Is the Surface Potential Integral of a Dipole in a Volume Conductor Always Zero? A Cloud Over the Average Reference of EEG and ERP.

    PubMed

    Yao, Dezhong

    2017-03-01

    Currently, average reference is one of the most widely adopted references in EEG and ERP studies. The theoretical assumption is the surface potential integral of a volume conductor being zero, thus the average of scalp potential recordings might be an approximation of the theoretically desired zero reference. However, such a zero integral assumption has been proved only for a spherical surface. In this short communication, three counter-examples are given to show that the potential integral over the surface of a dipole in a volume conductor may not be zero. It depends on the shape of the conductor and the orientation of the dipole. This fact on one side means that average reference is not a theoretical 'gold standard' reference, and on the other side reminds us that the practical accuracy of average reference is not only determined by the well-known electrode array density and its coverage but also intrinsically by the head shape. It means that reference selection still is a fundamental problem to be fixed in various EEG and ERP studies.

  17. Mourning and psychosis: a psychoanalytic perspective.

    PubMed

    Tizón, Jorge L

    2010-12-01

    The author attempts to develop some of the basic models and concepts relating to mourning processes in psychotic patients on the assumption that situations of loss and mourning are key moments for psychoanalysis, psychotherapy, and therapeutic approaches in general. Secondly, he reminds us that 'mourning processes in psychotics' are not always 'psychotic mourning processes', that is to say, that they do not necessarily occur within, or give rise to, a psychotic clinical picture. These ideas are illustrated by a number of sessions and vignettes concerning two psychotic patients in psychotherapeutic and psychoanalytic treatment. In theoretical terms, it seems vitally important in this context to combine a relationship-based approach within a framework of special psychoanalytic psychopathology with an updated view of processes of mourning and affective loss. A fundamental requirement at clinical level is to determine the role to be played by psychoanalytically based treatments in combined, integrated or global therapies when working with psychotic patients. For this purpose, the paper ends by outlining a set of principles and objectives for such treatments. Copyright © 2010 Institute of Psychoanalysis.

  18. A Memory Based Model of Posttraumatic Stress Disorder: Evaluating Basic Assumptions Underlying the PTSD Diagnosis

    PubMed Central

    Rubin, David C.; Berntsen, Dorthe; Johansen, Malene Klindt

    2009-01-01

    In the mnemonic model of PTSD, the current memory of a negative event, not the event itself determines symptoms. The model is an alternative to the current event-based etiology of PTSD represented in the DSM. The model accounts for important and reliable findings that are often inconsistent with the current diagnostic view and that have been neglected by theoretical accounts of the disorder, including the following observations. The diagnosis needs objective information about the trauma and peritraumatic emotions, but uses retrospective memory reports that can have substantial biases. Negative events and emotions that do not satisfy the current diagnostic criteria for a trauma can be followed by symptoms that would otherwise qualify for PTSD. Predisposing factors that affect the current memory have large effects on symptoms. The inability-to-recall-an-important-aspect-of-the-trauma symptom does not correlate with other symptoms. Loss or enhancement of the trauma memory affects PTSD symptoms in predictable ways. Special mechanisms that apply only to traumatic memories are not needed, increasing parsimony and the knowledge that can be applied to understanding PTSD. PMID:18954211

  19. The assumed relation between occupation and inequality in health.

    PubMed

    Madsen, Jacob; Kanstrup, Anne Marie; Josephsson, Staffan

    2016-01-01

    Occupational science and therapy scholars have argued that research on inequality in health is needed. Simultaneously, a knowledge gap between how to understand and take action on health inequalities exists in occupational science and therapy. To identify how inequality in health, high-risk areas of health, and engagement in health for low-income adult citizens have been described and conceptualized in contemporary occupational science and therapy literature. A structured literature review of 37 publications in occupational science and therapy literature, published from 2004 to 2014. The review revealed several descriptions and conceptualizations based on environmental, social, cultural, historical, and personal perspectives on occupation and already existing occupational science concepts. However, these descriptions were mainly based on assumptions regarding the relation between occupation and inequality in health, and statements on the need to explore this relation. Basic theory and reasoning, as well as empirical studies, on inequality in health are missing in occupational science and therapy. Based on the findings and theoretical trends, the authors suggest a transactional perspective on occupation is a possible frame for understanding inequality in health and related issues.

  20. Modelling ADHD: A review of ADHD theories through their predictions for computational models of decision-making and reinforcement learning.

    PubMed

    Ziegler, Sigurd; Pedersen, Mads L; Mowinckel, Athanasia M; Biele, Guido

    2016-12-01

    Attention deficit hyperactivity disorder (ADHD) is characterized by altered decision-making (DM) and reinforcement learning (RL), for which competing theories propose alternative explanations. Computational modelling contributes to understanding DM and RL by integrating behavioural and neurobiological findings, and could elucidate pathogenic mechanisms behind ADHD. This review of neurobiological theories of ADHD describes predictions for the effect of ADHD on DM and RL as described by the drift-diffusion model of DM (DDM) and a basic RL model. Empirical studies employing these models are also reviewed. While theories often agree on how ADHD should be reflected in model parameters, each theory implies a unique combination of predictions. Empirical studies agree with the theories' assumptions of a lowered DDM drift rate in ADHD, while findings are less conclusive for boundary separation. The few studies employing RL models support a lower choice sensitivity in ADHD, but not an altered learning rate. The discussion outlines research areas for further theoretical refinement in the ADHD field. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Forecasting Social Unrest Using Activity Cascades

    PubMed Central

    Cadena, Jose; Korkmaz, Gizem; Kuhlman, Chris J.; Marathe, Achla; Ramakrishnan, Naren; Vullikanti, Anil

    2015-01-01

    Social unrest is endemic in many societies, and recent news has drawn attention to happenings in Latin America, the Middle East, and Eastern Europe. Civilian populations mobilize, sometimes spontaneously and sometimes in an organized manner, to raise awareness of key issues or to demand changes in governing or other organizational structures. It is of key interest to social scientists and policy makers to forecast civil unrest using indicators observed on media such as Twitter, news, and blogs. We present an event forecasting model using a notion of activity cascades in Twitter (proposed by Gonzalez-Bailon et al., 2011) to predict the occurrence of protests in three countries of Latin America: Brazil, Mexico, and Venezuela. The basic assumption is that the emergence of a suitably detected activity cascade is a precursor or a surrogate to a real protest event that will happen “on the ground.” Our model supports the theoretical characterization of large cascades using spectral properties and uses properties of detected cascades to forecast events. Experimental results on many datasets, including the recent June 2013 protests in Brazil, demonstrate the effectiveness of our approach. PMID:26091012

  2. On the Shallow Processing (Dis)Advantage: Grammar and Economy

    PubMed Central

    Koornneef, Arnout; Reuland, Eric

    2016-01-01

    In the psycholinguistic literature it has been proposed that readers and listeners often adopt a “good-enough” processing strategy in which a “shallow” representation of an utterance driven by (top-down) extra-grammatical processes has a processing advantage over a “deep” (bottom-up) grammatically-driven representation of that same utterance. In the current contribution we claim, both on theoretical and experimental grounds, that this proposal is overly simplistic. Most importantly, in the domain of anaphora there is now an accumulating body of evidence showing that the anaphoric dependencies between (reflexive) pronominals and their antecedents are subject to an economy hierarchy. In this economy hierarchy, deriving anaphoric dependencies by deep—grammatical—operations requires less processing costs than doing so by shallow—extra-grammatical—operations. In addition, in case of ambiguity when both a shallow and a deep derivation are available to the parser, the latter is actually preferred. This, we argue, contradicts the basic assumptions of the shallow–deep dichotomy and, hence, a rethinking of the good-enough processing framework is warranted. PMID:26903897

  3. Forecasting Social Unrest Using Activity Cascades.

    PubMed

    Cadena, Jose; Korkmaz, Gizem; Kuhlman, Chris J; Marathe, Achla; Ramakrishnan, Naren; Vullikanti, Anil

    2015-01-01

    Social unrest is endemic in many societies, and recent news has drawn attention to happenings in Latin America, the Middle East, and Eastern Europe. Civilian populations mobilize, sometimes spontaneously and sometimes in an organized manner, to raise awareness of key issues or to demand changes in governing or other organizational structures. It is of key interest to social scientists and policy makers to forecast civil unrest using indicators observed on media such as Twitter, news, and blogs. We present an event forecasting model using a notion of activity cascades in Twitter (proposed by Gonzalez-Bailon et al., 2011) to predict the occurrence of protests in three countries of Latin America: Brazil, Mexico, and Venezuela. The basic assumption is that the emergence of a suitably detected activity cascade is a precursor or a surrogate to a real protest event that will happen "on the ground." Our model supports the theoretical characterization of large cascades using spectral properties and uses properties of detected cascades to forecast events. Experimental results on many datasets, including the recent June 2013 protests in Brazil, demonstrate the effectiveness of our approach.

  4. Design and Analysis of an Electromagnetic Thrust Bearing

    NASA Technical Reports Server (NTRS)

    Banerjee, Bibhuti; Rao, Dantam K.

    1996-01-01

    A double-acting electromagnetic thrust bearing is normally used to counter the axial loads in many rotating machines that employ magnetic bearings. It essentially consists of an actuator and drive electronics. Existing thrust bearing design programs are based on several assumptions. These assumptions, however, are often violated in practice. For example, no distinction is made between maximum external loads and maximum bearing forces, which are assumed to be identical. Furthermore, it is assumed that the maximum flux density in the air gap occurs at the nominal gap position of the thrust runner. The purpose of this paper is to present a clear theoretical basis for the design of the electromagnetic thrust bearing which obviates such assumptions.

  5. The Central Registry for Child Abuse Cases: Rethinking Basic Assumptions

    ERIC Educational Resources Information Center

    Whiting, Leila

    1977-01-01

    Class data pools on abused and neglected children and their families are found desirable for program planning, but identification by name is of questionable value and possibly a dangerous invasion of civil liberties. (MS)

  6. Self-transcendent positive emotions increase spirituality through basic world assumptions.

    PubMed

    Van Cappellen, Patty; Saroglou, Vassilis; Iweins, Caroline; Piovesana, Maria; Fredrickson, Barbara L

    2013-01-01

    Spirituality has mostly been studied in psychology as implied in the process of overcoming adversity, being triggered by negative experiences, and providing positive outcomes. By reversing this pathway, we investigated whether spirituality may also be triggered by self-transcendent positive emotions, which are elicited by stimuli appraised as demonstrating higher good and beauty. In two studies, elevation and/or admiration were induced using different methods. These emotions were compared to two control groups, a neutral state and a positive emotion (mirth). Self-transcendent positive emotions increased participants' spirituality (Studies 1 and 2), especially for the non-religious participants (Study 1). Two basic world assumptions, i.e., belief in life as meaningful (Study 1) and in the benevolence of others and the world (Study 2) mediated the effect of these emotions on spirituality. Spirituality should be understood not only as a coping strategy, but also as an upward spiralling pathway to and from self-transcendent positive emotions.

  7. Exploring super-Gaussianity toward robust information-theoretical time delay estimation.

    PubMed

    Petsatodis, Theodoros; Talantzis, Fotios; Boukis, Christos; Tan, Zheng-Hua; Prasad, Ramjee

    2013-03-01

    Time delay estimation (TDE) is a fundamental component of speaker localization and tracking algorithms. Most of the existing systems are based on the generalized cross-correlation method assuming gaussianity of the source. It has been shown that the distribution of speech, captured with far-field microphones, is highly varying, depending on the noise and reverberation conditions. Thus the performance of TDE is expected to fluctuate depending on the underlying assumption for the speech distribution, being also subject to multi-path reflections and competitive background noise. This paper investigates the effect upon TDE when modeling the source signal with different speech-based distributions. An information theoretical TDE method indirectly encapsulating higher order statistics (HOS) formed the basis of this work. The underlying assumption of Gaussian distributed source has been replaced by that of generalized Gaussian distribution that allows evaluating the problem under a larger set of speech-shaped distributions, ranging from Gaussian to Laplacian and Gamma. Closed forms of the univariate and multivariate entropy expressions of the generalized Gaussian distribution are derived to evaluate the TDE. The results indicate that TDE based on the specific criterion is independent of the underlying assumption for the distribution of the source, for the same covariance matrix.

  8. NMR studies of excluded volume interactions in peptide dendrimers.

    PubMed

    Sheveleva, Nadezhda N; Markelov, Denis A; Vovk, Mikhail A; Mikhailova, Maria E; Tarasenko, Irina I; Neelov, Igor M; Lähderanta, Erkki

    2018-06-11

    Peptide dendrimers are good candidates for diverse biomedical applications due to their biocompatibility and low toxicity. The local orientational mobility of groups with different radial localization inside dendrimers is important characteristic for drug and gene delivery, synthesis of nanoparticles, and other specific purposes. In this paper we focus on the validation of two theoretical assumptions for dendrimers: (i) independence of NMR relaxations on excluded volume effects and (ii) similarity of mobilities of side and terminal segments of dendrimers. For this purpose we study 1 H NMR spin-lattice relaxation time, T 1H , of two similar peptide dendrimers of the second generation, with and without side fragments in their inner segments. Temperature dependences of 1/T 1H in the temperature range from 283 to 343 K were measured for inner and terminal groups of the dendrimers dissolved in deuterated water. We have shown that the 1/T 1H temperature dependences of inner groups for both dendrimers (with and without side fragments) practically coincide despite different densities of atoms inside these dendrimers. This result confirms the first theoretical assumption. The second assumption is confirmed by the 1/T 1H temperature dependences of terminal groups which are similar for both dendrimers.

  9. Systematicity and a Categorical Theory of Cognitive Architecture: Universal Construction in Context

    PubMed Central

    Phillips, Steven; Wilson, William H.

    2016-01-01

    Why does the capacity to think certain thoughts imply the capacity to think certain other, structurally related, thoughts? Despite decades of intensive debate, cognitive scientists have yet to reach a consensus on an explanation for this property of cognitive architecture—the basic processes and modes of composition that together afford cognitive capacity—called systematicity. Systematicity is generally considered to involve a capacity to represent/process common structural relations among the equivalently cognizable entities. However, the predominant theoretical approaches to the systematicity problem, i.e., classical (symbolic) and connectionist (subsymbolic), require arbitrary (ad hoc) assumptions to derive systematicity. That is, their core principles and assumptions do not provide the necessary and sufficient conditions from which systematicity follows, as required of a causal theory. Hence, these approaches fail to fully explain why systematicity is a (near) universal property of human cognition, albeit in restricted contexts. We review an alternative, category theory approach to the systematicity problem. As a mathematical theory of structure, category theory provides necessary and sufficient conditions for systematicity in the form of universal construction: each systematically related cognitive capacity is composed of a common component and a unique component. Moreover, every universal construction can be viewed as the optimal construction in the given context (category). From this view, universal constructions are derived from learning, as an optimization. The ultimate challenge, then, is to explain the determination of context. If context is a category, then a natural extension toward addressing this question is higher-order category theory, where categories themselves are the objects of construction. PMID:27524975

  10. A strategy for detecting the conservation of folding-nucleus residues in protein superfamilies.

    PubMed

    Michnick, S W; Shakhnovich, E

    1998-01-01

    Nucleation-growth theory predicts that fast-folding peptide sequences fold to their native structure via structures in a transition-state ensemble that share a small number of native contacts (the folding nucleus). Experimental and theoretical studies of proteins suggest that residues participating in folding nuclei are conserved among homologs. We attempted to determine if this is true in proteins with highly diverged sequences but identical folds (superfamilies). We describe a strategy based on comparisons of residue conservation in natural superfamily sequences with simulated sequences (generated with a Monte-Carlo sequence design strategy) for the same proteins. The basic assumptions of the strategy were that natural sequences will conserve residues needed for folding and stability plus function, the simulated sequences contain no functional conservation, and nucleus residues make native contacts with each other. Based on these assumptions, we identified seven potential nucleus residues in ubiquitin superfamily members. Non-nucleus conserved residues were also identified; these are proposed to be involved in stabilizing native interactions. We found that all superfamily members conserved the same potential nucleus residue positions, except those for which the structural topology is significantly different. Our results suggest that the conservation of the nucleus of a specific fold can be predicted by comparing designed simulated sequences with natural highly diverged sequences that fold to the same structure. We suggest that such a strategy could be used to help plan protein folding and design experiments, to identify new superfamily members, and to subdivide superfamilies further into classes having a similar folding mechanism.

  11. Sliding friction between polymer surfaces: A molecular interpretation

    NASA Astrophysics Data System (ADS)

    Allegra, Giuseppe; Raos, Guido

    2006-04-01

    For two contacting rigid bodies, the friction force F is proportional to the normal load and independent of the macroscopic contact area and relative velocity V (Amonton's law). With two mutually sliding polymer samples, the surface irregularities transmit deformation to the underlying material. Energy loss along the deformation cycles is responsible for the friction force, which now appears to depend strongly on V [see, e.g., N. Maeda et al., Science 297, 379 (2002)]. We base our theoretical interpretation on the assumption that polymer chains are mainly subjected to oscillatory "reptation" along their "tubes." At high deformation frequencies—i.e., with a large sliding velocity V—the internal viscosity due to the rotational energy barriers around chain bonds hinders intramolecular mobility. As a result, energy dissipation and the correlated friction force strongly diminish at large V. Derived from a linear differential equation for chain dynamics, our results are basically consistent with the experimental data by Maeda et al. [Science 297, 379 (2002)] on modified polystyrene. Although the bulk polymer is below Tg, we regard the first few chain layers below the surface to be in the liquid state. In particular, the observed maximum of F vs V is consistent with physically reasonable values of the molecular parameters. As a general result, the ratio F /V is a steadily decreasing function of V, tending to V-2 for large velocities. We evaluate a much smaller friction for a cross-linked polymer under the assumption that the junctions are effectively immobile, also in agreement with the experimental results of Maeda et al. [Science 297, 379 (2002)].

  12. Systematicity and a Categorical Theory of Cognitive Architecture: Universal Construction in Context.

    PubMed

    Phillips, Steven; Wilson, William H

    2016-01-01

    Why does the capacity to think certain thoughts imply the capacity to think certain other, structurally related, thoughts? Despite decades of intensive debate, cognitive scientists have yet to reach a consensus on an explanation for this property of cognitive architecture-the basic processes and modes of composition that together afford cognitive capacity-called systematicity. Systematicity is generally considered to involve a capacity to represent/process common structural relations among the equivalently cognizable entities. However, the predominant theoretical approaches to the systematicity problem, i.e., classical (symbolic) and connectionist (subsymbolic), require arbitrary (ad hoc) assumptions to derive systematicity. That is, their core principles and assumptions do not provide the necessary and sufficient conditions from which systematicity follows, as required of a causal theory. Hence, these approaches fail to fully explain why systematicity is a (near) universal property of human cognition, albeit in restricted contexts. We review an alternative, category theory approach to the systematicity problem. As a mathematical theory of structure, category theory provides necessary and sufficient conditions for systematicity in the form of universal construction: each systematically related cognitive capacity is composed of a common component and a unique component. Moreover, every universal construction can be viewed as the optimal construction in the given context (category). From this view, universal constructions are derived from learning, as an optimization. The ultimate challenge, then, is to explain the determination of context. If context is a category, then a natural extension toward addressing this question is higher-order category theory, where categories themselves are the objects of construction.

  13. How Does The Climate Change?

    NASA Astrophysics Data System (ADS)

    Jones, R. N.

    2011-12-01

    In 1997, maximum temperature in SE Australia shifted up by 0.8°C at pH0<0.01. Rainfall decreased by 13% in 1997-2010 compared to 1900-1996. Statistically significant shifts also occur in impact indicators: baumé levels in winegrapes shift >21 days earlier from 1998, streamflow records decrease by 30-70% from 1997 and annual mean forest fire danger index increased by 38% from 1997. Despite catastrophic fires killing 178 people in early 2009, the public remains unaware of this large change in their exposure. When regional temperature was separated into internally and externally forced components, the latter component was found to warm in two steps, in 1968-73 and 1997. These dates coincide with shifts in zonal mean temperature (24-44S; Figure 1). Climate model output shows similar step and trend behavior. Tests run on zonal, hemispheric and global mean temperature observations found shifts in all regions. 1997 marks a shift in global temperature of 0.3°C at pH0<0.01. Similar shifts occur in long-term tide gauge records around the globe (e.g., Figure 2) and in ocean heat content. The prevailing paradigm for how climate variables change is signal-noise construct combining a smooth signal with variations caused by internal climate variability. There seems to be no sound theoretical basis for this assumption. On the contrary, complex system behavior would suggest non-linear responses to externally forced change, especially at the regional scale. Some of our most basic assumptions about how climate changes may need to be re-examined.

  14. Interpretative potential of dental metrics for biodistance analysis in hunter-gatherers from central Argentina. A theoretical-methodological approach.

    PubMed

    Luna, L H

    2015-10-01

    The use of dental metrics as a reliable tool for the assessment of biological distances has diversified dramatically in the last decades. In this paper some of the basic assumptions on this issue and the potential of cervical measurements in biodistance protocols are discussed. A sample of 1173 permanent teeth from 57 male and female individuals, recovered in Chenque I site (western Pampas, central Argentina), a Late Holocene hunter-gatherer cemetery, is examined in order to test the impact of exogenous factors that may have influenced the phenotypic manifestation and affected dental crown sizes. The statistical association between dental metric data, obtained by measuring the mesiodistal and buccolingual diameters of the crown and cervix, and the quantification of hypoplastic defects as a measure to evaluate the influence of the environment in the dental phenotypic expression is evaluated. The results show that socioenvironmental stress did not affect dental metrics and that only the more stable teeth (first incisors, canines, first premolars and first molars) and three variables (buccolingual diameter of the crown and both mesiodistal and buccolingual measurements of the cervix) should be included in multivariate analyses. These suggestions must be strengthened with additional studies of other regional samples to identify factors of variation among populations, so as to develop general guidelines for dental survey and biodistance analysis, but they are a first step for discussing assumptions usually used and maximizing the available information for low-density hunter-gatherer societies. Copyright © 2015 Elsevier GmbH. All rights reserved.

  15. Velocity Measurement by Scattering from Index of Refraction Fluctuations Induced in Turbulent Flows

    NASA Technical Reports Server (NTRS)

    Lading, Lars; Saffman, Mark; Edwards, Robert

    1996-01-01

    Induced phase screen scattering is defined as scatter light from a weak index of refraction fluctuations induced by turbulence. The basic assumptions and requirements for induced phase screen scattering, including scale requirements, are presented.

  16. Undergraduate Cross Registration.

    ERIC Educational Resources Information Center

    Grupe, Fritz H.

    This report discusses various aspects of undergraduate cross-registration procedures, including the dimensions, values, roles and functions, basic assumptions, and facilitating and encouragment of cross-registration. Dimensions of cross-registration encompass financial exchange, eligibility, program limitations, type of grade and credit; extent of…

  17. The Peace Movement: An Exercise in Micro-Macro Linkages.

    ERIC Educational Resources Information Center

    Galtung, Johan

    1988-01-01

    Contends that the basic assumption of the peace movement is the abuse of military power by the state. Argues that the peace movement is most effective through linkages with cultural, political, and economic forces in society. (BSR)

  18. Graduate Education in Psychology: A Comment on Rogers' Passionate Statement

    ERIC Educational Resources Information Center

    Brown, Robert C., Jr.; Tedeschi, James T.

    1972-01-01

    Authors' hope that this critical evaluation can place Carl Rogers' assumptions into perspective; they propose a compromise program meant to satisfy the basic aims of a humanistic psychology program. For Rogers' rejoinder see AA 512 869. (MB)

  19. Waveform model for an eccentric binary black hole based on the effective-one-body-numerical-relativity formalism

    NASA Astrophysics Data System (ADS)

    Cao, Zhoujian; Han, Wen-Biao

    2017-08-01

    Binary black hole systems are among the most important sources for gravitational wave detection. They are also good objects for theoretical research for general relativity. A gravitational waveform template is important to data analysis. An effective-one-body-numerical-relativity (EOBNR) model has played an essential role in the LIGO data analysis. For future space-based gravitational wave detection, many binary systems will admit a somewhat orbit eccentricity. At the same time, the eccentric binary is also an interesting topic for theoretical study in general relativity. In this paper, we construct the first eccentric binary waveform model based on an effective-one-body-numerical-relativity framework. Our basic assumption in the model construction is that the involved eccentricity is small. We have compared our eccentric EOBNR model to the circular one used in the LIGO data analysis. We have also tested our eccentric EOBNR model against another recently proposed eccentric binary waveform model; against numerical relativity simulation results; and against perturbation approximation results for extreme mass ratio binary systems. Compared to numerical relativity simulations with an eccentricity as large as about 0.2, the overlap factor for our eccentric EOBNR model is better than 0.98 for all tested cases, including spinless binary and spinning binary, equal mass binary, and unequal mass binary. Hopefully, our eccentric model can be the starting point to develop a faithful template for future space-based gravitational wave detectors.

  20. Anastasia Might Still Be Alive, But the Monarchy Is Dead.

    ERIC Educational Resources Information Center

    Eisner, Elliot W.

    1983-01-01

    Criticizes the previous article on positivism in educational thought by Denis Phillips. Takes issue with Phillips' assumption that, at the base of theoretical disputes and inquiry, there exists a final and absolute truth. (GC)

  1. The zoom lens of attention: Simulating shuffled versus normal text reading using the SWIFT model

    PubMed Central

    Schad, Daniel J.; Engbert, Ralf

    2012-01-01

    Assumptions on the allocation of attention during reading are crucial for theoretical models of eye guidance. The zoom lens model of attention postulates that attentional deployment can vary from a sharp focus to a broad window. The model is closely related to the foveal load hypothesis, i.e., the assumption that the perceptual span is modulated by the difficulty of the fixated word. However, these important theoretical concepts for cognitive research have not been tested quantitatively in eye movement models. Here we show that the zoom lens model, implemented in the SWIFT model of saccade generation, captures many important patterns of eye movements. We compared the model's performance to experimental data from normal and shuffled text reading. Our results demonstrate that the zoom lens of attention might be an important concept for eye movement control in reading. PMID:22754295

  2. Coherent structures in turbulence and Prandtl's mixing length theory (27th Ludwig Prandtl Memorial Lecture)

    NASA Astrophysics Data System (ADS)

    Landahl, M. T.

    1984-08-01

    The fundamental ideas behind Prandtl's famous mixing length theory are discussed in the light of newer findings from experimental and theoretical research on coherent turbulence structures in the region near solid walls. A simple theoretical model for 'flat' structures is used to examine the fundamental assumptions behind Prandtl's theory. The model is validated by comparisons with conditionally sampled velocity data obtained in recent channel flow experiments. Particular attention is given to the role of pressure fluctuations on the evolution of flat eddies. The validity of Prandtl's assumption that an element of fluid retains its streamwise momentum as it is moved around by turbulence is confirmed for flat eddies. It is demonstrated that spanwise pressure gradients give rise to a contribution to the vertical displacement of a fluid element which is proportional to the distance from the wall. This contribution is particularly important for eddies that are highly elongated in the streamwise direction.

  3. Family learning research in museums: An emerging disciplinary matrix?

    NASA Astrophysics Data System (ADS)

    Ellenbogen, Kirsten M.; Luke, Jessica J.; Dierking, Lynn D.

    2004-07-01

    Thomas Kuhn's notion of a disciplinary matrix provides a useful framework for investigating the growth of research on family learning in and from museums over the last decade. To track the emergence of this disciplinary matrix we consider three issues. First are shifting theoretical perspectives that result in new shared language, beliefs, values, understandings, and assumptions about what counts as family learning. Second are realigning methodologies, driven by underlying disciplinary assumptions about how research in this arena is best conducted, what questions should be addressed, and criteria for valid and reliable evidence. Third is resituating the focus of our research to make the family central to what we study, reflecting a more holistic understanding of the family as an educational institution within larger learning infrastructure. We discuss research that exemplifies these three issues and demonstrates the ways in which shifting theoretical perspectives, realigning methodologies, and resituating research foci signal the existence of a nascent disciplinary matrix.

  4. SW-846 Test Method 1340: In Vitro Bioaccessibility Assay for Lead in Soil

    EPA Pesticide Factsheets

    Describes assay procedures written on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  5. Theoretical astrophysics in the 19th century (Homage to Radó von Kövesligethy)

    NASA Astrophysics Data System (ADS)

    Balázs, Lajos G.

    The nature of astronomical information is determined mostly by the incoming light. Theoretical astrophysics means basically the theory of light emission and its relation to the physical constitution of the emitting celestial bodies. The necessary physical disciplines include theory of gravitation, theory of radiation, thermodynamics, matter--radiation interaction. The most significant theoretical achievement in the 17th - 18th century was the axiomatic foundation of mechanics and the law of gravitation. In the context of the nature of light, there were two conceptions: Newton contra Huygens, i.e. particle versus wave phenomenon. Using the theory of gravitation, first speculations appeared on black holes (Michell, Laplace), cosmogony (Kant-Laplace theory), the structure of the Milky Way (Kant), and the explanation of motion of the celestial bodies. The Olbers Paradox, formulated in the 19th century, is still one of the most significant constraints on observational cosmology. The development of thermodynamics, matter-radiation interaction, development of the theory of electromagnetism became important milestones. Maxwell's theory was the classical framework of the interaction between matter and radiation. Kirchhoff and Bunsen's revolutionary discovery of spectral analysis (1859) showed that observation of spectra makes it possible to study the chemical composition of emitting bodies. Thermodynamics predicted the existence of the black body radiation. It did not succeed, however, to determine the functional form of the wavelength dependence. A combination of the thermodynamic equation of state with the equation of hydrostatics resulted in the first stellar models (Lane, Ritter, Schuster). The first successful spectral equation of black body radiation was the theory of continuous spectra of celestial bodies by Radó von Kövesligethy (published 1885 in Hungarian, 1890 in German). Kövesligethy made several assumptions on the matter-radiation interaction: radiating matter consists of interacting particles, the form of interaction is an inverse power law, the radiation field is represented by the aether, aether is made also from interacting particles, light is the propagation of the oscillation of the aether particles, there is an equipartition between the oscillations energy of material and aetheric particles. Based on these assumptions, he derived a spectral equation with the following properties: the spectral distribution of radiation depends only on the temperature, the total irradiated energy is finite (15 years before Planck!), the wavelength of the intensity maximum is inversely proportional to the temperature (eight years before Wien!). Using his spectral equation, he estimated the temperature of several celestial bodies, including the Sun.

  6. Flight Measurements of Helicopter Blade Motion with a Comparison between Theoretical and Experimental Results

    DTIC Science & Technology

    1947-04-01

    EXPERS5EHTAL RESULTS By G-arry C. Myers, Jr. STHMÄRY Hi order to provide " basic data on helicopter rotor-"blade motion, photographic .records of...ABOUT 2HE AXIS OF NO FEATHERING Reason for conversion.- At the time that the " basic theoretical treatments, such as that of reference 1, were made...of the machanical means used for achieving it. This fact may be confirmed by inspection but has also been demonstrated mathematically in reference

  7. Theoretical and experimental study of a fiber optic microphone

    NASA Technical Reports Server (NTRS)

    Hu, Andong; Cuomo, Frank W.; Zuckerwar, Allan J.

    1992-01-01

    Modifications to condenser microphone theory yield new expressions for the membrane deflections at its center, which provide the basic theory for the fiber optic microphone. The theoretical analysis for the membrane amplitude and the phase response of the fiber optic microphone is given in detail in terms of its basic geometrical quantities. A relevant extension to the original concepts of the optical microphone includes the addition of a backplate with holes similar in design to present condenser microphone technology. This approach generates improved damping characteristics and extended frequency response that were not previously considered. The construction and testing of the improved optical fiber microphone provide experimental data that are in good agreement with the theoretical analysis.

  8. Ethics and managed care.

    PubMed

    Perkel, R L

    1996-03-01

    Managed care presents physicians with potential ethical dilemmas different from dilemmas in traditional fee-for-service practice. The ethical assumptions of managed care are explored, with special attention to the evolving dual responsibilities of physicians as patient advocates and as entrepreneurs. A number of proposals are described that delineate issues in support of and in opposition to managed care. Through an understanding of how to apply basic ethics principles to managed care participation, physicians may yet hold on to the basic ethic of the fiduciary doctor-patient relationship.

  9. Development of a Multiple Linear Regression Model to Forecast Facility Electrical Consumption at an Air Force Base.

    DTIC Science & Technology

    1981-09-01

    corresponds to the same square footage that consumed the electrical energy. 3. The basic assumptions of multiple linear regres- sion, as enumerated in...7. Data related to the sample of bases is assumed to be representative of bases in the population. Limitations Basic limitations on this research were... Ratemaking --Overview. Rand Report R-5894, Santa Monica CA, May 1977. Chatterjee, Samprit, and Bertram Price. Regression Analysis by Example. New York: John

  10. Hazards and occupational risk in hard coal mines - a critical analysis of legal requirements

    NASA Astrophysics Data System (ADS)

    Krause, Marcin

    2017-11-01

    This publication concerns the problems of occupational safety and health in hard coal mines, the basic elements of which are the mining hazards and the occupational risk. The work includes a comparative analysis of selected provisions of general and industry-specific law regarding the analysis of hazards and occupational risk assessment. Based on a critical analysis of legal requirements, basic assumptions regarding the practical guidelines for occupational risk assessment in underground coal mines have been proposed.

  11. Basic Relationships among Scale, Quality, and Benefits in Sino-Foreign Cooperative Education

    ERIC Educational Resources Information Center

    Lin, Jinhui

    2016-01-01

    The basic relationships among scale, quality, and benefits in Sino-foreign cooperative education are key to the development of cooperative education. It is necessary to construct a theoretical framework for the basic relationships among scale, quality, and benefits in Sino-foreign cooperative education and analyze the questions faced in…

  12. NMR properties of 3He-A in biaxially anisotropic aerogel

    NASA Astrophysics Data System (ADS)

    Dmitriev, V. V.; Krasnikhin, D. A.; Senin, A. A.; Yudin, A. N.

    2012-12-01

    Theoretical model of G.E. Volovik for A-like phase of 3He in aerogel suggests formation of Larkin-Imry-Ma state of Anderson-Brinkmann-Morel order parameter. Most of results of NMR studies of A-like phase are in a good agreement with this model in assumption of uniaxial anisotropy, except for some of experiments in weakly anisotropic aerogel samples. We demonstrate that these results can be described in frames of the same model in assumption of biaxial anisotropy. Parameters of anisotropy in these experiments can be determined from the NMR data.

  13. Derivation of an applied nonlinear Schroedinger equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pitts, Todd Alan; Laine, Mark Richard; Schwarz, Jens

    We derive from first principles a mathematical physics model useful for understanding nonlinear optical propagation (including filamentation). All assumptions necessary for the development are clearly explained. We include the Kerr effect, Raman scattering, and ionization (as well as linear and nonlinear shock, diffraction and dispersion). We explain the phenomenological sub-models and each assumption required to arrive at a complete and consistent theoretical description. The development includes the relationship between shock and ionization and demonstrates why inclusion of Drude model impedance effects alters the nature of the shock operator. Unclassified Unlimited Release

  14. A Basic Literacy Project for the Correctional Service of Canada: Curriculum Design as a Strategy for Staff Development.

    ERIC Educational Resources Information Center

    Collins, Michael

    1989-01-01

    Describes a Canadian curriculum development project; analyzes underlying policy assumptions. Advocates involvement of prison educators and inmates in the process if curriculum is to meet the educational needs of inmates. (Author/LAM)

  15. Computer Applications in Teaching and Learning.

    ERIC Educational Resources Information Center

    Halley, Fred S.; And Others

    Some examples of the usage of computers in teaching and learning are examination generation, automatic exam grading, student tracking, problem generation, computational examination generators, program packages, simulation, and programing skills for problem solving. These applications are non-trivial and do fulfill the basic assumptions necessary…

  16. Probabilistic Simulation of Territorial Seismic Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baratta, Alessandro; Corbi, Ileana

    2008-07-08

    The paper is focused on a stochastic process for the prevision of seismic scenarios on the territory and developed by means of some basic assumptions in the procedure and by elaborating the fundamental parameters recorded during some ground motions occurred in a seismic area.

  17. Elements of a Research Report.

    ERIC Educational Resources Information Center

    Schurter, William J.

    This guide for writing research or technical reports discusses eleven basic elements of such reports and provides examples of "good" and "bad" wordings. These elements are the title, problem statement, purpose statement, need statement, hypothesis, assumptions, procedures, limitations, terminology, conclusion and recommendations. This guide is…

  18. The Estimation Theory Framework of Data Assimilation

    NASA Technical Reports Server (NTRS)

    Cohn, S.; Atlas, Robert (Technical Monitor)

    2002-01-01

    Lecture 1. The Estimation Theory Framework of Data Assimilation: 1. The basic framework: dynamical and observation models; 2. Assumptions and approximations; 3. The filtering, smoothing, and prediction problems; 4. Discrete Kalman filter and smoother algorithms; and 5. Example: A retrospective data assimilation system

  19. Inhibition of return: A phenomenon in search of a definition and a theoretical framework.

    PubMed

    Dukewich, Kristie R; Klein, Raymond M

    2015-07-01

    In a study of scientific nomenclature, we explore the diversity of perspectives researchers endorse for the phenomenon of inhibition of return (IOR). IOR is often described as an effect whereby people are slower to respond to a target presented at a recently stimulated or inspected location as compared to a target presented at a new location. Since its discovery, scores of papers have been published on IOR, and researchers have proposed, accepted and rejected a variety of potential causes, mechanisms, effects and components for the phenomenon. Experts in IOR were surveyed about their opinions regarding various aspects of IOR and the literature exploring it. We found variety both between and within experts surveyed, suggesting that most researchers hold implicit, and often quite unique assumptions about IOR. These widely varied assumptions may be hindering the creation or acceptance of a central theoretical framework regarding IOR; and this variety may portend that what has been given the label "IOR" may be more than one phenomenon requiring more than one theoretical explanation. We wonder whether scientific progress in domains other than IOR might be affected by too broad (or perhaps too narrow) a range of phenomena to which our nomenclature is applied.

  20. Using a matrix-analytical approach to synthesizing evidence solved incompatibility problem in the hierarchy of evidence.

    PubMed

    Walach, Harald; Loef, Martin

    2015-11-01

    The hierarchy of evidence presupposes linearity and additivity of effects, as well as commutativity of knowledge structures. It thereby implicitly assumes a classical theoretical model. This is an argumentative article that uses theoretical analysis based on pertinent literature and known facts to examine the standard view of methodology. We show that the assumptions of the hierarchical model are wrong. The knowledge structures gained by various types of studies are not sequentially indifferent, that is, do not commute. External validity and internal validity are at least partially incompatible concepts. Therefore, one needs a different theoretical structure, typical of quantum-type theories, to model this situation. The consequence of this situation is that the implicit assumptions of the hierarchical model are wrong, if generalized to the concept of evidence in total. The problem can be solved by using a matrix-analytical approach to synthesizing evidence. Here, research methods that produce different types of evidence that complement each other are synthesized to yield the full knowledge. We show by an example how this might work. We conclude that the hierarchical model should be complemented by a broader reasoning in methodology. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Relationship between Organizational Culture and the Use of Psychotropic Medicines in Nursing Homes: A Systematic Integrative Review.

    PubMed

    Sawan, Mouna; Jeon, Yun-Hee; Chen, Timothy F

    2018-03-01

    Psychotropic medicines are commonly used in nursing homes, despite marginal clinical benefits and association with harm in the elderly. Organizational culture is proposed as a factor explaining the high-level use of psychotropic medicines. Schein describes three levels of culture: artifacts, espoused values, and basic assumptions. This integrative review aimed to investigate the facets and role of organizational culture in the use of psychotropic medicines in nursing homes. Five databases were searched for qualitative, quantitative, and mixed method empirical studies up to 13 February 2017. Articles were included if they examined an aspect of organizational culture according to Schein's theory and the use of psychotropic medicines in nursing homes for the management of behavioral and sleep disturbances in residents. Article screening and data extraction were performed independently by one reviewer and checked by the research team. The integrative review method, an approach similar to the method of constant comparison analysis was utilized for data analysis. Twenty-four studies met the inclusion criteria: 13 used quantitative methods, 9 used qualitative methods, 1 was quasi-qualitative, and 1 used mixed methods. Included studies were found to only address two aspects of organizational culture in relation to the use of psychotropic medicines: artifacts and espoused values. No studies addressed the basic assumptions, the unsaid taken-for-granted beliefs, which provide explanations for in/consistencies between the ideal use of psychotropic medicines and the actual use of psychotropic medicines. Previous studies suggest that organizational culture influences the use of psychotropic medicines in nursing homes; however, what is known is descriptive of culture only at the surface level, that is the artifacts and espoused values. Hence, future research that explains the impact of the basic assumptions of culture on the use of psychotropic medicines is important.

  2. Empathy for Carnivores

    DTIC Science & Technology

    2013-05-23

    this section. It helps to identify and remove cognitive biases and unseen assumptions. THEORETICAL TIES TO EMPATHY We had been hopelessly labouring ...attempts to gauge the satisfaction of future circumstances and their sustainability in light of the anticipated future system as a whole. In simulating his

  3. Using graph-based assessments within socratic tutorials to reveal and refine students' analytical thinking about molecular networks.

    PubMed

    Trujillo, Caleb; Cooper, Melanie M; Klymkowsky, Michael W

    2012-01-01

    Biological systems, from the molecular to the ecological, involve dynamic interaction networks. To examine student thinking about networks we used graphical responses, since they are easier to evaluate for implied, but unarticulated assumptions. Senior college level molecular biology students were presented with simple molecular level scenarios; surprisingly, most students failed to articulate the basic assumptions needed to generate reasonable graphical representations; their graphs often contradicted their explicit assumptions. We then developed a tiered Socratic tutorial based on leading questions designed to provoke metacognitive reflection. The activity is characterized by leading questions (prompts) designed to provoke meta-cognitive reflection. When applied in a group or individual setting, there was clear improvement in targeted areas. Our results highlight the promise of using graphical responses and Socratic prompts in a tutorial context as both a formative assessment for students and an informative feedback system for instructors, in part because graphical responses are relatively easy to evaluate for implied, but unarticulated assumptions. Copyright © 2011 Wiley Periodicals, Inc.

  4. A close examination of double filtering with fold change and t test in microarray analysis

    PubMed Central

    2009-01-01

    Background Many researchers use the double filtering procedure with fold change and t test to identify differentially expressed genes, in the hope that the double filtering will provide extra confidence in the results. Due to its simplicity, the double filtering procedure has been popular with applied researchers despite the development of more sophisticated methods. Results This paper, for the first time to our knowledge, provides theoretical insight on the drawback of the double filtering procedure. We show that fold change assumes all genes to have a common variance while t statistic assumes gene-specific variances. The two statistics are based on contradicting assumptions. Under the assumption that gene variances arise from a mixture of a common variance and gene-specific variances, we develop the theoretically most powerful likelihood ratio test statistic. We further demonstrate that the posterior inference based on a Bayesian mixture model and the widely used significance analysis of microarrays (SAM) statistic are better approximations to the likelihood ratio test than the double filtering procedure. Conclusion We demonstrate through hypothesis testing theory, simulation studies and real data examples, that well constructed shrinkage testing methods, which can be united under the mixture gene variance assumption, can considerably outperform the double filtering procedure. PMID:19995439

  5. Thinking in Arithmetic.

    ERIC Educational Resources Information Center

    Resnick, Lauren B.; And Others

    This paper discusses a radically different set of assumptions to improve educational outcomes for disadvantaged students. It is argued that disadvantaged children, when exposed to carefully organized thinking-oriented instruction, can acquire the traditional basic skills in the process of reasoning and solving problems. The paper is presented in…

  6. Measurement of Inequality: The Gini Coefficient and School Finance Studies.

    ERIC Educational Resources Information Center

    Lows, Raymond L.

    1984-01-01

    Discusses application of the "Lorenz Curve" (a graphical representation of the concentration of wealth) with the "Gini Coefficient" (an index of inequality) to measure social inequality in school finance studies. Examines the basic assumptions of these measures and suggests a minor reconception. (MCG)

  7. Beyond the Virtues-Principles Debate.

    ERIC Educational Resources Information Center

    Keat, Marilyn S.

    1992-01-01

    Indicates basic ontological assumptions in the virtues-principles debate in moral philosophy, noting Aristotle's and Kant's fundamental ideas about morality and considering a hermeneutic synthesis of theories. The article discusses what acceptance of the synthesis might mean in the theory and practice of moral pedagogy, offering examples of…

  8. The Structuring Principle: Political Socialization and Belief Systems

    ERIC Educational Resources Information Center

    Searing, Donald D.; And Others

    1973-01-01

    Assesses the significance of data on childhood political learning to political theory by testing the structuring principle,'' considered one of the central assumptions of political socialization research. This principle asserts that basic orientations acquired during childhood structure the later learning of specific issue beliefs.'' The…

  9. The Experience of Disability.

    ERIC Educational Resources Information Center

    Hastings, Elizabeth

    1981-01-01

    The author outlines the experiences of disability and demonstrates that generally unpleasant experiences are the direct result of a basic and false assumption on the part of society. Experiences of the disabled are discussed in areas the author categorizes as exclusion or segregation, deprivation, prejudice, poverty, frustration, and…

  10. Assessment of the Natural Environment.

    ERIC Educational Resources Information Center

    Cantrell, Mary Lynn; Cantrell, Robert P.

    1985-01-01

    Basic assumptions of an ecological-behavioral view of assessing behavior disordered students are reviewed along with a proposed method for ecological analysis and specific techniques for measuring ecological variables (such as environmental units, behaviors of significant others, and behavioral expectations). The use of such information in program…

  11. Sherlock Holmes as a Social Scientist.

    ERIC Educational Resources Information Center

    Ward, Veronica; Orbell, John

    1988-01-01

    Presents a way of teaching the scientific method through studying the adventures of Sherlock Holmes. Asserting that Sherlock Holmes used the scientific method to solve cases, the authors construct Holmes' method through excerpts from novels featuring his adventures. Discusses basic assumptions, paradigms, theory building, and testing. (SLM)

  12. Problems in Choosing a Theory of Basic Writing: Toward a Rhetoric of Scholarly Discourse.

    ERIC Educational Resources Information Center

    Bizzell, Patricia

    This paper discusses some of the problems faced in working with competing theories of basic writing and suggests its own kind of theoretical analysis of nonstandard writing. A brief overview of basic writing theories is presented, and the theories are categorized into two approaches: a traditional approach of teaching by prescription in an…

  13. Testing the basic assumption of the hydrogeomorphic approach to assessing wetland functions.

    PubMed

    Hruby, T

    2001-05-01

    The hydrogeomorphic (HGM) approach for developing "rapid" wetland function assessment methods stipulates that the variables used are to be scaled based on data collected at sites judged to be the best at performing the wetland functions (reference standard sites). A critical step in the process is to choose the least altered wetlands in a hydrogeomorphic subclass to use as a reference standard against which other wetlands are compared. The basic assumption made in this approach is that wetlands judged to have had the least human impact have the highest level of sustainable performance for all functions. The levels at which functions are performed in these least altered wetlands are assumed to be "characteristic" for the subclass and "sustainable." Results from data collected in wetlands in the lowlands of western Washington suggest that the assumption may not be appropriate for this region. Teams developing methods for assessing wetland functions did not find that the least altered wetlands in a subclass had a range of performance levels that could be identified as "characteristic" or "sustainable." Forty-four wetlands in four hydrogeomorphic subclasses (two depressional subclasses and two riverine subclasses) were rated by teams of experts on the severity of their human alterations and on the level of performance of 15 wetland functions. An ordinal scale of 1-5 was used to quantify alterations in water regime, soils, vegetation, buffers, and contributing basin. Performance of functions was judged on an ordinal scale of 1-7. Relatively unaltered wetlands were judged to perform individual functions at levels that spanned all of the seven possible ratings in all four subclasses. The basic assumption of the HGM approach, that the least altered wetlands represent "characteristic" and "sustainable" levels of functioning that are different from those found in altered wetlands, was not confirmed. Although the intent of the HGM approach is to use level of functioning as a metric to assess the ecological integrity or "health" of the wetland ecosystem, the metric does not seem to work in western Washington for that purpose.

  14. Transmission dynamics of Bacillus thuringiensis infecting Plodia interpunctella: a test of the mass action assumption with an insect pathogen.

    PubMed

    Knell, R J; Begon, M; Thompson, D J

    1996-01-22

    Central to theoretical studies of host-pathogen population dynamics is a term describing transmission of the pathogen. This usually assumes that transmission is proportional to the density of infectious hosts or particles and of susceptible individuals. We tested this assumption with the bacterial pathogen Bacillus thuringiensis infecting larvae of Plodia interpunctella, the Indian meal moth. Transmission was found to increase in a more than linear way with host density in fourth and fifth instar P. interpunctella, and to decrease with the density of infectious cadavers in the case of fifth instar larvae. Food availability was shown to play an important part in this process. Therefore, on a number of counts, the usual assumption was found not to apply in our experimental system.

  15. Concepts of magnetospheric convection

    NASA Technical Reports Server (NTRS)

    Vasyliunas, V. M.

    1975-01-01

    The paper describes the basic theoretical notions of convection applicable to magnetospheres in general and discusses the relative importance of convective and corrotational motions, with particular reference to the comparison of the earth and Jupiter. The basic equations relating the E, B, and J fields and the bulk plasma velocity are given for the three principal regions in magnetosphere dynamics, namely, the central object and its magnetic field, the space surrounding the central object, and the external medium outside the magnetosphere. The notion of driving currents of magnetospheric convection and their closure is explained, while consideration of the added effects of the rotation of the central body completes the basic theoretical picture. Flow topology is examined for the two cases where convection dominates over corotation and vice versa.

  16. Phase Transitions in Combinatorial Optimization Problems: Basics, Algorithms and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    Hartmann, Alexander K.; Weigt, Martin

    2005-10-01

    A concise, comprehensive introduction to the topic of statistical physics of combinatorial optimization, bringing together theoretical concepts and algorithms from computer science with analytical methods from physics. The result bridges the gap between statistical physics and combinatorial optimization, investigating problems taken from theoretical computing, such as the vertex-cover problem, with the concepts and methods of theoretical physics. The authors cover rapid developments and analytical methods that are both extremely complex and spread by word-of-mouth, providing all the necessary basics in required detail. Throughout, the algorithms are shown with examples and calculations, while the proofs are given in a way suitable for graduate students, post-docs, and researchers. Ideal for newcomers to this young, multidisciplinary field.

  17. Feminist Theories and Media Studies.

    ERIC Educational Resources Information Center

    Steeves, H. Leslie

    1987-01-01

    Discusses the assumptions that ground radical, liberal, and socialist feminist theoretical frameworks, and reviews feminist media research. Argues that liberal feminism speaks only to White, heterosexual, middle and upper class women and is incapable of addressing most women's concerns. Concludes that socialist feminism offers the greatest…

  18. Ethnographic/Qualitative Research: Theoretical Perspectives and Methodological Strategies.

    ERIC Educational Resources Information Center

    Butler, E. Dean

    This paper examines the metatheoretical concepts associated with ethnographic/qualitative educational inquiry and overviews the more commonly utilized research designs, data collection methods, and analytical approaches. The epistemological and ontological assumptions of this newer approach differ greatly from those of the traditional educational…

  19. Achievement Goal Orientations and Identity Formation Styles

    ERIC Educational Resources Information Center

    Kaplan, Avi; Flum, Hanoch

    2010-01-01

    The present article points to shared underlying theoretical assumptions and central processes of a prominent academic motivation perspective--achievement goal theory--and recent process perspectives in the identity formation literature, and more specifically, identity formation styles. The review highlights the shared definition of achievement…

  20. An Information Theoretic Analysis of Classification Sorting and Cognition by Ninth Grade Children within a Piagetian Setting.

    ERIC Educational Resources Information Center

    Dunlop, David Livingston

    The purpose of this study was to use an information theoretic memory model to quantitatively investigate classification sorting and recall behaviors of various groups of students. The model provided theorems for the determination of information theoretic measures from which inferences concerning mental processing were made. The basic procedure…

  1. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI): Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    PubMed Central

    Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie

    2017-01-01

    The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals. PMID:29180977

  2. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI): Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    PubMed

    Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie

    2017-01-01

    The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals.

  3. Aspects of fluency in writing.

    PubMed

    Uppstad, Per Henning; Solheim, Oddny Judith

    2007-03-01

    The notion of 'fluency' is most often associated with spoken-language phenomena such as stuttering. The present article investigates the relevance of considering fluency in writing. The basic argument for raising this question is empirical-it follows from a focus on difficulties in written and spoken language as manifestations of different problems which should be investigated separately on the basis of their symptoms. Key-logging instruments provide new possibilities for the study of writing. The obvious use of this new technology is to study writing as it unfolds in real time, instead of focusing only on aspects of the end product. A more sophisticated application is to exploit the key-logging instrument in order to test basic assumptions of contemporary theories of spelling. The present study is a dictation task involving words and non-words, intended to investigate spelling in nine-year-old pupils with regard to their mastery of the doubling of consonants in Norwegian. In this study, we report on differences with regard to temporal measures between a group of strong writers and a group of poor ones. On the basis of these pupils' writing behavior, the relevance of the concept of 'fluency' in writing is highlighted. The interpretation of the results questions basic assumptions of the cognitive hypothesis about spelling; the article concludes by hypothesizing a different conception of spelling.

  4. A New Framework for Cumulus Parametrization - A CPT in action

    NASA Astrophysics Data System (ADS)

    Jakob, C.; Peters, K.; Protat, A.; Kumar, V.

    2016-12-01

    The representation of convection in climate model remains a major Achilles Heel in our pursuit of better predictions of global and regional climate. The basic principle underpinning the parametrisation of tropical convection in global weather and climate models is that there exist discernible interactions between the resolved model scale and the parametrised cumulus scale. Furthermore, there must be at least some predictive power in the larger scales for the statistical behaviour on small scales for us to be able to formally close the parametrised equations. The presentation will discuss a new framework for cumulus parametrisation based on the idea of separating the prediction of cloud area from that of velocity. This idea is put into practice by combining an existing multi-scale stochastic cloud model with observations to arrive at the prediction of the area fraction for deep precipitating convection. Using mid-tropospheric humidity and vertical motion as predictors, the model is shown to reproduce the observed behaviour of both mean and variability of deep convective area fraction well. The framework allows for the inclusion of convective organisation and can - in principle - be made resolution-aware or resolution-independent. When combined with simple assumptions about cloud-base vertical motion the model can be used as a closure assumption in any existing cumulus parametrisation. Results of applying this idea in the the ECHAM model indicate significant improvements in the simulation of tropical variability, including but not limited to the MJO. This presentation will highlight how the close collaboration of the observational, theoretical and model development community in the spirit of the climate process teams can lead to significant progress in long-standing issues in climate modelling while preserving the freedom of individual groups in pursuing their specific implementation of an agreed framework.

  5. Are We Ready for Real-world Neuroscience?

    PubMed

    Matusz, Pawel J; Dikker, Suzanne; Huth, Alexander G; Perrodin, Catherine

    2018-06-19

    Real-world environments are typically dynamic, complex, and multisensory in nature and require the support of top-down attention and memory mechanisms for us to be able to drive a car, make a shopping list, or pour a cup of coffee. Fundamental principles of perception and functional brain organization have been established by research utilizing well-controlled but simplified paradigms with basic stimuli. The last 30 years ushered a revolution in computational power, brain mapping, and signal processing techniques. Drawing on those theoretical and methodological advances, over the years, research has departed more and more from traditional, rigorous, and well-understood paradigms to directly investigate cognitive functions and their underlying brain mechanisms in real-world environments. These investigations typically address the role of one or, more recently, multiple attributes of real-world environments. Fundamental assumptions about perception, attention, or brain functional organization have been challenged-by studies adapting the traditional paradigms to emulate, for example, the multisensory nature or varying relevance of stimulation or dynamically changing task demands. Here, we present the state of the field within the emerging heterogeneous domain of real-world neuroscience. To be precise, the aim of this Special Focus is to bring together a variety of the emerging "real-world neuroscientific" approaches. These approaches differ in their principal aims, assumptions, or even definitions of "real-world neuroscience" research. Here, we showcase the commonalities and distinctive features of the different "real-world neuroscience" approaches. To do so, four early-career researchers and the speakers of the Cognitive Neuroscience Society 2017 Meeting symposium under the same title answer questions pertaining to the added value of such approaches in bringing us closer to accurate models of functional brain organization and cognitive functions.

  6. Scale-dependent Normalized Amplitude and Weak Spectral Anisotropy of Magnetic Field Fluctuations in the Solar Wind Turbulence

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Tu, Chuanyi; Marsch, Eckart; He, Jiansen; Wang, Linghua

    2016-01-01

    Turbulence in the solar wind was recently reported to be anisotropic, with the average power spectral index close to -2 when sampling parallel to the local mean magnetic field B0 and close to -5/3 when sampling perpendicular to the local B0. This result was widely considered to be observational evidence for the critical balance theory (CBT), which is derived by making the assumption that the turbulence strength is close to one. However, this basic assumption has not yet been checked carefully with observational data. Here we present for the first time the scale-dependent magnetic-field fluctuation amplitude, which is normalized by the local B0 and evaluated for both parallel and perpendicular sampling directions, using two 30-day intervals of Ulysses data. From our results, the turbulence strength is evaluated as much less than one at small scales in the parallel direction. An even stricter criterion is imposed when selecting the wavelet coefficients for a given sampling direction, so that the time stationarity of the local B0 is better ensured during the local sampling interval. The spectral index for the parallel direction is then found to be -1.75, whereas the spectral index in the perpendicular direction remains close to -1.65. These two new results, namely that the value of the turbulence strength is much less than one in the parallel direction and that the angle dependence of the spectral index is weak, cannot be explained by existing turbulence theories, like CBT, and thus will require new theoretical considerations and promote further observations of solar-wind turbulence.

  7. Artifacts, assumptions, and ambiguity: Pitfalls in comparing experimental results to numerical simulations when studying electrical stimulation of the heart.

    PubMed

    Roth, Bradley J.

    2002-09-01

    Insidious experimental artifacts and invalid theoretical assumptions complicate the comparison of numerical predictions and observed data. Such difficulties are particularly troublesome when studying electrical stimulation of the heart. During unipolar stimulation of cardiac tissue, the artifacts include nonlinearity of membrane dyes, optical signals blocked by the stimulating electrode, averaging of optical signals with depth, lateral averaging of optical signals, limitations of the current source, and the use of excitation-contraction uncouplers. The assumptions involve electroporation, membrane models, electrode size, the perfusing bath, incorrect model parameters, the applicability of a continuum model, and tissue damage. Comparisons of theory and experiment during far-field stimulation are limited by many of these same factors, plus artifacts from plunge and epicardial recording electrodes and assumptions about the fiber angle at an insulating boundary. These pitfalls must be overcome in order to understand quantitatively how the heart responds to an electrical stimulus. (c) 2002 American Institute of Physics.

  8. New directions in evidence-based policy research: a critical analysis of the literature

    PubMed Central

    2014-01-01

    Despite 40 years of research into evidence-based policy (EBP) and a continued drive from both policymakers and researchers to increase research uptake in policy, barriers to the use of evidence are persistently identified in the literature. However, it is not clear what explains this persistence – whether they represent real factors, or if they are artefacts of approaches used to study EBP. Based on an updated review, this paper analyses this literature to explain persistent barriers and facilitators. We critically describe the literature in terms of its theoretical underpinnings, definitions of ‘evidence’, methods, and underlying assumptions of research in the field, and aim to illuminate the EBP discourse by comparison with approaches from other fields. Much of the research in this area is theoretically naive, focusing primarily on the uptake of research evidence as opposed to evidence defined more broadly, and privileging academics’ research priorities over those of policymakers. Little empirical data analysing the processes or impact of evidence use in policy is available to inform researchers or decision-makers. EBP research often assumes that policymakers do not use evidence and that more evidence – meaning research evidence – use would benefit policymakers and populations. We argue that these assumptions are unsupported, biasing much of EBP research. The agenda of ‘getting evidence into policy’ has side-lined the empirical description and analysis of how research and policy actually interact in vivo. Rather than asking how research evidence can be made more influential, academics should aim to understand what influences and constitutes policy, and produce more critically and theoretically informed studies of decision-making. We question the main assumptions made by EBP researchers, explore the implications of doing so, and propose new directions for EBP research, and health policy. PMID:25023520

  9. Helicopter Toy and Lift Estimation

    ERIC Educational Resources Information Center

    Shakerin, Said

    2013-01-01

    A $1 plastic helicopter toy (called a Wacky Whirler) can be used to demonstrate lift. Students can make basic measurements of the toy, use reasonable assumptions and, with the lift formula, estimate the lift, and verify that it is sufficient to overcome the toy's weight. (Contains 1 figure.)

  10. The Rural School Principalship: Unique Challenges, Opportunities.

    ERIC Educational Resources Information Center

    Hill, Jonathan

    1993-01-01

    Presents findings based on author's research and experience as principal in California's Mojave Desert. Five basic characteristics distinguish the rural principalship: lack of an assistant principal or other support staff; assumption of other duties, including central office tasks, teaching, or management of another site; less severe student…

  11. Teacher Education: Of the People, by the People, and for the People.

    ERIC Educational Resources Information Center

    Clinton, Hillary Rodham

    1985-01-01

    Effective inservice programs are necessary to ensure that current reforms in education are properly implemented. Inservice programs must meet the needs of both the educational system and educators. Six basic policy assumptions dealing with what is needed in inservice education are discussed. (DF)

  12. School Discipline Disproportionality: Culturally Competent Interventions for African American Males

    ERIC Educational Resources Information Center

    Simmons-Reed, Evette A.; Cartledge, Gwendolyn

    2014-01-01

    Exclusionary policies are practiced widely in schools despite being associated with extremely poor outcomes for culturally and linguistically diverse students, particularly African American males with and without disabilities. This article discusses zero tolerance policies, the related research questioning their basic assumptions, and the negative…

  13. Educational Evaluation: Analysis and Responsibility.

    ERIC Educational Resources Information Center

    Apple, Michael W., Ed.; And Others

    This book presents controversial aspects of evaluation and aims at broadening perspectives and insights in the evaluation field. Chapter 1 criticizes modes of evaluation and the basic rationality behind them and focuses on assumptions that have problematic consequences. Chapter 2 introduces concepts of evaluation and examines methods of grading…

  14. General Nature of Multicollinearity in Multiple Regression Analysis.

    ERIC Educational Resources Information Center

    Liu, Richard

    1981-01-01

    Discusses multiple regression, a very popular statistical technique in the field of education. One of the basic assumptions in regression analysis requires that independent variables in the equation should not be highly correlated. The problem of multicollinearity and some of the solutions to it are discussed. (Author)

  15. Feminism, Communication and the Politics of Knowledge.

    ERIC Educational Resources Information Center

    Gallagher, Margaret

    Recent retrieval of pre-nineteenth century feminist thought provides a telling lesson in the politics of knowledge creation and control. From a feminist perspective, very little research carried out within the critical research paradigm questions the "basic assumptions, conventional wisdom, media myths and the accepted way of doing…

  16. A Neo-Kohlbergian Approach to Morality Research.

    ERIC Educational Resources Information Center

    Rest, James R.; Narvaez, Darcia; Thoma, Stephen J.; Bebeau, Muriel J.

    2000-01-01

    Proposes a model of moral judgment that builds on Lawrence Kohlberg's core assumptions. Addresses the concerns that have surfaced related to Kohlberg's work in moral judgment. Presents an overview of this model using Kohlberg's basic starting points, ideas from cognitive science, and developments in moral philosophy. (CMK)

  17. Reconciling Time, Space and Function: A New Dorsal-Ventral Stream Model of Sentence Comprehension

    ERIC Educational Resources Information Center

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias

    2013-01-01

    We present a new dorsal-ventral stream framework for language comprehension which unifies basic neurobiological assumptions (Rauschecker & Scott, 2009) with a cross-linguistic neurocognitive sentence comprehension model (eADM; Bornkessel & Schlesewsky, 2006). The dissociation between (time-dependent) syntactic structure-building and…

  18. Qualitative Research in Counseling Psychology: Conceptual Foundations

    ERIC Educational Resources Information Center

    Morrow, Susan L.

    2007-01-01

    Beginning with calls for methodological diversity in counseling psychology, this article addresses the history and current state of qualitative research in counseling psychology. It identifies the historical and disciplinary origins as well as basic assumptions and underpinnings of qualitative research in general, as well as within counseling…

  19. Individual behavioral phenotypes: an integrative meta-theoretical framework. Why "behavioral syndromes" are not analogs of "personality".

    PubMed

    Uher, Jana

    2011-09-01

    Animal researchers are increasingly interested in individual differences in behavior. Their interpretation as meaningful differences in behavioral strategies stable over time and across contexts, adaptive, heritable, and acted upon by natural selection has triggered new theoretical developments. However, the analytical approaches used to explore behavioral data still address population-level phenomena, and statistical methods suitable to analyze individual behavior are rarely applied. I discuss fundamental investigative principles and analytical approaches to explore whether, in what ways, and under which conditions individual behavioral differences are actually meaningful. I elaborate the meta-theoretical ideas underlying common theoretical concepts and integrate them into an overarching meta-theoretical and methodological framework. This unravels commonalities and differences, and shows that assumptions of analogy to concepts of human personality are not always warranted and that some theoretical developments may be based on methodological artifacts. Yet, my results also highlight possible directions for new theoretical developments in animal behavior research. Copyright © 2011 Wiley Periodicals, Inc.

  20. On the phase space structure of IP3 induced Ca2+ signalling and concepts for predictive modeling

    NASA Astrophysics Data System (ADS)

    Falcke, Martin; Moein, Mahsa; TilÅ«naitÄ--, Agne; Thul, Rüdiger; Skupin, Alexander

    2018-04-01

    The correspondence between mathematical structures and experimental systems is the basis of the generalizability of results found with specific systems and is the basis of the predictive power of theoretical physics. While physicists have confidence in this correspondence, it is less recognized in cellular biophysics. On the one hand, the complex organization of cellular dynamics involving a plethora of interacting molecules and the basic observation of cell variability seem to question its possibility. The practical difficulties of deriving the equations describing cellular behaviour from first principles support these doubts. On the other hand, ignoring such a correspondence would severely limit the possibility of predictive quantitative theory in biophysics. Additionally, the existence of functional modules (like pathways) across cell types suggests also the existence of mathematical structures with comparable universality. Only a few cellular systems have been sufficiently investigated in a variety of cell types to follow up these basic questions. IP3 induced Ca2+signalling is one of them, and the mathematical structure corresponding to it is subject of ongoing discussion. We review the system's general properties observed in a variety of cell types. They are captured by a reaction diffusion system. We discuss the phase space structure of its local dynamics. The spiking regime corresponds to noisy excitability. Models focussing on different aspects can be derived starting from this phase space structure. We discuss how the initial assumptions on the set of stochastic variables and phase space structure shape the predictions of parameter dependencies of the mathematical models resulting from the derivation.

  1. Intergenerational resource transfers with random offspring numbers

    PubMed Central

    Arrow, Kenneth J.; Levin, Simon A.

    2009-01-01

    A problem common to biology and economics is the transfer of resources from parents to children. We consider the issue under the assumption that the number of offspring is unknown and can be represented as a random variable. There are 3 basic assumptions. The first assumption is that a given body of resources can be divided into consumption (yielding satisfaction) and transfer to children. The second assumption is that the parents' welfare includes a concern for the welfare of their children; this is recursive in the sense that the children's welfares include concern for their children and so forth. However, the welfare of a child from a given consumption is counted somewhat differently (generally less) than that of the parent (the welfare of a child is “discounted”). The third assumption is that resources transferred may grow (or decline). In economic language, investment, including that in education or nutrition, is productive. Under suitable restrictions, precise formulas for the resulting allocation of resources are found, demonstrating that, depending on the shape of the utility curve, uncertainty regarding the number of offspring may or may not favor increased consumption. The results imply that wealth (stock of resources) will ultimately have a log-normal distribution. PMID:19617553

  2. Telepresence for space: The state of the concept

    NASA Technical Reports Server (NTRS)

    Smith, Randy L.; Gillan, Douglas J.; Stuart, Mark A.

    1990-01-01

    The purpose here is to examine the concept of telepresence critically. To accomplish this goal, first, the assumptions that underlie telepresence and its applications are examined, and second, the issues raised by that examination are discussed. Also, these assumptions and issues are used as a means of shifting the focus in telepresence from development to user-based research. The most basic assumption of telepresence is that the information being provided to the human must be displayed in a natural fashion, i.e., the information should be displayed to the same human sensory modalities, and in the same fashion, as if the person where actually at the remote site. A further fundamental assumption for the functional use of telepresence is that a sense of being present in the work environment will produce superior performance. In other words, that sense of being there would allow the human operator of a distant machine to take greater advantage of his or her considerable perceptual, cognitive, and motor capabilities in the performance of a task than would more limited task-related feedback. Finally, a third fundamental assumption of functional telepresence is that the distant machine under the operator's control must substantially resemble a human in dexterity.

  3. Questionable assumptions hampered interpretation of a network meta-analysis of primary care depression treatments.

    PubMed

    Linde, Klaus; Rücker, Gerta; Schneider, Antonius; Kriston, Levente

    2016-03-01

    We aimed to evaluate the underlying assumptions of a network meta-analysis investigating which depression treatment works best in primary care and to highlight challenges and pitfalls of interpretation under consideration of these assumptions. We reviewed 100 randomized trials investigating pharmacologic and psychological treatments for primary care patients with depression. Network meta-analysis was carried out within a frequentist framework using response to treatment as outcome measure. Transitivity was assessed by epidemiologic judgment based on theoretical and empirical investigation of the distribution of trial characteristics across comparisons. Homogeneity and consistency were investigated by decomposing the Q statistic. There were important clinical and statistically significant differences between "pure" drug trials comparing pharmacologic substances with each other or placebo (63 trials) and trials including a psychological treatment arm (37 trials). Overall network meta-analysis produced results well comparable with separate meta-analyses of drug trials and psychological trials. Although the homogeneity and consistency assumptions were mostly met, we considered the transitivity assumption unjustifiable. An exchange of experience between reviewers and, if possible, some guidance on how reviewers addressing important clinical questions can proceed in situations where important assumptions for valid network meta-analysis are not met would be desirable. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Response time distributions in rapid chess: a large-scale decision making experiment.

    PubMed

    Sigman, Mariano; Etchemendy, Pablo; Slezak, Diego Fernández; Cecchi, Guillermo A

    2010-01-01

    Rapid chess provides an unparalleled laboratory to understand decision making in a natural environment. In a chess game, players choose consecutively around 40 moves in a finite time budget. The goodness of each choice can be determined quantitatively since current chess algorithms estimate precisely the value of a position. Web-based chess produces vast amounts of data, millions of decisions per day, incommensurable with traditional psychological experiments. We generated a database of response times (RTs) and position value in rapid chess games. We measured robust emergent statistical observables: (1) RT distributions are long-tailed and show qualitatively distinct forms at different stages of the game, (2) RT of successive moves are highly correlated both for intra- and inter-player moves. These findings have theoretical implications since they deny two basic assumptions of sequential decision making algorithms: RTs are not stationary and can not be generated by a state-function. Our results also have practical implications. First, we characterized the capacity of blunders and score fluctuations to predict a player strength, which is yet an open problem in chess softwares. Second, we show that the winning likelihood can be reliably estimated from a weighted combination of remaining times and position evaluation.

  5. Response Time Distributions in Rapid Chess: A Large-Scale Decision Making Experiment

    PubMed Central

    Sigman, Mariano; Etchemendy, Pablo; Slezak, Diego Fernández; Cecchi, Guillermo A.

    2010-01-01

    Rapid chess provides an unparalleled laboratory to understand decision making in a natural environment. In a chess game, players choose consecutively around 40 moves in a finite time budget. The goodness of each choice can be determined quantitatively since current chess algorithms estimate precisely the value of a position. Web-based chess produces vast amounts of data, millions of decisions per day, incommensurable with traditional psychological experiments. We generated a database of response times (RTs) and position value in rapid chess games. We measured robust emergent statistical observables: (1) RT distributions are long-tailed and show qualitatively distinct forms at different stages of the game, (2) RT of successive moves are highly correlated both for intra- and inter-player moves. These findings have theoretical implications since they deny two basic assumptions of sequential decision making algorithms: RTs are not stationary and can not be generated by a state-function. Our results also have practical implications. First, we characterized the capacity of blunders and score fluctuations to predict a player strength, which is yet an open problem in chess softwares. Second, we show that the winning likelihood can be reliably estimated from a weighted combination of remaining times and position evaluation. PMID:21031032

  6. Capillary Flow in Containers of Polygonal Section: Theory and Experiment

    NASA Technical Reports Server (NTRS)

    Weislogel, Mark M.; Rame, Enrique (Technical Monitor)

    2001-01-01

    An improved understanding of the large-length-scale capillary flows arising in a low-gravity environment is critical to that engineering community concerned with the design and analysis of spacecraft fluids management systems. Because a significant portion of liquid behavior in spacecraft is capillary dominated it is natural to consider designs that best exploit the spontaneous character of such flows. In the present work, a recently verified asymptotic analysis is extended to approximate spontaneous capillary flows in a large class of cylindrical containers of irregular polygonal section experiencing a step reduction in gravitational acceleration. Drop tower tests are conducted using partially-filled irregular triangular containers for comparison with the theoretical predictions. The degree to which the experimental data agree with the theory is a testament to the robustness of the basic analytical assumption of predominantly parallel flow. As a result, the closed form analytical expressions presented serve as simple, accurate tools for predicting bulk flow characteristics essential to practical low-g system design and analysis. Equations for predicting corner wetting rates, total container flow rates, and transient surfaces shapes are provided that are relevant also to terrestrial applications such as capillary flow in porous media.

  7. Thanatos and massive psychic trauma: the impact of the death instinct on knowing, remembering, and forgetting.

    PubMed

    Laub, Dori; Lee, Susanna

    2003-01-01

    The connection between massive psychic trauma and the concept of the death instinct is explored using the basic assumptions that the death instinct is unleashed through and is in a sense characteristic of traumatic experience, and that the concept of the death instinct is indispensable to the understanding and treatment of trauma. Characteristics of traumatic experience, such as dissolution of the empathic bond, failure to assimilate experience into psychic representation and structure, a tendency to repeat traumatic experience, and a resistance to remembering and knowing, are considered as trauma-induced death instinct derivatives. An initial focus is on the individual, on how death instinct manifestations can be discerned in the survivors of trauma. Next the intergenerational force of trauma is examined; a clinical vignette illustrates how the death instinct acts on and is passed on to the children of survivors. Finally, the cultural or societal aspects of trauma are considered, with an eye to how death instinct derivatives permeate cultural responses (or failures to respond) to trauma. Because trauma causes a profound destructuring and decathexis, it is concluded that the concept of the death instinct is a clinical and theoretical necessity.

  8. The inverse niche model for food webs with parasites

    USGS Publications Warehouse

    Warren, Christopher P.; Pascual, Mercedes; Lafferty, Kevin D.; Kuris, Armand M.

    2010-01-01

    Although parasites represent an important component of ecosystems, few field and theoretical studies have addressed the structure of parasites in food webs. We evaluate the structure of parasitic links in an extensive salt marsh food web, with a new model distinguishing parasitic links from non-parasitic links among free-living species. The proposed model is an extension of the niche model for food web structure, motivated by the potential role of size (and related metabolic rates) in structuring food webs. The proposed extension captures several properties observed in the data, including patterns of clustering and nestedness, better than does a random model. By relaxing specific assumptions, we demonstrate that two essential elements of the proposed model are the similarity of a parasite's hosts and the increasing degree of parasite specialization, along a one-dimensional niche axis. Thus, inverting one of the basic rules of the original model, the one determining consumers' generality appears critical. Our results support the role of size as one of the organizing principles underlying niche space and food web topology. They also strengthen the evidence for the non-random structure of parasitic links in food webs and open the door to addressing questions concerning the consequences and origins of this structure.

  9. Investigation of the asymptotic state of rotating turbulence using large-eddy simulation

    NASA Technical Reports Server (NTRS)

    Squires, Kyle D.; Chasnov, Jeffrey R.; Mansour, Nagi N.; Cambon, Claude

    1993-01-01

    Study of turbulent flows in rotating reference frames has long been an area of considerable scientific and engineering interest. Because of its importance, the subject of turbulence in rotating reference frames has motivated over the years a large number of theoretical, experimental, and computational studies. The bulk of these previous works has served to demonstrate that the effect of system rotation on turbulence is subtle and remains exceedingly difficult to predict. A rotating flow of particular interest in many studies, including the present work, is examination of the effect of solid-body rotation on an initially isotropic turbulent flow. One of the principal reasons for the interest in this flow is that it represents the most basic turbulent flow whose structure is altered by system rotation but without the complicating effects introduced by mean strains or flow inhomogeneities. The assumption of statistical homogeneity considerably simplifies analysis and computation. The principal objective of the present study has been to examine the asymptotic state of solid-body rotation applied to an initially isotropic, high Reynolds number turbulent flow. Of particular interest has been to determine the degree of two-dimensionalization and the existence of asymptotic self-similar states in homogeneous rotating turbulence.

  10. Comparison of Inboard-Outboard Pedestal Temperature Measurements in JET Using ECE Diagnostics

    NASA Astrophysics Data System (ADS)

    Barrera, L.; de la Luna, E.; Figini, L.

    2008-03-01

    Despite considerable effort, both theoretically and experimentally, a complete physical model to describe the particle and energy losses during ELMs is far from complete. On the experimental front, improved description of the spatial structure (poloidal asymmetry, radial distribution) and the dynamics of the ELM crash is a key requirement to answer some of the basic outstanding questions concerning the physics of ELMs. A significant number of diagnostics is now capable of fast measurements of the pedestal profile during an ELM, however, there is a lack of data from the inboard midplane, so assumptions of poloidal symmetry on the flux surfaces have often to be made. The aim of this work is to explore the capabilities of the electron cyclotron emission (ECE) diagnostics to provide simultaneous measurements of the edge temperature for both inboard and outboard plasma midplane. Access to the inboard region of the plasma is achieved in JET by using 1 harmonic/O-mode polarization, as it is not affected by harmonic overlap with the 2nd harmonic. This paper focuses on the validation of the inboard ECE data and the identification of the limitations of the measurements and the data analysis.

  11. Architectures for Quantum Simulation Showing a Quantum Speedup

    NASA Astrophysics Data System (ADS)

    Bermejo-Vega, Juan; Hangleiter, Dominik; Schwarz, Martin; Raussendorf, Robert; Eisert, Jens

    2018-04-01

    One of the main aims in the field of quantum simulation is to achieve a quantum speedup, often referred to as "quantum computational supremacy," referring to the experimental realization of a quantum device that computationally outperforms classical computers. In this work, we show that one can devise versatile and feasible schemes of two-dimensional, dynamical, quantum simulators showing such a quantum speedup, building on intermediate problems involving nonadaptive, measurement-based, quantum computation. In each of the schemes, an initial product state is prepared, potentially involving an element of randomness as in disordered models, followed by a short-time evolution under a basic translationally invariant Hamiltonian with simple nearest-neighbor interactions and a mere sampling measurement in a fixed basis. The correctness of the final-state preparation in each scheme is fully efficiently certifiable. We discuss experimental necessities and possible physical architectures, inspired by platforms of cold atoms in optical lattices and a number of others, as well as specific assumptions that enter the complexity-theoretic arguments. This work shows that benchmark settings exhibiting a quantum speedup may require little control, in contrast to universal quantum computing. Thus, our proposal puts a convincing experimental demonstration of a quantum speedup within reach in the near term.

  12. Re-evaluating the link between brain size and behavioural ecology in primates.

    PubMed

    Powell, Lauren E; Isler, Karin; Barton, Robert A

    2017-10-25

    Comparative studies have identified a wide range of behavioural and ecological correlates of relative brain size, with results differing between taxonomic groups, and even within them. In primates for example, recent studies contradict one another over whether social or ecological factors are critical. A basic assumption of such studies is that with sufficiently large samples and appropriate analysis, robust correlations indicative of selection pressures on cognition will emerge. We carried out a comprehensive re-examination of correlates of primate brain size using two large comparative datasets and phylogenetic comparative methods. We found evidence in both datasets for associations between brain size and ecological variables (home range size, diet and activity period), but little evidence for an effect of social group size, a correlation which has previously formed the empirical basis of the Social Brain Hypothesis. However, reflecting divergent results in the literature, our results exhibited instability across datasets, even when they were matched for species composition and predictor variables. We identify several potential empirical and theoretical difficulties underlying this instability and suggest that these issues raise doubts about inferring cognitive selection pressures from behavioural correlates of brain size. © 2017 The Author(s).

  13. Non-Classical Order in Sphere Forming ABAC Tetrablock Copolymers

    NASA Astrophysics Data System (ADS)

    Zhang, Jingwen; Sides, Scott; Bates, Frank

    2013-03-01

    AB diblock and ABC triblock copolymers have been studied thoroughly. ABAC tetrablock copolymers, representing the simplest variation from ABC triblock by breaking the molecular symmetry via inserting some of the A block in between B and C blocks, have been studied systematically in this research. The model system is poly(styrene-b-isoprene-b-styrene-b-ethylene oxide) (SISO) tetrablock terpolymers and the resulting morphologies were characterized by nuclear magnetic resonance, gel permeation chromatography, small-angle X-ray scattering, transmission electron microscopy, differential scanning calorimetry and dynamic mechanical spectroscopy. Two novel phases are first discovered in a single component block copolymers: hexagonally ordered spherical phase and tentatively identified dodecagonal quasicrystalline (QC) phase. In particular, the discovery of QC phase bridges the world of soft matters to that of metals. These unusual sets of morphologies will be discussed in the context of segregation under the constraints associated with the tetrablock molecular architecture. Theoretical calculations based on the assumption of Gaussian chain statistics provide valuable insights into the molecular configurations associated with these morphologies. the U.S. Department of Energy, Basic Energy Sciences, Division of Materials Science and Engineering, under contract number DEAC05-00OR22725 with UT-Battelle LLC at Oak Ridge National Lab.

  14. Electron theory of fast and ultrafast dissipative magnetization dynamics.

    PubMed

    Fähnle, M; Illg, C

    2011-12-14

    For metallic magnets we review the experimental and electron-theoretical investigations of fast magnetization dynamics (on a timescale of ns to 100 ps) and of laser-pulse-induced ultrafast dynamics (few hundred fs). It is argued that for both situations the dominant contributions to the dissipative part of the dynamics arise from the excitation of electron-hole pairs and from the subsequent relaxation of these pairs by spin-dependent scattering processes, which transfer angular momentum to the lattice. By effective field theories (generalized breathing and bubbling Fermi-surface models) it is shown that the Gilbert equation of motion, which is often used to describe the fast dissipative magnetization dynamics, must be extended in several aspects. The basic assumptions of the Elliott-Yafet theory, which is often used to describe the ultrafast spin relaxation after laser-pulse irradiation, are discussed very critically. However, it is shown that for Ni this theory probably yields a value for the spin-relaxation time T(1) in good agreement with the experimental value. A relation between the quantity α characterizing the damping of the fast dynamics in simple situations and the time T(1) is derived. © 2011 IOP Publishing Ltd

  15. Chemical bond and superconductivity. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Messmer, R.P.

    1987-07-01

    The search for understanding of the physical mechanisms operating in the recently discovered high-T/sub c/ superconductors forces a re-examination of the basic concepts and physical assumptions of current theoretical approaches. The attractive interaction of a more-general theory may be rather more complicated than the electron-phonon interaction usually assumed. In fact, it probably contains the critical chemical parameters of the material. This is the motivation for the present work in which the focus is two-fold: first, to call attention to some recent developments in our understanding of the chemical bond, and second, to prepose that this new understanding is not onlymore » germane to the electronic structure of solids but also provides a new perspective on the relationship between the chemical bond and superconductivity. Studying the connection between chemical bonding and superconductivity would seem to be rather an academic exercise if it were not for the high-temperature superconductors. These materials have brought attention in a dramatic fashion to the ignorance that exists in relating chemistry to the important physical parameters of a superconductor. Although this point was raised in numerous contributions by Matthias, its full import was never so apparent when the superconductors were traditional metals and alloys.« less

  16. "Owning" Knowledge: Looking beyond Politics to Find the Public Good

    ERIC Educational Resources Information Center

    Bernstein-Sierra, Samantha

    2017-01-01

    This chapter explores the theoretical assumptions underlying both the IP system and its counternarrative, academic openness, to encourage stakeholders to look beyond extremes as depicted in political rhetoric, and find a compromise consistent with the common mission of faculty, universities, and publishers.

  17. Epilepsy: An Overview for the Special Educator.

    ERIC Educational Resources Information Center

    Nivens, Maryruth K.

    Intended to dispel myths concerning epilepsy, the paper discusses the history, symptoms and characteristics, possible causes and current medication approaches to the condition, theoretical assumptions are traced, and a definition explained. Charts depict the location of discharge; seizure patterns and accompanying physical/psychological symptoms;…

  18. Cognitive-Developmental and Behavior-Analytic Theories: Evolving into Complementarity

    ERIC Educational Resources Information Center

    Overton, Willis F.; Ennis, Michelle D.

    2006-01-01

    Historically, cognitive-developmental and behavior-analytic approaches to the study of human behavior change and development have been presented as incompatible alternative theoretical and methodological perspectives. This presumed incompatibility has been understood as arising from divergent sets of metatheoretical assumptions that take the form…

  19. Dreaming and Schizophrenia.

    ERIC Educational Resources Information Center

    Stickney, Jeffrey L.

    Parallels between dream states and schizophrenia suggest that the study of dreams may offer some information about schizophrenia. A major theoretical assumption of the research on dreaming and schizophrenia is that, in schizophrenics, the dream state intrudes on the awake state creating a dreamlike symptomatology. This theory, called the REM…

  20. Escaping the Tyranny of Belief

    ERIC Educational Resources Information Center

    Wiswell, Albert K.; Wells, C. Leanne

    2004-01-01

    This study describes an action research case study through which the dynamics of identifying and changing strongly held assumptions illustrate the differences between experiences that serve to strengthen beliefs from those that lead to learning. Theoretical considerations are presented linking cognitive schema, action science, attribution theory,…

  1. Relative coronal abundances derived from X-ray observations 3: The effect of cascades on the relative intensity of Fe (XVII) line fluxes, and a revised iron abundance

    NASA Technical Reports Server (NTRS)

    Walker, A. B. C., Jr.; Rugge, H. R.; Weiss, K.

    1974-01-01

    Permitted lines in the optically thin coronal X-ray spectrum were analyzed to find the distribution of coronal material, as a function of temperature, without special assumptions concerning coronal conditions. The resonance lines of N, O, Ne, Na, Mg, Al, Si, S, and Ar which dominate the quiet coronal spectrum below 25A were observed. Coronal models were constructed and the relative abundances of these elements were determined. The intensity in the lines of the 2p-3d transitions near 15A was used in conjunction with these coronal models, with the assumption of coronal excitation, to determine the Fe XVII abundance. The relative intensities of the 2p-3d Fe XVII lines observed in the corona agreed with theoretical prediction. Using a more complete theoretical model, and higher resolution observations, a revised calculation of iron abundance relative to hydrogen of 0.000026 was made.

  2. Survival estimation and the effects of dependency among animals

    USGS Publications Warehouse

    Schmutz, Joel A.; Ward, David H.; Sedinger, James S.; Rexstad, Eric A.

    1995-01-01

    Survival models assume that fates of individuals are independent, yet the robustness of this assumption has been poorly quantified. We examine how empirically derived estimates of the variance of survival rates are affected by dependency in survival probability among individuals. We used Monte Carlo simulations to generate known amounts of dependency among pairs of individuals and analyzed these data with Kaplan-Meier and Cormack-Jolly-Seber models. Dependency significantly increased these empirical variances as compared to theoretically derived estimates of variance from the same populations. Using resighting data from 168 pairs of black brant, we used a resampling procedure and program RELEASE to estimate empirical and mean theoretical variances. We estimated that the relationship between paired individuals caused the empirical variance of the survival rate to be 155% larger than the empirical variance for unpaired individuals. Monte Carlo simulations and use of this resampling strategy can provide investigators with information on how robust their data are to this common assumption of independent survival probabilities.

  3. A control-volume method for analysis of unsteady thrust augmenting ejector flows

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.

    1988-01-01

    A method for predicting transient thrust augmenting ejector characteristics is presented. The analysis blends classic self-similar turbulent jet descriptions with a control volume mixing region discretization to solicit transient effects in a new way. Division of the ejector into an inlet, diffuser, and mixing region corresponds with the assumption of viscous-dominated phenomenon in the latter. Inlet and diffuser analyses are simplified by a quasi-steady analysis, justified by the assumptions that pressure is the forcing function in those regions. Details of the theoretical foundation, the solution algorithm, and sample calculations are given.

  4. Quality Control and Nondestructive Evaluation Techniques for Composites. Part 2. Physiochemical Characterization Techniques - A State-of-the Art Review

    DTIC Science & Technology

    1983-05-01

    in the presence of fillers or without it. The basic assumption made is that the heat of reaction is proportional to the extent of the reaction...disperse the SillllV* rVdi\\tion ^^9 • .canning machan ^m. ill isolate the frequency range falling on the detector In this manner. the spectrum...the molar orms with only has n absorb ing (nxp) and # by the udied. Of t have a all of the analysis a complete the same There are two basic

  5. Production process stability - core assumption of INDUSTRY 4.0 concept

    NASA Astrophysics Data System (ADS)

    Chromjakova, F.; Bobak, R.; Hrusecka, D.

    2017-06-01

    Today’s industrial enterprises are confronted by implementation of INDUSTRY 4.0 concept with basic problem - stabilised manufacturing and supporting processes. Through this phenomenon of stabilisation, they will achieve positive digital management of both processes and continuously throughput. There is required structural stability of horizontal (business) and vertical (digitized) manufacturing processes, supported through digitalised technologies of INDUSTRY 4.0 concept. Results presented in this paper based on the research results and survey realised in more industrial companies. Following will described basic model for structural process stabilisation in manufacturing environment.

  6. Hepatitis C bio-behavioural surveys in people who inject drugs-a systematic review of sensitivity to the theoretical assumptions of respondent driven sampling.

    PubMed

    Buchanan, Ryan; Khakoo, Salim I; Coad, Jonathan; Grellier, Leonie; Parkes, Julie

    2017-07-11

    New, more effective and better-tolerated therapies for hepatitis C (HCV) have made the elimination of HCV a feasible objective. However, for this to be achieved, it is necessary to have a detailed understanding of HCV epidemiology in people who inject drugs (PWID). Respondent-driven sampling (RDS) can provide prevalence estimates in hidden populations such as PWID. The aims of this systematic review are to identify published studies that use RDS in PWID to measure the prevalence of HCV, and compare each study against the STROBE-RDS checklist to assess their sensitivity to the theoretical assumptions underlying RDS. Searches were undertaken in accordance with PRISMA systematic review guidelines. Included studies were English language publications in peer-reviewed journals, which reported the use of RDS to recruit PWID to an HCV bio-behavioural survey. Data was extracted under three headings: (1) survey overview, (2) survey outcomes, and (3) reporting against selected STROBE-RDS criteria. Thirty-one studies met the inclusion criteria. They varied in scale (range 1-15 survey sites) and the sample sizes achieved (range 81-1000 per survey site) but were consistent in describing the use of standard RDS methods including: seeds, coupons and recruitment incentives. Twenty-seven studies (87%) either calculated or reported the intention to calculate population prevalence estimates for HCV and two used RDS data to calculate the total population size of PWID. Detailed operational and analytical procedures and reporting against selected criteria from the STROBE-RDS checklist varied between studies. There were widespread indications that sampling did not meet the assumptions underlying RDS, which led to two studies being unable to report an estimated HCV population prevalence in at least one survey location. RDS can be used to estimate a population prevalence of HCV in PWID and estimate the PWID population size. Accordingly, as a single instrument, it is a useful tool for guiding HCV elimination. However, future studies should report the operational conduct of each survey in accordance with the STROBE-RDS checklist to indicate sensitivity to the theoretical assumptions underlying the method. PROSPERO CRD42015019245.

  7. Testing Atmospheric Retrieval Modeling Assumptions for Transiting Planet Atmospheres: Preparatory science for the James Webb Space Telescope and beyond.

    NASA Astrophysics Data System (ADS)

    Line, Michael

    The field of transiting exoplanet atmosphere characterization has grown considerably over the past decade given the wealth of photometric and spectroscopic data from the Hubble and Spitzer space telescopes. In order to interpret these data, atmospheric models combined with Bayesian approaches are required. From spectra, these approaches permit us to infer fundamental atmospheric properties and how their compositions can relate back to planet formation. However, such approaches must make a wide range of assumptions regarding the physics/parameterizations included in the model atmospheres. There has yet to be a comprehensive investigation exploring how these model assumptions influence our interpretations of exoplanetary spectra. Understanding the impact of these assumptions is especially important since the James Webb Space Telescope (JWST) is expected to invest a substantial portion of its time observing transiting planet atmospheres. It is therefore prudent to optimize and enhance our tools to maximize the scientific return from the revolutionary data to come. The primary goal of the proposed work is to determine the pieces of information we can robustly learn from transiting planet spectra as obtained by JWST and other future, space-based platforms, by investigating commonly overlooked model assumptions. We propose to explore the following effects and how they impact our ability to infer exoplanet atmospheric properties: 1. Stellar/Planetary Uncertainties: Transit/occultation eclipse depths and subsequent planetary spectra are measured relative to their host stars. How do stellar uncertainties, on radius, effective temperature, metallicity, and gravity, as well as uncertainties in the planetary radius and gravity, propagate into the uncertainties on atmospheric composition and thermal structure? Will these uncertainties significantly bias our atmospheric interpretations? Is it possible to use the relative measurements of the planetary spectra to provide additional constraints on the stellar properties? 2. The "1D" Assumption: Atmospheres are inherently three-dimensional. Many exoplanet atmosphere models, especially within retrieval frameworks, assume 1D physics and chemistry when interpreting spectra. How does this "1D" atmosphere assumption bias our interpretation of exoplanet spectra? Do we have to consider global temperature variations such as day-night contrasts or hot spots? What about spatially inhomogeneous molecular abundances and clouds? How will this change our interpretations of phase resolved spectra? 3. Clouds/Hazes: Understanding how clouds/hazes impact transit spectra is absolutely critical if we are to obtain proper estimates of basic atmospheric quantities. How do the assumptions in cloud physics bias our inferences of molecular abundances in transmission? What kind of data (wavelengths, signal-to-noise, resolution) do we need to infer cloud composition, vertical extent, spatial distribution (patchy or global), and size distributions? The proposed work is relevant and timely to the scope of the NASA Exoplanet Research program. The proposed work aims to further develop the critical theoretical modeling tools required to rigorously interpret transiting exoplanet atmosphere data in order to maximize the science return from JWST and beyond. This work will serve as a benchmark study for defining the data (wavelength ranges, signal-to-noises, and resolutions) required from a modeling perspective to "characterize exoplanets and their atmospheres in order to inform target and operational choices for current NASA missions, and/or targeting, operational, and formulation data for future NASA observatories". Doing so will allow us to better "understand the chemical and physical processes of exoplanets (their atmospheres)" which will ultimately " improve understanding of the origins of exoplanetary systems" through robust planetary elemental abundance determinations.

  8. Economic Theory and Broadcasting.

    ERIC Educational Resources Information Center

    Bates, Benjamin J.

    Focusing on access to audience through broadcast time, this paper examines the status of research into the economics of broadcasting. The paper first discusses the status of theory in the study of broadcast economics, both as described directly and as it exists in the statement of the basic assumptions generated by prior work and general…

  9. Dewey and Schon: An Analysis of Reflective Thinking.

    ERIC Educational Resources Information Center

    Bauer, Norman J.

    The challenge to the dominance of rationality in educational philosophy presented by John Dewey and Donald Schon is examined in this paper. The paper identifies basic assumptions of their perspective and explains concepts of reflective thinking, which include biography, context of uncertainty, and "not-yet." A model of reflective thought…

  10. Tiedeman's Approach to Career Development.

    ERIC Educational Resources Information Center

    Harren, Vincent A.

    Basic to Tiedeman's approach to career development and decision making is the assumption that one is responsible for one's own behavior because one has the capacity for choice and lives in a world which is not deterministic. Tiedeman, a cognitive-developmental theorist, views continuity of development as internal or psychological while…

  11. Linking Educational Philosophy with Micro-Level Technology: The Search for a Complete Method.

    ERIC Educational Resources Information Center

    Januszewski, Alan

    Traditionally, educational technologists have not been concerned with social or philosophical questions, and the field does not have a basic educational philosophy. Instead, it is dominated by a viewpoint characterized as "technical rationality" or "technicism"; the most important assumption of this viewpoint is that science…

  12. Conservatism in America--What Does it Mean for Teacher Education?

    ERIC Educational Resources Information Center

    Dolce, Carl J.

    The current conflict among opposing sets of cultural ideals is illustrated by several interrelated conditions. The conservative phenomenon is more complex than the traditional liberal-conservative dichotomy would suggest. Changes in societal conditions invite a reexamination of basic assumptions across the broad spectrum of political ideology.…

  13. Variable thickness transient ground-water flow model. Volume 1. Formulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reisenauer, A.E.

    1979-12-01

    Mathematical formulation for the variable thickness transient (VTT) model of an aquifer system is presented. The basic assumptions are described. Specific data requirements for the physical parameters are discussed. The boundary definitions and solution techniques of the numerical formulation of the system of equations are presented.

  14. A SYSTEMS ANALYSIS OF SCHOOL BOARD ACTION.

    ERIC Educational Resources Information Center

    SCRIBNER, JAY D.

    THE BASIC ASSUMPTION OF THE FUNCTIONAL-SYSTEMS THEORY IS THAT STRUCTURES FULFILL FUNCTIONS IN SYSTEMS AND THAT SUBSYSTEMS OPERATE SEPARATELY WITHIN ANY TYPE OF STRUCTURE. RELYING MAINLY ON GABRIEL ALMOND'S PARADIGM, THE AUTHOR ATTEMPTS TO DETERMINE THE USEFULNESS OF THE FUNCTIONAL-SYSTEMS THEORY IN CONDUCTING EMPIRICAL RESEARCH OF SCHOOL BOARDS.…

  15. Distance-Based and Distributed Learning: A Decision Tool for Education Leaders.

    ERIC Educational Resources Information Center

    McGraw, Tammy M.; Ross, John D.

    This decision tool presents a progression of data collection and decision-making strategies that can increase the effectiveness of distance-based or distributed learning instruction. A narrative and flow chart cover the following steps: (1) basic assumptions, including purpose of instruction, market scan, and financial resources; (2) needs…

  16. Applying the Principles of Specific Objectivity and of Generalizability to the Measurement of Change.

    ERIC Educational Resources Information Center

    Fischer, Gerhard H.

    1987-01-01

    A natural parameterization and formalization of the problem of measuring change in dichotomous data is developed. Mathematically-exact definitions of specific objectivity are presented, and the basic structures of the linear logistic test model and the linear logistic model with relaxed assumptions are clarified. (SLD)

  17. A Guide to Curriculum Planning in Mathematics. Bulletin No. 6284.

    ERIC Educational Resources Information Center

    Chambers, Donald L.; And Others

    This guide was written under the basic assumptions that the mathematics curriculum must continuously change and that mathematics is most effectively learned through a spiral approach. Further, it is assumed that the audience will be members of district mathematics curriculum committees. Instructional objectives have been organized to reveal the…

  18. Validated Test Method 1315: Mass Transfer Rates of Constituents in Monolithic or Compacted Granular Materials Using a Semi-Dynamic Tank Leaching Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  19. Document-Oriented E-Learning Components

    ERIC Educational Resources Information Center

    Piotrowski, Michael

    2009-01-01

    This dissertation questions the common assumption that e-learning requires a "learning management system" (LMS) such as Moodle or Blackboard. Based on an analysis of the current state of the art in LMSs, we come to the conclusion that the functionality of conventional e-learning platforms consists of basic content management and…

  20. Measuring Protein Interactions by Optical Biosensors

    PubMed Central

    Zhao, Huaying; Boyd, Lisa F.; Schuck, Peter

    2017-01-01

    This unit gives an introduction to the basic techniques of optical biosensing for measuring equilibrium and kinetics of reversible protein interactions. Emphasis is given to the description of robust approaches that will provide reliable results with few assumptions. How to avoid the most commonly encountered problems and artifacts is also discussed. PMID:28369667

  1. A "View from Nowhen" on Time Perception Experiments

    ERIC Educational Resources Information Center

    Riemer, Martin; Trojan, Jorg; Kleinbohl, Dieter; Holzl, Rupert

    2012-01-01

    Systematic errors in time reproduction tasks have been interpreted as a misperception of time and therefore seem to contradict basic assumptions of pacemaker-accumulator models. Here we propose an alternative explanation of this phenomenon based on methodological constraints regarding the direction of time, which cannot be manipulated in…

  2. Teaching Literature: Some Honest Doubts.

    ERIC Educational Resources Information Center

    Rutledge, Donald G.

    1968-01-01

    The possibility that many English teachers take their subject too seriously should be considered. The assumption that literature can to any degree either improve or adversely affect students is doubtful, but the exclusive study of "great literature" in our secondary schools may invite basic reflections too early: a year's steady diet of "King…

  3. East Europe Report, Political, Sociological and Military Affairs, No. 2219

    DTIC Science & Technology

    1983-10-24

    takes place in training booths and classrooms. On the way to warrant officer one must take sociology, Russian, basic construction, materials...polemics. I admit that I like this much more than the obligatory hearty kiss on both cheeks along with, of course, the assumption that polemicists have

  4. Exceptional Children Conference Papers: Behavioral and Emotional Problems.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Arlington, VA.

    Four of the seven conference papers treating behavioral and emotional problems concern the Conceptual Project, an attempt to provide definition and evaluation of conceptual models of the various theories of emotional disturbance and their basic assumptions, and to provide training packages based on these materials. The project is described in…

  5. The Binding Properties of Quechua Suffixes.

    ERIC Educational Resources Information Center

    Weber, David

    This paper sketches an explicitly non-lexicalist application of grammatical theory to Huallaga (Huanuco) Quechua (HgQ). The advantages of applying binding theory to many suffixes that have previously been treated only as objects of the morphology are demonstrated. After an introduction, section 2 outlines basic assumptions about the nature of HgQ…

  6. Validated Test Method 1316: Liquid-Solid Partitioning as a Function of Liquid-to-Solid Ratio in Solid Materials Using a Parallel Batch Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  7. Creating a Healthy Camp Community: A Nurse's Role.

    ERIC Educational Resources Information Center

    Lishner, Kris Miller; Bruya, Margaret Auld

    This book provides an organized, systematic overview of the basic aspects of health program management, nursing practice, and human relations issues in camp nursing. A foremost assumption is that health care in most camps needs improvement. Good health is dependent upon interventions involving social, environmental, and lifestyle factors that…

  8. Fatherless America: Confronting Our Most Urgent Social Problem.

    ERIC Educational Resources Information Center

    Blankenhorn, David

    The United States is rapidly becoming a fatherless society. Fatherlessness is the leading cause of declining child well-being, providing the impetus behind social problems such as crime, domestic violence, and adolescent pregnancy. Challenging the basic assumptions of opinion leaders in academia and in the media, this book debunks the prevailing…

  9. Teaching Strategy: A New Planet.

    ERIC Educational Resources Information Center

    O'Brien, Edward L.

    1998-01-01

    Presents a lesson for middle and secondary school students in which they respond to a hypothetical scenario that enables them to develop a list of basic rights. Expounds that students compare their list of rights to the Universal Declaration of Human Rights in order to explore the assumptions about human rights. (CMK)

  10. Session overview: forest ecosystems

    Treesearch

    John J. Battles; Robert C. Heald

    2004-01-01

    The core assumption of this symposium is that science can provide insight to management. Nowhere is this link more formally established than in regard to the science and management of forest ecosystems. The basic questions addressed are integral to our understanding of nature; the applications of this understanding are crucial to effective stewardship of natural...

  11. A Comprehensive Real-World Distillation Experiment

    ERIC Educational Resources Information Center

    Kazameas, Christos G.; Keller, Kaitlin N.; Luyben, William L.

    2015-01-01

    Most undergraduate mass transfer and separation courses cover the design of distillation columns, and many undergraduate laboratories have distillation experiments. In many cases, the treatment is restricted to simple column configurations and simplifying assumptions are made so as to convey only the basic concepts. In industry, the analysis of a…

  12. Human Praxis: A New Basic Assumption for Art Educators of the Future.

    ERIC Educational Resources Information Center

    Hodder, Geoffrey S.

    1980-01-01

    After analyzing Vincent Lanier's five characteristic roles of art education, the article briefly explains the pedagogy of Paulo Freire, based on human praxis, and applies it to the existing "oppresive" art education system. The article reduces Lanier's roles to resemble a single Freirean model. (SB)

  13. Model-Based Reasoning

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk; Seel, Norbert M.

    2013-01-01

    In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…

  14. Alternate hosts of Blepharipa pratensis (Meigen)

    Treesearch

    Paul A. Godwin; Thomas M. Odell

    1977-01-01

    A current tactic for biological control of the gypsy moth, Lymantria dispar Linnaeus, is to release its parasites in forests susceptible to gypsy moth damage before the gypsy moth arrives. The basic assumption in these anticipatory releases is that the parasites can find and utilize native insects as hosts in the interim. Blepharipa...

  15. Children and Adolescents: Should We Teach Them or Let Them Learn?

    ERIC Educational Resources Information Center

    Rohwer, William D., Jr.

    Research to date has provided too few answers for vital educational questions concerning teaching children or letting them learn. A basic problem is that experimentation usually begins by accepting conventional assumptions about schooling, ignoring experiments that would entail disturbing the ordering of current educational priorities.…

  16. Comparison of theoretical proteomes: identification of COGs with conserved and variable pI within the multimodal pI distribution.

    PubMed

    Nandi, Soumyadeep; Mehra, Nipun; Lynn, Andrew M; Bhattacharya, Alok

    2005-09-09

    Theoretical proteome analysis, generated by plotting theoretical isoelectric points (pI) against molecular masses of all proteins encoded by the genome show a multimodal distribution for pI. This multimodal distribution is an effect of allowed combinations of the charged amino acids, and not due to evolutionary causes. The variation in this distribution can be correlated to the organisms ecological niche. Contributions to this variation maybe mapped to individual proteins by studying the variation in pI of orthologs across microorganism genomes. The distribution of ortholog pI values showed trimodal distributions for all prokaryotic genomes analyzed, similar to whole proteome plots. Pairwise analysis of pI variation show that a few COGs are conserved within, but most vary between, the acidic and basic regions of the distribution, while molecular mass is more highly conserved. At the level of functional grouping of orthologs, five groups vary significantly from the population of orthologs, which is attributed to either conservation at the level of sequences or a bias for either positively or negatively charged residues contributing to the function. Individual COGs conserved in both the acidic and basic regions of the trimodal distribution are identified, and orthologs that best represent the variation in levels of the acidic and basic regions are listed. The analysis of pI distribution by using orthologs provides a basis for resolution of theoretical proteome comparison at the level of individual proteins. Orthologs identified that significantly vary between the major acidic and basic regions maybe used as representative of the variation of the entire proteome.

  17. An analytical study of double bend achromat lattice.

    PubMed

    Fakhri, Ali Akbar; Kant, Pradeep; Singh, Gurnam; Ghodke, A D

    2015-03-01

    In a double bend achromat, Chasman-Green (CG) lattice represents the basic structure for low emittance synchrotron radiation sources. In the basic structure of CG lattice single focussing quadrupole (QF) magnet is used to form an achromat. In this paper, this CG lattice is discussed and an analytical relation is presented, showing the limitation of basic CG lattice to provide the theoretical minimum beam emittance in achromatic condition. To satisfy theoretical minimum beam emittance parameters, achromat having two, three, and four quadrupole structures is presented. In this structure, different arrangements of QF and defocusing quadruple (QD) are used. An analytical approach assuming quadrupoles as thin lenses has been followed for studying these structures. A study of Indus-2 lattice in which QF-QD-QF configuration in the achromat part has been adopted is also presented.

  18. Properties of the ion-ion hybrid resonator in fusion plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morales, George J.

    2015-10-06

    The project developed theoretical and numerical descriptions of the properties of ion-ion hybrid Alfvén resonators that are expected to arise in the operation of a fusion reactor. The methodology and theoretical concepts were successfully compared to observations made in basic experiments in the LAPD device at UCLA. An assessment was made of the excitation of resonator modes by energetic alpha particles for burning plasma conditions expected in the ITER device. The broader impacts included the generation of basic insight useful to magnetic fusion and space science researchers, defining new avenues for exploration in basic laboratory experiments, establishing broader contacts betweenmore » experimentalists and theoreticians, completion of a Ph.D. dissertation, and promotion of interest in science through community outreach events and classroom instruction.« less

  19. Differentiating and defusing theoretical Ecology's criticisms: A rejoinder to Sagoff's reply to Donhauser (2016).

    PubMed

    Donhauser, Justin

    2017-06-01

    In a (2016) paper in this journal, I defuse allegations that theoretical ecological research is problematic because it relies on teleological metaphysical assumptions. Mark Sagoff offers a formal reply. In it, he concedes that I succeeded in establishing that ecologists abandoned robust teleological views long ago and that they use teleological characterizations as metaphors that aid in developing mechanistic explanations of ecological phenomena. Yet, he contends that I did not give enduring criticisms of theoretical ecology a fair shake in my paper. He says this is because enduring criticisms center on concerns about the nature of ecological networks and forces, the instrumentality of ecological laws and theoretical models, and the relation between theoretical and empirical methods in ecology that that paper does not broach. Below I set apart the distinct criticisms Sagoff presents in his commentary and respond to each in turn. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Satellite Power Systems (SPS) space transportation cost analysis and evaluation

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A picture of Space Power Systems space transportation costs at the present time is given with respect to accuracy as stated, reasonableness of the methods used, assumptions made, and uncertainty associated with the estimates. The approach used consists of examining space transportation costs from several perspectives to perform a variety of sensitivity analyses or reviews and examine the findings in terms of internal consistency and external comparison with analogous systems. These approaches are summarized as a theoretical and historical review including a review of stated and unstated assumptions used to derive the costs, and a performance or technical review. These reviews cover the overall transportation program as well as the individual vehicles proposed. The review of overall cost assumptions is the principal means used for estimating the cost uncertainty derived. The cost estimates used as the best current estimate are included.

  1. Traumatic memories, eye movements, phobia, and panic: a critical note on the proliferation of EMDR.

    PubMed

    Muris, P; Merckelbach, H

    1999-01-01

    In the past years, Eye Movement Desensitization and Reprocessing (EMDR) has become increasingly popular as a treatment method for Posttraumatic Stress Disorder (PTSD). The current article critically evaluates three recurring assumptions in EMDR literature: (a) the notion that traumatic memories are fixed and stable and that flashbacks are accurate reproductions of the traumatic incident; (b) the idea that eye movements, or other lateralized rhythmic behaviors have an inhibitory effect on emotional memories; and (c) the assumption that EMDR is not only effective in treating PTSD, but can also be successfully applied to other psychopathological conditions. There is little support for any of these three assumptions. Meanwhile, the expansion of the theoretical underpinnings of EMDR in the absence of a sound empirical basis casts doubts on the massive proliferation of this treatment method.

  2. The current theoretical assumptions of the Bobath concept as determined by the members of BBTA.

    PubMed

    Raine, Sue

    2007-01-01

    The Bobath concept is a problem-solving approach to the assessment and treatment of individuals following a lesion of the central nervous system that offers therapists a framework for their clinical practice. The aim of this study was to facilitate a group of experts in determining the current theoretical assumptions underpinning the Bobath concept.A four-round Delphi study was used. The expert sample included all 15 members of the British Bobath Tutors Association. Initial statements were identified from the literature with respondents generating additional statements. Level of agreement was determined by using a five-point Likert scale. Level of consensus was set at 80%. Eighty-five statements were rated from the literature along with 115 generated by the group. Ninety-three statements were identified as representing the theoretical underpinning of the Bobath concept. The Bobath experts agreed that therapists need to be aware of the principles of motor learning such as active participation, opportunities for practice and meaningful goals. They emphasized that therapy is an interactive process between individual, therapist, and the environment and aims to promote efficiency of movement to the individual's maximum potential rather than normal movement. Treatment was identified by the experts as having "change of functional outcome" at its center.

  3. Interfacing theories of program with theories of evaluation for advancing evaluation practice: Reductionism, systems thinking, and pragmatic synthesis.

    PubMed

    Chen, Huey T

    2016-12-01

    Theories of program and theories of evaluation form the foundation of program evaluation theories. Theories of program reflect assumptions on how to conceptualize an intervention program for evaluation purposes, while theories of evaluation reflect assumptions on how to design useful evaluation. These two types of theories are related, but often discussed separately. This paper attempts to use three theoretical perspectives (reductionism, systems thinking, and pragmatic synthesis) to interface them and discuss the implications for evaluation practice. Reductionism proposes that an intervention program can be broken into crucial components for rigorous analyses; systems thinking view an intervention program as dynamic and complex, requiring a holistic examination. In spite of their contributions, reductionism and systems thinking represent the extreme ends of a theoretical spectrum; many real-world programs, however, may fall in the middle. Pragmatic synthesis is being developed to serve these moderate- complexity programs. These three theoretical perspectives have their own strengths and challenges. Knowledge on these three perspectives and their evaluation implications can provide a better guide for designing fruitful evaluations, improving the quality of evaluation practice, informing potential areas for developing cutting-edge evaluation approaches, and contributing to advancing program evaluation toward a mature applied science. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Experimental Control of Simple Pendulum Model

    ERIC Educational Resources Information Center

    Medina, C.

    2004-01-01

    This paper conveys information about a Physics laboratory experiment for students with some theoretical knowledge about oscillatory motion. Students construct a simple pendulum that behaves as an ideal one, and analyze model assumption incidence on its period. The following aspects are quantitatively analyzed: vanishing friction, small amplitude,…

  5. Building Intuitions about Statistical Inference Based on Resampling

    ERIC Educational Resources Information Center

    Watson, Jane; Chance, Beth

    2012-01-01

    Formal inference, which makes theoretical assumptions about distributions and applies hypothesis testing procedures with null and alternative hypotheses, is notoriously difficult for tertiary students to master. The debate about whether this content should appear in Years 11 and 12 of the "Australian Curriculum: Mathematics" has gone on…

  6. The "New" Economics of Education: Towards a "Unified" Macro/Micro-Educational Planning Policy.

    ERIC Educational Resources Information Center

    Kraft, Richard H.; Nakib, Yasser

    1991-01-01

    Takes issue with conventional human capital theory, questioning assumptions regarding external benefits, internal efficiency, educational purposes, and returns-to-education and manpower needs approaches. Reviews new theoretical directions regarding supply and demand, socialization, labor market segmentation, and overeducation and undereducation,…

  7. The Nature of the University

    ERIC Educational Resources Information Center

    Lenartowicz, Marta

    2015-01-01

    Higher education research frequently refers to the complex external conditions that give our old-fashioned universities a good reason to change. The underlying theoretical assumption of such framing is that organizations are open systems. This paper presents an alternative view, derived from the theory of social systems autopoiesis. It proposes…

  8. Generational Differences in Technology Adoption in Community Colleges

    ERIC Educational Resources Information Center

    Rosario, Victoria C.

    2012-01-01

    This research study investigated the technological perceptions and expectations of community college students, faculty, administrators, and Information Technology (IT) staff. The theoretical framework is based upon two assumptions on the process of technological innovation: it can be explained by diffusion of adoption theory, and by studying the…

  9. Scaffolding Student Participation in Mathematical Practices

    ERIC Educational Resources Information Center

    Moschkovich, Judit N.

    2015-01-01

    The concept of scaffolding can be used to describe various types of adult guidance, in multiple settings, across different time scales. This article clarifies what we mean by scaffolding, considering several questions specifically for scaffolding in mathematics: What theoretical assumptions are framing scaffolding? What is being scaffolded? At…

  10. Selective Mutism: Phenomenological Characteristics.

    ERIC Educational Resources Information Center

    Ford, Mary Ann; Sladeczek, Ingrid E.; Carlson, John; Kratochwill, Thomas R.

    1998-01-01

    To explore factors related to selective mutism (SM), a survey of persons (N=153, including 135 children) with SM was undertaken. Three theoretical assumptions are supported: (1) variant talking behaviors prior to identification of SM; (2) link between SM and social anxiety; (3) potential link between temperament and SM. (EMK)

  11. The Newtonian Mechanistic Paradigm, Special Education, and Contours of Alternatives: An Overview.

    ERIC Educational Resources Information Center

    Heshusius, Lous

    1989-01-01

    The article examines theoretical reorientations in special education away from the Newtonian mechanistic paradigm toward an emerging holistic paradigm. Recent literature is critiqued for renaming theories as paradigms, thereby providing an illusion of change while leaving fundamental mechanistic assumptions in place. (Author/DB)

  12. Interactivism: Change, Sensory-Emotional Intelligence, and Intentionality in Being and Learning.

    ERIC Educational Resources Information Center

    Bichelmeyer, Barbara A.

    This paper documents the theoretical framework of interactivism; articulates the pedagogical theory which frames its assumptions regarding effective educational practice; positions the pedagogy of interactivism against traditional pedagogical practice; and argues for the educational importance of the interactivist view. Interactivism is the term…

  13. Cognitive Processes in Dissociation: An Analysis of Core Theoretical Assumptions

    ERIC Educational Resources Information Center

    Giesbrecht, Timo; Lilienfield, Scott O.; Lynn, Steven Jay; Merckelbach, Harald

    2008-01-01

    Dissociation is typically defined as the lack of normal integration of thoughts, feelings, and experiences into consciousness and memory. The present article critically evaluates the research literature on cognitive processes in dissociation. The authors' review indicates that dissociation is characterized by subtle deficits in neuropsychological…

  14. Theoretical studies of solar lasers and converters

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.

    1988-01-01

    The previously constructed one dimensional model for the simulated operation of an iodine laser assumed that the perfluoroalkyl iodide gas n-C3F7I was incompressible. The present study removes this simplifying assumption and considers n-C3F7I as a compressible fluid.

  15. [General aspects of planning and care in mental health].

    PubMed

    Saforcada, E

    1976-09-01

    This paper reviews some general concepts on Planning, especially in public and welfare sectors, stressing those concerning the major flaws in the argentine system of mental health. The author considers the definition of planning levels, and sets forth three: general plan, program and project. The correlative implementation is also considered. The importance of feed-back from adequate evaluation is stressed, emphasizing three aspects: a) evaluation of dynamics, rate and extent of decrease, increase or stagnation; b) assessment of efficacity of factors involved; c) control and stabilization of goals already attained. The necessity to develop a human ecology, encompassing socio-cultural and psycho-social factors is stressed, together with fostering theoretical research and the use of its results by implementation agents. Several differences among prevailing mental health actions are pointed out which allow a distinction between two typical models: clinical and sanitarist. The main differences between them lye on: standard location of working sites, nature of basic actions, field of action, hypothesis for working, including ethiological and ecological assumptions, theoretical and methodological framework. A series of criteria for evaluating sanitary techniques and strategies are set forth, among which: operative procedures, length of treatments, degree of therapeutic concentration, and general pragmatic criteria. The indicators reviewed are: degree of efficacity, covering, degree of perseverance in treatments, cultural barriers between patient and therapist, delegation of functions into special, first-rate sanitary agents, needs for the training of mental health workers. An attempt is made at developping general evaluation criteria for mental health planning, and several indicators are proposed, among which: a) cost/efficacity ratio, including in costs the use of economical, human and physical resources; b) preventive capacities of the community; c) capacities for the community to generate new types of organization and social dynamics able to cope with increasing mental health demands; first-rate personnel and mental health agents performance and training; e) assessment of "distances" between theoretical planning and actual implementation outlines; f) requirements of time for the implementation of programs.

  16. Statistical thermodynamics of protein folding: Comparison of a mean-field theory with Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Hao, Ming-Hong; Scheraga, Harold A.

    1995-01-01

    A comparative study of protein folding with an analytical theory and computer simulations, respectively, is reported. The theory is based on an improved mean-field formalism which, in addition to the usual mean-field approximations, takes into account the distributions of energies in the subsets of conformational states. Sequence-specific properties of proteins are parametrized in the theory by two sets of variables, one for the energetics of mean-field interactions and one for the distribution of energies. Simulations are carried out on model polypeptides with different sequences, with different chain lengths, and with different interaction potentials, ranging from strong biases towards certain local chain states (bond angles and torsional angles) to complete absence of local conformational preferences. Theoretical analysis of the simulation results for the model polypeptides reveals three different types of behavior in the folding transition from the statistical coiled state to the compact globular state; these include a cooperative two-state transition, a continuous folding, and a glasslike transition. It is found that, with the fitted theoretical parameters which are specific for each polypeptide under a different potential, the mean-field theory can describe the thermodynamic properties and folding behavior of the different polypeptides accurately. By comparing the theoretical descriptions with simulation results, we verify the basic assumptions of the theory and, thereby, obtain new insights about the folding transitions of proteins. It is found that the cooperativity of the first-order folding transition of the model polypeptides is determined mainly by long-range interactions, in particular the dipolar orientation; the local interactions (e.g., bond-angle and torsion-angle potentials) have only marginal effect on the cooperative characteristic of the folding, but have a large impact on the difference in energy between the folded lowest-energy structure and the unfolded conformations of a protein.

  17. Quantitative force measurements using frequency modulation atomic force microscopy—theoretical foundations

    NASA Astrophysics Data System (ADS)

    Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.

    2005-03-01

    Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.

  18. Theoretical geology

    NASA Astrophysics Data System (ADS)

    Mikeš, Daniel

    2010-05-01

    Theoretical geology Present day geology is mostly empirical of nature. I claim that geology is by nature complex and that the empirical approach is bound to fail. Let's consider the input to be the set of ambient conditions and the output to be the sedimentary rock record. I claim that the output can only be deduced from the input if the relation from input to output be known. The fundamental question is therefore the following: Can one predict the output from the input or can one predict the behaviour of a sedimentary system? If one can, than the empirical/deductive method has changes, if one can't than that method is bound to fail. The fundamental problem to solve is therefore the following: How to predict the behaviour of a sedimentary system? It is interesting to observe that this question is never asked and many a study is conducted by the empirical/deductive method; it seems that the empirical method has been accepted as being appropriate without question. It is, however, easy to argument that a sedimentary system is by nature complex and that several input parameters vary at the same time and that they can create similar output in the rock record. It follows trivially from these first principles that in such a case the deductive solution cannot be unique. At the same time several geological methods depart precisely from the assumption, that one particular variable is the dictator/driver and that the others are constant, even though the data do not support such an assumption. The method of "sequence stratigraphy" is a typical example of such a dogma. It can be easily argued that all the interpretation resulting from a method that is built on uncertain or wrong assumptions is erroneous. Still, this method has survived for many years, nonwithstanding all the critics it has received. This is just one example of the present day geological world and is not unique. Even the alternative methods criticising sequence stratigraphy actually depart from the same erroneous assumptions and do not solve the very fundamental issue that lies at the base of the problem. This problem is straighforward and obvious: a sedimentary system is inherently four-dimensional (3 spatial dimensions + 1 temporal dimension). Any method using an inferior number or dimensions is bound to fail to describe the evolution of a sedimentary system. It is indicative of the present day geological world that such fundamental issues be overlooked. The only reason for which one can appoint the socalled "rationality" in todays society. Simple "common sense" leads us to the conclusion that in this case the empirical method is bound to fail and the only method that can solve the problem is the theoretical approach. Reasoning that is completely trivial for the traditional exact sciences like physics and mathematics and applied sciences like engineering. However, not for geology, a science that was traditionally descriptive and jumped to empirical science, skipping the stage of theoretical science. I argue that the gap of theoretical geology is left open and needs to be filled. Every discipline in geology lacks a theoretical base. This base can only be filled by the theoretical/inductive approach and can impossibly be filled by the empirical/deductive approach. Once a critical mass of geologists realises this flaw in todays geology, we can start solving the fundamental problems in geology.

  19. Backward Dependencies and in-Situ wh-Questions as Test Cases on How to Approach Experimental Linguistics Research That Pursues Theoretical Linguistics Questions

    PubMed Central

    Pablos, Leticia; Doetjes, Jenny; Cheng, Lisa L.-S.

    2018-01-01

    The empirical study of language is a young field in contemporary linguistics. This being the case, and following a natural development process, the field is currently at a stage where different research methods and experimental approaches are being put into question in terms of their validity. Without pretending to provide an answer with respect to the best way to conduct linguistics related experimental research, in this article we aim at examining the process that researchers follow in the design and implementation of experimental linguistics research with a goal to validate specific theoretical linguistic analyses. First, we discuss the general challenges that experimental work faces in finding a compromise between addressing theoretically relevant questions and being able to implement these questions in a specific controlled experimental paradigm. We discuss the Granularity Mismatch Problem (Poeppel and Embick, 2005) which addresses the challenges that research that is trying to bridge the representations and computations of language and their psycholinguistic/neurolinguistic evidence faces, and the basic assumptions that interdisciplinary research needs to consider due to the different conceptual granularity of the objects under study. To illustrate the practical implications of the points addressed, we compare two approaches to perform linguistic experimental research by reviewing a number of our own studies strongly grounded on theoretically informed questions. First, we show how linguistic phenomena similar at a conceptual level can be tested within the same language using measurement of event-related potentials (ERP) by discussing results from two ERP experiments on the processing of long-distance backward dependencies that involve coreference and negative polarity items respectively in Dutch. Second, we examine how the same linguistic phenomenon can be tested in different languages using reading time measures by discussing the outcome of four self-paced reading experiments on the processing of in-situ wh-questions in Mandarin Chinese and French. Finally, we review the implications that our findings have for the specific theoretical linguistics questions that we originally aimed to address. We conclude with an overview of the general insights that can be gained from the role of structural hierarchy and grammatical constraints in processing and the existing limitations on the generalization of results. PMID:29375417

  20. Backward Dependencies and in-Situ wh-Questions as Test Cases on How to Approach Experimental Linguistics Research That Pursues Theoretical Linguistics Questions.

    PubMed

    Pablos, Leticia; Doetjes, Jenny; Cheng, Lisa L-S

    2017-01-01

    The empirical study of language is a young field in contemporary linguistics. This being the case, and following a natural development process, the field is currently at a stage where different research methods and experimental approaches are being put into question in terms of their validity. Without pretending to provide an answer with respect to the best way to conduct linguistics related experimental research, in this article we aim at examining the process that researchers follow in the design and implementation of experimental linguistics research with a goal to validate specific theoretical linguistic analyses. First, we discuss the general challenges that experimental work faces in finding a compromise between addressing theoretically relevant questions and being able to implement these questions in a specific controlled experimental paradigm. We discuss the Granularity Mismatch Problem (Poeppel and Embick, 2005) which addresses the challenges that research that is trying to bridge the representations and computations of language and their psycholinguistic/neurolinguistic evidence faces, and the basic assumptions that interdisciplinary research needs to consider due to the different conceptual granularity of the objects under study. To illustrate the practical implications of the points addressed, we compare two approaches to perform linguistic experimental research by reviewing a number of our own studies strongly grounded on theoretically informed questions. First, we show how linguistic phenomena similar at a conceptual level can be tested within the same language using measurement of event-related potentials (ERP) by discussing results from two ERP experiments on the processing of long-distance backward dependencies that involve coreference and negative polarity items respectively in Dutch. Second, we examine how the same linguistic phenomenon can be tested in different languages using reading time measures by discussing the outcome of four self-paced reading experiments on the processing of in-situ wh -questions in Mandarin Chinese and French. Finally, we review the implications that our findings have for the specific theoretical linguistics questions that we originally aimed to address. We conclude with an overview of the general insights that can be gained from the role of structural hierarchy and grammatical constraints in processing and the existing limitations on the generalization of results.

  1. A new theoretical approach to terrestrial ecosystem science based on multiscale observations and eco-evolutionary optimality principles

    NASA Astrophysics Data System (ADS)

    Prentice, Iain Colin; Wang, Han; Cornwell, William; Davis, Tyler; Dong, Ning; Evans, Bradley; Keenan, Trevor; Peng, Changhui; Stocker, Benjamin; Togashi, Henrique; Wright, Ian

    2016-04-01

    Ecosystem science focuses on biophysical interactions of organisms and their abiotic environment, and comprises vital aspects of Earth system function such as the controls of carbon, water and energy exchanges between ecosystems and the atmosphere. Global numerical models of these processes have proliferated, and have been incorporated as standard components of Earth system models whose ambitious goal is to predict the coupled behaviour of the oceans, atmosphere and land on time scales from minutes to millennia. Unfortunately, however, the performance of most current terrestrial ecosystem models is highly unsatisfactory. Models typically fail the most basic observational benchmarks, and diverge greatly from one another when called upon to predict the response of ecosystem function and composition to environmental changes beyond the narrow range for which they were developed. This situation seems to have arisen for two inter-related reasons. First, general principles underlying many basic terrestrial biogeochemical processes have been neither clearly formulated nor adequately tested. Second, extensive observational data sets that could be used to test process formulations have become available only quite recently, long postdating the emergence of the current modelling paradigm. But the situation has changed now and ecosystem science needs to change too, to reflect both recent theoretical advances and the vast increase in the availability of relevant data sets at scales from the leaf to the globe. This presentation will outline an emerging mathematical theory that links biophysical plant and ecosystem processes through testable hypotheses derived from the principle of optimization by natural selection. The development and testing of this theory has depended on the availability of extensive data sets on climate, leaf traits (including δ13C measurements), and ecosystem properties including green vegetation cover and land-atmosphere CO2 fluxes. Achievements to date include unified explanations for observed climate and elevation effects on leaf CO2 drawdown (ci:c¬a¬ ratio) and photosynthetic capacity (Vcmax), growth temperature effects on the Jmax:Vcmax ratio, the adaptive nature of acclimation to enhanced CO2 concentration, the controls of leaf versus sapwood respiration, the controls of leaf N content (Narea), the relative constancy of the light use efficiency of gross primary production, and the relative conservatism of leaf dark respiration with climate. These findings call into question many assumptions in supposed "state-of-the-art" terrestrial ecosystem models, and provide a foundation for next-generation global ecosystem models that will rest on a greatly strengthened theoretical and empirical basis.

  2. R0 for vector-borne diseases: impact of the assumption for the duration of the extrinsic incubation period.

    PubMed

    Hartemink, Nienke; Cianci, Daniela; Reiter, Paul

    2015-03-01

    Mathematical modeling and notably the basic reproduction number R0 have become popular tools for the description of vector-borne disease dynamics. We compare two widely used methods to calculate the probability of a vector to survive the extrinsic incubation period. The two methods are based on different assumptions for the duration of the extrinsic incubation period; one method assumes a fixed period and the other method assumes a fixed daily rate of becoming infectious. We conclude that the outcomes differ substantially between the methods when the average life span of the vector is short compared to the extrinsic incubation period.

  3. Likelihood ratio decisions in memory: three implied regularities.

    PubMed

    Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T

    2009-06-01

    We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.

  4. Shielding of substations against direct lightning strokes by shield wires

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chowdhuri, P.

    1994-01-01

    A new analysis for shielding outdoor substations against direct lightning strokes by shield wires is proposed. The basic assumption of this proposed method is that any lightning stroke which penetrates the shields will cause damage. The second assumption is that a certain level of risk of failure must be accepted, such as one or two failures per 100 years. The proposed method, using electrogeometric model, was applied to design shield wires for two outdoor substations: (1) 161-kV/69-kV station, and (2) 500-kV/161-kV station. The results of the proposed method were also compared with the shielding data of two other substations.

  5. The impact of cloud vertical profile on liquid water path retrieval based on the bispectral method: A theoretical study based on large-eddy simulations of shallow marine boundary layer clouds.

    PubMed

    Miller, Daniel J; Zhang, Zhibo; Ackerman, Andrew S; Platnick, Steven; Baum, Bryan A

    2016-04-27

    Passive optical retrievals of cloud liquid water path (LWP), like those implemented for Moderate Resolution Imaging Spectroradiometer (MODIS), rely on cloud vertical profile assumptions to relate optical thickness ( τ ) and effective radius ( r e ) retrievals to LWP. These techniques typically assume that shallow clouds are vertically homogeneous; however, an adiabatic cloud model is plausibly more realistic for shallow marine boundary layer cloud regimes. In this study a satellite retrieval simulator is used to perform MODIS-like satellite retrievals, which in turn are compared directly to the large-eddy simulation (LES) output. This satellite simulator creates a framework for rigorous quantification of the impact that vertical profile features have on LWP retrievals, and it accomplishes this while also avoiding sources of bias present in previous observational studies. The cloud vertical profiles from the LES are often more complex than either of the two standard assumptions, and the favored assumption was found to be sensitive to cloud regime (cumuliform/stratiform). Confirming previous studies, drizzle and cloud top entrainment of dry air are identified as physical features that bias LWP retrievals away from adiabatic and toward homogeneous assumptions. The mean bias induced by drizzle-influenced profiles was shown to be on the order of 5-10 g/m 2 . In contrast, the influence of cloud top entrainment was found to be smaller by about a factor of 2. A theoretical framework is developed to explain variability in LWP retrievals by introducing modifications to the adiabatic r e profile. In addition to analyzing bispectral retrievals, we also compare results with the vertical profile sensitivity of passive polarimetric retrieval techniques.

  6. The impact of cloud vertical profile on liquid water path retrieval based on the bispectral method: A theoretical study based on large-eddy simulations of shallow marine boundary layer clouds

    PubMed Central

    Miller, Daniel J.; Zhang, Zhibo; Ackerman, Andrew S.; Platnick, Steven; Baum, Bryan A.

    2018-01-01

    Passive optical retrievals of cloud liquid water path (LWP), like those implemented for Moderate Resolution Imaging Spectroradiometer (MODIS), rely on cloud vertical profile assumptions to relate optical thickness (τ) and effective radius (re) retrievals to LWP. These techniques typically assume that shallow clouds are vertically homogeneous; however, an adiabatic cloud model is plausibly more realistic for shallow marine boundary layer cloud regimes. In this study a satellite retrieval simulator is used to perform MODIS-like satellite retrievals, which in turn are compared directly to the large-eddy simulation (LES) output. This satellite simulator creates a framework for rigorous quantification of the impact that vertical profile features have on LWP retrievals, and it accomplishes this while also avoiding sources of bias present in previous observational studies. The cloud vertical profiles from the LES are often more complex than either of the two standard assumptions, and the favored assumption was found to be sensitive to cloud regime (cumuliform/stratiform). Confirming previous studies, drizzle and cloud top entrainment of dry air are identified as physical features that bias LWP retrievals away from adiabatic and toward homogeneous assumptions. The mean bias induced by drizzle-influenced profiles was shown to be on the order of 5–10 g/m2. In contrast, the influence of cloud top entrainment was found to be smaller by about a factor of 2. A theoretical framework is developed to explain variability in LWP retrievals by introducing modifications to the adiabatic re profile. In addition to analyzing bispectral retrievals, we also compare results with the vertical profile sensitivity of passive polarimetric retrieval techniques. PMID:29637042

  7. Evaluation of a distributed catchment scale water balance model

    NASA Technical Reports Server (NTRS)

    Troch, Peter A.; Mancini, Marco; Paniconi, Claudio; Wood, Eric F.

    1993-01-01

    The validity of some of the simplifying assumptions in a conceptual water balance model is investigated by comparing simulation results from the conceptual model with simulation results from a three-dimensional physically based numerical model and with field observations. We examine, in particular, assumptions and simplifications related to water table dynamics, vertical soil moisture and pressure head distributions, and subsurface flow contributions to stream discharge. The conceptual model relies on a topographic index to predict saturation excess runoff and on Philip's infiltration equation to predict infiltration excess runoff. The numerical model solves the three-dimensional Richards equation describing flow in variably saturated porous media, and handles seepage face boundaries, infiltration excess and saturation excess runoff production, and soil driven and atmosphere driven surface fluxes. The study catchments (a 7.2 sq km catchment and a 0.64 sq km subcatchment) are located in the North Appalachian ridge and valley region of eastern Pennsylvania. Hydrologic data collected during the MACHYDRO 90 field experiment are used to calibrate the models and to evaluate simulation results. It is found that water table dynamics as predicted by the conceptual model are close to the observations in a shallow water well and therefore, that a linear relationship between a topographic index and the local water table depth is found to be a reasonable assumption for catchment scale modeling. However, the hydraulic equilibrium assumption is not valid for the upper 100 cm layer of the unsaturated zone and a conceptual model that incorporates a root zone is suggested. Furthermore, theoretical subsurface flow characteristics from the conceptual model are found to be different from field observations, numerical simulation results, and theoretical baseflow recession characteristics based on Boussinesq's groundwater equation.

  8. Allgemeine Sprachfaehigkeit und Fremdsprachenerwerb. Zur Struktur von Leistungsdimensionen und linguistischer Kompetenz des Fremdsprachenlerners (General Language Ability and Foreign Language Acquisition. On the Structure of Performance Dimensions and the Linguistic Competence of the Foreign Language Learner). Diskussions beitraege aus dem Institute fuer Bildungsforschung, No. 1.

    ERIC Educational Resources Information Center

    Sang, Fritz; Vollmer, Helmut J.

    This study investigates the theoretical plausibility and empirical validity of the assumption that all performance in a foreign language can be traced back to a single factor, the general language ability factor. The theoretical background of this hypothesis is reviewed in detail. The concept of a unitary linguistic competence, interpreted as an…

  9. Theoretical Implications of Disordered Syntactic Comprehension.

    ERIC Educational Resources Information Center

    Rindflesch, Thomas; Reeves, Jennifer E.

    1992-01-01

    Reexamines data from Caplan and Hildebrandt (1988) with a new set of background assumptions and concludes a Government-Binding-based account is not supported. Instead, deficits observed in the process of infinitival complement constructions are attributed to patient inability to fully access the data structure required to support a proposed…

  10. The Theoretical Distribution of Evoked Brainstem Activity in Preterm, High-Risk, and Healthy Infants.

    ERIC Educational Resources Information Center

    Salamy, A.

    1981-01-01

    Determines the frequency distribution of Brainstem Auditory Evoked Potential variables (BAEP) for premature babies at different stages of development--normal newborns, infants, young children, and adults. The author concludes that the assumption of normality underlying most "standard" statistical analyses can be met for many BAEP…

  11. Cultivating Teachers' Morality and the Pedagogy of Emotional Rationality

    ERIC Educational Resources Information Center

    Kim, Minkang

    2013-01-01

    Teachers are expected to act ethically and provide moral role models in performing their duties, even though teacher education has often relegated the cultivation of teachers' ethical awareness and moral development to the margins. When it is addressed, the main theoretical assumptions have relied heavily on the cognitivist developmental theories…

  12. Pedagogies of Indignation and "The Lives of Others"

    ERIC Educational Resources Information Center

    Suissa, Judith

    2017-01-01

    Neel Mukherjee's novel, "The Lives of Others", which depicts characters dealing with a situation of extreme and violent oppression, is used as the basis for looking more closely at some of the theoretical assumptions about hope, agency and critical consciousness that underpin Critical Pedagogy. It is suggested that it may be…

  13. The Syntax and Pragmatics of Fronting in Germanic

    ERIC Educational Resources Information Center

    Light, Caitlin

    2012-01-01

    Across the Germanic language family, we find a type of movement traditionally termed "topicalization," which may be realized in Germanic languages which possess the so-called Verb-Second (V2) constraint, as well as those without it. I will henceforward call this phenomenon "fronting" to avoid theoretical assumptions. This…

  14. Modification of the DSN radio frequency angular tropospheric refraction model

    NASA Technical Reports Server (NTRS)

    Berman, A. L.

    1977-01-01

    The previously derived DSN Radio Frequency Angular Tropospheric Refraction Model contained an assumption which was subsequently seen to be at a variance with the theoretical basis of angular refraction. The modification necessary to correct the model is minor in that the value of a constant is changed.

  15. Stochastic game theory: for playing games, not just for doing theory.

    PubMed

    Goeree, J K; Holt, C A

    1999-09-14

    Recent theoretical advances have dramatically increased the relevance of game theory for predicting human behavior in interactive situations. By relaxing the classical assumptions of perfect rationality and perfect foresight, we obtain much improved explanations of initial decisions, dynamic patterns of learning and adjustment, and equilibrium steady-state distributions.

  16. The Metrical Foot in Diyari.

    ERIC Educational Resources Information Center

    Poser, William

    1989-01-01

    Considers the metrical foot in Diyari, a South Australian Language, and concludes that, on the basis of stress alone, an argument can be made for the constituency of the metrical stress foot under certain theoretical assumptions. This conclusion is reinforced by the occupance in Diyari of other less theory-dependant phenomena. (46 references) (JL)

  17. The Worldview Dimensions of Individualism and Collectivism: Implications for Counseling.

    ERIC Educational Resources Information Center

    Williams, Bryant

    2003-01-01

    A recent article, "Rethinking Individualism and Collectivism: Evaluation of Theoretical Assumptions and Meta-Analyses" (D. Oyserman, H. M. Coon, & M. Kemmelmeier, 2002), revealed that 170 studies have been conducted on the worldview dimensions of individualism and collectivism. This article reviews the results of the authors'…

  18. Cognitive Processes in Dissociation: Comment on Giesbrecht et al. (2008)

    ERIC Educational Resources Information Center

    Bremner, J. Douglas

    2010-01-01

    In their recent review "Cognitive Processes in Dissociation: An Analysis of Core Theoretical Assumptions," published in "Psychological Bulletin", Giesbrecht, Lynn, Lilienfeld, and Merckelbach (2008) have challenged the widely accepted trauma theory of dissociation, which holds that dissociative symptoms are caused by traumatic stress. In doing so,…

  19. Parent-Child Interaction: Research and Its Practical Implications.

    ERIC Educational Resources Information Center

    Smart, Margaret E.; Minet, Selma B.

    This report, prepared as part of the Project in Television and Early Childhood Education at the University of Southern California, contains a review of landmark and current literature on parent-child interaction (PCI). Major theoretical assumptions, research procedures and findings are analyzed in order to develop a model of parent-child…

  20. Mathematical Formulation of Multivariate Euclidean Models for Discrimination Methods.

    ERIC Educational Resources Information Center

    Mullen, Kenneth; Ennis, Daniel M.

    1987-01-01

    Multivariate models for the triangular and duo-trio methods are described, and theoretical methods are compared to a Monte Carlo simulation. Implications are discussed for a new theory of multidimensional scaling which challenges the traditional assumption that proximity measures and perceptual distances are monotonically related. (Author/GDC)

  1. Parent-Child Relationships of Boys in Different Offending Trajectories: A Developmental Perspective

    ERIC Educational Resources Information Center

    Keijsers, Loes; Loeber, Rolf; Branje, Susan; Meeus, Wim

    2012-01-01

    Background: This study tested the theoretical assumption that transformations of parent-child relationships in late childhood and adolescence would differ for boys following different offending trajectories. Methods: Using longitudinal multiinformant data of 503 boys (ages 7-19), we conducted Growth Mixture Modeling to extract offending…

  2. Toward an Instructionally Oriented Theory of Example-Based Learning

    ERIC Educational Resources Information Center

    Renkl, Alexander

    2014-01-01

    Learning from examples is a very effective means of initial cognitive skill acquisition. There is an enormous body of research on the specifics of this learning method. This article presents an instructionally oriented theory of example-based learning that integrates theoretical assumptions and findings from three research areas: learning from…

  3. Issues in the Intellectual Assessment of Hearing Impaired Children

    ERIC Educational Resources Information Center

    Hughes, Deana; Sapp, Gary L.; Kohler, Maxie P.

    2006-01-01

    The assessment of hearing impaired children is fraught with a number of problems. These include lack of valid assessment measures, faulty theoretical assumptions, lack of knowledge regarding the functioning of cognitive processes of these children, and biases against these children. This article briefly considers these issues and describes a study…

  4. Latter-Day Saint Women and Leadership: The Influence of Their Religious Worldview

    ERIC Educational Resources Information Center

    Madsen, Susan R.

    2016-01-01

    The article examines theories, assumptions, concepts, experiences, and practices from the Latter-day Saints' (LDS, or the Mormons) religious worldview to expand existing theoretical constructs and implications of leadership development and education for women. The article elucidates LDS doctrine and culture regarding women and provides specific…

  5. Psychologic-Pedagogical Conditions for Prevention of Suicidal Tendencies among Teenagers

    ERIC Educational Resources Information Center

    Abil, Yerkin A.; Kim, Natalia P.; Baymuhambetova, Botagoz Sh.; Mamiyev, Nurlan B.; Li, Yelena D.; Shumeyko, Tatyana S.

    2016-01-01

    Aim of research: to develop complex of psychology-pedagogical conditions, directed on prevention of suicidal tendencies among teenagers. On analysis basis of scientific literature authors disclose main causes of suicidal behavior in adolescence. To confirm science veracity of advanced theoretic assumptions, describes experiment, conducted on basis…

  6. A Theoretical Examination of Psychosocial Issues for Asian Pacific American Students.

    ERIC Educational Resources Information Center

    Kodama, Corinne Maekawa; McEwen, Marylu K.; Liang, Christopher T. H.; Lee, Sunny

    2001-01-01

    Examines psychosocial issues for Asian Pacific American (APA) students, one of the fastest growing but most understudied college populations. Finds that general groupings of developmental issues align somewhat with traditional psychosocial theory, although the underlying assumptions and specific developmental tasks do not fit the experience of…

  7. Should Debbie Do Shale? A Playful Polemic in Honor of Paul Feyerabend.

    ERIC Educational Resources Information Center

    Steedman, P. H.

    1982-01-01

    Examines the epistemological assumptions underlying the teaching of high school science. The author recommends a science course at the high school level in which science is presented, within a historical context, as an essentially theoretical activity which reflects a culture's political, religious, philosophical, aesthetic, and ideological…

  8. The Critical Purchase of Genealogy: Critiquing Student Participation Projects

    ERIC Educational Resources Information Center

    Anderson, Anna

    2015-01-01

    Until recently the dominant critique of "student participation" projects was one based on the theoretical assumptions of critical theory in the form of critical pedagogy. Over the last decade, we have witnessed the emergence of a critical education discourse that theorises and critically analyses such projects using Foucault's notion of…

  9. Generating Synergy between Conceptual Change and Knowledge Building

    ERIC Educational Resources Information Center

    Lee, Chwee Beng

    2010-01-01

    This paper is an initial effort to review the reciprocity between the theoretical traditions of "conceptual change" and "knowledge building" by discussing the underlying epistemological assumptions, objectives, conceptions of concepts and ideas, and mechanisms that bring forth the respective goals of these two traditions. The basis for generating…

  10. Normalizing Catastrophe: An Educational Response

    ERIC Educational Resources Information Center

    Jickling, Bob

    2013-01-01

    Processes of normalizing assumptions and values have been the subjects of theoretical framing and critique for several decades now. Critique has often been tied to issues of environmental sustainability and social justice. Now, in an era of global warming, there is a rising concern that the results of normalizing of present values could be…

  11. A Unified Framework for Monetary Theory and Policy Analysis.

    ERIC Educational Resources Information Center

    Lagos, Ricardo; Wright, Randall

    2005-01-01

    Search-theoretic models of monetary exchange are based on explicit descriptions of the frictions that make money essential. However, tractable versions of these models typically make strong assumptions that render them ill suited for monetary policy analysis. We propose a new framework, based on explicit micro foundations, within which macro…

  12. Ecosystemic Complexity Theory of Conflict: Understanding the Fog of Conflict

    ERIC Educational Resources Information Center

    Brack, Greg; Lassiter, Pamela S.; Hill, Michele B.; Moore, Sarah A.

    2011-01-01

    Counselors often engage in conflict mediation in professional practice. A model for understanding the complex and subtle nature of conflict resolution is presented. The ecosystemic complexity theory of conflict is offered to assist practitioners in navigating the fog of conflict. Theoretical assumptions are discussed with implications for clinical…

  13. Diversity in Literary Response: Revisiting Gender Expectations

    ERIC Educational Resources Information Center

    Brendler, Beth M.

    2014-01-01

    Drawing on and reexamining theories on gender and literacy, derived from research performed between 1974 and 2002, this qualitative study explored the gender assumptions and expectations of Language Arts teachers in a graduate level adolescent literature course at a university in the Midwestern United States. The theoretical framework was…

  14. Practitioner Review: Approaches to Assessment and Treatment of Children with DCD--An Evaluative Review

    ERIC Educational Resources Information Center

    Wilson, Peter H.

    2005-01-01

    Background: Movement clumsiness (or Developmental Coordination Disorder--DCD) has gained increasing recognition as a significant condition of childhood. However, some uncertainty still exists about diagnosis. Accordingly, approaches to assessment and treatment are varied, each drawing on distinct theoretical assumptions about the aetiology of the…

  15. Examining Transfer Effects from Dialogic Discussions to New Tasks and Contexts

    ERIC Educational Resources Information Center

    Reznitskaya, Alina; Glina, Monica; Carolan, Brian; Michaud, Olivier; Rogers, Jon; Sequeira, Lavina

    2012-01-01

    This study investigated whether students who engage in inquiry dialogue with others improve their performance on various tasks measuring argumentation development. The study used an educational environment called Philosophy for Children (P4C) to examine specific theoretical assumptions regarding the role dialogic interaction plays in the…

  16. Toward a Social Approach to Learning in Community Service Learning

    ERIC Educational Resources Information Center

    Cooks, Leda; Scharrer, Erica; Paredes, Mari Castaneda

    2004-01-01

    The authors describe a social approach to learning in community service learning that extends the contributions of three theoretical bodies of scholarship on learning: social constructionism, critical pedagogy, and community service learning. Building on the assumptions about learning described in each of these areas, engagement, identity, and…

  17. A Competency Approach to Developing Leaders--Is This Approach Effective?

    ERIC Educational Resources Information Center

    Richards, Patricia

    2008-01-01

    This paper examines the underlying assumptions that competency-based frameworks are based upon in relation to leadership development. It examines the impetus for this framework becoming the prevailing theoretical base for developing leaders and tracks the historical path to this phenomenon. Research suggests that a competency-based framework may…

  18. High-mass stars in Milky Way clusters

    NASA Astrophysics Data System (ADS)

    Negueruela, Ignacio

    2017-11-01

    Young open clusters are our laboratories for studying high-mass star formation and evolution. Unfortunately, the information that they provide is difficult to interpret, and sometimes contradictory. In this contribution, I present a few examples of the uncertainties that we face when confronting observations with theoretical models and our own assumptions.

  19. Play-Based Art Activities in Early Years: Teachers' Thinking and Practice

    ERIC Educational Resources Information Center

    Savva, Andri; Erakleous, Valentina

    2018-01-01

    The present study reports findings on pre-service teachers' thinking during planning and implementing play-based art activities. "Thinking" (in the present study) is informed by discourses emphasising art teaching and learning in relation to play and theoretical assumptions conceptualising planning as "practice of knowing."…

  20. Partial Least Squares Structural Equation Modeling with R

    ERIC Educational Resources Information Center

    Ravand, Hamdollah; Baghaei, Purya

    2016-01-01

    Structural equation modeling (SEM) has become widespread in educational and psychological research. Its flexibility in addressing complex theoretical models and the proper treatment of measurement error has made it the model of choice for many researchers in the social sciences. Nevertheless, the model imposes some daunting assumptions and…

  1. Learning from Programmed Instruction: Examining Implications for Modern Instructional Technology

    ERIC Educational Resources Information Center

    McDonald, Jason K.; Yanchar, Stephen C.; Osguthorpe, Russell T.

    2005-01-01

    This article reports a theoretical examination of several parallels between contemporary instructional technology (as manifest in one of its most current manifestations, online learning) and one of its direct predecessors, programmed instruction. We place particular focus on the underlying assumptions of the two movements. Our analysis suggests…

  2. E-Portfolio Evaluation and Vocabulary Learning: Moving from Pedagogy to Andragogy

    ERIC Educational Resources Information Center

    Sharifi, Maryam; Soleimani, Hassan; Jafarigohar, Manoochehr

    2017-01-01

    Current trends in the field of educational technology indicate a shift in pedagogical assumptions and theoretical frameworks that favor active involvement of self-directed learners in a constructivist environment. This study probes the influence of electronic portfolio evaluation on vocabulary learning of Iranian university students and the…

  3. Induction and Processing of the Radiation-Induced Gamma-H2AX Signal and Its Link to the Underlying Pattern of DSB: A Combined Experimental and Modelling Study

    PubMed Central

    Tommasino, Francesco; Friedrich, Thomas; Jakob, Burkhard; Meyer, Barbara; Durante, Marco; Scholz, Michael

    2015-01-01

    We present here an analysis of DSB induction and processing after irradiation with X-rays in an extended dose range based on the use of the γH2AX assay. The study was performed by quantitative flow cytometry measurements, since the use of foci counting would result in reasonable accuracy only in a limited dose range of a few Gy. The experimental data are complemented by a theoretical analysis based on the GLOBLE model. In fact, original aim of the study was to test GLOBLE predictions against new experimental data, in order to contribute to the validation of the model. Specifically, the γH2AX signal kinetics has been investigated up to 24 h after exposure to increasing photon doses between 2 and 500 Gy. The prolonged persistence of the signal at high doses strongly suggests dose dependence in DSB processing after low LET irradiation. Importantly, in the framework of our modelling analysis, this is related to a gradually increased fraction of DSB clustering at the micrometre scale. The parallel study of γH2AX dose response curves shows the onset of a pronounced saturation in two cell lines at a dose of about 20 Gy. This dose is much lower than expected according to model predictions based on the values usually adopted for the DSB induction yield (≈ 30 DSB/Gy) and for the γH2AX foci extension of approximately 2 Mbp around the DSB. We show and discuss how theoretical predictions and experimental findings can be in principle reconciled by combining an increased DSB induction yield with the assumption of a larger genomic extension for the single phosphorylated regions. As an alternative approach, we also considered in our model the possibility of a 3D spreading-mechanism of the H2AX phosphorylation around the induced DSB, and applied it to the analysis of both the aspects considered. Our results are found to be supportive for the basic assumptions on which GLOBLE is built. Apart from giving new insights into the H2AX phosphorylation process, experiments performed at high doses are of relevance in the context of radiation therapy, where hypo-fractionated schemes become increasingly popular. PMID:26067661

  4. A National Research Council Evaluation of the Department of Energy's Marine and Hydrokinetic Resource Assessments

    NASA Astrophysics Data System (ADS)

    Glickson, D.; Holmes, K. J.; Cooke, D.

    2012-12-01

    Marine and hydrokinetic (MHK) resources are increasingly becoming part of energy regulatory, planning, and marketing activities in the U.S. and elsewhere. In particular, state-based renewable portfolio standards and federal production and investment tax credits have led to an increased interest in the possible deployment of MHK technologies. The Energy Policy Act of 2005 (Public Law 109-58) directed the Department of Energy (DOE) to estimate the size of the MHK resource base. In order to help DOE prioritize its overall portfolio of future research, increase the understanding of the potential for MHK resource development, and direct MHK device and/or project developers to locations of greatest promise, the DOE Wind and Water Power Program requested that the National Research Council (NRC) provide an evaluation of the detailed assessments being conducted by five individual resource assessment groups. These resource assessment groups were contracted to estimate the amount of extractable energy from wave, tidal, ocean current, ocean thermal energy conversion, and riverine resources. Performing these assessments requires that each resource assessment group estimate the average power density of the resource base, as well as the basic technology characteristics and spatial and temporal constituents that convert power into electricity for that resource. The NRC committee evaluated the methodologies, technologies, and assumptions associated with each of these resource assessments. The committee developed a conceptual framework for delineating the processes used to develop the assessment results requested by the DOE, with definitions of the theoretical, technical, and practical resource to clarify elements of the overall resource assessment process. This allowed the NRC committee to make a comparison of different methods, terminology, and processes among the five resource assessment groups. The committee concluded that the overall approach taken by the wave resource and tidal resource assessment groups is a useful contribution to understanding the distribution and possible magnitude of energy sources from waves and tides in U.S. waters, but had concerns regarding the usefulness of aggregating the analysis to produce a "single number" estimate of the total national or regional theoretical and technical resource base. The committee had further concerns about the methodologies and assumptions within each assessment, as well as the limited scope of validation exercises. An interim report was released in July 2011, and the committee's final report will be released in Fall 2012.;

  5. Characteristics of Knowledge Interconnectedness in Teaching

    ERIC Educational Resources Information Center

    Antonijevic, Radovan

    2006-01-01

    The subject of the paper presents establishing basic characteristics, forms and levels of knowledge interconnectedness in teaching, especially in mathematics and biology teaching. The analysis was realized by considering basic theoretical views in this field, as well as by establishing features and levels of knowledge interconnectedness in the…

  6. Academic Public Relations Curricula: How They Compare with the Bateman-Cutlip Commission Standards.

    ERIC Educational Resources Information Center

    McCartney, Hunter P.

    To see what effect the 1975 Bateman-Cutlip Commission's recommendations have had on improving public relations education in the United States, 173 questionnaires were sent to colleges or universities with accredited or comprehensive programs in public relations. Responding to five basic assumptions underlying the commission's recommendations,…

  7. Faculty and Student Attitudes about Transfer of Learning

    ERIC Educational Resources Information Center

    Lightner, Robin; Benander, Ruth; Kramer, Eugene F.

    2008-01-01

    Transfer of learning is using previous knowledge in novel contexts. While this is a basic assumption of the educational process, students may not always perceive all the options for using what they have learned in different, novel situations. Within the framework of transfer of learning, this study outlines an attitudinal survey concerning faculty…

  8. New Directions in Teacher Education: Foundations, Curriculum, Policy.

    ERIC Educational Resources Information Center

    Denton, Jon, Ed.; And Others

    This publication includes presentations made at the Aikin-Stinnett Lecture Series and follow-up papers sponsored by the Instructional Research Laboratory at Texas A&M University. The papers in this collection focus upon the basic assumptions and conceptual bases of teacher education and the use of research in providing a foundation for…

  9. Perspective Making: Constructivism as a Meaning-Making Structure for Simulation Gaming

    ERIC Educational Resources Information Center

    Lainema, Timo

    2009-01-01

    Constructivism has recently gained popularity, although it is not a completely new learning paradigm. Much of the work within e-learning, for example, uses constructivism as a reference "discipline" (explicitly or implicitly). However, some of the work done within the simulation gaming (SG) community discusses what the basic assumptions and…

  10. Spiral Growth in Plants: Models and Simulations

    ERIC Educational Resources Information Center

    Allen, Bradford D.

    2004-01-01

    The analysis and simulation of spiral growth in plants integrates algebra and trigonometry in a botanical setting. When the ideas presented here are used in a mathematics classroom/computer lab, students can better understand how basic assumptions about plant growth lead to the golden ratio and how the use of circular functions leads to accurate…

  11. Dynamic Assessment and Its Implications for RTI Models

    ERIC Educational Resources Information Center

    Wagner, Richard K.; Compton, Donald L.

    2011-01-01

    Dynamic assessment refers to assessment that combines elements of instruction for the purpose of learning something about an individual that cannot be learned as easily or at all from conventional assessment. The origins of dynamic assessment can be traced to Thorndike (1924), Rey (1934), and Vygotsky (1962), who shared three basic assumptions.…

  12. Looking for Skinner and Finding Freud

    ERIC Educational Resources Information Center

    Overskeid, Geir

    2007-01-01

    Sigmund Freud and B. F. Skinner are often seen as psychology's polar opposites. It seems this view is fallacious. Indeed, Freud and Skinner had many things in common, including basic assumptions shaped by positivism and determinism. More important, Skinner took a clear interest in psychoanalysis and wanted to be analyzed but was turned down. His…

  13. Student Teachers' Beliefs about the Teacher's Role in Inclusive Education

    ERIC Educational Resources Information Center

    Domovic, Vlatka; Vidovic Vlasta, Vizek; Bouillet, Dejana

    2017-01-01

    The main aim of this research is to examine the basic features of student teachers' professional beliefs about the teacher's role in relation to teaching mainstream pupils and pupils with developmental disabilities. The starting assumption of this analysis is that teacher professional development is largely dependent upon teachers' beliefs about…

  14. United States Air Force Training Line Simulator. Final Report.

    ERIC Educational Resources Information Center

    Nauta, Franz; Pierce, Michael B.

    This report describes the technical aspects and potential applications of a computer-based model simulating the flow of airmen through basic training and entry-level technical training. The objective of the simulation is to assess the impacts of alternative recruit classification and training policies under a wide variety of assumptions regarding…

  15. Cable in Boston; A Basic Viability Report.

    ERIC Educational Resources Information Center

    Hauben, Jan Ward; And Others

    The viability of urban cable television (CATV) as an economic phenomenon is examined via a case study of its feasibility in Boston, a microcosm of general urban environment. To clarify cable's economics, a unitary concept of viability is used in which all local characteristics, cost assumptions, and growth estimates are structured dynamically as a…

  16. "I Fell off [the Mothering] Track": Barriers to "Effective Mothering" among Prostituted Women

    ERIC Educational Resources Information Center

    Dalla, Rochelle

    2004-01-01

    Ecological theory and basic assumptions for the promotion of effective mothering among low-income and working-poor women are applied in relation to a particularly vulnerable population: street-level prostitution-involved women. Qualitative data from 38 street-level prostituted women shows barriers to effective mothering at the individual,…

  17. Between "Homo Sociologicus" and "Homo Biologicus": The Reflexive Self in the Age of Social Neuroscience

    ERIC Educational Resources Information Center

    Pickel, Andreas

    2012-01-01

    The social sciences rely on assumptions of a unified self for their explanatory logics. Recent work in the new multidisciplinary field of social neuroscience challenges precisely this unproblematic character of the subjective self as basic, well-defined entity. If disciplinary self-insulation is deemed unacceptable, the philosophical challenge…

  18. Fueling a Third Paradigm of Education: The Pedagogical Implications of Digital, Social and Mobile Media

    ERIC Educational Resources Information Center

    Pavlik, John V.

    2015-01-01

    Emerging technologies are fueling a third paradigm of education. Digital, networked and mobile media are enabling a disruptive transformation of the teaching and learning process. This paradigm challenges traditional assumptions that have long characterized educational institutions and processes, including basic notions of space, time, content,…

  19. Using LISREL to Evaluate Measurement Models and Scale Reliability.

    ERIC Educational Resources Information Center

    Fleishman, John; Benson, Jeri

    1987-01-01

    LISREL program was used to examine measurement model assumptions and to assess reliability of Coopersmith Self-Esteem Inventory for Children, Form B. Data on 722 third-sixth graders from over 70 schools in large urban school district were used. LISREL program assessed (1) nature of basic measurement model for scale, (2) scale invariance across…

  20. The Hidden Reason Behind Children's Misbehavior.

    ERIC Educational Resources Information Center

    Nystul, Michael S.

    1986-01-01

    Discusses hidden reason theory based on the assumptions that: (1) the nature of people is positive; (2) a child's most basic psychological need is involvement; and (3) a child has four possible choices in life (good somebody, good nobody, bad somebody, or severely mentally ill.) A three step approach for implementing hidden reason theory is…

  1. 78 FR 26269 - Connect America Fund; High-Cost Universal Service Support

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-06

    ... the model platform, which is the basic framework for the model consisting of key assumptions about the... combination of competitive bidding and a new forward-looking model of the cost of constructing modern multi-purpose networks.'' Using the cost model to ``estimate the support necessary to serve areas where costs...

  2. Response: Training Doctoral Students to Be Scientists

    ERIC Educational Resources Information Center

    Pollio, David E.

    2012-01-01

    The purpose of this article is to begin framing doctoral training for a science of social work. This process starts by examining two seemingly simple questions: "What is a social work scientist?" and "How do we train social work scientists?" In answering the first question, some basic assumptions and concepts about what constitutes a "social work…

  3. Adults with Intellectual and Developmental Disabilities and Participation in Decision Making: Ethical Considerations for Professional-Client Practice

    ERIC Educational Resources Information Center

    Lotan, Gurit; Ells, Carolyn

    2010-01-01

    In this article, the authors challenge professionals to re-examine assumptions about basic concepts and their implications in supporting adults with intellectual and developmental disabilities. The authors focus on decisions with significant implications, such as planning transition from school to adult life, changing living environments, and…

  4. A Convergence of Two Cultures in the Implementation of P.L. 94-142.

    ERIC Educational Resources Information Center

    Haas, Toni J.

    The Education for All Handicapped Children Act (PL 94-142) demanded basic changes in the practices, purposes, and institutional structures of schools to accommodate handicapped students, but did not adequately address the differences between general and special educators in expectations, training, or assumptions about the functions of schooling…

  5. From Earth to Space--Advertising Films Created in a Computer-Based Primary School Task

    ERIC Educational Resources Information Center

    Öman, Anne

    2017-01-01

    Today, teachers orchestrate computer-based tasks in software applications in Swedish primary schools. Meaning is made through various modes, and multimodal perspectives on literacy have the basic assumption that meaning is made through many representational and communicational resources. The case study presented in this paper has analysed pupils'…

  6. Child Sexual Abuse: Intervention and Treatment Issues. The User Manual Series.

    ERIC Educational Resources Information Center

    Faller, Kathleen Coulborn

    This manual describes professional practices in intervention and treatment of sexual abuse and discusses how to address the problems of sexually abused children and their families. It makes an assumption that the reader has basic information about sexual abuse. The discussion focuses primarily on the child's guardian as the abuser. The manual…

  7. A Comparative Analysis of Selected Mechanical Aspects of the Ice Skating Stride.

    ERIC Educational Resources Information Center

    Marino, G. Wayne

    This study quantitatively analyzes selected aspects of the skating strides of above-average and below-average ability skaters. Subproblems were to determine how stride length and stride rate are affected by changes in skating velocity, to ascertain whether the basic assumption that stride length accurately approximates horizontal movement of the…

  8. Implementing a Redesign Strategy: Lessons from Educational Change.

    ERIC Educational Resources Information Center

    Basom, Richard E., Jr.; Crandall, David P.

    The effective implementation of school redesign, based on a social systems approach, is discussed in this paper. A basic assumption is that the interdependence of system elements has implications for a complex change process. Seven barriers to redesign and five critical issues for successful redesign strategy are presented. Seven linear steps for…

  9. Children Are Human Beings

    ERIC Educational Resources Information Center

    Bossard, James H. S.

    2017-01-01

    The basic assumption underlying this article is that the really significant changes in human history are those that occur, not in the mechanical gadgets which men use nor in the institutionalized arrangements by which they live, but in their attitudes and in the values which they accept. The revolutions of the past that have had the greatest…

  10. Civility in Politics and Education. Routledge Studies in Contemporary Philosophy

    ERIC Educational Resources Information Center

    Mower, Deborah, Ed.; Robison, Wade L., Ed.

    2011-01-01

    This book examines the concept of civility and the conditions of civil disagreement in politics and education. Although many assume that civility is merely polite behavior, it functions to aid rational discourse. Building on this basic assumption, the book offers multiple accounts of civility and its contribution to citizenship, deliberative…

  11. Improving Clinical Teaching: The ADN Experience. Pathways to Practice.

    ERIC Educational Resources Information Center

    Haase, Patricia T.; And Others

    Three Florida associate degree in nursing (ADN) demonstration projects of the Nursing Curriculum Project (NCP) are described, and the history of the ADN program and current controversies are reviewed. In 1976, the NCP of the Southern Regional Education Board issued basic assumptions about the role of the ADN graduate, relating them to client…

  12. Development and Validation of a Clarinet Performance Adjudication Scale

    ERIC Educational Resources Information Center

    Abeles, Harold F.

    1973-01-01

    A basic assumption of this study is that there are generally agreed upon performance standards as evidenced by the use of adjudicators for evaluations at contests and festivals. An evaluation instrument was developed to enable raters to measure effectively those aspects of performance that have common standards of proficiency. (Author/RK)

  13. Organize Your School for Improvement

    ERIC Educational Resources Information Center

    Truby, William F.

    2017-01-01

    W. Edwards Deming has suggested 96% of organization performance is a function of the organization's structure. He contends only about 4% of an organization's performance is attributable to the people. This is a fundamental difference as most school leaders work with the basic assumption that 80% of a school's performance is related to staff and…

  14. Training for Basic Skills or Educating Workers?: Changing Conceptions of Workplace Education Programs.

    ERIC Educational Resources Information Center

    Schultz, Katherine

    Although the National Workplace Literacy Program is relatively new, a new orthodoxy of program development based on particular understandings of literacy and learning has emerged. Descriptions of two model workplace education programs are the beginning points for an examination of the assumptions contained in most reports of workplace education…

  15. Appreciative Inquiry: A Model for Organizational Development and Performance Improvement in Student Affairs

    ERIC Educational Resources Information Center

    Elleven, Russell K.

    2007-01-01

    The article examines a relatively new tool to increase the effectiveness of organizations and people. The recent development and background of Appreciative Inquiry (AI) is reviewed. Basic assumptions of the model are discussed. Implications for departments and divisions of student affairs are analyzed. Finally, suggested readings and workshop…

  16. Resegregation in Norfolk, Virginia. Does Restoring Neighborhood Schools Work?

    ERIC Educational Resources Information Center

    Meldrum, Christina; Eaton, Susan E.

    This report reviews school department data and interviews with officials and others involved in the Norfolk (Virginia) school resegregation plan designed to stem White flight and increase parental involvement. The report finds that all the basic assumptions the local community and the court had about the potential benefits of undoing the city's…

  17. An Economic Theory of School Governance.

    ERIC Educational Resources Information Center

    Rada, Roger D.

    Working from the basic assumption that the primary motivation for those involved in school governance is self-interest, this paper develops and discusses 15 hypotheses that form the essential elements of an economic theory of school governance. The paper opens with a review of previous theories of governance and their origins in social science…

  18. The Effectiveness of Ineffectiveness: A New Approach to Assessing Patterns of Organizational Effectiveness.

    ERIC Educational Resources Information Center

    Cameron, Kim S.

    A way to assess and improve organizational effectiveness is discussed, with a focus on factors that inhibit successful organizational performance. The basic assumption is that it is easier, more accurate, and more beneficial for individuals and organizations to identify criteria of ineffectiveness (faults and weaknesses) than to identify criteria…

  19. Validated Test Method 1314: Liquid-Solid Partitioning as a Function of Liquid-Solid Ratio for Constituents in Solid Materials Using An Up-Flow Percolation Column Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  20. Testing Intercultural Competence in (International) English: Some Basic Questions and Suggested Answers

    ERIC Educational Resources Information Center

    Camerer, Rudi

    2014-01-01

    The testing of intercultural competence has long been regarded as the field of psychometric test procedures, which claim to analyse an individual's personality by specifying and quantifying personality traits with the help of self-answer questionnaires and the statistical evaluation of these. The underlying assumption is that what is analysed and…

  1. Lifeboat Counseling: The Issue of Survival Decisions

    ERIC Educational Resources Information Center

    Dowd, E. Thomas; Emener, William G.

    1978-01-01

    Rehabilitation counseling, as a profession, needs to look at future world possibilities, especially in light of overpopulation, and be aware that the need may arise for adjusting basic assumptions about human life--from the belief that every individual has a right to a meaningful life to the notion of selecting who shall live. (DTT)

  2. Challenges of Adopting Constructive Alignment in Action Learning Education

    ERIC Educational Resources Information Center

    Remneland Wikhamn, Björn

    2017-01-01

    This paper will critically examine how the two influential pedagogical approaches of action-based learning and constructive alignment relate to each other, and how they may differ in focus and basic assumptions. From the outset, they are based on similar underpinnings, with the student and the learning outcomes in the center. Drawing from…

  3. Curricular Learning Communities and Unprepared Students: How Faculty Can Provide a Foundation for Success

    ERIC Educational Resources Information Center

    Engstrom, Cathy McHugh

    2008-01-01

    The pedagogical assumptions and teaching practices of learning community models reflect exemplary conditions for learning, so using these models with unprepared students seems desirable and worthy of investigation. This chapter describes the key role of faculty in creating active, integrative learning experiences for students in basic skills…

  4. Education in Conflict and Crisis for National Security.

    ERIC Educational Resources Information Center

    McClelland, Charles A.

    A basic assumption is that the level of conflict within and between nations will escalate over the next 50 years. Trying to "muddle through" using the tools and techniques of organized violence may yield national suicide. Therefore, complex conflict resolution skills need to be developed and used by some part of society to quell disorder…

  5. Textbooks as a Possible Influence on Unscientific Ideas about Evolution

    ERIC Educational Resources Information Center

    Tshuma, Tholani; Sanders, Martie

    2015-01-01

    While school textbooks are assumed to be written for and used by students, it is widely acknowledged that they also serve a vital support function for teachers, particularly in times of curriculum change. A basic assumption is that biology textbooks are scientifically accurate. Furthermore, because of the negative impact of…

  6. A basic review on the inferior alveolar nerve block techniques.

    PubMed

    Khalil, Hesham

    2014-01-01

    The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned.

  7. A basic review on the inferior alveolar nerve block techniques

    PubMed Central

    Khalil, Hesham

    2014-01-01

    The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned. PMID:25886095

  8. A Markov chain model for reliability growth and decay

    NASA Technical Reports Server (NTRS)

    Siegrist, K.

    1982-01-01

    A mathematical model is developed to describe a complex system undergoing a sequence of trials in which there is interaction between the internal states of the system and the outcomes of the trials. For example, the model might describe a system undergoing testing that is redesigned after each failure. The basic assumptions for the model are that the state of the system after a trial depends probabilistically only on the state before the trial and on the outcome of the trial and that the outcome of a trial depends probabilistically only on the state of the system before the trial. It is shown that under these basic assumptions, the successive states form a Markov chain and the successive states and outcomes jointly form a Markov chain. General results are obtained for the transition probabilities, steady-state distributions, etc. A special case studied in detail describes a system that has two possible state ('repaired' and 'unrepaired') undergoing trials that have three possible outcomes ('inherent failure', 'assignable-cause' 'failure' and 'success'). For this model, the reliability function is computed explicitly and an optimal repair policy is obtained.

  9. Uncertainties and understanding of experimental and theoretical results regarding reactions forming heavy and superheavy nuclei

    NASA Astrophysics Data System (ADS)

    Giardina, G.; Mandaglio, G.; Nasirov, A. K.; Anastasi, A.; Curciarello, F.; Fazio, G.

    2018-02-01

    Experimental and theoretical results of the PCN fusion probability of reactants in the entrance channel and the Wsur survival probability against fission at deexcitation of the compound nucleus formed in heavy-ion collisions are discussed. The theoretical results for a set of nuclear reactions leading to formation of compound nuclei (CNs) with the charge number Z = 102- 122 reveal a strong sensitivity of PCN to the characteristics of colliding nuclei in the entrance channel, dynamics of the reaction mechanism, and excitation energy of the system. We discuss the validity of assumptions and procedures for analysis of experimental data, and also the limits of validity of theoretical results obtained by the use of phenomenological models. The comparison of results obtained in many investigated reactions reveals serious limits of validity of the data analysis and calculation procedures.

  10. Comparison of theoretical proteomes: Identification of COGs with conserved and variable pI within the multimodal pI distribution

    PubMed Central

    Nandi, Soumyadeep; Mehra, Nipun; Lynn, Andrew M; Bhattacharya, Alok

    2005-01-01

    Background Theoretical proteome analysis, generated by plotting theoretical isoelectric points (pI) against molecular masses of all proteins encoded by the genome show a multimodal distribution for pI. This multimodal distribution is an effect of allowed combinations of the charged amino acids, and not due to evolutionary causes. The variation in this distribution can be correlated to the organisms ecological niche. Contributions to this variation maybe mapped to individual proteins by studying the variation in pI of orthologs across microorganism genomes. Results The distribution of ortholog pI values showed trimodal distributions for all prokaryotic genomes analyzed, similar to whole proteome plots. Pairwise analysis of pI variation show that a few COGs are conserved within, but most vary between, the acidic and basic regions of the distribution, while molecular mass is more highly conserved. At the level of functional grouping of orthologs, five groups vary significantly from the population of orthologs, which is attributed to either conservation at the level of sequences or a bias for either positively or negatively charged residues contributing to the function. Individual COGs conserved in both the acidic and basic regions of the trimodal distribution are identified, and orthologs that best represent the variation in levels of the acidic and basic regions are listed. Conclusion The analysis of pI distribution by using orthologs provides a basis for resolution of theoretical proteome comparison at the level of individual proteins. Orthologs identified that significantly vary between the major acidic and basic regions maybe used as representative of the variation of the entire proteome. PMID:16150155

  11. Autocorrelated residuals in inverse modelling of soil hydrological processes: a reason for concern or something that can safely be ignored?

    NASA Astrophysics Data System (ADS)

    Scharnagl, Benedikt; Durner, Wolfgang

    2013-04-01

    Models are inherently imperfect because they simplify processes that are themselves imperfectly known and understood. Moreover, the input variables and parameters needed to run a model are typically subject to various sources of error. As a consequence of these imperfections, model predictions will always deviate from corresponding observations. In most applications in soil hydrology, these deviations are clearly not random but rather show a systematic structure. From a statistical point of view, this systematic mismatch may be a reason for concern because it violates one of the basic assumptions made in inverse parameter estimation: the assumption of independence of the residuals. But what are the consequences of simply ignoring the autocorrelation in the residuals, as it is current practice in soil hydrology? Are the parameter estimates still valid even though the statistical foundation they are based on is partially collapsed? Theory and practical experience from other fields of science have shown that violation of the independence assumption will result in overconfident uncertainty bounds and that in some cases it may lead to significantly different optimal parameter values. In our contribution, we present three soil hydrological case studies, in which the effect of autocorrelated residuals on the estimated parameters was investigated in detail. We explicitly accounted for autocorrelated residuals using a formal likelihood function that incorporates an autoregressive model. The inverse problem was posed in a Bayesian framework, and the posterior probability density function of the parameters was estimated using Markov chain Monte Carlo simulation. In contrast to many other studies in related fields of science, and quite surprisingly, we found that the first-order autoregressive model, often abbreviated as AR(1), did not work well in the soil hydrological setting. We showed that a second-order autoregressive, or AR(2), model performs much better in these applications, leading to parameter and uncertainty estimates that satisfy all the underlying statistical assumptions. For theoretical reasons, these estimates are deemed more reliable than those estimates based on the neglect of autocorrelation in the residuals. In compliance with theory and results reported in the literature, our results showed that parameter uncertainty bounds were substantially wider if autocorrelation in the residuals was explicitly accounted for, and also the optimal parameter vales were slightly different in this case. We argue that the autoregressive model presented here should be used as a matter of routine in inverse modeling of soil hydrological processes.

  12. Elasticity reconstruction: Beyond the assumption of local homogeneity

    NASA Astrophysics Data System (ADS)

    Sinkus, Ralph; Daire, Jean-Luc; Van Beers, Bernard E.; Vilgrain, Valerie

    2010-07-01

    Elasticity imaging is a novel domain which is currently gaining significant interest in the medical field. Most inversion techniques are based on the homogeneity assumption, i.e. the local spatial derivatives of the complex-shear modulus are ignored. This analysis presents an analytic approach in order to overcome this limitation, i.e. first order spatial derivatives of the real-part of the complex-shear modulus are taken into account. Resulting distributions in a gauged breast lesion phantom agree very well with the theoretical expectations. An in-vivo example of a cholangiocarcinoma demonstrates that the new approach provides maps of the viscoelastic properties which agree much better with expectations from anatomy.

  13. Theoretical aerodynamic characteristics of a family of slender wing-tail-body combinations

    NASA Technical Reports Server (NTRS)

    Lomax, Harvard; Byrd, Paul F

    1951-01-01

    The aerodynamic characteristics of an airplane configuration composed of a swept-back, nearly constant chord wing and a triangular tail mounted on a cylindrical body are presented. The analysis is based on the assumption that the free-stream Mach number is near unity or that the configuration is slender. The calculations for the tail are made on the assumption that the vortex system trailing back from the wing is either a sheet lying entirely in the plane of the flat tail surface or has completely "rolled up" into two point vortices that lie either in, above, or below the plane of the tail surface.

  14. Validity of the mockwitness paradigm: testing the assumptions.

    PubMed

    McQuiston, Dawn E; Malpass, Roy S

    2002-08-01

    Mockwitness identifications are used to provide a quantitative measure of lineup fairness. Some theoretical and practical assumptions of this paradigm have not been studied in terms of mockwitnesses' decision processes and procedural variation (e.g., instructions, lineup presentation method), and the current experiment was conducted to empirically evaluate these assumptions. Four hundred and eighty mockwitnesses were given physical information about a culprit, received 1 of 4 variations of lineup instructions, and were asked to identify the culprit from either a fair or unfair sequential lineup containing 1 of 2 targets. Lineup bias estimates varied as a result of lineup fairness and the target presented. Mockwitnesses generally reported that the target's physical description was their main source of identifying information. Our findings support the use of mockwitness identifications as a useful technique for sequential lineup evaluation, but only for mockwitnesses who selected only 1 lineup member. Recommendations for the use of this evaluation procedure are discussed.

  15. Twin studies in psychiatry and psychology: science or pseudoscience?

    PubMed

    Joseph, Jay

    2002-01-01

    Twin studies are frequently cited in support of the influence of genetic factors for a wide range of psychiatric conditions and psychological trait differences. The most common method, known as the classical twin method, compares the concordance rates or correlations of reared-together identical (MZ) vs. reared-together same-sex fraternal (DZ) twins. However, drawing genetic inferences from MZ-DZ comparisons is problematic due to methodological problems and questionable assumptions. It is argued that the main theoretical assumption of the twin method--known as the "equal environment assumption"--is not tenable. The twin method is therefore of doubtful value as an indicator of genetic influences. Studies of reared-apart twins are discussed, and it is noted that these studies are also vulnerable to methodological problems and environmental confounds. It is concluded that there is little reason to believe that twin studies provide evidence in favor of genetic influences on psychiatric disorders and human behavioral differences.

  16. Causal inferences on the effectiveness of complex social programs: Navigating assumptions, sources of complexity and evaluation design challenges.

    PubMed

    Chatterji, Madhabi

    2016-12-01

    This paper explores avenues for navigating evaluation design challenges posed by complex social programs (CSPs) and their environments when conducting studies that call for generalizable, causal inferences on the intervention's effectiveness. A definition is provided of a CSP drawing on examples from different fields, and an evaluation case is analyzed in depth to derive seven (7) major sources of complexity that typify CSPs, threatening assumptions of textbook-recommended experimental designs for performing impact evaluations. Theoretically-supported, alternative methodological strategies are discussed to navigate assumptions and counter the design challenges posed by the complex configurations and ecology of CSPs. Specific recommendations include: sequential refinement of the evaluation design through systems thinking, systems-informed logic modeling; and use of extended term, mixed methods (ETMM) approaches with exploratory and confirmatory phases of the evaluation. In the proposed approach, logic models are refined through direct induction and interactions with stakeholders. To better guide assumption evaluation, question-framing, and selection of appropriate methodological strategies, a multiphase evaluation design is recommended. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Iconic Discourse: The Troubling Legacy of Mina Shaughnessy.

    ERIC Educational Resources Information Center

    Gunner, Jeanne

    1998-01-01

    Examines two debates within the basic writing community (the reaction against Min Zhan Lu's early theoretical work and the recent acrimonious debate regarding Ira Shor's defense of mainstreaming) showing how they reflect conflicting models of the basic writing field, with "critical" discourse challenging the conventions and authority of…

  18. Dropout and Persistence in Adult Basic Education.

    ERIC Educational Resources Information Center

    Clark, Lisa K.

    A review of research on dropouts from adult basic education (ABE) reveals several theoretical models relevant to the study of dropout and persistence in ABE, shows specific variables related to dropout behavior, provides insights as to who drops out and why, identifies some characteristics of disadvantaged adult learners, and reveals some problems…

  19. Students' Levels of Explanations, Models, and Misconceptions in Basic Quantum Chemistry: A Phenomenographic Study

    ERIC Educational Resources Information Center

    Stefani, Christina; Tsaparlis, Georgios

    2009-01-01

    We investigated students' knowledge constructions of basic quantum chemistry concepts, namely atomic orbitals, the Schrodinger equation, molecular orbitals, hybridization, and chemical bonding. Ausubel's theory of meaningful learning provided the theoretical framework and phenomenography the method of analysis. The semi-structured interview with…

  20. Microcomputer Calculation of Theoretical Pre-Exponential Factors for Bimolecular Reactions.

    ERIC Educational Resources Information Center

    Venugopalan, Mundiyath

    1991-01-01

    Described is the application of microcomputers to predict reaction rates based on theoretical atomic and molecular properties taught in undergraduate physical chemistry. Listed is the BASIC program which computes the partition functions for any specific bimolecular reactants. These functions are then used to calculate the pre-exponential factor of…

  1. The Theoretical Basis of the Effective School Improvement Model (ESI)

    ERIC Educational Resources Information Center

    Scheerens, Jaap; Demeuse, Marc

    2005-01-01

    This article describes the process of theoretical reflection that preceded the development and empirical verification of a model of "effective school improvement". The focus is on basic mechanisms that could be seen as underlying "getting things in motion" and change in education systems. Four mechanisms are distinguished:…

  2. Estimate of the critical exponents from the field-theoretical renormalization group: mathematical meaning of the 'Standard Values'

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pogorelov, A. A.; Suslov, I. M.

    2008-06-15

    New estimates of the critical exponents have been obtained from the field-theoretical renormalization group using a new method for summing divergent series. The results almost coincide with the central values obtained by Le Guillou and Zinn-Justin (the so-called standard values), but have lower uncertainty. It has been shown that usual field-theoretical estimates implicitly imply the smoothness of the coefficient functions. The last assumption is open for discussion in view of the existence of the oscillating contribution to the coefficient functions. The appropriate interpretation of the last contribution is necessary both for the estimation of the systematic errors of the standardmore » values and for a further increase in accuracy.« less

  3. Charting the future course of rural health and remote health in Australia: Why we need theory.

    PubMed

    Bourke, Lisa; Humphreys, John S; Wakerman, John; Taylor, Judy

    2010-04-01

    This paper argues that rural and remote health is in need of theoretical development. Based on the authors' discussions, reflections and critical analyses of literature, this paper proposes key reasons why rural and remote health warrants the development of theoretical frameworks. The paper cites five reasons why theory is needed: (i) theory provides an approach for how a topic is studied; (ii) theory articulates key assumptions in knowledge development; (iii) theory systematises knowledge, enabling it to be transferable; (iv) theory provides predictability; and (v) theory enables comprehensive understanding. This paper concludes with a call for theoretical development in both rural and remote health to expand its knowledge and be more relevant to improving health care for rural Australians.

  4. Design and Experimental Results for a Natural-Laminar-Flow Airfoil for General Aviation Applications

    NASA Technical Reports Server (NTRS)

    Somers, D. M.

    1981-01-01

    A natural-laminar-flow airfoil for general aviation applications, the NLF(1)-0416, was designed and analyzed theoretically and verified experimentally in the Langley Low-Turbulence Pressure Tunnel. The basic objective of combining the high maximum lift of the NASA low-speed airfoils with the low cruise drag of the NACA 6-series airfoils was achieved. The safety requirement that the maximum lift coefficient not be significantly affected with transition fixed near the leading edge was also met. Comparisons of the theoretical and experimental results show excellent agreement. Comparisons with other airfoils, both laminar flow and turbulent flow, confirm the achievement of the basic objective.

  5. Quantum physics in neuroscience and psychology: A neurophysicalmodel of the mind/brain interaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Jeffrey M.; Stapp, Henry P.; Beauregard, Mario

    Neuropsychological research on the neural basis of behavior generally posits that brain mechanisms will ultimately suffice to explain all psychologically described phenomena. This assumption stems from the idea that the brain is made up entirely of material particles and fields, and that all causal mechanisms relevant to neuroscience can therefore be formulated solely in terms of properties of these elements. Thus terms having intrinsic mentalistic and/or experiential content (e.g., ''feeling,'' ''knowing,'' and ''effort'') are not included as primary causal factors. This theoretical restriction is motivated primarily by ideas about the natural world that have been known to be fundamentally incorrectmore » for more than three quarters of a century. Contemporary basic physical theory differs profoundly from classical physics on the important matter of how the consciousness of human agents enters into the structure of empirical phenomena. The new principles contradict the older idea that local mechanical processes alone can account for the structure of all observed empirical data. Contemporary physical theory brings directly and irreducibly into the overall causal structure certain psychologically described choices made by human agents about how they will act. This key development in basic physical theory is applicable to neuroscience, and it provides neuroscientists and psychologists with an alternative conceptual framework for describing neural processes. Indeed, due to certain structural features of ion channels critical to synaptic function, contemporary physical theory must in principle be used when analyzing human brain dynamics. The new framework, unlike its classical-physics-based predecessor is erected directly upon, and is compatible with, the prevailing principles of physics, and is able to represent more adequately than classical concepts the neuroplastic mechanisms relevant to the growing number of empirical studies of the capacity of directed attention and mental effort to systematically alter brain function.« less

  6. Mathematical analysis of frontal affinity chromatography in particle and membrane configurations.

    PubMed

    Tejeda-Mansir, A; Montesinos, R M; Guzmán, R

    2001-10-30

    The scaleup and optimization of large-scale affinity-chromatographic operations in the recovery, separation and purification of biochemical components is of major industrial importance. The development of mathematical models to describe affinity-chromatographic processes, and the use of these models in computer programs to predict column performance is an engineering approach that can help to attain these bioprocess engineering tasks successfully. Most affinity-chromatographic separations are operated in the frontal mode, using fixed-bed columns. Purely diffusive and perfusion particles and membrane-based affinity chromatography are among the main commercially available technologies for these separations. For a particular application, a basic understanding of the main similarities and differences between particle and membrane frontal affinity chromatography and how these characteristics are reflected in the transport models is of fundamental relevance. This review presents the basic theoretical considerations used in the development of particle and membrane affinity chromatography models that can be applied in the design and operation of large-scale affinity separations in fixed-bed columns. A transport model for column affinity chromatography that considers column dispersion, particle internal convection, external film resistance, finite kinetic rate, plus macropore and micropore resistances is analyzed as a framework for exploring further the mathematical analysis. Such models provide a general realistic description of almost all practical systems. Specific mathematical models that take into account geometric considerations and transport effects have been developed for both particle and membrane affinity chromatography systems. Some of the most common simplified models, based on linear driving-force (LDF) and equilibrium assumptions, are emphasized. Analytical solutions of the corresponding simplified dimensionless affinity models are presented. Particular methods for estimating the parameters that characterize the mass-transfer and adsorption mechanisms in affinity systems are described.

  7. Quantum physics in neuroscience and psychology: a neurophysical model of mind–brain interaction

    PubMed Central

    Schwartz, Jeffrey M; Stapp, Henry P; Beauregard, Mario

    2005-01-01

    Neuropsychological research on the neural basis of behaviour generally posits that brain mechanisms will ultimately suffice to explain all psychologically described phenomena. This assumption stems from the idea that the brain is made up entirely of material particles and fields, and that all causal mechanisms relevant to neuroscience can therefore be formulated solely in terms of properties of these elements. Thus, terms having intrinsic mentalistic and/or experiential content (e.g. ‘feeling’, ‘knowing’ and ‘effort’) are not included as primary causal factors. This theoretical restriction is motivated primarily by ideas about the natural world that have been known to be fundamentally incorrect for more than three-quarters of a century. Contemporary basic physical theory differs profoundly from classic physics on the important matter of how the consciousness of human agents enters into the structure of empirical phenomena. The new principles contradict the older idea that local mechanical processes alone can account for the structure of all observed empirical data. Contemporary physical theory brings directly and irreducibly into the overall causal structure certain psychologically described choices made by human agents about how they will act. This key development in basic physical theory is applicable to neuroscience, and it provides neuroscientists and psychologists with an alternative conceptual framework for describing neural processes. Indeed, owing to certain structural features of ion channels critical to synaptic function, contemporary physical theory must in principle be used when analysing human brain dynamics. The new framework, unlike its classic-physics-based predecessor, is erected directly upon, and is compatible with, the prevailing principles of physics. It is able to represent more adequately than classic concepts the neuroplastic mechanisms relevant to the growing number of empirical studies of the capacity of directed attention and mental effort to systematically alter brain function. PMID:16147524

  8. Negotiating School Conflicts to Prevent Student Delinquency.

    ERIC Educational Resources Information Center

    De Cecco, John P.; Roberts, John K.

    One of 52 theoretical papers on school crime and its relation to poverty, this chapter presents a model of negotiation as a means to resolve school conflict. The assumption is that school conflict is inevitable, but student delinquency is not. Delinquent behavior results from the way that the school deals with conflict. Students resort to…

  9. Quantitative Differences in Retest Effects across Different Methods Used to Construct Alternate Test Forms

    ERIC Educational Resources Information Center

    Arendasy, Martin E.; Sommer, Markus

    2013-01-01

    Allowing respondents to retake a cognitive ability test has shown to increase their test scores. Several theoretical models have been proposed to explain this effect, which make distinct assumptions regarding the measurement invariance of psychometric tests across test administration sessions with regard to narrower cognitive abilities and general…

  10. Computerized Adaptive Test (CAT) Applications and Item Response Theory Models for Polytomous Items

    ERIC Educational Resources Information Center

    Aybek, Eren Can; Demirtasli, R. Nukhet

    2017-01-01

    This article aims to provide a theoretical framework for computerized adaptive tests (CAT) and item response theory models for polytomous items. Besides that, it aims to introduce the simulation and live CAT software to the related researchers. Computerized adaptive test algorithm, assumptions of item response theory models, nominal response…

  11. Bayesian Learning and the Psychology of Rule Induction

    ERIC Educational Resources Information Center

    Endress, Ansgar D.

    2013-01-01

    In recent years, Bayesian learning models have been applied to an increasing variety of domains. While such models have been criticized on theoretical grounds, the underlying assumptions and predictions are rarely made concrete and tested experimentally. Here, I use Frank and Tenenbaum's (2011) Bayesian model of rule-learning as a case study to…

  12. Analyzing Data from a Pretest-Posttest Control Group Design: The Importance of Statistical Assumptions

    ERIC Educational Resources Information Center

    Zientek, Linda; Nimon, Kim; Hammack-Brown, Bryn

    2016-01-01

    Purpose: Among the gold standards in human resource development (HRD) research are studies that test theoretically developed hypotheses and use experimental designs. A somewhat typical experimental design would involve collecting pretest and posttest data on individuals assigned to a control or experimental group. Data from such a design that…

  13. Reflective Pedagogy: The Integration of Methodology and Subject-Matter Content in a Graduate-Level Course

    ERIC Educational Resources Information Center

    Jakeman, Rick C.; Henderson, Markesha M.; Howard, Lionel C.

    2017-01-01

    This article presents a critical reflection on how we, instructors of a graduate-level course in higher education administration, sought to integrate theoretical and subject-matter content and research methodology. Our reflection, guided by autoethnography and teacher reflection, challenged both our assumptions about curriculum design and our…

  14. Complexity, Methodology and Method: Crafting a Critical Process of Research

    ERIC Educational Resources Information Center

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  15. "Wrighting" the Self: New Technologies and Textual Subjectivities

    ERIC Educational Resources Information Center

    Sakr, Mona

    2012-01-01

    The expression of the self through multimodal texts is a central theme in education. While it has been suggested that new technologies act as important mediators in the relationship between texts and subjectivity, the mechanisms underlying such mediation has been a neglected topic of research. This paper considers the theoretical assumptions upon…

  16. Daddy, I Know What the Story Means--Now, I Just Need Help with the Words.

    ERIC Educational Resources Information Center

    Bintz, William

    1998-01-01

    Describes an instance of literacy learning involving the author and his two daughters at a local bookstore. Discusses how this literacy event challenged the author to consider alternative assumptions about reading, learning to read, and the relationship between reading and literacy. Offers lingering questions about what theoretical assumptions…

  17. The Unfinished Stories of Two First Nations Mothers

    ERIC Educational Resources Information Center

    Moayeri, Maryam; Smith, Jane

    2010-01-01

    This study is shaped by an underlying theoretical assumption that literacy is a cultural practice, shaped by and shaping social factors such as culture, gender, politics, and economics. As a result, this article focuses on the literacy practices of two mothers who participated in the study. Because of their Aboriginal ancestry and the historical…

  18. On Knowing: Art and Visual Culture.

    ERIC Educational Resources Information Center

    Duncum, Paul, Ed.; Bracey, Ted, Ed.

    The question of whether or not art can be distinguished from all that is called visual culture has become central to art theoretical discussion over recent decades. This collection of essays and responses addresses this question with the specific aim of making sense of an epistemology of art, with the assumption that nothing less than a persuasive…

  19. Unequal Ecological Exchange and Environmental Degradation: A Theoretical Proposition and Cross-National Study of Deforestation, 1990-2000

    ERIC Educational Resources Information Center

    Jorgenson, Andrew K.

    2006-01-01

    Political-economic sociologists have long investigated the dynamics and consequences of international trade. With few exceptions, this area of inquiry ignores the possible connections between trade and environmental degradation. In contrast, environmental sociologists have made several assumptions about the environmental impacts of international…

  20. Impact of Handwriting Training on Fluency, Spelling and Text Quality among Third Graders

    ERIC Educational Resources Information Center

    Hurschler Lichtsteiner, Sibylle; Wicki, Werner; Falmann, Péter

    2018-01-01

    As recent studies and theoretical assumptions suggest that the quality of texts composed by children and adolescents is affected by their transcription skills, this experimental field trial aims at investigating the impact of combined handwriting/spelling training on fluency, spelling and text quality among normally developing 3rd graders…

Top