Sample records for basic assumption underlying

  1. Artificial Intelligence: Underlying Assumptions and Basic Objectives.

    ERIC Educational Resources Information Center

    Cercone, Nick; McCalla, Gordon

    1984-01-01

    Presents perspectives on methodological assumptions underlying research efforts in artificial intelligence (AI) and charts activities, motivations, methods, and current status of research in each of the major AI subareas: natural language understanding; computer vision; expert systems; search, problem solving, planning; theorem proving and logic…

  2. Teaching Practices: Reexamining Assumptions.

    ERIC Educational Resources Information Center

    Spodek, Bernard, Ed.

    This publication contains eight papers, selected from papers presented at the Bicentennial Conference on Early Childhood Education, that discuss different aspects of teaching practices. The first two chapters reexamine basic assumptions underlying the organization of curriculum experiences for young children. Chapter 3 discusses the need to…

  3. [The Basic-Symptom Concept and its Influence on Current International Research on the Prediction of Psychoses].

    PubMed

    Schultze-Lutter, F

    2016-12-01

    The early detection of psychoses has become increasingly relevant in research and clinic. Next to the ultra-high risk (UHR) approach that targets an immediate risk of developing frank psychosis, the basic symptom approach that targets the earliest possible detection of the developing disorder is being increasingly used worldwide. The present review gives an introduction to the development and basic assumptions of the basic symptom concept, summarizes the results of studies on the specificity of basic symptoms for psychoses in different age groups as well as on studies of their psychosis-predictive value, and gives an outlook on future results. Moreover, a brief introduction to first recent imaging studies is given that supports one of the main assumptions of the basic symptom concept, i. e., that basic symptoms are the most immediate phenomenological expression of the cerebral aberrations underlying the development of psychosis. From this, it is concluded that basic symptoms might be able to provide important information on future neurobiological research on the etiopathology of psychoses. © Georg Thieme Verlag KG Stuttgart · New York.

  4. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  5. An Extension of the Partial Credit Model with an Application to the Measurement of Change.

    ERIC Educational Resources Information Center

    Fischer, Gerhard H.; Ponocny, Ivo

    1994-01-01

    An extension to the partial credit model, the linear partial credit model, is considered under the assumption of a certain linear decomposition of the item x category parameters into basic parameters. A conditional maximum likelihood algorithm for estimating basic parameters is presented and illustrated with simulation and an empirical study. (SLD)

  6. On the Basis of the Basic Variety.

    ERIC Educational Resources Information Center

    Schwartz, Bonnie D.

    1997-01-01

    Considers the interplay between source and target language in relation to two points made by Klein and Perdue: (1) the argument that the analysis of the target language should not be used as the model for analyzing interlanguage data; and (2) the theoretical claim that under the technical assumptions of minimalism, the Basic Variety is a "perfect"…

  7. Application of the Recreation Opportunity Spectrum for Outdoor Recreation Planning on Army Installations.

    DTIC Science & Technology

    1982-03-01

    to preference types, and uses capacity estimation; therefore, it is basically a good system for recreation and resource inventory and classification...quan- tity, and distribution of recreational resources. Its basic unit of inventory is landform, or the homogeneity of physical features used to...by Clark and Stankey, "the basic assumption underlying the ROS is that quality recreational experiences are best assured by providing a diverse set of

  8. Methodological problems in the use of indirect comparisons for evaluating healthcare interventions: survey of published systematic reviews.

    PubMed

    Song, Fujian; Loke, Yoon K; Walsh, Tanya; Glenny, Anne-Marie; Eastwood, Alison J; Altman, Douglas G

    2009-04-03

    To investigate basic assumptions and other methodological problems in the application of indirect comparison in systematic reviews of competing healthcare interventions. Survey of published systematic reviews. Inclusion criteria Systematic reviews published between 2000 and 2007 in which an indirect approach had been explicitly used. Identified reviews were assessed for comprehensiveness of the literature search, method for indirect comparison, and whether assumptions about similarity and consistency were explicitly mentioned. The survey included 88 review reports. In 13 reviews, indirect comparison was informal. Results from different trials were naively compared without using a common control in six reviews. Adjusted indirect comparison was usually done using classic frequentist methods (n=49) or more complex methods (n=18). The key assumption of trial similarity was explicitly mentioned in only 40 of the 88 reviews. The consistency assumption was not explicit in most cases where direct and indirect evidence were compared or combined (18/30). Evidence from head to head comparison trials was not systematically searched for or not included in nine cases. Identified methodological problems were an unclear understanding of underlying assumptions, inappropriate search and selection of relevant trials, use of inappropriate or flawed methods, lack of objective and validated methods to assess or improve trial similarity, and inadequate comparison or inappropriate combination of direct and indirect evidence. Adequate understanding of basic assumptions underlying indirect and mixed treatment comparison is crucial to resolve these methodological problems. APPENDIX 1: PubMed search strategy. APPENDIX 2: Characteristics of identified reports. APPENDIX 3: Identified studies. References of included studies.

  9. [A reflection about organizational culture according to psychoanalysis' view].

    PubMed

    Cardoso, Maria Lúcia Alves Pereira

    2008-01-01

    This article aims at submitting a reflection on the universal presuppositions of human culture proposed by Freud, as a prop for analyzing presuppositions of organizational culture according to Schein. In an article published in 1984, the latter claims that in order to decipher organizational culture one cannot rely upon the (visible) artifacts or to (perceptible) values, but should take a deeper plunge and identify the basic assumptions underlying organizational culture. Such pressupositions spread into the field of sttudy concerning the individual inner self, within the sphere of Psychoanalysis. We have therefore examined Freud's basic assumptions of human culture in order to ascertain its conformity with the paradigms of organizational culture as proposed by Schein.

  10. Thermodynamic Properties of Low-Density {}^{132}Xe Gas in the Temperature Range 165-275 K

    NASA Astrophysics Data System (ADS)

    Akour, Abdulrahman

    2018-01-01

    The method of static fluctuation approximation was used to calculate selected thermodynamic properties (internal energy, entropy, energy capacity, and pressure) for xenon in a particularly low-temperature range (165-270 K) under different conditions. This integrated microscopic study started from an initial basic assumption as the main input. The basic assumption in this method was to replace the local field operator with its mean value, then numerically solve a closed set of nonlinear equations using an iterative method, considering the Hartree-Fock B2-type dispersion potential as the most appropriate potential for xenon. The results are in very good agreement with those of an ideal gas.

  11. Is Tissue the Issue? A Critique of SOMPA's Models and Tests.

    ERIC Educational Resources Information Center

    Goodman, Joan F.

    1979-01-01

    A critical view of the underlying theoretical rationale of the System of Multicultural Pluralistic Assessment (SOMPA) model for student assessment is presented. The critique is extensive and questions the basic assumptions of the model. (JKS)

  12. The Disk Instability Model for SU UMa systems - a Comparison of the Thermal-Tidal Model and Plain Vanilla Model

    NASA Astrophysics Data System (ADS)

    Cannizzo, John K.

    2017-01-01

    We utilize the time dependent accretion disk model described by Ichikawa & Osaki (1992) to explore two basic ideas for the outbursts in the SU UMa systems, Osaki's Thermal-Tidal Model, and the basic accretion disk limit cycle model. We explore a range in possible input parameters and model assumptions to delineate under what conditions each model may be preferred.

  13. A Basic Literacy Project for the Correctional Service of Canada: Curriculum Design as a Strategy for Staff Development.

    ERIC Educational Resources Information Center

    Collins, Michael

    1989-01-01

    Describes a Canadian curriculum development project; analyzes underlying policy assumptions. Advocates involvement of prison educators and inmates in the process if curriculum is to meet the educational needs of inmates. (Author/LAM)

  14. Basic principles of respiratory function monitoring in ventilated newborns: A review.

    PubMed

    Schmalisch, Gerd

    2016-09-01

    Respiratory monitoring during mechanical ventilation provides a real-time picture of patient-ventilator interaction and is a prerequisite for lung-protective ventilation. Nowadays, measurements of airflow, tidal volume and applied pressures are standard in neonatal ventilators. The measurement of lung volume during mechanical ventilation by tracer gas washout techniques is still under development. The clinical use of capnography, although well established in adults, has not been embraced by neonatologists because of technical and methodological problems in very small infants. While the ventilatory parameters are well defined, the calculation of other physiological parameters are based upon specific assumptions which are difficult to verify. Incomplete knowledge of the theoretical background of these calculations and their limitations can lead to incorrect interpretations with clinical consequences. Therefore, the aim of this review was to describe the basic principles and the underlying assumptions of currently used methods for respiratory function monitoring in ventilated newborns and to highlight methodological limitations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Calculation of Temperature Rise in Calorimetry.

    ERIC Educational Resources Information Center

    Canagaratna, Sebastian G.; Witt, Jerry

    1988-01-01

    Gives a simple but fuller account of the basis for accurately calculating temperature rise in calorimetry. Points out some misconceptions regarding these calculations. Describes two basic methods, the extrapolation to zero time and the equal area method. Discusses the theoretical basis of each and their underlying assumptions. (CW)

  16. Network-level reproduction number and extinction threshold for vector-borne diseases.

    PubMed

    Xue, Ling; Scoglio, Caterina

    2015-06-01

    The basic reproduction number of deterministic models is an essential quantity to predict whether an epidemic will spread or not. Thresholds for disease extinction contribute crucial knowledge of disease control, elimination, and mitigation of infectious diseases. Relationships between basic reproduction numbers of two deterministic network-based ordinary differential equation vector-host models, and extinction thresholds of corresponding stochastic continuous-time Markov chain models are derived under some assumptions. Numerical simulation results for malaria and Rift Valley fever transmission on heterogeneous networks are in agreement with analytical results without any assumptions, reinforcing that the relationships may always exist and proposing a mathematical problem for proving existence of the relationships in general. Moreover, numerical simulations show that the basic reproduction number does not monotonically increase or decrease with the extinction threshold. Consistent trends of extinction probability observed through numerical simulations provide novel insights into mitigation strategies to increase the disease extinction probability. Research findings may improve understandings of thresholds for disease persistence in order to control vector-borne diseases.

  17. A Guide to Curriculum Planning in Mathematics. Bulletin No. 6284.

    ERIC Educational Resources Information Center

    Chambers, Donald L.; And Others

    This guide was written under the basic assumptions that the mathematics curriculum must continuously change and that mathematics is most effectively learned through a spiral approach. Further, it is assumed that the audience will be members of district mathematics curriculum committees. Instructional objectives have been organized to reveal the…

  18. Intergenerational resource transfers with random offspring numbers

    PubMed Central

    Arrow, Kenneth J.; Levin, Simon A.

    2009-01-01

    A problem common to biology and economics is the transfer of resources from parents to children. We consider the issue under the assumption that the number of offspring is unknown and can be represented as a random variable. There are 3 basic assumptions. The first assumption is that a given body of resources can be divided into consumption (yielding satisfaction) and transfer to children. The second assumption is that the parents' welfare includes a concern for the welfare of their children; this is recursive in the sense that the children's welfares include concern for their children and so forth. However, the welfare of a child from a given consumption is counted somewhat differently (generally less) than that of the parent (the welfare of a child is “discounted”). The third assumption is that resources transferred may grow (or decline). In economic language, investment, including that in education or nutrition, is productive. Under suitable restrictions, precise formulas for the resulting allocation of resources are found, demonstrating that, depending on the shape of the utility curve, uncertainty regarding the number of offspring may or may not favor increased consumption. The results imply that wealth (stock of resources) will ultimately have a log-normal distribution. PMID:19617553

  19. Academic Public Relations Curricula: How They Compare with the Bateman-Cutlip Commission Standards.

    ERIC Educational Resources Information Center

    McCartney, Hunter P.

    To see what effect the 1975 Bateman-Cutlip Commission's recommendations have had on improving public relations education in the United States, 173 questionnaires were sent to colleges or universities with accredited or comprehensive programs in public relations. Responding to five basic assumptions underlying the commission's recommendations,…

  20. United States Air Force Training Line Simulator. Final Report.

    ERIC Educational Resources Information Center

    Nauta, Franz; Pierce, Michael B.

    This report describes the technical aspects and potential applications of a computer-based model simulating the flow of airmen through basic training and entry-level technical training. The objective of the simulation is to assess the impacts of alternative recruit classification and training policies under a wide variety of assumptions regarding…

  1. Children Are Human Beings

    ERIC Educational Resources Information Center

    Bossard, James H. S.

    2017-01-01

    The basic assumption underlying this article is that the really significant changes in human history are those that occur, not in the mechanical gadgets which men use nor in the institutionalized arrangements by which they live, but in their attitudes and in the values which they accept. The revolutions of the past that have had the greatest…

  2. Testing Intercultural Competence in (International) English: Some Basic Questions and Suggested Answers

    ERIC Educational Resources Information Center

    Camerer, Rudi

    2014-01-01

    The testing of intercultural competence has long been regarded as the field of psychometric test procedures, which claim to analyse an individual's personality by specifying and quantifying personality traits with the help of self-answer questionnaires and the statistical evaluation of these. The underlying assumption is that what is analysed and…

  3. The Spouse and Familial Incest: An Adlerian Perspective.

    ERIC Educational Resources Information Center

    Quinn, Kathleen L.

    A major component of Adlerian psychology concerns the belief in responsibility to self and others. In both incest perpetrator and spouse the basic underlying assumption of responsibility to self and others is often not present. Activities and behaviors occur in a social context and as such need to be regarded within a social context that may serve…

  4. The Importance of Woody Twig Ends to Deer in the Southeast

    Treesearch

    Charles T. Cushwa; Robert L. Downing; Richard F. Harlow; David F. Urbston

    1970-01-01

    One of the basic assumptions underlying research on wildlife habitat in the five Atlantic states of the Southeast is that white-tailed deer (Odocoileus virginianus) rely heavily on the ends of woody twigs during the winter. Considerable research has been undertaken to determine methods for increasing and measuring the availability of woody twigs to...

  5. Model for Developing an In-Service Teacher Workshop To Help Multilingual and Multicultural Students.

    ERIC Educational Resources Information Center

    Kachaturoff, Grace; Romatowski, Jane A.

    This is a model for designing an inservice teacher workshop to assist teachers working with multicultural students. The basic assumption underlying the model is universities and schools need to work cooperatively to provide experiences for improving the quality of teaching by increasing awareness of educational issues and situations and by…

  6. Challenging Freedom: Neoliberalism and the Erosion of Democratic Education

    ERIC Educational Resources Information Center

    Karaba, Robert

    2016-01-01

    Goodlad, et al. (2002) rightly point out that a culture can either resist or support change. Schein's (2010) model of culture indicates observable behaviors of a culture can be explained by exposing underlying shared values and basic assumptions that give meaning to the performance. Yet culture is many-faceted and complex. So Schein advised a…

  7. Cable Television and Education: Proceedings of the CATV and Education Conference, May 11-12, 1973.

    ERIC Educational Resources Information Center

    Cardellino, Earl L., Comp.; Forsythe, Charles G., Comp.

    Edited versions of the conference presentations are compiled. The purpose of the meeting was to bring together media specialists and other educators from throughout Pennsylvania to evaluate the basic assumptions underlying the educational use of cable television (CATV) and to share ideas about the ways in which cable could be used to change the…

  8. Quantitative Methodology: A Guide for Emerging Physical Education and Adapted Physical Education Researchers

    ERIC Educational Resources Information Center

    Haegele, Justin A.; Hodge, Samuel R.

    2015-01-01

    Emerging professionals, particularly senior-level undergraduate and graduate students in kinesiology who have an interest in physical education for individuals with and without disabilities, should understand the basic assumptions of the quantitative research paradigm. Knowledge of basic assumptions is critical for conducting, analyzing, and…

  9. Misleading Theoretical Assumptions in Hypertext/Hypermedia Research.

    ERIC Educational Resources Information Center

    Tergan, Sigmar-Olaf

    1997-01-01

    Reviews basic theoretical assumptions of research on learning with hypertext/hypermedia. Focuses on whether the results of research on hypertext/hypermedia-based learning support these assumptions. Results of empirical studies and theoretical analysis reveal that many research approaches have been misled by inappropriate theoretical assumptions on…

  10. Didactics and History of Mathematics: Knowledge and Self-Knowledge

    ERIC Educational Resources Information Center

    Fried, Michael N.

    2007-01-01

    The basic assumption of this paper is that mathematics and history of mathematics are both forms of knowledge and, therefore, represent different ways of knowing. This was also the basic assumption of Fried (2001) who maintained that these ways of knowing imply different conceptual and methodological commitments, which, in turn, lead to a conflict…

  11. The Discrepancy-Induced Source Comprehension (D-ISC) Model: Basic Assumptions and Preliminary Evidence

    ERIC Educational Resources Information Center

    Braasch, Jason L. G.; Bråten, Ivar

    2017-01-01

    Despite the importance of source attention and evaluation for learning from texts, little is known about the particular conditions that encourage sourcing during reading. In this article, basic assumptions of the discrepancy-induced source comprehension (D-ISC) model are presented, which describes the moment-by-moment cognitive processes that…

  12. 26 CFR 1.404(a)-3 - Contributions of an employer to or under an employees' pension trust or annuity plan that meets...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... representative experience may be used as an assumed retirement age. Different basic assumptions or rates may be used for different classes of risks or different groups where justified by conditions or required by... proper, or except when a change is necessitated by reason of the use of different methods, factors...

  13. A Test of Three Basic Assumptions of Situational Leadership® II Model and Their Implications for HRD Practitioners

    ERIC Educational Resources Information Center

    Zigarmi, Drea; Roberts, Taylor Peyton

    2017-01-01

    Purpose: This study aims to test the following three assertions underlying the Situational Leadership® II (SLII) Model: all four leadership styles are received by followers; all four leadership styles are needed by followers; and if there is a fit between the leadership style a follower receives and needs, that follower will demonstrate favorable…

  14. A Critical Reading of Ecocentrism and Its Meta-Scientific Use of Ecology: Instrumental versus Emancipatory Approaches in Environmental Education and Ecology Education

    ERIC Educational Resources Information Center

    Hovardas, Tasos

    2013-01-01

    The aim of the paper is to make a critical reading of ecocentrism and its meta-scientific use of ecology. First, basic assumptions of ecocentrism will be examined, which involve nature's intrinsic value, postmodern and modern positions in ecocentrism, and the subject-object dichotomy under the lenses of ecocentrism. Then, we will discuss…

  15. Rationality, Authority and Spindles: An Enquiry into Some Neglected Aspects of Organizational Effectiveness and a Partial Application to Public Schools.

    ERIC Educational Resources Information Center

    Allison, Derek J.

    Focusing on the problem of authority, an analysis of the theories of Max Weber, James D. Thompson, and Elliott Jaques forms the basis for this proposal for improved organizational effectiveness in public schools. Basic assumptions are that modern organizations are established and operated under rational principles and subject to rational analysis,…

  16. Knowledge Discovery from Relations

    ERIC Educational Resources Information Center

    Guo, Zhen

    2010-01-01

    A basic and classical assumption in the machine learning research area is "randomness assumption" (also known as i.i.d assumption), which states that data are assumed to be independent and identically generated by some known or unknown distribution. This assumption, which is the foundation of most existing approaches in the literature, simplifies…

  17. Teaching Critical Literacy across the Curriculum in Multimedia America.

    ERIC Educational Resources Information Center

    Semali, Ladislaus M.

    The teaching of media texts as a form of textual construction is embedded in the assumption that audiences bring individual preexisting dispositions even though the media may contribute to their shaping of basic attitudes, beliefs, values, and behavior. As summed up by D. Lusted, at the core of such textual construction are basic assumptions that…

  18. Statistical foundations of liquid-crystal theory

    PubMed Central

    Seguin, Brian; Fried, Eliot

    2013-01-01

    We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals. PMID:23772091

  19. Is the hypothesis of preimplantation genetic screening (PGS) still supportable? A review.

    PubMed

    Gleicher, Norbert; Orvieto, Raoul

    2017-03-27

    The hypothesis of preimplantation genetic diagnosis (PGS) was first proposed 20 years ago, suggesting that elimination of aneuploid embryos prior to transfer will improve implantation rates of remaining embryos during in vitro fertilization (IVF), increase pregnancy and live birth rates and reduce miscarriages. The aforementioned improved outcome was based on 5 essential assumptions: (i) Most IVF cycles fail because of aneuploid embryos. (ii) Their elimination prior to embryo transfer will improve IVF outcomes. (iii) A single trophectoderm biopsy (TEB) at blastocyst stage is representative of the whole TE. (iv) TE ploidy reliably represents the inner cell mass (ICM). (v) Ploidy does not change (i.e., self-correct) downstream from blastocyst stage. We aim to offer a review of the aforementioned assumptions and challenge the general hypothesis of PGS. We reviewed 455 publications, which as of January 20, 2017 were listed in PubMed under the search phrase < preimplantation genetic screening (PGS) for aneuploidy>. The literature review was performed by both authors who agreed on the final 55 references. Various reports over the last 18 months have raised significant questions not only about the basic clinical utility of PGS but the biological underpinnings of the hypothesis, the technical ability of a single trophectoderm (TE) biopsy to accurately assess an embryo's ploidy, and suggested that PGS actually negatively affects IVF outcomes while not affecting miscarriage rates. Moreover, due to high rates of false positive diagnoses as a consequence of high mosaicism rates in TE, PGS leads to the discarding of large numbers of normal embryos with potential for normal euploid pregnancies if transferred rather than disposed of. We found all 5 basic assumptions underlying the hypothesis of PGS to be unsupported: (i) The association of embryo aneuploidy with IVF failure has to be reevaluated in view how much more common TE mosaicism is than has until recently been appreciated. (ii) Reliable elimination of presumed aneuploid embryos prior to embryo transfer appears unrealistic. (iii) Mathematical models demonstrate that a single TEB cannot provide reliable information about the whole TE. (iv) TE does not reliably reflect the ICM. (v) Embryos, likely, still have strong innate ability to self-correct downstream from blastocyst stage, with ICM doing so better than TE. The hypothesis of PGS, therefore, no longer appears supportable. With all 5 basic assumptions underlying the hypothesis of PGS demonstrated to have been mistaken, the hypothesis of PGS, itself, appears to be discredited. Clinical use of PGS for the purpose of IVF outcome improvements should, therefore, going forward be restricted to research studies.

  20. Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.

    PubMed

    Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian

    2011-01-01

    Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.

  1. Surface Oscillations of a Free-Falling Droplet of an Ideal Fluid

    NASA Astrophysics Data System (ADS)

    Kistovich, A. V.; Chashechkin, Yu. D.

    2018-03-01

    According to observations, drops freely falling in the air under the action of gravity are deformed and oscillate in a wide range of frequencies and scales. A technique for calculating surface axisymmetric oscillations of a deformed droplet in the linear approximation under the assumption that the amplitude and wavelength are small when compared to the droplet diameter is proposed. The basic form of an axisymmetric droplet is chosen from observations. The calculation results for surface oscillations agree with recorded data on the varying shape of water droplets falling in the air.

  2. [Medical service marketing at the time of medical insurance].

    PubMed

    Polyakov, I V; Uvarov, S A; Mikhaylova, L S; Lankin, K A

    1997-01-01

    Presents the approaches to applying the fundamentals of marketing to public health. Medical insurance organization may effectively work as arbitrators and marketing agents; the basic assumption in the theory of marketing underlies their activity. The concept of marketing implies investigation of the requirements of the users of medical services and the development of measures aimed at meeting the requirements of man in terms of health service and health maintenance.

  3. Telepresence for space: The state of the concept

    NASA Technical Reports Server (NTRS)

    Smith, Randy L.; Gillan, Douglas J.; Stuart, Mark A.

    1990-01-01

    The purpose here is to examine the concept of telepresence critically. To accomplish this goal, first, the assumptions that underlie telepresence and its applications are examined, and second, the issues raised by that examination are discussed. Also, these assumptions and issues are used as a means of shifting the focus in telepresence from development to user-based research. The most basic assumption of telepresence is that the information being provided to the human must be displayed in a natural fashion, i.e., the information should be displayed to the same human sensory modalities, and in the same fashion, as if the person where actually at the remote site. A further fundamental assumption for the functional use of telepresence is that a sense of being present in the work environment will produce superior performance. In other words, that sense of being there would allow the human operator of a distant machine to take greater advantage of his or her considerable perceptual, cognitive, and motor capabilities in the performance of a task than would more limited task-related feedback. Finally, a third fundamental assumption of functional telepresence is that the distant machine under the operator's control must substantially resemble a human in dexterity.

  4. Helping Students to Recognize and Evaluate an Assumption in Quantitative Reasoning: A Basic Critical-Thinking Activity with Marbles and Electronic Balance

    ERIC Educational Resources Information Center

    Slisko, Josip; Cruz, Adrian Corona

    2013-01-01

    There is a general agreement that critical thinking is an important element of 21st century skills. Although critical thinking is a very complex and controversial conception, many would accept that recognition and evaluation of assumptions is a basic critical-thinking process. When students use simple mathematical model to reason quantitatively…

  5. Statistical foundations of liquid-crystal theory: I. Discrete systems of rod-like molecules.

    PubMed

    Seguin, Brian; Fried, Eliot

    2012-12-01

    We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals.

  6. Factors Affecting Post-Service Wage Growth for Veterans

    DTIC Science & Technology

    1991-12-01

    Labor economics is primarily concerned with how employers and employees respond to changes in wages, prices, profits, and the non-pecuniary aspects...of the employment reLaticnship [Ref: 4, pg. 31 Two of the basic assumptions underlying labor economics are Lhat resources are scarce, and that people...Retiree’ Post-Service Earnigs and Empjoyment, February 1981, Fand Corporation. 4. Ehrenberq, R. G. and Smith, R. S., Modern Labor Economics . 3ra Edit on

  7. Causality and headache triggers

    PubMed Central

    Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.

    2013-01-01

    Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872

  8. A Markov chain model for reliability growth and decay

    NASA Technical Reports Server (NTRS)

    Siegrist, K.

    1982-01-01

    A mathematical model is developed to describe a complex system undergoing a sequence of trials in which there is interaction between the internal states of the system and the outcomes of the trials. For example, the model might describe a system undergoing testing that is redesigned after each failure. The basic assumptions for the model are that the state of the system after a trial depends probabilistically only on the state before the trial and on the outcome of the trial and that the outcome of a trial depends probabilistically only on the state of the system before the trial. It is shown that under these basic assumptions, the successive states form a Markov chain and the successive states and outcomes jointly form a Markov chain. General results are obtained for the transition probabilities, steady-state distributions, etc. A special case studied in detail describes a system that has two possible state ('repaired' and 'unrepaired') undergoing trials that have three possible outcomes ('inherent failure', 'assignable-cause' 'failure' and 'success'). For this model, the reliability function is computed explicitly and an optimal repair policy is obtained.

  9. The Applied Behavior Analysis Research Paradigm and Single-Subject Designs in Adapted Physical Activity Research.

    PubMed

    Haegele, Justin A; Hodge, Samuel Russell

    2015-10-01

    There are basic philosophical and paradigmatic assumptions that guide scholarly research endeavors, including the methods used and the types of questions asked. Through this article, kinesiology faculty and students with interests in adapted physical activity are encouraged to understand the basic assumptions of applied behavior analysis (ABA) methodology for conducting, analyzing, and presenting research of high quality in this paradigm. The purposes of this viewpoint paper are to present information fundamental to understanding the assumptions undergirding research methodology in ABA, describe key aspects of single-subject research designs, and discuss common research designs and data-analysis strategies used in single-subject studies.

  10. The Robustness of the Studentized Range Statistic to Violations of the Normality and Homogeneity of Variance Assumptions.

    ERIC Educational Resources Information Center

    Ramseyer, Gary C.; Tcheng, Tse-Kia

    The present study was directed at determining the extent to which the Type I Error rate is affected by violations in the basic assumptions of the q statistic. Monte Carlo methods were employed, and a variety of departures from the assumptions were examined. (Author)

  11. Photometry and spectroscopy of a newly discovered polar - Nova Cygni 1975 (V1500 CYG)

    NASA Technical Reports Server (NTRS)

    Kaluzny, Janusz; Chlebowski, Tomasz

    1988-01-01

    The paper reports photometric and spectroscopic observations which led to the conclusion that Nova Cygni 1975 (V1500 Cyg) is a polar (of AM Her-type).The CCD photometry confirms the constancy of the photometric period which is again interpreted as an orbital cycle. The time-resolved MMT spectra make it possible to reconstruct, under several assumptions, the basic system parameters: M1=0.9M solar mass and M2=0.31M solar mass.

  12. The Nonlinear Dynamic Response of an Elastic-Plastic Thin Plate under Impulsive Loading,

    DTIC Science & Technology

    1987-06-11

    Among those numerical methods, the finite element method is the most effective one. The method presented in this paper is an " influence function " numerical...computational time is much less than the finite element method. Its precision is higher also. II. Basic Assumption and the Influence Function of a Simple...calculation. Fig. 1 3 2. The Influence function of a Simple Supported Plate The motion differential equation of a thin plate can be written as DV’w+ _.eluq() (1

  13. Dark energy cosmology with tachyon field in teleparallel gravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Motavalli, H., E-mail: Motavalli@Tabrizu.ac.ir; Akbarieh, A. Rezaei; Nasiry, M.

    2016-07-15

    We construct a tachyon teleparallel dark energy model for a homogeneous and isotropic flat universe in which a tachyon as a non-canonical scalar field is non-minimally coupled to gravity in the framework of teleparallel gravity. The explicit form of potential and coupling functions are obtained under the assumption that the Lagrangian admits the Noether symmetry approach. The dynamical behavior of the basic cosmological observables is compared to recent observational data, which implies that the tachyon field may serve as a candidate for dark energy.

  14. Alien plants confront expectations of climate change impacts.

    PubMed

    Hulme, Philip E

    2014-09-01

    The success of alien plants in novel environments questions basic assumptions about the fate of native species under climate change. Aliens generally spread faster than the velocity of climate change, display considerable phenotypic plasticity as well as adaptation to new selection pressures, and their ranges are often shaped by biotic rather than climatic factors. Given that many native species also exhibit these attributes, their risk of extinction as a result of climate change might be overestimated. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Can Basic Research on Children and Families Be Useful for the Policy Process?

    ERIC Educational Resources Information Center

    Moore, Kristin A.

    Based on the assumption that basic science is the crucial building block for technological and biomedical progress, this paper examines the relevance for public policy of basic demographic and behavioral sciences research on children and families. The characteristics of basic research as they apply to policy making are explored. First, basic…

  16. Shaping the use of psychotropic medicines in nursing homes: A qualitative study on organisational culture.

    PubMed

    Sawan, Mouna; Jeon, Yun-Hee; Chen, Timothy F

    2018-04-01

    Psychotropic medicines have limited efficacy in the management of behavioural and psychological disturbances, yet they are commonly used in nursing homes. Organisational culture is an important consideration influencing use of psychotropic medicines. Schein's theory elucidates that organisational culture is underpinned by basic assumptions, which are the taken for granted beliefs driving organisational members' behaviour and practices. By exploring the basic assumptions of culture we are able to find explanations for why psychotropic medicines are prescribed contrary to standards. A qualitative study guided by Schein's theory was conducted using semi-structured interviews with 40 staff representing a broad range of roles from eight nursing homes. Findings from the study suggest two basic assumptions influenced the use of psychotropic medicines: locus of control and necessity for efficiency or comprehensiveness. Locus of control pertained to whether staff believed they could control decisions when facing negative work experiences. Necessity for efficiency or comprehensiveness concerned how much time and effort was spent on a given task. Participants' arrived at decisions to use psychotropic medicines that were inconsistent with ideal standards when they believed they were helpless to do the right thing by the resident and it was necessary to restrict time on a given task. Basic assumptions tended to provide the rationale for staff to use psychotropic medicines when it was not compatible with standards. Organisational culture is an important factor that should be addressed to optimise psychotropic medicine use. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Maximization, learning, and economic behavior

    PubMed Central

    Erev, Ido; Roth, Alvin E.

    2014-01-01

    The rationality assumption that underlies mainstream economic theory has proved to be a useful approximation, despite the fact that systematic violations to its predictions can be found. That is, the assumption of rational behavior is useful in understanding the ways in which many successful economic institutions function, although it is also true that actual human behavior falls systematically short of perfect rationality. We consider a possible explanation of this apparent inconsistency, suggesting that mechanisms that rest on the rationality assumption are likely to be successful when they create an environment in which the behavior they try to facilitate leads to the best payoff for all agents on average, and most of the time. Review of basic learning research suggests that, under these conditions, people quickly learn to maximize expected return. This review also shows that there are many situations in which experience does not increase maximization. In many cases, experience leads people to underweight rare events. In addition, the current paper suggests that it is convenient to distinguish between two behavioral approaches to improve economic analyses. The first, and more conventional approach among behavioral economists and psychologists interested in judgment and decision making, highlights violations of the rational model and proposes descriptive models that capture these violations. The second approach studies human learning to clarify the conditions under which people quickly learn to maximize expected return. The current review highlights one set of conditions of this type and shows how the understanding of these conditions can facilitate market design. PMID:25024182

  18. Maximization, learning, and economic behavior.

    PubMed

    Erev, Ido; Roth, Alvin E

    2014-07-22

    The rationality assumption that underlies mainstream economic theory has proved to be a useful approximation, despite the fact that systematic violations to its predictions can be found. That is, the assumption of rational behavior is useful in understanding the ways in which many successful economic institutions function, although it is also true that actual human behavior falls systematically short of perfect rationality. We consider a possible explanation of this apparent inconsistency, suggesting that mechanisms that rest on the rationality assumption are likely to be successful when they create an environment in which the behavior they try to facilitate leads to the best payoff for all agents on average, and most of the time. Review of basic learning research suggests that, under these conditions, people quickly learn to maximize expected return. This review also shows that there are many situations in which experience does not increase maximization. In many cases, experience leads people to underweight rare events. In addition, the current paper suggests that it is convenient to distinguish between two behavioral approaches to improve economic analyses. The first, and more conventional approach among behavioral economists and psychologists interested in judgment and decision making, highlights violations of the rational model and proposes descriptive models that capture these violations. The second approach studies human learning to clarify the conditions under which people quickly learn to maximize expected return. The current review highlights one set of conditions of this type and shows how the understanding of these conditions can facilitate market design.

  19. PTSD as Meaning Violation: Testing a Cognitive Worldview Perspective.

    PubMed

    Park, Crystal L; Mills, Mary Alice; Edmondson, Donald

    2012-01-01

    The cognitive perspective on post-traumatic stress disorder (PTSD) has been successful in explaining many PTSD-related phenomena and in developing effective treatments, yet some of its basic assumptions remain surprisingly under-examined. The present study tested two of these assumptions: (1) situational appraisals of the event as violating global meaning (i.e., beliefs and goals) is related to PTSD symptomatology, and (2) the effect of situational appraisals of violation on PTSD symptomatology is mediated by global meaning (i.e., views of self and world). We tested these assumptions in a cross-sectional study of 130 college students who had experienced a DSM-IV level trauma. Structural equation modeling showed that appraisals of the extent to which the trauma violated one's beliefs and goals related fairly strongly to PTSD. In addition, the effects of appraisals of belief and goal violations on PTSD symptoms were fully mediated through negative global beliefs about both the self and the world. These findings support the cognitive worldview perspective, highlighting the importance of the meaning individuals assign to traumatic events, particularly the role of meaning violation.

  20. PTSD as Meaning Violation: Testing a Cognitive Worldview Perspective

    PubMed Central

    Park, Crystal L.; Mills, Mary Alice; Edmondson, Donald

    2014-01-01

    The cognitive perspective on post-traumatic stress disorder (PTSD) has been successful in explaining many PTSD-related phenomena and in developing effective treatments, yet some of its basic assumptions remain surprisingly under-examined. The present study tested two of these assumptions: (1) situational appraisals of the event as violating global meaning (i.e., beliefs and goals) is related to PTSD symptomatology, and (2) the effect of situational appraisals of violation on PTSD symptomatology is mediated by global meaning (i.e., views of self and world). We tested these assumptions in a cross-sectional study of 130 college students who had experienced a DSM-IV level trauma. Structural equation modeling showed that appraisals of the extent to which the trauma violated one’s beliefs and goals related fairly strongly to PTSD. In addition, the effects of appraisals of belief and goal violations on PTSD symptoms were fully mediated through negative global beliefs about both the self and the world. These findings support the cognitive worldview perspective, highlighting the importance of the meaning individuals assign to traumatic events, particularly the role of meaning violation. PMID:24860641

  1. Adaptive control: Myths and realities

    NASA Technical Reports Server (NTRS)

    Athans, M.; Valavani, L.

    1984-01-01

    It was found that all currently existing globally stable adaptive algorithms have three basic properties in common: positive realness of the error equation, square-integrability of the parameter adjustment law and, need for sufficient excitation for asymptotic parameter convergence. Of the three, the first property is of primary importance since it satisfies a sufficient condition for stabillity of the overall system, which is a baseline design objective. The second property has been instrumental in the proof of asymptotic error convergence to zero, while the third addresses the issue of parameter convergence. Positive-real error dynamics can be generated only if the relative degree (excess of poles over zeroes) of the process to be controlled is known exactly; this, in turn, implies perfect modeling. This and other assumptions, such as absence of nonminimum phase plant zeros on which the mathematical arguments are based, do not necessarily reflect properties of real systems. As a result, it is natural to inquire what happens to the designs under less than ideal assumptions. The issues arising from violation of the exact modeling assumption which is extremely restrictive in practice and impacts the most important system property, stability, are discussed.

  2. How biological background assumptions influence scientific risk evaluation of stacked genetically modified plants: an analysis of research hypotheses and argumentations.

    PubMed

    Rocca, Elena; Andersen, Fredrik

    2017-08-14

    Scientific risk evaluations are constructed by specific evidence, value judgements and biological background assumptions. The latter are the framework-setting suppositions we apply in order to understand some new phenomenon. That background assumptions co-determine choice of methodology, data interpretation, and choice of relevant evidence is an uncontroversial claim in modern basic science. Furthermore, it is commonly accepted that, unless explicated, disagreements in background assumptions can lead to misunderstanding as well as miscommunication. Here, we extend the discussion on background assumptions from basic science to the debate over genetically modified (GM) plants risk assessment. In this realm, while the different political, social and economic values are often mentioned, the identity and role of background assumptions at play are rarely examined. We use an example from the debate over risk assessment of stacked genetically modified plants (GM stacks), obtained by applying conventional breeding techniques to GM plants. There are two main regulatory practices of GM stacks: (i) regulate as conventional hybrids and (ii) regulate as new GM plants. We analyzed eight papers representative of these positions and found that, in all cases, additional premises are needed to reach the stated conclusions. We suggest that these premises play the role of biological background assumptions and argue that the most effective way toward a unified framework for risk analysis and regulation of GM stacks is by explicating and examining the biological background assumptions of each position. Once explicated, it is possible to either evaluate which background assumptions best reflect contemporary biological knowledge, or to apply Douglas' 'inductive risk' argument.

  3. Sampling Assumptions in Inductive Generalization

    ERIC Educational Resources Information Center

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  4. Issues in the economic evaluation of influenza vaccination by injection of healthy working adults in the US: a review and decision analysis of ten published studies.

    PubMed

    Hogan, Thomas J

    2012-05-01

    The objective was to review recent economic evaluations of influenza vaccination by injection in the US, assess their evidence, and conclude on their collective findings. The literature was searched for economic evaluations of influenza vaccination injection in healthy working adults in the US published since 1995. Ten evaluations described in nine papers were identified. These were synopsized and their results evaluated, the basic structure of all evaluations was ascertained, and sensitivity of outcomes to changes in parameter values were explored using a decision model. Areas to improve economic evaluations were noted. Eight of nine evaluations with credible economic outcomes were favourable to vaccination, representing a statistically significant result compared with a proportion of 50% that would be expected if vaccination and no vaccination were economically equivalent. Evaluations shared a basic structure, but differed considerably with respect to cost components, assumptions, methods, and parameter estimates. Sensitivity analysis indicated that changes in parameter values within the feasible range, individually or simultaneously, could reverse economic outcomes. Given stated misgivings, the methods of estimating influenza reduction ascribed to vaccination must be researched to confirm that they produce accurate and reliable estimates. Research is also needed to improve estimates of the costs per case of influenza illness and the costs of vaccination. Based on their assumptions, the reviewed papers collectively appear to support the economic benefits of influenza vaccination of healthy adults. Yet the underlying assumptions, methods and parameter estimates themselves warrant further research to confirm they are accurate, reliable and appropriate to economic evaluation purposes.

  5. Quantitative model of price diffusion and market friction based on trading as a mechanistic random process.

    PubMed

    Daniels, Marcus G; Farmer, J Doyne; Gillemot, László; Iori, Giulia; Smith, Eric

    2003-03-14

    We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.

  6. Quantitative Model of Price Diffusion and Market Friction Based on Trading as a Mechanistic Random Process

    NASA Astrophysics Data System (ADS)

    Daniels, Marcus G.; Farmer, J. Doyne; Gillemot, László; Iori, Giulia; Smith, Eric

    2003-03-01

    We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.

  7. Belief Structures about People Held by Selected Graduate Students.

    ERIC Educational Resources Information Center

    Dole, Arthur A.; And Others

    Wrightsman has established that assumptions about human nature distinguish religious, occupational, political, gender, and other groups, and that they predict behavior in structured situations. Hjelle and Ziegler proposed a set of nine basic bipolar assumptions about the nature of people: freedom-determinism; rationality-irrationality;…

  8. The EPQ model under conditions of two levels of trade credit and limited storage capacity in supply chain management

    NASA Astrophysics Data System (ADS)

    Chung, Kun-Jen

    2013-09-01

    An inventory problem involves a lot of factors influencing inventory decisions. To understand it, the traditional economic production quantity (EPQ) model plays rather important role for inventory analysis. Although the traditional EPQ models are still widely used in industry, practitioners frequently question validities of assumptions of these models such that their use encounters challenges and difficulties. So, this article tries to present a new inventory model by considering two levels of trade credit, finite replenishment rate and limited storage capacity together to relax the basic assumptions of the traditional EPQ model to improve the environment of the use of it. Keeping in mind cost-minimisation strategy, four easy-to-use theorems are developed to characterise the optimal solution. Finally, the sensitivity analyses are executed to investigate the effects of the various parameters on ordering policies and the annual total relevant costs of the inventory system.

  9. Modeling intelligent adversaries for terrorism risk assessment: some necessary conditions for adversary models.

    PubMed

    Guikema, Seth

    2012-07-01

    Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. © 2011 Society for Risk Analysis.

  10. Zipf's word frequency law in natural language: a critical review and future directions.

    PubMed

    Piantadosi, Steven T

    2014-10-01

    The frequency distribution of words has been a key object of study in statistical linguistics for the past 70 years. This distribution approximately follows a simple mathematical form known as Zipf's law. This article first shows that human language has a highly complex, reliable structure in the frequency distribution over and above this classic law, although prior data visualization methods have obscured this fact. A number of empirical phenomena related to word frequencies are then reviewed. These facts are chosen to be informative about the mechanisms giving rise to Zipf's law and are then used to evaluate many of the theoretical explanations of Zipf's law in language. No prior account straightforwardly explains all the basic facts or is supported with independent evaluation of its underlying assumptions. To make progress at understanding why language obeys Zipf's law, studies must seek evidence beyond the law itself, testing assumptions and evaluating novel predictions with new, independent data.

  11. Thin Skin, Deep Damage: Addressing the Wounded Writer in the Basic Writing Course

    ERIC Educational Resources Information Center

    Boone, Stephanie D.

    2010-01-01

    How do institutions and their writing faculties see basic writers? What assumptions about these writers drive writing curricula, pedagogies and assessments? How do writing programs enable or frustrate these writers? How might course design facilitate the outcomes we envision? This article argues that, in order to teach basic writers to enter…

  12. Writing Partners: Service Learning as a Route to Authority for Basic Writers

    ERIC Educational Resources Information Center

    Gabor, Catherine

    2009-01-01

    This article looks at best practices in basic writing instruction in terms of non-traditional audiences and writerly authority. Much conventional wisdom discourages participation in service-learning projects for basic writers because of the assumption that their writing is not yet ready to "go public." Countering this line of thinking, the author…

  13. Introduction to the Application of Web-Based Surveys.

    ERIC Educational Resources Information Center

    Timmerman, Annemarie

    This paper discusses some basic assumptions and issues concerning web-based surveys. Discussion includes: assumptions regarding cost and ease of use; disadvantages of web-based surveys, concerning the inability to compensate for four common errors of survey research: coverage error, sampling error, measurement error and nonresponse error; and…

  14. School, Cultural Diversity, Multiculturalism, and Contact

    ERIC Educational Resources Information Center

    Pagani, Camilla; Robustelli, Francesco; Martinelli, Cristina

    2011-01-01

    The basic assumption of this paper is that school's potential to improve cross-cultural relations, as well as interpersonal relations in general, is enormous. This assumption is supported by a number of theoretical considerations and by the analysis of data we obtained from a study we conducted on the attitudes toward diversity and…

  15. Nuclear Reactions in Micro/Nano-Scale Metal Particles

    NASA Astrophysics Data System (ADS)

    Kim, Y. E.

    2013-03-01

    Low-energy nuclear reactions in micro/nano-scale metal particles are described based on the theory of Bose-Einstein condensation nuclear fusion (BECNF). The BECNF theory is based on a single basic assumption capable of explaining the observed LENR phenomena; deuterons in metals undergo Bose-Einstein condensation. The BECNF theory is also a quantitative predictive physical theory. Experimental tests of the basic assumption and theoretical predictions are proposed. Potential application to energy generation by ignition at low temperatures is described. Generalized theory of BECNF is used to carry out theoretical analyses of recently reported experimental results for hydrogen-nickel system.

  16. High Tech Educators Network Evaluation.

    ERIC Educational Resources Information Center

    O'Shea, Dan

    A process evaluation was conducted to assess the High Tech Educators Network's (HTEN's) activities. Four basic components to the evaluation approach were documentation review, program logic model, written survey, and participant interviews. The model mapped the basic goals and objectives, assumptions, activities, outcome expectations, and…

  17. Scale Dependence of Magnetic Helicity in the Solar Wind

    NASA Technical Reports Server (NTRS)

    Brandenburg, Axel; Subramanian, Kandaswamy; Balogh, Andre; Goldstein, Melvyn L.

    2011-01-01

    We determine the magnetic helicity, along with the magnetic energy, at high latitudes using data from the Ulysses mission. The data set spans the time period from 1993 to 1996. The basic assumption of the analysis is that the solar wind is homogeneous. Because the solar wind speed is high, we follow the approach first pioneered by Matthaeus et al. by which, under the assumption of spatial homogeneity, one can use Fourier transforms of the magnetic field time series to construct one-dimensional spectra of the magnetic energy and magnetic helicity under the assumption that the Taylor frozen-in-flow hypothesis is valid. That is a well-satisfied assumption for the data used in this study. The magnetic helicity derives from the skew-symmetric terms of the three-dimensional magnetic correlation tensor, while the symmetric terms of the tensor are used to determine the magnetic energy spectrum. Our results show a sign change of magnetic helicity at wavenumber k approximately equal to 2AU(sup -1) (or frequency nu approximately equal to 2 microHz) at distances below 2.8AU and at k approximately equal to 30AU(sup -1) (or nu approximately equal to 25 microHz) at larger distances. At small scales the magnetic helicity is positive at northern heliographic latitudes and negative at southern latitudes. The positive magnetic helicity at small scales is argued to be the result of turbulent diffusion reversing the sign relative to what is seen at small scales at the solar surface. Furthermore, the magnetic helicity declines toward solar minimum in 1996. The magnetic helicity flux integrated separately over one hemisphere amounts to about 10(sup 45) Mx(sup 2) cycle(sup -1) at large scales and to a three times lower value at smaller scales.

  18. Validation of the underlying assumptions of the quality-adjusted life-years outcome: results from the ECHOUTCOME European project.

    PubMed

    Beresniak, Ariel; Medina-Lara, Antonieta; Auray, Jean Paul; De Wever, Alain; Praet, Jean-Claude; Tarricone, Rosanna; Torbica, Aleksandra; Dupont, Danielle; Lamure, Michel; Duru, Gerard

    2015-01-01

    Quality-adjusted life-years (QALYs) have been used since the 1980s as a standard health outcome measure for conducting cost-utility analyses, which are often inadequately labeled as 'cost-effectiveness analyses'. This synthetic outcome, which combines the quantity of life lived with its quality expressed as a preference score, is currently recommended as reference case by some health technology assessment (HTA) agencies. While critics of the QALY approach have expressed concerns about equity and ethical issues, surprisingly, very few have tested the basic methodological assumptions supporting the QALY equation so as to establish its scientific validity. The main objective of the ECHOUTCOME European project was to test the validity of the underlying assumptions of the QALY outcome and its relevance in health decision making. An experiment has been conducted with 1,361 subjects from Belgium, France, Italy, and the UK. The subjects were asked to express their preferences regarding various hypothetical health states derived from combining different health states with time durations in order to compare observed utility values of the couples (health state, time) and calculated utility values using the QALY formula. Observed and calculated utility values of the couples (health state, time) were significantly different, confirming that preferences expressed by the respondents were not consistent with the QALY theoretical assumptions. This European study contributes to establishing that the QALY multiplicative model is an invalid measure. This explains why costs/QALY estimates may vary greatly, leading to inconsistent recommendations relevant to providing access to innovative medicines and health technologies. HTA agencies should consider other more robust methodological approaches to guide reimbursement decisions.

  19. On Cognitive Constraints and Learning Progressions: The Case of "Structure of Matter"

    ERIC Educational Resources Information Center

    Talanquer, Vicente

    2009-01-01

    Based on the analysis of available research on students' alternative conceptions about the particulate nature of matter, we identified basic implicit assumptions that seem to constrain students' ideas and reasoning on this topic at various learning stages. Although many of these assumptions are interrelated, some of them seem to change or…

  20. Rationality as the Basic Assumption in Explaining Japanese (or Any Other) Business Culture.

    ERIC Educational Resources Information Center

    Koike, Shohei

    Economic analysis, with its explicit assumption that people are rational, is applied to the Japanese and American business cultures to illustrate how the approach is useful for understanding cultural differences. Specifically, differences in cooperative behavior among Japanese and American workers are examined. Economic analysis goes beyond simple…

  1. Standardization of Selected Semantic Differential Scales with Secondary School Children.

    ERIC Educational Resources Information Center

    Evans, G. T.

    A basic assumption of this study is that the meaning continuum registered by an adjective pair remains relatively constant over a large universe of concepts and over subjects within a relatively homogeneous population. An attempt was made to validate this assumption by showing the invariance of the factor structure across different types of…

  2. What's Love Got to Do with It? Rethinking Common Sense Assumptions

    ERIC Educational Resources Information Center

    Trachman, Matthew; Bluestone, Cheryl

    2005-01-01

    One of the most basic tasks in introductory social science classes is to get students to reexamine their common sense assumptions concerning human behavior. This article introduces a shared assignment developed for a learning community that paired an introductory sociology and psychology class. The assignment challenges students to rethink the…

  3. General solutions for the oxidation kinetics of polymers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, K.T.; Clough, R.L.; Wise, J.

    1996-08-01

    The simplest general kinetic schemes applicable to the oxidation of polymers are presented, discussed and analyzed in terms of the underlying kinetic assumptions. For the classic basic autoxidation scheme (BAS), which involves three bimolecular termination steps and is applicable mainly to unstabilized polymers, typical assumptions used singly or in groups include (1) long kinetic chain length, (2) a specific ratio of the termination rate constants and (3) insensitivity to the oxygen concentration (e.g., domination by a single termination step). Steady-state solutions for the rate of oxidation are given in terms of one, two, three, or four parameters, corresponding respectively tomore » three, two, one, or zero kinetic assumptions. The recently derived four-parameter solution predicts conditions yielding unusual dependencies of the oxidation rate on oxygen concentration and on initiation rate, as well as conditions leading to some unusual diffusion-limited oxidation profile shapes. For stabilized polymers, unimolecular termination schemes are typically more appropriate than bimolecular. Kinetics incorporating unimolecular termination reactions are shown to result in very simple oxidation expressions which have been experimentally verified for both radiation-initiated oxidation of an EPDM and thermoxidative degradation of nitrile and chloroprene elastomers.« less

  4. Global dilemmas and the plausibility of whole-system change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harman, W.W.

    1995-05-01

    Approaching the global dilemmas of our time with whole-system thinking implies that the much-talked-about problems of environmental degradation, deforestation, desertification, man-made climate change, chronic hunger and poverty, etc. are not so much problems as symptoms of a deeper-level condition that must be dealt with. This has to do with the basic incompatibility between widely proclaimed goals and underlying system assumptions. Pressures toward whole-system change are increasing in intensity. The critical issue is whether that change can be smooth and nondisruptive, or whether it will involve some disintegration of present structures. Constructive interventions are discussed. 1 tab.

  5. A test of the hierarchical model of litter decomposition.

    PubMed

    Bradford, Mark A; Veen, G F Ciska; Bonis, Anne; Bradford, Ella M; Classen, Aimee T; Cornelissen, J Hans C; Crowther, Thomas W; De Long, Jonathan R; Freschet, Gregoire T; Kardol, Paul; Manrubia-Freixa, Marta; Maynard, Daniel S; Newman, Gregory S; Logtestijn, Richard S P; Viketoft, Maria; Wardle, David A; Wieder, William R; Wood, Stephen A; van der Putten, Wim H

    2017-12-01

    Our basic understanding of plant litter decomposition informs the assumptions underlying widely applied soil biogeochemical models, including those embedded in Earth system models. Confidence in projected carbon cycle-climate feedbacks therefore depends on accurate knowledge about the controls regulating the rate at which plant biomass is decomposed into products such as CO 2 . Here we test underlying assumptions of the dominant conceptual model of litter decomposition. The model posits that a primary control on the rate of decomposition at regional to global scales is climate (temperature and moisture), with the controlling effects of decomposers negligible at such broad spatial scales. Using a regional-scale litter decomposition experiment at six sites spanning from northern Sweden to southern France-and capturing both within and among site variation in putative controls-we find that contrary to predictions from the hierarchical model, decomposer (microbial) biomass strongly regulates decomposition at regional scales. Furthermore, the size of the microbial biomass dictates the absolute change in decomposition rates with changing climate variables. Our findings suggest the need for revision of the hierarchical model, with decomposers acting as both local- and broad-scale controls on litter decomposition rates, necessitating their explicit consideration in global biogeochemical models.

  6. Intellectualizing Adult Basic Literacy Education: A Case Study

    ERIC Educational Resources Information Center

    Bradbury, Kelly S.

    2012-01-01

    At a time when accusations of American ignorance and anti-intellectualism are ubiquitous, this article challenges problematic assumptions about intellectualism that overlook the work of adult basic literacy programs and proposes an expanded view of intellectualism. It is important to recognize and to challenge narrow views of intellectualism…

  7. Adult Literacy Programs: Guidelines for Effectiveness.

    ERIC Educational Resources Information Center

    Lord, Jerome E.

    This report is a summary of information from both research and experience about the assumptions and practices that guide successful basic skills programs. The 31 guidelines are basic to building a solid foundation on which effective instructional programs for adults can be developed. The first six guidelines address some important characteristics…

  8. Social Studies Curriculum Guidelines.

    ERIC Educational Resources Information Center

    Manson, Gary; And Others

    These guidelines, which set standards for social studies programs K-12, can be used to update existing programs or may serve as a baseline for further innovation. The first section, "A Basic Rationale for Social Studies Education," identifies the theoretical assumptions basic to the guidelines as knowledge, thinking, valuing, social participation,…

  9. An uncertainty analysis of the flood-stage upstream from a bridge.

    PubMed

    Sowiński, M

    2006-01-01

    The paper begins with the formulation of the problem in the form of a general performance function. Next the Latin hypercube sampling (LHS) technique--a modified version of the Monte Carlo method is briefly described. The essential uncertainty analysis of the flood-stage upstream from a bridge starts with a description of the hydraulic model. This model concept is based on the HEC-RAS model developed for subcritical flow under a bridge without piers in which the energy equation is applied. The next section contains the characteristic of the basic variables including a specification of their statistics (means and variances). Next the problem of correlated variables is discussed and assumptions concerning correlation among basic variables are formulated. The analysis of results is based on LHS ranking lists obtained from the computer package UNCSAM. Results fot two examples are given: one for independent and the other for correlated variables.

  10. Basic lubrication equations

    NASA Technical Reports Server (NTRS)

    Hamrock, B. J.; Dowson, D.

    1981-01-01

    Lubricants, usually Newtonian fluids, are assumed to experience laminar flow. The basic equations used to describe the flow are the Navier-Stokes equation of motion. The study of hydrodynamic lubrication is, from a mathematical standpoint, the application of a reduced form of these Navier-Stokes equations in association with the continuity equation. The Reynolds equation can also be derived from first principles, provided of course that the same basic assumptions are adopted in each case. Both methods are used in deriving the Reynolds equation, and the assumptions inherent in reducing the Navier-Stokes equations are specified. Because the Reynolds equation contains viscosity and density terms and these properties depend on temperature and pressure, it is often necessary to couple the Reynolds with energy equation. The lubricant properties and the energy equation are presented. Film thickness, a parameter of the Reynolds equation, is a function of the elastic behavior of the bearing surface. The governing elasticity equation is therefore presented.

  11. Study on low intensity aeration oxygenation model and optimization for shallow water

    NASA Astrophysics Data System (ADS)

    Chen, Xiao; Ding, Zhibin; Ding, Jian; Wang, Yi

    2018-02-01

    Aeration/oxygenation is an effective measure to improve self-purification capacity in shallow water treatment while high energy consumption, high noise and expensive management refrain the development and the application of this process. Based on two-film theory, the theoretical model of the three-dimensional partial differential equation of aeration in shallow water is established. In order to simplify the equation, the basic assumptions of gas-liquid mass transfer in vertical direction and concentration diffusion in horizontal direction are proposed based on engineering practice and are tested by the simulation results of gas holdup which are obtained by simulating the gas-liquid two-phase flow in aeration tank under low-intensity condition. Based on the basic assumptions and the theory of shallow permeability, the model of three-dimensional partial differential equations is simplified and the calculation model of low-intensity aeration oxygenation is obtained. The model is verified through comparing the aeration experiment. Conclusions as follows: (1)The calculation model of gas-liquid mass transfer in vertical direction and concentration diffusion in horizontal direction can reflect the process of aeration well; (2) Under low-intensity conditions, the long-term aeration and oxygenation is theoretically feasible to enhance the self-purification capacity of water bodies; (3) In the case of the same total aeration intensity, the effect of multipoint distributed aeration on the diffusion of oxygen concentration in the horizontal direction is obvious; (4) In the shallow water treatment, reducing the volume of aeration equipment with the methods of miniaturization, array, low-intensity, mobilization to overcome the high energy consumption, large size, noise and other problems can provide a good reference.

  12. Zipf’s word frequency law in natural language: A critical review and future directions

    PubMed Central

    2014-01-01

    The frequency distribution of words has been a key object of study in statistical linguistics for the past 70 years. This distribution approximately follows a simple mathematical form known as Zipf ’ s law. This article first shows that human language has a highly complex, reliable structure in the frequency distribution over and above this classic law, although prior data visualization methods have obscured this fact. A number of empirical phenomena related to word frequencies are then reviewed. These facts are chosen to be informative about the mechanisms giving rise to Zipf’s law and are then used to evaluate many of the theoretical explanations of Zipf’s law in language. No prior account straightforwardly explains all the basic facts or is supported with independent evaluation of its underlying assumptions. To make progress at understanding why language obeys Zipf’s law, studies must seek evidence beyond the law itself, testing assumptions and evaluating novel predictions with new, independent data. PMID:24664880

  13. What Is This Substance? What Makes It Different? Mapping Progression in Students' Assumptions about Chemical Identity

    ERIC Educational Resources Information Center

    Ngai, Courtney; Sevian, Hannah; Talanquer, Vicente

    2014-01-01

    Given the diversity of materials in our surroundings, one should expect scientifically literate citizens to have a basic understanding of the core ideas and practices used to analyze chemical substances. In this article, we use the term 'chemical identity' to encapsulate the assumptions, knowledge, and practices upon which chemical…

  14. Science Awareness and Science Literacy through the Basic Physics Course: Physics with a bit of Metaphysics?

    NASA Astrophysics Data System (ADS)

    Rusli, Aloysius

    2016-08-01

    Until the 1980s, it is well known and practiced in Indonesian Basic Physics courses, to present physics by its effective technicalities: The ideally elastic spring, the pulley and moving blocks, the thermodynamics of ideal engine models, theoretical electrostatics and electrodynamics with model capacitors and inductors, wave behavior and its various superpositions, and hopefully closed with a modern physics description. A different approach was then also experimented with, using the Hobson and Moore texts, stressing the alternative aim of fostering awareness, not just mastery, of science and the scientific method. This is hypothesized to be more in line with the changed attitude of the so-called Millenials cohort who are less attentive if not interested, and are more used to multi-tasking which suits their shorter span of attention. The upside is increased awareness of science and the scientific method. The downside is that they are getting less experience of the scientific method which intensely bases itself on critical observation, analytic thinking to set up conclusions or hypotheses, and checking consistency of the hypotheses with measured data. Another aspect is recognition that the human person encompasses both the reasoning capacity and the mental- spiritual-cultural capacity. This is considered essential, as the world grows even smaller due to increased communication capacity, causing strong interactions, nonlinear effects, and showing that value systems become more challenging and challenged due to physics / science and its cosmology, which is successfully based on the scientific method. So students should be made aware of the common basis of these two capacities: the assumptions, the reasoning capacity and the consistency assumption. This shows that the limits of science are their set of basic quantifiable assumptions, and the limits of the mental-spiritual-cultural aspects of life are their set of basic metaphysical (non-quantifiable) assumptions. The bridging between these two human aspects of life, can lead to a “why” of science, and a “meaning” of life. A progress report on these efforts is presented, essentially being of the results indicated by an extended format of the usual weekly reporting used previously in Basic Physics lectures.

  15. Genital Measures: Comments on Their Role in Understanding Human Sexuality

    ERIC Educational Resources Information Center

    Geer, James H.

    1976-01-01

    This paper discusses the use of genital measures in the study of both applied and basic work in human sexuality. Some of the advantages of psychophysiological measures are considered along with cautions concerning unwarranted assumptions. Some of the advances that are possible in both applied and basic work are examined. (Author)

  16. Disease Extinction Versus Persistence in Discrete-Time Epidemic Models.

    PubMed

    van den Driessche, P; Yakubu, Abdul-Aziz

    2018-04-12

    We focus on discrete-time infectious disease models in populations that are governed by constant, geometric, Beverton-Holt or Ricker demographic equations, and give a method for computing the basic reproduction number, [Formula: see text]. When [Formula: see text] and the demographic population dynamics are asymptotically constant or under geometric growth (non-oscillatory), we prove global asymptotic stability of the disease-free equilibrium of the disease models. Under the same demographic assumption, when [Formula: see text], we prove uniform persistence of the disease. We apply our theoretical results to specific discrete-time epidemic models that are formulated for SEIR infections, cholera in humans and anthrax in animals. Our simulations show that a unique endemic equilibrium of each of the three specific disease models is asymptotically stable whenever [Formula: see text].

  17. An assessment of the impact of transition on advanced winged entry vehicle thermal protection system mass

    NASA Technical Reports Server (NTRS)

    Wurster, K. E.

    1981-01-01

    This study examines the impact of turbulent heating on thermal protection system (TPS) mass for advanced winged entry vehicles. Four basic systems are considered: insulative, metallic hot structures, metallic standoff, and hybrid systems. TPS sizings are performed using entry trajectories tailored specifically to the characteristics of each TPS concept under consideration. Comparisons are made between systems previously sized under the assumption of all laminar heating and those sized using a baseline estimate of transition and turbulent heating. The relative effect of different transition criteria on TPS mass requirements is also examined. Also investigated are entry trajectories tailored to alleviate turbulent heating. Results indicate the significant impact of turbulent heating on TPS mass and demonstrate the importance of both accurate transition criteria and entry trajectory tailoring.

  18. Approach/Avoidance Orientations Affect Self-Construal and Identification with In-group

    PubMed Central

    Nussinson, Ravit; Häfner, Michael; Seibt, Beate; Strack, Fritz; Trope, Yaacov

    2011-01-01

    Approach and avoidance are two basic motivational orientations. Their activation influences cognitive and perceptive processes: Previous work suggests that an approach orientation instigates a focus on larger units as compared to avoidance. Study 1 confirms this assumption using a paradigm that more directly taps a person’s tendency to represent objects as belonging to small or large units than prior studies. It was further predicted that the self should also be represented as belonging to larger units, and hence be more interdependent under approach than under avoidance. Study 2 supports this prediction. As a consequence of this focus on belonging to larger units, it was finally predicted that approach results in a stronger identification with one’s in-group than avoidance. Studies 3 and 4 support that prediction. PMID:22844229

  19. 39 Questionable Assumptions in Modern Physics

    NASA Astrophysics Data System (ADS)

    Volk, Greg

    2009-03-01

    The growing body of anomalies in new energy, low energy nuclear reactions, astrophysics, atomic physics, and entanglement, combined with the failure of the Standard Model and string theory to predict many of the most basic fundamental phenomena, all point to a need for major new paradigms. Not Band-Aids, but revolutionary new ways of conceptualizing physics, in the spirit of Thomas Kuhn's The Structure of Scientific Revolutions. This paper identifies a number of long-held, but unproven assumptions currently being challenged by an increasing number of alternative scientists. Two common themes, both with venerable histories, keep recurring in the many alternative theories being proposed: (1) Mach's Principle, and (2) toroidal, vortex particles. Matter-based Mach's Principle differs from both space-based universal frames and observer-based Einsteinian relativity. Toroidal particles, in addition to explaining electron spin and the fundamental constants, satisfy the basic requirement of Gauss's misunderstood B Law, that motion itself circulates. Though a comprehensive theory is beyond the scope of this paper, it will suggest alternatives to the long list of assumptions in context.

  20. Chromatographic multivariate quality control of pharmaceuticals giving strongly overlapped peaks based on the chromatogram profile.

    PubMed

    Escuder-Gilabert, L; Ruiz-Roch, D; Villanueva-Camañas, R M; Medina-Hernández, M J; Sagrado, S

    2004-03-12

    In the present paper, the simultaneous quantification of two analytes showing strongly overlapped chromatographic peaks (alpha = 1.02), under the assumption that both available equipment and training of the laboratory staff are basic, is studied. A pharmaceutical preparation (Mutabase) containing two drugs of similar physicochemical properties (amitriptyline and perphenazine) is selected as case of study. The assays are carried out under realistic working conditions (i.e. routine testing laboratories). Uncertainty considerations are introduced in the study. A partial least squares model is directly applied to the chromatographic data (with no previous signal transformation) to perform quality control of the pharmaceutical formulation. Under the adequate protocol, the relative error in prediction of analytes is within the tolerances found in the pharmacopeia (10%). For spiked samples simulating formulation mistakes, the errors found have the same magnitude and sign to those provoked.

  1. Density functional computational studies on the glucose and glycine Maillard reaction: Formation of the Amadori rearrangement products

    NASA Astrophysics Data System (ADS)

    Jalbout, Abraham F.; Roy, Amlan K.; Shipar, Abul Haider; Ahmed, M. Samsuddin

    Theoretical energy changes of various intermediates leading to the formation of the Amadori rearrangement products (ARPs) under different mechanistic assumptions have been calculated, by using open chain glucose (O-Glu)/closed chain glucose (A-Glu and B-Glu) and glycine (Gly) as a model for the Maillard reaction. Density functional theory (DFT) computations have been applied on the proposed mechanisms under different pH conditions. Thus, the possibility of the formation of different compounds and electronic energy changes for different steps in the proposed mechanisms has been evaluated. B-Glu has been found to be more efficient than A-Glu, and A-Glu has been found more efficient than O-Glu in the reaction. The reaction under basic condition is the most favorable for the formation of ARPs. Other reaction pathways have been computed and discussed in this work.0

  2. A radiosity-based model to compute the radiation transfer of soil surface

    NASA Astrophysics Data System (ADS)

    Zhao, Feng; Li, Yuguang

    2011-11-01

    A good understanding of interactions of electromagnetic radiation with soil surface is important for a further improvement of remote sensing methods. In this paper, a radiosity-based analytical model for soil Directional Reflectance Factor's (DRF) distributions was developed and evaluated. The model was specifically dedicated to the study of radiation transfer for the soil surface under tillage practices. The soil was abstracted as two dimensional U-shaped or V-shaped geometric structures with periodic macroscopic variations. The roughness of the simulated surfaces was expressed as a ratio of the height to the width for the U and V-shaped structures. The assumption was made that the shadowing of soil surface, simulated by U or V-shaped grooves, has a greater influence on the soil reflectance distribution than the scattering properties of basic soil particles of silt and clay. Another assumption was that the soil is a perfectly diffuse reflector at a microscopic level, which is a prerequisite for the application of the radiosity method. This radiosity-based analytical model was evaluated by a forward Monte Carlo ray-tracing model under the same structural scenes and identical spectral parameters. The statistics of these two models' BRF fitting results for several soil structures under the same conditions showed the good agreements. By using the model, the physical mechanism of the soil bidirectional reflectance pattern was revealed.

  3. Actin-based propulsion of a microswimmer.

    PubMed

    Leshansky, A M

    2006-07-01

    A simple hydrodynamic model of actin-based propulsion of microparticles in dilute cell-free cytoplasmic extracts is presented. Under the basic assumption that actin polymerization at the particle surface acts as a force dipole, pushing apart the load and the free (nonanchored) actin tail, the propulsive velocity of the microparticle is determined as a function of the tail length, porosity, and particle shape. The anticipated velocities of the cargo displacement and the rearward motion of the tail are in good agreement with recently reported results of biomimetic experiments. A more detailed analysis of the particle-tail hydrodynamic interaction is presented and compared to the prediction of the simplified model.

  4. The basic aerodynamics of floatation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davies, M.J.; Wood, D.H.

    1983-09-01

    The original derivation of the basic theory governing the aerodynamics of both hovercraft and modern floatation ovens, requires the validity of some extremely crude assumptions. However, the basic theory is surprisingly accurate. It is shown that this accuracy occurs because the final expression of the basic theory can be derived by approximating the full Navier-Stokes equations in a manner that clearly shows the limitations of the theory. These limitations are used in discussing the relatively small discrepancies between the theory and experiment, which may not be significant for practical purposes.

  5. 5 CFR 841.502 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... EMPLOYEES RETIREMENT SYSTEM-GENERAL ADMINISTRATION Employee Deductions and Government Contributions § 841... standards (using dynamic assumptions) and expressed as a level percentage of aggregate basic pay. Normal...

  6. Experimental investigation of two-phase heat transfer in a porous matrix.

    NASA Technical Reports Server (NTRS)

    Von Reth, R.; Frost, W.

    1972-01-01

    One-dimensional two-phase flow transpiration cooling through porous metal is studied experimentally. The experimental data is compared with a previous one-dimensional analysis. Good agreement with calculated temperature distribution is obtained as long as the basic assumptions of the analytical model are satisfied. Deviations from the basic assumptions are caused by nonhomogeneous and oscillating flow conditions. Preliminary derivation of nondimensional parameters which characterize the stable and unstable flow conditions is given. Superheated liquid droplets observed sputtering from the heated surface indicated incomplete evaporation at heat fluxes well in access of the latent energy transport. A parameter is developed to account for the nonequilibrium thermodynamic effects. Measured and calculated pressure drops show contradicting trends which are attributed to capillary forces.

  7. The Not So Common Sense: Differences in How People Judge Social and Political Life.

    ERIC Educational Resources Information Center

    Rosenberg, Shawn W.

    This interdisciplinary book challenges two basic assumptions that orient much contemporary social scientific thinking. Offering theory and empirical research, the book rejects the classic liberal view that people share a basic common sense or rationality; while at the same time, it questions the view of contemporary social theory that meaning is…

  8. The effects of school closures on influenza outbreaks and pandemics: systematic review of simulation studies.

    PubMed

    Jackson, Charlotte; Mangtani, Punam; Hawker, Jeremy; Olowokure, Babatunde; Vynnycky, Emilia

    2014-01-01

    School closure is a potential intervention during an influenza pandemic and has been investigated in many modelling studies. To systematically review the effects of school closure on influenza outbreaks as predicted by simulation studies. We searched Medline and Embase for relevant modelling studies published by the end of October 2012, and handsearched key journals. We summarised the predicted effects of school closure on the peak and cumulative attack rates and the duration of the epidemic. We investigated how these predictions depended on the basic reproduction number, the timing and duration of closure and the assumed effects of school closures on contact patterns. School closures were usually predicted to be most effective if they caused large reductions in contact, if transmissibility was low (e.g. a basic reproduction number <2), and if attack rates were higher in children than in adults. The cumulative attack rate was expected to change less than the peak, but quantitative predictions varied (e.g. reductions in the peak were frequently 20-60% but some studies predicted >90% reductions or even increases under certain assumptions). This partly reflected differences in model assumptions, such as those regarding population contact patterns. Simulation studies suggest that school closure can be a useful control measure during an influenza pandemic, particularly for reducing peak demand on health services. However, it is difficult to accurately quantify the likely benefits. Further studies of the effects of reactive school closures on contact patterns are needed to improve the accuracy of model predictions.

  9. A Minimalist Analysis of English Topicalization: A Phase-Based Cartographic Complementizer Phrase (CP) Perspective.

    PubMed

    Tanaka, Hiroyoshi

    Under the basic tenet that syntactic derivation offers an optimal solution to both phonological realization and semantic interpretation of linguistic expression, the recent minimalist framework of syntactic theory claims that the basic unit for the derivation is equivalent to a syntactic propositional element, which is called a phase. In this analysis, syntactic derivation is assumed to proceed at phasal projections that include Complementizer Phrases (CP). However, there have been pointed out some empirical problems with respect to the failure of multiple occurrences of discourse-related elements in the CP domain. This problem can be easily overcome if the alternative approach in the recent minimalist perspective, which is called Cartographic CP analysis, is adopted, but this may raise a theoretical issue about the tension between phasality and four kinds of functional projections assumed in this analysis (Force Phrase (ForceP), Finite Phrase (FinP), Topic Phrase (TopP) and Focus Phrase (FocP)). This paper argues that a hybrid analysis with these two influential approaches can be proposed by claiming a reasonable assumption that syntactically requisite projections (i.e., ForceP and FinP) are phases and independently constitute a phasehood with relevant heads in the derivation. This then enables us to capture various syntactic properties of the Topicalization construction in English. Our proposed analysis, coupled with some additional assumptions and observations in recent minimalist studies, can be extended to incorporate peculiar properties in temporal/conditional adverbials and imperatives.

  10. The Federal Role and Chapter 1: Rethinking Some Basic Assumptions.

    ERIC Educational Resources Information Center

    Kirst, Michael W.

    In the 20 years since the major Federal program for the disadvantaged began, surprisingly little has changed from its original vision. It is now time to question some of the basic policies of Chapter 1 of the Education Consolidation and Improvement Act in view of the change in conceptions about the Federal role and the recent state and local…

  11. Achieving Successful Employment Outcomes with the Use of Assistive Technology. Report from the Study Group, Institute on Rehabilitation Issues (24th, Washington, DC, May 1998).

    ERIC Educational Resources Information Center

    Radtke, Jean, Ed.

    Developed as a result of an institute on rehabilitation issues, this document is a guide to assistive technology as it affects successful competitive employment outcomes for people with disabilities. Chapter 1 offers basic information on assistive technology including basic assumptions, service provider approaches, options for technology…

  12. 5 CFR 842.702 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... for valuation of the System, based on dynamic assumptions. The present value factors are unisex... EMPLOYEES RETIREMENT SYSTEM-BASIC ANNUITY Alternative Forms of Annuities § 842.702 Definitions. In this...

  13. Steady-state heat conduction in quiescent fluids: Incompleteness of the Navier-Stokes-Fourier equations

    NASA Astrophysics Data System (ADS)

    Brenner, Howard

    2011-10-01

    Linear irreversible thermodynamic principles are used to demonstrate, by counterexample, the existence of a fundamental incompleteness in the basic pre-constitutive mass, momentum, and energy equations governing fluid mechanics and transport phenomena in continua. The demonstration is effected by addressing the elementary case of steady-state heat conduction (and transport processes in general) occurring in quiescent fluids. The counterexample questions the universal assumption of equality of the four physically different velocities entering into the basic pre-constitutive mass, momentum, and energy conservation equations. Explicitly, it is argued that such equality is an implicit constitutive assumption rather than an established empirical fact of unquestioned authority. Such equality, if indeed true, would require formal proof of its validity, currently absent from the literature. In fact, our counterexample shows the assumption of equality to be false. As the current set of pre-constitutive conservation equations appearing in textbooks are regarded as applicable both to continua and noncontinua (e.g., rarefied gases), our elementary counterexample negating belief in the equality of all four velocities impacts on all aspects of fluid mechanics and transport processes, continua and noncontinua alike.

  14. Pedophilia: a diagnosis in search of a disorder.

    PubMed

    Malón, Agustin

    2012-10-01

    This article presents a critical review of the recent controversies concerning the diagnosis of pedophilia in the context of the preparation of the fifth edition of the DSM. The analysis focuses basically on the relationship between pedophilia and the current DSM-IV-TR's definition of mental disorder. Scholars appear not to share numerous basic assumptions ranging from their underlying ideas about what constitutes a mental disorder to the role of psychiatry in modern society, including irreconcilable theories about human sexuality, which interfere with reaching any kind of a consensus as to what the psychiatric status of pedophilia should be. It is questioned if the diagnosis of pedophilia contained in the DSM is more forensic than therapeutic, focusing rather on the dangers inherent in the condition of pedophilia (dangerous dysfunction) than on its negative effects for the subject (harmful dysfunction). The apparent necessity of the diagnosis of pedophilia in the DSM is supported, but the basis for this diagnosis is uncertain.

  15. The Prediction Properties of Inverse and Reverse Regression for the Simple Linear Calibration Problem

    NASA Technical Reports Server (NTRS)

    Parker, Peter A.; Geoffrey, Vining G.; Wilson, Sara R.; Szarka, John L., III; Johnson, Nels G.

    2010-01-01

    The calibration of measurement systems is a fundamental but under-studied problem within industrial statistics. The origins of this problem go back to basic chemical analysis based on NIST standards. In today's world these issues extend to mechanical, electrical, and materials engineering. Often, these new scenarios do not provide "gold standards" such as the standard weights provided by NIST. This paper considers the classic "forward regression followed by inverse regression" approach. In this approach the initial experiment treats the "standards" as the regressor and the observed values as the response to calibrate the instrument. The analyst then must invert the resulting regression model in order to use the instrument to make actual measurements in practice. This paper compares this classical approach to "reverse regression," which treats the standards as the response and the observed measurements as the regressor in the calibration experiment. Such an approach is intuitively appealing because it avoids the need for the inverse regression. However, it also violates some of the basic regression assumptions.

  16. The effect of errors in the assignment of the transmission functions on the accuracy of the thermal sounding of the atmosphere

    NASA Technical Reports Server (NTRS)

    Timofeyev, Y. M.

    1979-01-01

    In order to test the error of calculation in assumed values of the transmission function for Soviet and American radiometers sounding the atmosphere thermally from orbiting satellites, the assumptions of the transmission calculation is varied with respect to atmospheric CO2 content, transmission frequency, and atmospheric absorption. The error arising from variations of the assumptions from the standard basic model is calculated.

  17. Student Services: Programs and Functions. A Report on the Administration of Selected Student and Campus Services of the University of Illinois at Chicago Circle. Part 1 and 2.

    ERIC Educational Resources Information Center

    Bentz, Robert P.; And Others

    The commuter institute is one to which students commute. The two basic assumptions of this study are: (1) the Chicago Circle campus of the University of Illinois will remain a commuter institution during the decade ahead; and (2) the campus will increasingly serve a more heterogeneous student body. These assumptions have important implications for…

  18. Turning great strategy into great performance.

    PubMed

    Mankins, Michael C; Steele, Richard

    2005-01-01

    Despite the enormous time and energy that goes into strategy development, many companies have little to show for their efforts. Indeed, research by the consultancy Marakon Associates suggests that companies on average deliver only 63% of the financial performance their strategies promise. In this article, Michael Mankins and Richard Steele of Marakon present the findings of this research. They draw on their experience with high-performing companies like Barclays, Cisco, Dow Chemical, 3M, and Roche to establish some basic rules for setting and delivering strategy: Keep it simple, make it concrete. Avoid long, drawn-out descriptions of lofty goals and instead stick to clear language describing what your company will and won't do. Debate assumptions, not forecasts. Create cross-functional teams drawn from strategy, marketing, and finance to ensure the assumptions underlying your long-term plans reflect both the real economics of your company's markets and its actual performance relative to competitors. Use a rigorous analytic framework. Ensure that the dialogue between the corporate center and the business units about market trends and assumptions is conducted within a rigorous framework, such as that of "profit pools". Discuss resource deployments early. Create more realistic forecasts and more executable plans by discussing up front the level and timing of critical deployments. Clearly identify priorities. Prioritize tactics so that employees have a clear sense of where to direct their efforts. Continuously monitor performance. Track resource deployment and results against plan, using continuous feedback to reset assumptions and reallocate resources. Reward and develop execution capabilities. Motivate and develop staff. Following these rules strictly can help narrow the strategy-to-performance gap.

  19. Image processing for grazing incidence fast atom diffraction

    NASA Astrophysics Data System (ADS)

    Debiossac, Maxime; Roncin, Philippe

    2016-09-01

    Grazing incidence fast atom diffraction (GIFAD, or FAD) has developed as a surface sensitive technique. Compared with thermal energies helium diffraction (TEAS or HAS), GIFAD is less sensitive to thermal decoherence but also more demanding in terms of surface coherence, the mean distance between defects. Such high quality surfaces can be obtained from freshly cleaved crystals or in a molecular beam epitaxy (MBE) chamber where a GIFAD setup has been installed allowing in situ operation. Based on recent publications by Atkinson et al. (2014) and Debiossac et al. (2014), the paper describes in detail the basic steps needed to measure the relative intensities of the diffraction spots. Care is taken to outline the underlying physical assumptions.

  20. On star formation in stellar systems. I - Photoionization effects in protoglobular clusters

    NASA Technical Reports Server (NTRS)

    Tenorio-Tagle, G.; Bodenheimer, P.; Lin, D. N. C.; Noriega-Crespo, A.

    1986-01-01

    The progressive ionization and subsequent dynamical evolution of nonhomogeneously distributed low-metal-abundance diffuse gas after star formation in globular clusters are investigated analytically, taking the gravitational acceleration due to the stars into account. The basic equations are derived; the underlying assumptions, input parameters, and solution methods are explained; and numerical results for three standard cases (ionization during star formation, ionization during expansion, and evolution resulting in a stable H II region at its equilibrium Stromgren radius) are presented in graphs and characterized in detail. The time scale of residual-gas loss in typical clusters is found to be about the same as the lifetime of a massive star on the main sequence.

  1. Choice Inconsistencies among the Elderly: Evidence from Plan Choice in the Medicare Part D Program: Comment.

    PubMed

    Ketcham, Jonathan D; Kuminoff, Nicolai V; Powers, Christopher A

    2016-12-01

    Consumers' enrollment decisions in Medicare Part D can be explained by Abaluck and Gruber’s (2011) model of utility maximization with psychological biases or by a neoclassical version of their model that precludes such biases. We evaluate these competing hypotheses by applying nonparametric tests of utility maximization and model validation tests to administrative data. We find that 79 percent of enrollment decisions from 2006 to 2010 satisfied basic axioms of consumer theory under the assumption of full information. The validation tests provide evidence against widespread psychological biases. In particular, we find that precluding psychological biases improves the structural model's out-of-sample predictions for consumer behavior.

  2. A constitutive model for AS4/PEEK thermoplastic composites under cyclic loading

    NASA Technical Reports Server (NTRS)

    Rui, Yuting; Sun, C. T.

    1990-01-01

    Based on the basic and essential features of the elastic-plastic response of the AS4/PEEK thermoplastic composite subjected to off-axis cyclic loadings, a simple rate-independent constitutive model is proposed to describe the orthotropic material behavior for cyclic loadings. A one-parameter memory surface is introduced to distinguish the virgin deformation and the subsequent deformation process and to characterize the loading range effect. Cyclic softening is characterized by the change of generalized plastic modulus. By the vanishing yield surface assumption, a yield criterion is not needed and it is not necessary to consider loading and unloading separately. The model is compared with experimental results and good agreement is obtained.

  3. Molecular Simulation-Based Structural Prediction of Protein Complexes in Mass Spectrometry: The Human Insulin Dimer

    PubMed Central

    Li, Jinyu; Rossetti, Giulia; Dreyer, Jens; Raugei, Simone; Ippoliti, Emiliano; Lüscher, Bernhard; Carloni, Paolo

    2014-01-01

    Protein electrospray ionization (ESI) mass spectrometry (MS)-based techniques are widely used to provide insight into structural proteomics under the assumption that non-covalent protein complexes being transferred into the gas phase preserve basically the same intermolecular interactions as in solution. Here we investigate the applicability of this assumption by extending our previous structural prediction protocol for single proteins in ESI-MS to protein complexes. We apply our protocol to the human insulin dimer (hIns2) as a test case. Our calculations reproduce the main charge and the collision cross section (CCS) measured in ESI-MS experiments. Molecular dynamics simulations for 0.075 ms show that the complex maximizes intermolecular non-bonded interactions relative to the structure in water, without affecting the cross section. The overall gas-phase structure of hIns2 does exhibit differences with the one in aqueous solution, not inferable from a comparison with calculated CCS. Hence, care should be exerted when interpreting ESI-MS proteomics data based solely on NMR and/or X-ray structural information. PMID:25210764

  4. Walking through the statistical black boxes of plant breeding.

    PubMed

    Xavier, Alencar; Muir, William M; Craig, Bruce; Rainey, Katy Martin

    2016-10-01

    The main statistical procedures in plant breeding are based on Gaussian process and can be computed through mixed linear models. Intelligent decision making relies on our ability to extract useful information from data to help us achieve our goals more efficiently. Many plant breeders and geneticists perform statistical analyses without understanding the underlying assumptions of the methods or their strengths and pitfalls. In other words, they treat these statistical methods (software and programs) like black boxes. Black boxes represent complex pieces of machinery with contents that are not fully understood by the user. The user sees the inputs and outputs without knowing how the outputs are generated. By providing a general background on statistical methodologies, this review aims (1) to introduce basic concepts of machine learning and its applications to plant breeding; (2) to link classical selection theory to current statistical approaches; (3) to show how to solve mixed models and extend their application to pedigree-based and genomic-based prediction; and (4) to clarify how the algorithms of genome-wide association studies work, including their assumptions and limitations.

  5. Resolving the homology—function relationship through comparative genomics of membrane-trafficking machinery and parasite cell biology

    PubMed Central

    Klinger, Christen M.; Ramirez-Macias, Inmaculada; Herman, Emily K.; Turkewitz, Aaron P.; Field, Mark C.; Dacks, Joel B.

    2016-01-01

    With advances in DNA sequencing technology, it is increasingly common and tractable to informatically look for genes of interest in the genomic databases of parasitic organisms and infer cellular states. Assignment of a putative gene function based on homology to functionally characterized genes in other organisms, though powerful, relies on the implicit assumption of functional homology, i.e. that orthology indicates conserved function. Eukaryotes reveal a dazzling array of cellular features and structural organization, suggesting a concomitant diversity in their underlying molecular machinery. Significantly, examples of novel functions for pre-existing or new paralogues are not uncommon. Do these examples undermine the basic assumption of functional homology, especially in parasitic protists, which are often highly derived? Here we examine the extent to which functional homology exists between organisms spanning the eukaryotic lineage. By comparing membrane trafficking proteins between parasitic protists and traditional model organisms, where direct functional evidence is available, we find that function is indeed largely conserved between orthologues, albeit with significant adaptation arising from the unique biological features within each lineage. PMID:27444378

  6. The Culture-Transmission Motive in Immigrants: A World-Wide Internet Survey

    PubMed Central

    Mchitarjan, Irina; Reisenzein, Rainer

    2015-01-01

    A world-wide internet survey was conducted to test central assumptions of a recent theory of cultural transmission in minorities proposed by the authors. 844 1st to 2nd generation immigrants from a wide variety of countries recruited on a microjob platform completed a questionnaire designed to test eight hypotheses derived from the theory. Support was obtained for all hypotheses. In particular, evidence was obtained for the continued presence, in the immigrants, of the culture-transmission motive postulated by the theory: the desire to maintain the culture of origin and transmit it to the next generation. Support was also obtained for the hypothesized anchoring of the culture-transmission motive in more basic motives fulfilled by cultural groups, the relative intra- and intergenerational stability of the culture-transmission motive, and its motivating effects for action tendencies and desires that support cultural transmission under the difficult conditions of migration. Furthermore, the findings suggest that the assumption that people have a culture-transmission motive belongs to the folk psychology of sociocultural groups, and that immigrants regard the fulfillment of this desire as a moral right. PMID:26529599

  7. The Culture-Transmission Motive in Immigrants: A World-Wide Internet Survey.

    PubMed

    Mchitarjan, Irina; Reisenzein, Rainer

    2015-01-01

    A world-wide internet survey was conducted to test central assumptions of a recent theory of cultural transmission in minorities proposed by the authors. 844 1st to 2nd generation immigrants from a wide variety of countries recruited on a microjob platform completed a questionnaire designed to test eight hypotheses derived from the theory. Support was obtained for all hypotheses. In particular, evidence was obtained for the continued presence, in the immigrants, of the culture-transmission motive postulated by the theory: the desire to maintain the culture of origin and transmit it to the next generation. Support was also obtained for the hypothesized anchoring of the culture-transmission motive in more basic motives fulfilled by cultural groups, the relative intra- and intergenerational stability of the culture-transmission motive, and its motivating effects for action tendencies and desires that support cultural transmission under the difficult conditions of migration. Furthermore, the findings suggest that the assumption that people have a culture-transmission motive belongs to the folk psychology of sociocultural groups, and that immigrants regard the fulfillment of this desire as a moral right.

  8. On a viable first-order formulation of relativistic viscous fluids and its applications to cosmology

    NASA Astrophysics Data System (ADS)

    Disconzi, Marcelo M.; Kephart, Thomas W.; Scherrer, Robert J.

    We consider a first-order formulation of relativistic fluids with bulk viscosity based on a stress-energy tensor introduced by Lichnerowicz. Choosing a barotropic equation-of-state, we show that this theory satisfies basic physical requirements and, under the further assumption of vanishing vorticity, that the equations of motion are causal, both in the case of a fixed background and when the equations are coupled to Einstein's equations. Furthermore, Lichnerowicz's proposal does not fit into the general framework of first-order theories studied by Hiscock and Lindblom, and hence their instability results do not apply. These conclusions apply to the full-fledged nonlinear theory, without any equilibrium or near equilibrium assumptions. Similarities and differences between the approach explored here and other theories of relativistic viscosity, including the Mueller-Israel-Stewart formulation, are addressed. Cosmological models based on the Lichnerowicz stress-energy tensor are studied. As the topic of (relativistic) viscous fluids is also of interest outside the general relativity and cosmology communities, such as, for instance, in applications involving heavy-ion collisions, we make our presentation largely self-contained.

  9. Kronos Observatory Operations Challenges in a Lean Environment

    NASA Astrophysics Data System (ADS)

    Koratkar, Anuradha; Peterson, Bradley M.; Polidan, Ronald S.

    2003-02-01

    Kronos is a multiwavelength observatory designed to map the accretion disks and environments of supermassive black holes in various environments using the natural intrinsic variability of the accretion-driven sources. Kronos is envisaged as a Medium Explorer mission to NASA Office of Space Science under the Structure and Evolution of the Universe theme. We will achieve the Kronos science objectives by developing cost-effective techniques for obtaining and assimilating data from the research spacecraft and its subsequent work on the ground. The science operations assumptions for the mission are: (1 Need for flexible scheduling due to the variable nature of targets, (2) Large data volumes but minimal ground station contact, (3) Very small staff for operations. Our first assumption implies that we will have to consider an effective strategy to dynamically reprioritize the observing schedule to maximize science data acquisition. The flexibility we seek greatly increases the science return of the mission, because variability events can be properly captured. Our second assumption implies that we will have to develop some basic on-board analysis strategies to determine which data get downloaded. The small size of the operations staff implies that we need to "automate" as many routine processes of science operations as possible. In this paper we will discuss the various solutions that we are considering to optimize our operations and maximize science returns on the observatory.

  10. Tolerance values of benthic macroinvertebrates for stream biomonitoring: assessment of assumptions underlying scoring systems worldwide.

    PubMed

    Chang, Feng-Hsun; Lawrence, Justin E; Rios-Touma, Blanca; Resh, Vincent H

    2014-04-01

    Tolerance values (TVs) based on benthic macroinvertebrates are one of the most widely used tools for monitoring the biological impacts of water pollution, particularly in streams and rivers. We compiled TVs of benthic macroinvertebrates from 29 regions around the world to test 11 basic assumptions about pollution tolerance, that: (1) Arthropoda are < tolerant than non-Arthropoda; (2) Insecta < non-Insecta; (3) non-Oligochaeta < Oligochaeta; (4) other macroinvertebrates < Oligochaeta + Chironomidae; (5) other macroinvertebrate taxa < Isopoda + Gastropoda + Hirudinea; (6) Ephemeroptera + Plecoptera + Trichoptera (EPT) < Odonata + Coleoptera + Heteroptera (OCH); (7) EPT < non-EPT insects; (8) Diptera < Insecta; (9) Bivalvia < Gastropoda; (10) Baetidae < other Ephemeroptera; and (11) Hydropsychidae < other Trichoptera. We found that the first eight of these 11 assumptions were supported despite regional variability. In addition, we examined the effect of Best Professional Judgment (BPJ) and non-independence of TVs among countries by performing all analyses using subsets of the original dataset. These subsets included a group based on those systems using TVs that were derived from techniques other than BPJ, and groups based on methods used for TV assignment. The results obtained from these subsets and the entire dataset are similar. We also made seven a priori hypotheses about the regional similarity of TVs based on geography. Only one of these was supported. Development of TVs and the reporting of how they are assigned need to be more rigorous and be better described.

  11. Social factors in space station interiors

    NASA Technical Reports Server (NTRS)

    Cranz, Galen; Eichold, Alice; Hottes, Klaus; Jones, Kevin; Weinstein, Linda

    1987-01-01

    Using the example of the chair, which is often written into space station planning but which serves no non-cultural function in zero gravity, difficulties in overcoming cultural assumptions are discussed. An experimental approach is called for which would allow designers to separate cultural assumptions from logistic, social and psychological necessities. Simulations, systematic doubt and monitored brainstorming are recommended as part of basic research so that the designer will approach the problems of space module design with a complete program.

  12. The Eleventh Quadrennial Review of Military Compensation. Supporting Research Papers

    DTIC Science & Technology

    2012-06-01

    value. 4. BAH + BAS is roughly equal to expenditures for housing and food for servicemembers.22 In the first phase of the formal model, we further...assume that taxes, housing, and food are the only basic living expenses. Then, in the next phase, we include estimates of noncash benefits not included...assumption 4 with assumption 2 implies that civilian housing and food expenses are also equal to military BAH and BAS. However, civilian housing and food

  13. A Monte Carlo Simulation Study of the Reliability of Intraindividual Variability

    PubMed Central

    Estabrook, Ryne; Grimm, Kevin J.; Bowles, Ryan P.

    2012-01-01

    Recent research has seen intraindividual variability (IIV) become a useful technique to incorporate trial-to-trial variability into many types of psychological studies. IIV as measured by individual standard deviations (ISDs) has shown unique prediction to several types of positive and negative outcomes (Ram, Rabbit, Stollery, & Nesselroade, 2005). One unanswered question regarding measuring intraindividual variability is its reliability and the conditions under which optimal reliability is achieved. Monte Carlo simulation studies were conducted to determine the reliability of the ISD compared to the intraindividual mean. The results indicate that ISDs generally have poor reliability and are sensitive to insufficient measurement occasions, poor test reliability, and unfavorable amounts and distributions of variability in the population. Secondary analysis of psychological data shows that use of individual standard deviations in unfavorable conditions leads to a marked reduction in statistical power, although careful adherence to underlying statistical assumptions allows their use as a basic research tool. PMID:22268793

  14. The Nonproliferation Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RAJEN,GAURAV; BIRINGER,KENT L.

    2000-07-28

    The aim of this paper is to understand the numerous nuclear-related agreements that involve India and Pakistan, and in so doing identify starting points for future confidence-creating and confidence-building projects. Existing nuclear-related agreements provide a framework under which various projects can be proposed that foster greater nuclear transparency and cooperation in South Asia. The basic assumptions and arguments underlying this paper can be summarized as follows: (1) Increased nuclear transparency between India and Pakistan is a worthwhile objective, as it will lead to the irreversibility of extant nuclear agreements, the prospects of future agreements; and the balance of opacity andmore » transparency required for stability in times of crises; (2) Given the current state of Indian and Pakistani relations, incremental progress in increased nuclear transparency is the most likely future outcome; and (3) Incremental progress can be achieved by enhancing the information exchange required by existing nuclear-related agreements.« less

  15. Material Perception.

    PubMed

    Fleming, Roland W

    2017-09-15

    Under typical viewing conditions, human observers effortlessly recognize materials and infer their physical, functional, and multisensory properties at a glance. Without touching materials, we can usually tell whether they would feel hard or soft, rough or smooth, wet or dry. We have vivid visual intuitions about how deformable materials like liquids or textiles respond to external forces and how surfaces like chrome, wax, or leather change appearance when formed into different shapes or viewed under different lighting. These achievements are impressive because the retinal image results from complex optical interactions between lighting, shape, and material, which cannot easily be disentangled. Here I argue that because of the diversity, mutability, and complexity of materials, they pose enormous challenges to vision science: What is material appearance, and how do we measure it? How are material properties estimated and represented? Resolving these questions causes us to scrutinize the basic assumptions of mid-level vision.

  16. Riddles of masculinity: gender, bisexuality, and thirdness.

    PubMed

    Fogel, Gerald I

    2006-01-01

    Clinical examples are used to illuminate several riddles of masculinity-ambiguities, enigmas, and paradoxes in relation to gender, bisexuality, and thirdness-frequently seen in male patients. Basic psychoanalytic assumptions about male psychology are examined in the light of advances in female psychology, using ideas from feminist and gender studies as well as important and now widely accepted trends in contemporary psychoanalytic theory. By reexamining basic assumptions about heterosexual men, as has been done with ideas concerning women and homosexual men, complexity and nuance come to the fore to aid the clinician in treating the complex characterological pictures seen in men today. In a context of rapid historical and theoretical change, the use of persistent gender stereotypes and unnecessarily limiting theoretical formulations, though often unintended, may mask subtle countertransference and theoretical blind spots, and limit optimal clinical effectiveness.

  17. Costing interventions in primary care.

    PubMed

    Kernick, D

    2000-02-01

    Against a background of increasing demands on limited resources, studies that relate benefits of health interventions to the resources they consume will be an important part of any decision-making process in primary care, and an accurate assessment of costs will be an important part of any economic evaluation. Although there is no such thing as a gold standard cost estimate, there are a number of basic costing concepts that underlie any costing study. How costs are derived and combined will depend on the assumptions that have been made in their derivation. It is important to be clear what assumptions have been made and why in order to maintain consistency across comparative studies and prevent inappropriate conclusions being drawn. This paper outlines some costing concepts and principles to enable primary care practitioners and researchers to have a basic understanding of costing exercises and their pitfalls.

  18. Interpretation of the results of statistical measurements. [search for basic probability model

    NASA Technical Reports Server (NTRS)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  19. The Mw 7.7 Bhuj earthquake: Global lessons for earthquake hazard in intra-plate regions

    USGS Publications Warehouse

    Schweig, E.; Gomberg, J.; Petersen, M.; Ellis, M.; Bodin, P.; Mayrose, L.; Rastogi, B.K.

    2003-01-01

    The Mw 7.7 Bhuj earthquake occurred in the Kachchh District of the State of Gujarat, India on 26 January 2001, and was one of the most damaging intraplate earthquakes ever recorded. This earthquake is in many ways similar to the three great New Madrid earthquakes that occurred in the central United States in 1811-1812, An Indo-US team is studying the similarities and differences of these sequences in order to learn lessons for earthquake hazard in intraplate regions. Herein we present some preliminary conclusions from that study. Both the Kutch and New Madrid regions have rift type geotectonic setting. In both regions the strain rates are of the order of 10-9/yr and attenuation of seismic waves as inferred from observations of intensity and liquefaction are low. These strain rates predict recurrence intervals for Bhuj or New Madrid sized earthquakes of several thousand years or more. In contrast, intervals estimated from paleoseismic studies and from other independent data are significantly shorter, probably hundreds of years. All these observations together may suggest that earthquakes relax high ambient stresses that are locally concentrated by rheologic heterogeneities, rather than loading by plate-tectonic forces. The latter model generally underlies basic assumptions made in earthquake hazard assessment, that the long-term average rate of energy released by earthquakes is determined by the tectonic loading rate, which thus implies an inherent average periodicity of earthquake occurrence. Interpreting the observations in terms of the former model therefore may require re-examining the basic assumptions of hazard assessment.

  20. Behavioral health at-risk contracting--a rate development and financial reporting guide.

    PubMed

    Zinser, G R

    1994-01-01

    The process of developing rates for behavioral capitation contracts can seem mysterious and intimidating. The following article explains several key features of the method used to develop capitation rates. These include: (1) a basic understanding of the mechanics of rate calculation; (2) awareness of the variables to be considered and assumptions to be made; (3) a source of information to use as a basis for these assumptions; and (4) a system to collect detailed actual experience data.

  1. An Examination of Brazil and the United States as Potential Partners in a Joint Supersonic Military Fighter Aircraft Codevelopment and Production Program.

    DTIC Science & Technology

    1986-09-01

    Brazilian-American Chamber of Commerce Mr. Frank J. Devine, Executive Director Embraer, Empresa Brasileira De Aeronautica Mr. Salo Roth Vice President...Throughout this study the following assumptions have been made. First, it is assumed that the reader has a basic familiarity with aircraft. Therefore...of the 5 1 weapons acquisition process. Third, the assumption is made that most readers are familiar with U.S. procedures involving the sale of

  2. SPSS and SAS programs for addressing interdependence and basic levels-of-analysis issues in psychological data.

    PubMed

    O'Connor, Brian P

    2004-02-01

    Levels-of-analysis issues arise whenever individual-level data are collected from more than one person from the same dyad, family, classroom, work group, or other interaction unit. Interdependence in data from individuals in the same interaction units also violates the independence-of-observations assumption that underlies commonly used statistical tests. This article describes the data analysis challenges that are presented by these issues and presents SPSS and SAS programs for conducting appropriate analyses. The programs conduct the within-and-between-analyses described by Dansereau, Alutto, and Yammarino (1984) and the dyad-level analyses described by Gonzalez and Griffin (1999) and Griffin and Gonzalez (1995). Contrasts with general multilevel modeling procedures are then discussed.

  3. Molecular dynamics test of the Brownian description of Na(+) motion in water

    NASA Technical Reports Server (NTRS)

    Wilson, M. A.; Pohorille, A.; Pratt, L. R.

    1985-01-01

    The present paper provides the results of molecular dynamics calculations on a Na(+) ion in aqueous solution. Attention is given to the sodium-oxygen and sodium-hydrogen radial distribution functions, the velocity autocorrelation function for the Na(+) ion, the autocorrelation function of the force on the stationary ion, and the accuracy of Brownian motion assumptions which are basic to hydrodynamic models of ion dyanmics in solution. It is pointed out that the presented calculations provide accurate data for testing theories of ion dynamics in solution. The conducted tests show that it is feasible to calculate Brownian friction constants for ions in aqueous solutions. It is found that for Na(+) under the considered conditions the Brownian mobility is in error by only 60 percent.

  4. [Language is not neutral. Commentary about APA style].

    PubMed

    Delgado Sánchez-Mateos, Juan

    2007-05-01

    Some basic characteristics, not always explicit, of the editorial style proposed by the American Psychological Association (APA), and from the objections posed by some authors who maintain critical positions towards the use of this style, are reviewed, starting with the work of Madigan, Johnson, and Linton (1995) and the subsequent controversy. Starting with this review, problems related to underlying assumptions of the style, with ethical aspects of research, and with the epistemological positions defended by the different traditions of research are discussed. In the conclusions, a simpler differentiation between the scientific-technical and communicative-practical systems of enquiry is proposed, and an explicit commitment, in the text of the report, to the ethical responsibilities derived from the authorship and the development of the research.

  5. Revealing representational content with pattern-information fMRI--an introductory guide.

    PubMed

    Mur, Marieke; Bandettini, Peter A; Kriegeskorte, Nikolaus

    2009-03-01

    Conventional statistical analysis methods for functional magnetic resonance imaging (fMRI) data are very successful at detecting brain regions that are activated as a whole during specific mental activities. The overall activation of a region is usually taken to indicate involvement of the region in the task. However, such activation analysis does not consider the multivoxel patterns of activity within a brain region. These patterns of activity, which are thought to reflect neuronal population codes, can be investigated by pattern-information analysis. In this framework, a region's multivariate pattern information is taken to indicate representational content. This tutorial introduction motivates pattern-information analysis, explains its underlying assumptions, introduces the most widespread methods in an intuitive way, and outlines the basic sequence of analysis steps.

  6. The Central Registry for Child Abuse Cases: Rethinking Basic Assumptions

    ERIC Educational Resources Information Center

    Whiting, Leila

    1977-01-01

    Class data pools on abused and neglected children and their families are found desirable for program planning, but identification by name is of questionable value and possibly a dangerous invasion of civil liberties. (MS)

  7. Self-transcendent positive emotions increase spirituality through basic world assumptions.

    PubMed

    Van Cappellen, Patty; Saroglou, Vassilis; Iweins, Caroline; Piovesana, Maria; Fredrickson, Barbara L

    2013-01-01

    Spirituality has mostly been studied in psychology as implied in the process of overcoming adversity, being triggered by negative experiences, and providing positive outcomes. By reversing this pathway, we investigated whether spirituality may also be triggered by self-transcendent positive emotions, which are elicited by stimuli appraised as demonstrating higher good and beauty. In two studies, elevation and/or admiration were induced using different methods. These emotions were compared to two control groups, a neutral state and a positive emotion (mirth). Self-transcendent positive emotions increased participants' spirituality (Studies 1 and 2), especially for the non-religious participants (Study 1). Two basic world assumptions, i.e., belief in life as meaningful (Study 1) and in the benevolence of others and the world (Study 2) mediated the effect of these emotions on spirituality. Spirituality should be understood not only as a coping strategy, but also as an upward spiralling pathway to and from self-transcendent positive emotions.

  8. Teaching for Tomorrow: An Exploratory Study of Prekindergarten Teachers' Underlying Assumptions about How Children Learn

    ERIC Educational Resources Information Center

    Flynn, Erin E.; Schachter, Rachel E.

    2017-01-01

    This study investigated eight prekindergarten teachers' underlying assumptions about how children learn, and how these assumptions were used to inform and enact instruction. By contextualizing teachers' knowledge and understanding as it is used in practice we were able to provide unique insight into the work of teaching. Participants focused on…

  9. Statistical Issues for Uncontrolled Reentry Hazards

    NASA Technical Reports Server (NTRS)

    Matney, Mark

    2008-01-01

    A number of statistical tools have been developed over the years for assessing the risk of reentering objects to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. The statistical tools use this information to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of the analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper looks at a number of these theoretical assumptions, examining the mathematical basis for the hazard calculations, and outlining the conditions under which the simplifying assumptions hold. In addition, this paper will also outline some new tools for assessing ground hazard risk in useful ways. Also, this study is able to make use of a database of known uncontrolled reentry locations measured by the United States Department of Defense. By using data from objects that were in orbit more than 30 days before reentry, sufficient time is allowed for the orbital parameters to be randomized in the way the models are designed to compute. The predicted ground footprint distributions of these objects are based on the theory that their orbits behave basically like simple Kepler orbits. However, there are a number of factors - including the effects of gravitational harmonics, the effects of the Earth's equatorial bulge on the atmosphere, and the rotation of the Earth and atmosphere - that could cause them to diverge from simple Kepler orbit behavior and change the ground footprints. The measured latitude and longitude distributions of these objects provide data that can be directly compared with the predicted distributions, providing a fundamental empirical test of the model assumptions.

  10. Using effort information with change-in-ratio data for population estimation

    USGS Publications Warehouse

    Udevitz, Mark S.; Pollock, Kenneth H.

    1995-01-01

    Most change-in-ratio (CIR) methods for estimating fish and wildlife population sizes have been based only on assumptions about how encounter probabilities vary among population subclasses. When information on sampling effort is available, it is also possible to derive CIR estimators based on assumptions about how encounter probabilities vary over time. This paper presents a generalization of previous CIR models that allows explicit consideration of a range of assumptions about the variation of encounter probabilities among subclasses and over time. Explicit estimators are derived under this model for specific sets of assumptions about the encounter probabilities. Numerical methods are presented for obtaining estimators under the full range of possible assumptions. Likelihood ratio tests for these assumptions are described. Emphasis is on obtaining estimators based on assumptions about variation of encounter probabilities over time.

  11. Polymer physics experiments with single DNA molecules

    NASA Astrophysics Data System (ADS)

    Smith, Douglas E.

    1999-11-01

    Bacteriophage DNA molecules were taken as a model flexible polymer chain for the experimental study of polymer dynamics at the single molecule level. Video fluorescence microscopy was used to directly observe the conformational dynamics of fluorescently labeled molecules, optical tweezers were used to manipulate individual molecules, and micro-fabricated flow cells were used to apply controlled hydrodynamic strain to molecules. These techniques constitute a powerful new experimental approach in the study of basic polymer physics questions. I have used these techniques to study the diffusion and relaxation of isolated and entangled polymer molecules and the hydrodynamic deformation of polymers in elongational and shear flows. These studies revealed a rich, and previously unobserved, ``molecular individualism'' in the dynamical behavior of single molecules. Individual measurements on ensembles of identical molecules allowed the average conformation to be determined as well as the underlying probability distributions for molecular conformation. Scaling laws, that predict the dependence of properties on chain length and concentration, were also tested. The basic assumptions of the reptation model were directly confirmed by visualizing the dynamics of entangled chains.

  12. Fisher's geometrical model emerges as a property of complex integrated phenotypic networks.

    PubMed

    Martin, Guillaume

    2014-05-01

    Models relating phenotype space to fitness (phenotype-fitness landscapes) have seen important developments recently. They can roughly be divided into mechanistic models (e.g., metabolic networks) and more heuristic models like Fisher's geometrical model. Each has its own drawbacks, but both yield testable predictions on how the context (genomic background or environment) affects the distribution of mutation effects on fitness and thus adaptation. Both have received some empirical validation. This article aims at bridging the gap between these approaches. A derivation of the Fisher model "from first principles" is proposed, where the basic assumptions emerge from a more general model, inspired by mechanistic networks. I start from a general phenotypic network relating unspecified phenotypic traits and fitness. A limited set of qualitative assumptions is then imposed, mostly corresponding to known features of phenotypic networks: a large set of traits is pleiotropically affected by mutations and determines a much smaller set of traits under optimizing selection. Otherwise, the model remains fairly general regarding the phenotypic processes involved or the distribution of mutation effects affecting the network. A statistical treatment and a local approximation close to a fitness optimum yield a landscape that is effectively the isotropic Fisher model or its extension with a single dominant phenotypic direction. The fit of the resulting alternative distributions is illustrated in an empirical data set. These results bear implications on the validity of Fisher's model's assumptions and on which features of mutation fitness effects may vary (or not) across genomic or environmental contexts.

  13. Velocity Measurement by Scattering from Index of Refraction Fluctuations Induced in Turbulent Flows

    NASA Technical Reports Server (NTRS)

    Lading, Lars; Saffman, Mark; Edwards, Robert

    1996-01-01

    Induced phase screen scattering is defined as scatter light from a weak index of refraction fluctuations induced by turbulence. The basic assumptions and requirements for induced phase screen scattering, including scale requirements, are presented.

  14. Undergraduate Cross Registration.

    ERIC Educational Resources Information Center

    Grupe, Fritz H.

    This report discusses various aspects of undergraduate cross-registration procedures, including the dimensions, values, roles and functions, basic assumptions, and facilitating and encouragment of cross-registration. Dimensions of cross-registration encompass financial exchange, eligibility, program limitations, type of grade and credit; extent of…

  15. The Peace Movement: An Exercise in Micro-Macro Linkages.

    ERIC Educational Resources Information Center

    Galtung, Johan

    1988-01-01

    Contends that the basic assumption of the peace movement is the abuse of military power by the state. Argues that the peace movement is most effective through linkages with cultural, political, and economic forces in society. (BSR)

  16. Graduate Education in Psychology: A Comment on Rogers' Passionate Statement

    ERIC Educational Resources Information Center

    Brown, Robert C., Jr.; Tedeschi, James T.

    1972-01-01

    Authors' hope that this critical evaluation can place Carl Rogers' assumptions into perspective; they propose a compromise program meant to satisfy the basic aims of a humanistic psychology program. For Rogers' rejoinder see AA 512 869. (MB)

  17. ERP denoising in multichannel EEG data using contrasts between signal and noise subspaces.

    PubMed

    Ivannikov, Andriy; Kalyakin, Igor; Hämäläinen, Jarmo; Leppänen, Paavo H T; Ristaniemi, Tapani; Lyytinen, Heikki; Kärkkäinen, Tommi

    2009-06-15

    In this paper, a new method intended for ERP denoising in multichannel EEG data is discussed. The denoising is done by separating ERP/noise subspaces in multidimensional EEG data by a linear transformation and the following dimension reduction by ignoring noise components during inverse transformation. The separation matrix is found based on the assumption that ERP sources are deterministic for all repetitions of the same type of stimulus within the experiment, while the other noise sources do not obey the determinancy property. A detailed derivation of the technique is given together with the analysis of the results of its application to a real high-density EEG data set. The interpretation of the results and the performance of the proposed method under conditions, when the basic assumptions are violated - e.g. the problem is underdetermined - are also discussed. Moreover, we study how the factors of the number of channels and trials used by the method influence the effectiveness of ERP/noise subspaces separation. In addition, we explore also the impact of different data resampling strategies on the performance of the considered algorithm. The results can help in determining the optimal parameters of the equipment/methods used to elicit and reliably estimate ERPs.

  18. The Vocational Turn in Adult Literacy Education and the Impact of the International Adult Literacy Survey

    NASA Astrophysics Data System (ADS)

    Druine, Nathalie; Wildemeersch, Danny

    2000-09-01

    The authors critically examine some of the underlying epistemological and theoretical assumptions of the IALS. In doing so, they distinguish among two basic orientations towards literacy. First, the standard approach (of which IALS is an example) subscribes to the possibility of measuring literacy as abstract, cognitive skills, and endorses the claim that there is an important relationship between literacy skills and economic success in the so-called 'knowledge society.' The second, called a socio-cultural approach, insists on the contextual and power-related character of people's literacy practices. The authors further illustrate that the assumptions of the IALS are rooted in a neo-liberal ideology that forces all members of society to adjust to the exigencies of the globalised economy. In the current, contingent conditions of the risk society, however, it does not seem very wise to limit the learning of adults to enhancing labour-market competencies. Adult education should relate to the concrete literacy practices people already have in their lives. It should make its learners co-responsible actors of their own learning process and participants in a democratic debate on defining the kind of society people want to build.

  19. Application of random survival forests in understanding the determinants of under-five child mortality in Uganda in the presence of covariates that satisfy the proportional and non-proportional hazards assumption.

    PubMed

    Nasejje, Justine B; Mwambi, Henry

    2017-09-07

    Uganda just like any other Sub-Saharan African country, has a high under-five child mortality rate. To inform policy on intervention strategies, sound statistical methods are required to critically identify factors strongly associated with under-five child mortality rates. The Cox proportional hazards model has been a common choice in analysing data to understand factors strongly associated with high child mortality rates taking age as the time-to-event variable. However, due to its restrictive proportional hazards (PH) assumption, some covariates of interest which do not satisfy the assumption are often excluded in the analysis to avoid mis-specifying the model. Otherwise using covariates that clearly violate the assumption would mean invalid results. Survival trees and random survival forests are increasingly becoming popular in analysing survival data particularly in the case of large survey data and could be attractive alternatives to models with the restrictive PH assumption. In this article, we adopt random survival forests which have never been used in understanding factors affecting under-five child mortality rates in Uganda using Demographic and Health Survey data. Thus the first part of the analysis is based on the use of the classical Cox PH model and the second part of the analysis is based on the use of random survival forests in the presence of covariates that do not necessarily satisfy the PH assumption. Random survival forests and the Cox proportional hazards model agree that the sex of the household head, sex of the child, number of births in the past 1 year are strongly associated to under-five child mortality in Uganda given all the three covariates satisfy the PH assumption. Random survival forests further demonstrated that covariates that were originally excluded from the earlier analysis due to violation of the PH assumption were important in explaining under-five child mortality rates. These covariates include the number of children under the age of five in a household, number of births in the past 5 years, wealth index, total number of children ever born and the child's birth order. The results further indicated that the predictive performance for random survival forests built using covariates including those that violate the PH assumption was higher than that for random survival forests built using only covariates that satisfy the PH assumption. Random survival forests are appealing methods in analysing public health data to understand factors strongly associated with under-five child mortality rates especially in the presence of covariates that violate the proportional hazards assumption.

  20. SW-846 Test Method 1340: In Vitro Bioaccessibility Assay for Lead in Soil

    EPA Pesticide Factsheets

    Describes assay procedures written on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  1. Challenging Assumptions of International Public Relations: When Government Is the Most Important Public.

    ERIC Educational Resources Information Center

    Taylor, Maureen; Kent, Michael L.

    1999-01-01

    Explores assumptions underlying Malaysia's and the United States' public-relations practice. Finds many assumptions guiding Western theories and practices are not applicable to other countries. Examines the assumption that the practice of public relations targets a variety of key organizational publics. Advances international public-relations…

  2. Hydrogen isotope retention in beryllium for tokamak plasma-facing applications

    NASA Astrophysics Data System (ADS)

    Anderl, R. A.; Causey, R. A.; Davis, J. W.; Doerner, R. P.; Federici, G.; Haasz, A. A.; Longhurst, G. R.; Wampler, W. R.; Wilson, K. L.

    Beryllium has been used as a plasma-facing material to effect substantial improvements in plasma performance in the Joint European Torus (JET), and it is planned as a plasma-facing material for the first wall (FW) and other components of the International Thermonuclear Experimental Reactor (ITER). The interaction of hydrogenic ions, and charge-exchange neutral atoms from plasmas, with beryllium has been studied in recent years with widely varying interpretations of results. In this paper we review experimental data regarding hydrogenic atom inventories in experiments pertinent to tokamak applications and show that with some very plausible assumptions, the experimental data appear to exhibit rather predictable trends. A phenomenon observed in high ion-flux experiments is the saturation of the beryllium surface such that inventories of implanted particles become insensitive to increased flux and to continued implantation fluence. Methods for modeling retention and release of implanted hydrogen in beryllium are reviewed and an adaptation is suggested for modeling the saturation effects. The TMAP4 code used with these modifications has succeeded in simulating experimental data taken under saturation conditions where codes without this feature have not. That implementation also works well under more routine conditions where the conventional recombination-limited release model is applicable. Calculations of tritium inventory and permeation in the ITER FW during the basic performance phase (BPP) using both the conventional recombination model and the saturation effects assumptions show a difference of several orders of magnitude in both inventory and permeation rate to the coolant.

  3. Ethics and managed care.

    PubMed

    Perkel, R L

    1996-03-01

    Managed care presents physicians with potential ethical dilemmas different from dilemmas in traditional fee-for-service practice. The ethical assumptions of managed care are explored, with special attention to the evolving dual responsibilities of physicians as patient advocates and as entrepreneurs. A number of proposals are described that delineate issues in support of and in opposition to managed care. Through an understanding of how to apply basic ethics principles to managed care participation, physicians may yet hold on to the basic ethic of the fiduciary doctor-patient relationship.

  4. Development of a Multiple Linear Regression Model to Forecast Facility Electrical Consumption at an Air Force Base.

    DTIC Science & Technology

    1981-09-01

    corresponds to the same square footage that consumed the electrical energy. 3. The basic assumptions of multiple linear regres- sion, as enumerated in...7. Data related to the sample of bases is assumed to be representative of bases in the population. Limitations Basic limitations on this research were... Ratemaking --Overview. Rand Report R-5894, Santa Monica CA, May 1977. Chatterjee, Samprit, and Bertram Price. Regression Analysis by Example. New York: John

  5. Hazards and occupational risk in hard coal mines - a critical analysis of legal requirements

    NASA Astrophysics Data System (ADS)

    Krause, Marcin

    2017-11-01

    This publication concerns the problems of occupational safety and health in hard coal mines, the basic elements of which are the mining hazards and the occupational risk. The work includes a comparative analysis of selected provisions of general and industry-specific law regarding the analysis of hazards and occupational risk assessment. Based on a critical analysis of legal requirements, basic assumptions regarding the practical guidelines for occupational risk assessment in underground coal mines have been proposed.

  6. Applying the compound Poisson process model to the reporting of injury-related mortality rates.

    PubMed

    Kegler, Scott R

    2007-02-16

    Injury-related mortality rate estimates are often analyzed under the assumption that case counts follow a Poisson distribution. Certain types of injury incidents occasionally involve multiple fatalities, however, resulting in dependencies between cases that are not reflected in the simple Poisson model and which can affect even basic statistical analyses. This paper explores the compound Poisson process model as an alternative, emphasizing adjustments to some commonly used interval estimators for population-based rates and rate ratios. The adjusted estimators involve relatively simple closed-form computations, which in the absence of multiple-case incidents reduce to familiar estimators based on the simpler Poisson model. Summary data from the National Violent Death Reporting System are referenced in several examples demonstrating application of the proposed methodology.

  7. Phonology, reading, and Chomsky and Halle's optimal orthography.

    PubMed

    Steinberg, D D

    1973-09-01

    Chomsky and Halle claim that an orthography based on their underlying phonological representations (UPR) of lexical items would be optimal for English. This paper challenges three of C & H's basic phonological assumptions, that their vowel shift rule is valid, that the UPR is the only sound representation to be listed in the lexicon, and that derived words do not appear as wholes in the lexicon. A less abstract phonological representation level based on the conscious perceptions of speakers, the surface phonemic (SPR), is proposed. An SPR-based orthography has advantages which a UPR-based orthography would not: it is easy to learn and teach, it can be learned at an early age, and it permits rapid detection of rhyme. It is concluded that an orthography based on SPRs, and not UPRs, would be optimal.

  8. A Note on the Wave Action Density of a Viscous Instability Mode on a Laminar Free-shear Flow

    NASA Technical Reports Server (NTRS)

    Balsa, Thomas F.

    1994-01-01

    Using the assumptions of an incompressible and viscous flow at large Reynolds number, we derive the evolution equation for the wave action density of an instability wave traveling on top of a laminar free-shear flow. The instability is considered to be viscous; the purpose of the present work is to include the cumulative effect of the (locally) small viscous correction to the wave, over length and time scales on which the underlying base flow appears inhomogeneous owing to its viscous diffusion. As such, we generalize our previous work for inviscid waves. This generalization appears as an additional (but usually non-negligible) term in the equation for the wave action. The basic structure of the equation remains unaltered.

  9. Factors influencing the thermally-induced strength degradation of B/Al composites

    NASA Technical Reports Server (NTRS)

    Dicarlo, J. A.

    1982-01-01

    Literature data related to the thermally-induced strength degradation of B/Al composites were examined in the light of fracture theories based on reaction-controlled fiber weakening. Under the assumption of a parabolic time-dependent growth for the interfacial reaction product, a Griffith-type fracture model was found to yield simple equations whose predictions were in good agreement with data for boron fiber average strength and for B/Al axial fracture strain. The only variables in these equations were the time and temperature of the thermal exposure and an empirical factor related to fiber surface smoothness prior to composite consolidation. Such variables as fiber diameter and aluminum alloy composition were found to have little influence. The basic and practical implications of the fracture model equations are discussed.

  10. Computer Applications in Teaching and Learning.

    ERIC Educational Resources Information Center

    Halley, Fred S.; And Others

    Some examples of the usage of computers in teaching and learning are examination generation, automatic exam grading, student tracking, problem generation, computational examination generators, program packages, simulation, and programing skills for problem solving. These applications are non-trivial and do fulfill the basic assumptions necessary…

  11. Probabilistic Simulation of Territorial Seismic Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baratta, Alessandro; Corbi, Ileana

    2008-07-08

    The paper is focused on a stochastic process for the prevision of seismic scenarios on the territory and developed by means of some basic assumptions in the procedure and by elaborating the fundamental parameters recorded during some ground motions occurred in a seismic area.

  12. Elements of a Research Report.

    ERIC Educational Resources Information Center

    Schurter, William J.

    This guide for writing research or technical reports discusses eleven basic elements of such reports and provides examples of "good" and "bad" wordings. These elements are the title, problem statement, purpose statement, need statement, hypothesis, assumptions, procedures, limitations, terminology, conclusion and recommendations. This guide is…

  13. The Case for a Hierarchical Cosmology

    ERIC Educational Resources Information Center

    Vaucouleurs, G. de

    1970-01-01

    The development of modern theoretical cosmology is presented and some questionable assumptions of orthodox cosmology are pointed out. Suggests that recent observations indicate that hierarchical clustering is a basic factor in cosmology. The implications of hierarchical models of the universe are considered. Bibliography. (LC)

  14. The Estimation Theory Framework of Data Assimilation

    NASA Technical Reports Server (NTRS)

    Cohn, S.; Atlas, Robert (Technical Monitor)

    2002-01-01

    Lecture 1. The Estimation Theory Framework of Data Assimilation: 1. The basic framework: dynamical and observation models; 2. Assumptions and approximations; 3. The filtering, smoothing, and prediction problems; 4. Discrete Kalman filter and smoother algorithms; and 5. Example: A retrospective data assimilation system

  15. Saturation behavior: a general relationship described by a simple second-order differential equation.

    PubMed

    Kepner, Gordon R

    2010-04-13

    The numerous natural phenomena that exhibit saturation behavior, e.g., ligand binding and enzyme kinetics, have been approached, to date, via empirical and particular analyses. This paper presents a mechanism-free, and assumption-free, second-order differential equation, designed only to describe a typical relationship between the variables governing these phenomena. It develops a mathematical model for this relation, based solely on the analysis of the typical experimental data plot and its saturation characteristics. Its utility complements the traditional empirical approaches. For the general saturation curve, described in terms of its independent (x) and dependent (y) variables, a second-order differential equation is obtained that applies to any saturation phenomena. It shows that the driving factor for the basic saturation behavior is the probability of the interactive site being free, which is described quantitatively. Solving the equation relates the variables in terms of the two empirical constants common to all these phenomena, the initial slope of the data plot and the limiting value at saturation. A first-order differential equation for the slope emerged that led to the concept of the effective binding rate at the active site and its dependence on the calculable probability the interactive site is free. These results are illustrated using specific cases, including ligand binding and enzyme kinetics. This leads to a revised understanding of how to interpret the empirical constants, in terms of the variables pertinent to the phenomenon under study. The second-order differential equation revealed the basic underlying relations that describe these saturation phenomena, and the basic mathematical properties of the standard experimental data plot. It was shown how to integrate this differential equation, and define the common basic properties of these phenomena. The results regarding the importance of the slope and the new perspectives on the empirical constants governing the behavior of these phenomena led to an alternative perspective on saturation behavior kinetics. Their essential commonality was revealed by this analysis, based on the second-order differential equation.

  16. Relationship between Organizational Culture and the Use of Psychotropic Medicines in Nursing Homes: A Systematic Integrative Review.

    PubMed

    Sawan, Mouna; Jeon, Yun-Hee; Chen, Timothy F

    2018-03-01

    Psychotropic medicines are commonly used in nursing homes, despite marginal clinical benefits and association with harm in the elderly. Organizational culture is proposed as a factor explaining the high-level use of psychotropic medicines. Schein describes three levels of culture: artifacts, espoused values, and basic assumptions. This integrative review aimed to investigate the facets and role of organizational culture in the use of psychotropic medicines in nursing homes. Five databases were searched for qualitative, quantitative, and mixed method empirical studies up to 13 February 2017. Articles were included if they examined an aspect of organizational culture according to Schein's theory and the use of psychotropic medicines in nursing homes for the management of behavioral and sleep disturbances in residents. Article screening and data extraction were performed independently by one reviewer and checked by the research team. The integrative review method, an approach similar to the method of constant comparison analysis was utilized for data analysis. Twenty-four studies met the inclusion criteria: 13 used quantitative methods, 9 used qualitative methods, 1 was quasi-qualitative, and 1 used mixed methods. Included studies were found to only address two aspects of organizational culture in relation to the use of psychotropic medicines: artifacts and espoused values. No studies addressed the basic assumptions, the unsaid taken-for-granted beliefs, which provide explanations for in/consistencies between the ideal use of psychotropic medicines and the actual use of psychotropic medicines. Previous studies suggest that organizational culture influences the use of psychotropic medicines in nursing homes; however, what is known is descriptive of culture only at the surface level, that is the artifacts and espoused values. Hence, future research that explains the impact of the basic assumptions of culture on the use of psychotropic medicines is important.

  17. Estimating the Basic Reproductive Number for African Swine Fever Using the Ukrainian Historical Epidemic of 1977.

    PubMed

    Korennoy, F I; Gulenkin, V M; Gogin, A E; Vergne, T; Karaulov, A K

    2017-12-01

    In 1977, Ukraine experienced a local epidemic of African swine fever (ASF) in the Odessa region. A total of 20 settlements were affected during the course of the epidemic, including both large farms and backyard households. Thanks to timely interventions, the virus circulation was successfully eradicated within 6 months, leading to no additional outbreaks. Detailed report of the outbreak's investigation has been publically available from 2014. The report contains some quantitative data that allow studying the ASF-spread dynamics in the course of the epidemic. In our study, we used this historical epidemic to estimate the basic reproductive number of the ASF virus both within and between farms. The basic reproductive number (R 0 ) represents the average number of secondary infections caused by one infectious unit during its infectious period in a susceptible population. Calculations were made under assumption of an exponential initial growth by fitting the approximating curve to the initial segments of the epidemic curves. The R 0 both within farm and between farms was estimated at 7.46 (95% confidence interval: 5.68-9.21) and 1.65 (1.42-1.88), respectively. Corresponding daily transmission rates were estimated at 1.07 (0.81-1.32) and 0.09 (0.07-0.10). These estimations based on historical data are consistent with those using data generated by the recent epidemic currently affecting eastern Europe. Such results contribute to the published knowledge on the ASF transmission dynamics under natural conditions and could be used to model and predict the spread of ASF in affected and non-affected regions and to evaluate the effectiveness of different control measures. © 2016 Blackwell Verlag GmbH.

  18. Using graph-based assessments within socratic tutorials to reveal and refine students' analytical thinking about molecular networks.

    PubMed

    Trujillo, Caleb; Cooper, Melanie M; Klymkowsky, Michael W

    2012-01-01

    Biological systems, from the molecular to the ecological, involve dynamic interaction networks. To examine student thinking about networks we used graphical responses, since they are easier to evaluate for implied, but unarticulated assumptions. Senior college level molecular biology students were presented with simple molecular level scenarios; surprisingly, most students failed to articulate the basic assumptions needed to generate reasonable graphical representations; their graphs often contradicted their explicit assumptions. We then developed a tiered Socratic tutorial based on leading questions designed to provoke metacognitive reflection. The activity is characterized by leading questions (prompts) designed to provoke meta-cognitive reflection. When applied in a group or individual setting, there was clear improvement in targeted areas. Our results highlight the promise of using graphical responses and Socratic prompts in a tutorial context as both a formative assessment for students and an informative feedback system for instructors, in part because graphical responses are relatively easy to evaluate for implied, but unarticulated assumptions. Copyright © 2011 Wiley Periodicals, Inc.

  19. Analysis of a general SIS model with infective vectors on the complex networks

    NASA Astrophysics Data System (ADS)

    Juang, Jonq; Liang, Yu-Hao

    2015-11-01

    A general SIS model with infective vectors on complex networks is studied in this paper. In particular, the model considers the linear combination of three possible routes of disease propagation between infected and susceptible individuals as well as two possible transmission types which describe how the susceptible vectors attack the infected individuals. A new technique based on the basic reproduction matrix is introduced to obtain the following results. First, necessary and sufficient conditions are obtained for the global stability of the model through a unified approach. As a result, we are able to produce the exact basic reproduction number and the precise epidemic thresholds with respect to three spreading strengths, the curing strength or the immunization strength all at once. Second, the monotonicity of the basic reproduction number and the above mentioned epidemic thresholds with respect to all other parameters can be rigorously characterized. Finally, we are able to compare the effectiveness of various immunization strategies under the assumption that the number of persons getting vaccinated is the same for all strategies. In particular, we prove that in the scale-free networks, both targeted and acquaintance immunizations are more effective than uniform and active immunizations and that active immunization is the least effective strategy among those four. We are also able to determine how the vaccine should be used at minimum to control the outbreak of the disease.

  20. Thinking in Arithmetic.

    ERIC Educational Resources Information Center

    Resnick, Lauren B.; And Others

    This paper discusses a radically different set of assumptions to improve educational outcomes for disadvantaged students. It is argued that disadvantaged children, when exposed to carefully organized thinking-oriented instruction, can acquire the traditional basic skills in the process of reasoning and solving problems. The paper is presented in…

  1. Measurement of Inequality: The Gini Coefficient and School Finance Studies.

    ERIC Educational Resources Information Center

    Lows, Raymond L.

    1984-01-01

    Discusses application of the "Lorenz Curve" (a graphical representation of the concentration of wealth) with the "Gini Coefficient" (an index of inequality) to measure social inequality in school finance studies. Examines the basic assumptions of these measures and suggests a minor reconception. (MCG)

  2. Beyond the Virtues-Principles Debate.

    ERIC Educational Resources Information Center

    Keat, Marilyn S.

    1992-01-01

    Indicates basic ontological assumptions in the virtues-principles debate in moral philosophy, noting Aristotle's and Kant's fundamental ideas about morality and considering a hermeneutic synthesis of theories. The article discusses what acceptance of the synthesis might mean in the theory and practice of moral pedagogy, offering examples of…

  3. The Structuring Principle: Political Socialization and Belief Systems

    ERIC Educational Resources Information Center

    Searing, Donald D.; And Others

    1973-01-01

    Assesses the significance of data on childhood political learning to political theory by testing the structuring principle,'' considered one of the central assumptions of political socialization research. This principle asserts that basic orientations acquired during childhood structure the later learning of specific issue beliefs.'' The…

  4. The Experience of Disability.

    ERIC Educational Resources Information Center

    Hastings, Elizabeth

    1981-01-01

    The author outlines the experiences of disability and demonstrates that generally unpleasant experiences are the direct result of a basic and false assumption on the part of society. Experiences of the disabled are discussed in areas the author categorizes as exclusion or segregation, deprivation, prejudice, poverty, frustration, and…

  5. Some Remarks on the Theory of Political Education. German Studies Notes.

    ERIC Educational Resources Information Center

    Holtmann, Antonius

    This theoretical discussion explores pedagogical assumptions of political education in West Germany. Three major methodological orientations are discussed: the normative-ontological, empirical-analytical, and dialectical-historical. The author recounts the aims, methods, and basic presuppositions of each of these approaches. Topics discussed…

  6. Assessment of the Natural Environment.

    ERIC Educational Resources Information Center

    Cantrell, Mary Lynn; Cantrell, Robert P.

    1985-01-01

    Basic assumptions of an ecological-behavioral view of assessing behavior disordered students are reviewed along with a proposed method for ecological analysis and specific techniques for measuring ecological variables (such as environmental units, behaviors of significant others, and behavioral expectations). The use of such information in program…

  7. Sherlock Holmes as a Social Scientist.

    ERIC Educational Resources Information Center

    Ward, Veronica; Orbell, John

    1988-01-01

    Presents a way of teaching the scientific method through studying the adventures of Sherlock Holmes. Asserting that Sherlock Holmes used the scientific method to solve cases, the authors construct Holmes' method through excerpts from novels featuring his adventures. Discusses basic assumptions, paradigms, theory building, and testing. (SLM)

  8. Lectures on Dark Matter Physics

    NASA Astrophysics Data System (ADS)

    Lisanti, Mariangela

    Rotation curve measurements from the 1970s provided the first strong indication that a significant fraction of matter in the Universe is non-baryonic. In the intervening years, a tremendous amount of progress has been made on both the theoretical and experimental fronts in the search for this missing matter, which we now know constitutes nearly 85% of the Universe's matter density. These series of lectures provide an introduction to the basics of dark matter physics. They are geared for the advanced undergraduate or graduate student interested in pursuing research in high-energy physics. The primary goal is to build an understanding of how observations constrain the assumptions that can be made about the astro- and particle physics properties of dark matter. The lectures begin by delineating the basic assumptions that can be inferred about dark matter from rotation curves. A detailed discussion of thermal dark matter follows, motivating Weakly Interacting Massive Particles, as well as lighter-mass alternatives. As an application of these concepts, the phenomenology of direct and indirect detection experiments is discussed in detail.

  9. Testing the basic assumption of the hydrogeomorphic approach to assessing wetland functions.

    PubMed

    Hruby, T

    2001-05-01

    The hydrogeomorphic (HGM) approach for developing "rapid" wetland function assessment methods stipulates that the variables used are to be scaled based on data collected at sites judged to be the best at performing the wetland functions (reference standard sites). A critical step in the process is to choose the least altered wetlands in a hydrogeomorphic subclass to use as a reference standard against which other wetlands are compared. The basic assumption made in this approach is that wetlands judged to have had the least human impact have the highest level of sustainable performance for all functions. The levels at which functions are performed in these least altered wetlands are assumed to be "characteristic" for the subclass and "sustainable." Results from data collected in wetlands in the lowlands of western Washington suggest that the assumption may not be appropriate for this region. Teams developing methods for assessing wetland functions did not find that the least altered wetlands in a subclass had a range of performance levels that could be identified as "characteristic" or "sustainable." Forty-four wetlands in four hydrogeomorphic subclasses (two depressional subclasses and two riverine subclasses) were rated by teams of experts on the severity of their human alterations and on the level of performance of 15 wetland functions. An ordinal scale of 1-5 was used to quantify alterations in water regime, soils, vegetation, buffers, and contributing basin. Performance of functions was judged on an ordinal scale of 1-7. Relatively unaltered wetlands were judged to perform individual functions at levels that spanned all of the seven possible ratings in all four subclasses. The basic assumption of the HGM approach, that the least altered wetlands represent "characteristic" and "sustainable" levels of functioning that are different from those found in altered wetlands, was not confirmed. Although the intent of the HGM approach is to use level of functioning as a metric to assess the ecological integrity or "health" of the wetland ecosystem, the metric does not seem to work in western Washington for that purpose.

  10. A generating function approach to HIV transmission with dynamic contact rates

    DOE PAGES

    Romero-Severson, Ethan O.; Meadors, Grant D.; Volz, Erik M.

    2014-04-24

    The basic reproduction number, R 0, is often defined as the average number of infections generated by a newly infected individual in a fully susceptible population. The interpretation, meaning, and derivation of R 0 are controversial. However, in the context of mean field models, R 0 demarcates the epidemic threshold below which the infected population approaches zero in the limit of time. In this manner, R 0 has been proposed as a method for understanding the relative impact of public health interventions with respect to disease eliminations from a theoretical perspective. The use of R 0 is made more complexmore » by both the strong dependency of R 0 on the model form and the stochastic nature of transmission. A common assumption in models of HIV transmission that have closed form expressions for R 0 is that a single individual’s behavior is constant over time. For this research, we derive expressions for both R 0 and probability of an epidemic in a finite population under the assumption that people periodically change their sexual behavior over time. We illustrate the use of generating functions as a general framework to model the effects of potentially complex assumptions on the number of transmissions generated by a newly infected person in a susceptible population. In conclusion, we find that the relationship between the probability of an epidemic and R 0 is not straightforward, but, that as the rate of change in sexual behavior increases both R 0 and the probability of an epidemic also decrease.« less

  11. A generating function approach to HIV transmission with dynamic contact rates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romero-Severson, Ethan O.; Meadors, Grant D.; Volz, Erik M.

    The basic reproduction number, R 0, is often defined as the average number of infections generated by a newly infected individual in a fully susceptible population. The interpretation, meaning, and derivation of R 0 are controversial. However, in the context of mean field models, R 0 demarcates the epidemic threshold below which the infected population approaches zero in the limit of time. In this manner, R 0 has been proposed as a method for understanding the relative impact of public health interventions with respect to disease eliminations from a theoretical perspective. The use of R 0 is made more complexmore » by both the strong dependency of R 0 on the model form and the stochastic nature of transmission. A common assumption in models of HIV transmission that have closed form expressions for R 0 is that a single individual’s behavior is constant over time. For this research, we derive expressions for both R 0 and probability of an epidemic in a finite population under the assumption that people periodically change their sexual behavior over time. We illustrate the use of generating functions as a general framework to model the effects of potentially complex assumptions on the number of transmissions generated by a newly infected person in a susceptible population. In conclusion, we find that the relationship between the probability of an epidemic and R 0 is not straightforward, but, that as the rate of change in sexual behavior increases both R 0 and the probability of an epidemic also decrease.« less

  12. 7 CFR 1957.2 - Transfer with assumptions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Rural Housing Trust 1987-1, and who are eligible for an FmHA or its successor agency under Public Law 103-354 § 502 loan will be given the same priority by FmHA or its successor agency under Public Law.... FmHA or its successor agency under Public Law 103-354 regulations governing transfers and assumptions...

  13. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI): Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    PubMed Central

    Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie

    2017-01-01

    The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals. PMID:29180977

  14. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI): Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    PubMed

    Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie

    2017-01-01

    The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals.

  15. Aspects of fluency in writing.

    PubMed

    Uppstad, Per Henning; Solheim, Oddny Judith

    2007-03-01

    The notion of 'fluency' is most often associated with spoken-language phenomena such as stuttering. The present article investigates the relevance of considering fluency in writing. The basic argument for raising this question is empirical-it follows from a focus on difficulties in written and spoken language as manifestations of different problems which should be investigated separately on the basis of their symptoms. Key-logging instruments provide new possibilities for the study of writing. The obvious use of this new technology is to study writing as it unfolds in real time, instead of focusing only on aspects of the end product. A more sophisticated application is to exploit the key-logging instrument in order to test basic assumptions of contemporary theories of spelling. The present study is a dictation task involving words and non-words, intended to investigate spelling in nine-year-old pupils with regard to their mastery of the doubling of consonants in Norwegian. In this study, we report on differences with regard to temporal measures between a group of strong writers and a group of poor ones. On the basis of these pupils' writing behavior, the relevance of the concept of 'fluency' in writing is highlighted. The interpretation of the results questions basic assumptions of the cognitive hypothesis about spelling; the article concludes by hypothesizing a different conception of spelling.

  16. Helicopter Toy and Lift Estimation

    ERIC Educational Resources Information Center

    Shakerin, Said

    2013-01-01

    A $1 plastic helicopter toy (called a Wacky Whirler) can be used to demonstrate lift. Students can make basic measurements of the toy, use reasonable assumptions and, with the lift formula, estimate the lift, and verify that it is sufficient to overcome the toy's weight. (Contains 1 figure.)

  17. The Rural School Principalship: Unique Challenges, Opportunities.

    ERIC Educational Resources Information Center

    Hill, Jonathan

    1993-01-01

    Presents findings based on author's research and experience as principal in California's Mojave Desert. Five basic characteristics distinguish the rural principalship: lack of an assistant principal or other support staff; assumption of other duties, including central office tasks, teaching, or management of another site; less severe student…

  18. Teacher Education: Of the People, by the People, and for the People.

    ERIC Educational Resources Information Center

    Clinton, Hillary Rodham

    1985-01-01

    Effective inservice programs are necessary to ensure that current reforms in education are properly implemented. Inservice programs must meet the needs of both the educational system and educators. Six basic policy assumptions dealing with what is needed in inservice education are discussed. (DF)

  19. School Discipline Disproportionality: Culturally Competent Interventions for African American Males

    ERIC Educational Resources Information Center

    Simmons-Reed, Evette A.; Cartledge, Gwendolyn

    2014-01-01

    Exclusionary policies are practiced widely in schools despite being associated with extremely poor outcomes for culturally and linguistically diverse students, particularly African American males with and without disabilities. This article discusses zero tolerance policies, the related research questioning their basic assumptions, and the negative…

  20. Educational Evaluation: Analysis and Responsibility.

    ERIC Educational Resources Information Center

    Apple, Michael W., Ed.; And Others

    This book presents controversial aspects of evaluation and aims at broadening perspectives and insights in the evaluation field. Chapter 1 criticizes modes of evaluation and the basic rationality behind them and focuses on assumptions that have problematic consequences. Chapter 2 introduces concepts of evaluation and examines methods of grading…

  1. General Nature of Multicollinearity in Multiple Regression Analysis.

    ERIC Educational Resources Information Center

    Liu, Richard

    1981-01-01

    Discusses multiple regression, a very popular statistical technique in the field of education. One of the basic assumptions in regression analysis requires that independent variables in the equation should not be highly correlated. The problem of multicollinearity and some of the solutions to it are discussed. (Author)

  2. Feminism, Communication and the Politics of Knowledge.

    ERIC Educational Resources Information Center

    Gallagher, Margaret

    Recent retrieval of pre-nineteenth century feminist thought provides a telling lesson in the politics of knowledge creation and control. From a feminist perspective, very little research carried out within the critical research paradigm questions the "basic assumptions, conventional wisdom, media myths and the accepted way of doing…

  3. A Neo-Kohlbergian Approach to Morality Research.

    ERIC Educational Resources Information Center

    Rest, James R.; Narvaez, Darcia; Thoma, Stephen J.; Bebeau, Muriel J.

    2000-01-01

    Proposes a model of moral judgment that builds on Lawrence Kohlberg's core assumptions. Addresses the concerns that have surfaced related to Kohlberg's work in moral judgment. Presents an overview of this model using Kohlberg's basic starting points, ideas from cognitive science, and developments in moral philosophy. (CMK)

  4. Reconciling Time, Space and Function: A New Dorsal-Ventral Stream Model of Sentence Comprehension

    ERIC Educational Resources Information Center

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias

    2013-01-01

    We present a new dorsal-ventral stream framework for language comprehension which unifies basic neurobiological assumptions (Rauschecker & Scott, 2009) with a cross-linguistic neurocognitive sentence comprehension model (eADM; Bornkessel & Schlesewsky, 2006). The dissociation between (time-dependent) syntactic structure-building and…

  5. Qualitative Research in Counseling Psychology: Conceptual Foundations

    ERIC Educational Resources Information Center

    Morrow, Susan L.

    2007-01-01

    Beginning with calls for methodological diversity in counseling psychology, this article addresses the history and current state of qualitative research in counseling psychology. It identifies the historical and disciplinary origins as well as basic assumptions and underpinnings of qualitative research in general, as well as within counseling…

  6. Three regularities of recognition memory: the role of bias.

    PubMed

    Hilford, Andrew; Maloney, Laurence T; Glanzer, Murray; Kim, Kisok

    2015-12-01

    A basic assumption of Signal Detection Theory is that decisions are made on the basis of likelihood ratios. In a preceding paper, Glanzer, Hilford, and Maloney (Psychonomic Bulletin & Review, 16, 431-455, 2009) showed that the likelihood ratio assumption implies that three regularities will occur in recognition memory: (1) the Mirror Effect, (2) the Variance Effect, (3) the normalized Receiver Operating Characteristic (z-ROC) Length Effect. The paper offered formal proofs and computational demonstrations that decisions based on likelihood ratios produce the three regularities. A survey of data based on group ROCs from 36 studies validated the likelihood ratio assumption by showing that its three implied regularities are ubiquitous. The study noted, however, that bias, another basic factor in Signal Detection Theory, can obscure the Mirror Effect. In this paper we examine how bias affects the regularities at the theoretical level. The theoretical analysis shows: (1) how bias obscures the Mirror Effect, not the other two regularities, and (2) four ways to counter that obscuring. We then report the results of five experiments that support the theoretical analysis. The analyses and the experimental results also demonstrate: (1) that the three regularities govern individual, as well as group, performance, (2) alternative explanations of the regularities are ruled out, and (3) that Signal Detection Theory, correctly applied, gives a simple and unified explanation of recognition memory data.

  7. Realistic absorption coefficient of each individual film in a multilayer architecture

    NASA Astrophysics Data System (ADS)

    Cesaria, M.; Caricato, A. P.; Martino, M.

    2015-02-01

    A spectrophotometric strategy, termed multilayer-method (ML-method), is presented and discussed to realistically calculate the absorption coefficient of each individual layer embedded in multilayer architectures without reverse engineering, numerical refinements and assumptions about the layer homogeneity and thickness. The strategy extends in a non-straightforward way a consolidated route, already published by the authors and here termed basic-method, able to accurately characterize an absorbing film covering transparent substrates. The ML-method inherently accounts for non-measurable contribution of the interfaces (including multiple reflections), describes the specific film structure as determined by the multilayer architecture and used deposition approach and parameters, exploits simple mathematics, and has wide range of applicability (high-to-weak absorption regions, thick-to-ultrathin films). Reliability tests are performed on films and multilayers based on a well-known material (indium tin oxide) by deliberately changing the film structural quality through doping, thickness-tuning and underlying supporting-film. Results are found consistent with information obtained by standard (optical and structural) analysis, the basic-method and band gap values reported in the literature. The discussed example-applications demonstrate the ability of the ML-method to overcome the drawbacks commonly limiting an accurate description of multilayer architectures.

  8. Plant uptake of elements in soil and pore water: field observations versus model assumptions.

    PubMed

    Raguž, Veronika; Jarsjö, Jerker; Grolander, Sara; Lindborg, Regina; Avila, Rodolfo

    2013-09-15

    Contaminant concentrations in various edible plant parts transfer hazardous substances from polluted areas to animals and humans. Thus, the accurate prediction of plant uptake of elements is of significant importance. The processes involved contain many interacting factors and are, as such, complex. In contrast, the most common way to currently quantify element transfer from soils into plants is relatively simple, using an empirical soil-to-plant transfer factor (TF). This practice is based on theoretical assumptions that have been previously shown to not generally be valid. Using field data on concentrations of 61 basic elements in spring barley, soil and pore water at four agricultural sites in mid-eastern Sweden, we quantify element-specific TFs. Our aim is to investigate to which extent observed element-specific uptake is consistent with TF model assumptions and to which extent TF's can be used to predict observed differences in concentrations between different plant parts (root, stem and ear). Results show that for most elements, plant-ear concentrations are not linearly related to bulk soil concentrations, which is congruent with previous studies. This behaviour violates a basic TF model assumption of linearity. However, substantially better linear correlations are found when weighted average element concentrations in whole plants are used for TF estimation. The highest number of linearly-behaving elements was found when relating average plant concentrations to soil pore-water concentrations. In contrast to other elements, essential elements (micronutrients and macronutrients) exhibited relatively small differences in concentration between different plant parts. Generally, the TF model was shown to work reasonably well for micronutrients, whereas it did not for macronutrients. The results also suggest that plant uptake of elements from sources other than the soil compartment (e.g. from air) may be non-negligible. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Co-Dependency: An Examination of Underlying Assumptions.

    ERIC Educational Resources Information Center

    Myer, Rick A.; And Others

    1991-01-01

    Discusses need for careful examination of codependency as diagnostic category. Critically examines assumptions that codependency is disease, addiction, or predetermined by the environment. Discusses implications of assumptions. Offers recommendations for mental health counselors focusing on need for systematic research, redirection of efforts to…

  10. Quality Control and Nondestructive Evaluation Techniques for Composites. Part 2. Physiochemical Characterization Techniques - A State-of-the Art Review

    DTIC Science & Technology

    1983-05-01

    in the presence of fillers or without it. The basic assumption made is that the heat of reaction is proportional to the extent of the reaction...disperse the SillllV* rVdi\\tion ^^9 • .canning machan ^m. ill isolate the frequency range falling on the detector In this manner. the spectrum...the molar orms with only has n absorb ing (nxp) and # by the udied. Of t have a all of the analysis a complete the same There are two basic

  11. Production process stability - core assumption of INDUSTRY 4.0 concept

    NASA Astrophysics Data System (ADS)

    Chromjakova, F.; Bobak, R.; Hrusecka, D.

    2017-06-01

    Today’s industrial enterprises are confronted by implementation of INDUSTRY 4.0 concept with basic problem - stabilised manufacturing and supporting processes. Through this phenomenon of stabilisation, they will achieve positive digital management of both processes and continuously throughput. There is required structural stability of horizontal (business) and vertical (digitized) manufacturing processes, supported through digitalised technologies of INDUSTRY 4.0 concept. Results presented in this paper based on the research results and survey realised in more industrial companies. Following will described basic model for structural process stabilisation in manufacturing environment.

  12. Network resilience in the face of health system reform.

    PubMed

    Sheaff, Rod; Benson, Lawrence; Farbus, Lou; Schofield, Jill; Mannion, Russell; Reeves, David

    2010-03-01

    Many health systems now use networks as governance structures. Network 'macroculture' is the complex of artefacts, espoused values and unarticulated assumptions through which network members coordinate network activities. Knowledge of how network macroculture during 2006-2008 develops is therefore of value for understanding how health networks operate, how health system reforms affect them, and how networks function (and can be used) as governance structures. To examine how quasi-market reforms impact upon health networks' macrocultures we systematically compared longitudinal case studies of these impacts across two care networks, a programme network and a user-experience network in the English NHS. We conducted interviews with key informants, focus groups, non-participant observations of meetings and analyses of key documents. We found that in these networks, artefacts adapted to health system reform faster than espoused values did, and the latter adapted faster than basic underlying assumptions. These findings contribute to knowledge by providing empirical support for theories which hold that changes in networks' core practical activity are what stimulate changes in other aspects of network macroculture. The most powerful way of using network macroculture to manage the formation and operation of health networks therefore appears to be by focusing managerial activity on the ways in which networks produce their core artefacts. 2009 Elsevier Ltd. All rights reserved.

  13. Computation in generalised probabilisitic theories

    NASA Astrophysics Data System (ADS)

    Lee, Ciarán M.; Barrett, Jonathan

    2015-08-01

    From the general difficulty of simulating quantum systems using classical systems, and in particular the existence of an efficient quantum algorithm for factoring, it is likely that quantum computation is intrinsically more powerful than classical computation. At present, the best upper bound known for the power of quantum computation is that {{BQP}}\\subseteq {{AWPP}}, where {{AWPP}} is a classical complexity class (known to be included in {{PP}}, hence {{PSPACE}}). This work investigates limits on computational power that are imposed by simple physical, or information theoretic, principles. To this end, we define a circuit-based model of computation in a class of operationally-defined theories more general than quantum theory, and ask: what is the minimal set of physical assumptions under which the above inclusions still hold? We show that given only an assumption of tomographic locality (roughly, that multipartite states and transformations can be characterized by local measurements), efficient computations are contained in {{AWPP}}. This inclusion still holds even without assuming a basic notion of causality (where the notion is, roughly, that probabilities for outcomes cannot depend on future measurement choices). Following Aaronson, we extend the computational model by allowing post-selection on measurement outcomes. Aaronson showed that the corresponding quantum complexity class, {{PostBQP}}, is equal to {{PP}}. Given only the assumption of tomographic locality, the inclusion in {{PP}} still holds for post-selected computation in general theories. Hence in a world with post-selection, quantum theory is optimal for computation in the space of all operational theories. We then consider whether one can obtain relativized complexity results for general theories. It is not obvious how to define a sensible notion of a computational oracle in the general framework that reduces to the standard notion in the quantum case. Nevertheless, it is possible to define computation relative to a ‘classical oracle’. Then, we show there exists a classical oracle relative to which efficient computation in any theory satisfying the causality assumption does not include {{NP}}.

  14. Economic Theory and Broadcasting.

    ERIC Educational Resources Information Center

    Bates, Benjamin J.

    Focusing on access to audience through broadcast time, this paper examines the status of research into the economics of broadcasting. The paper first discusses the status of theory in the study of broadcast economics, both as described directly and as it exists in the statement of the basic assumptions generated by prior work and general…

  15. Dewey and Schon: An Analysis of Reflective Thinking.

    ERIC Educational Resources Information Center

    Bauer, Norman J.

    The challenge to the dominance of rationality in educational philosophy presented by John Dewey and Donald Schon is examined in this paper. The paper identifies basic assumptions of their perspective and explains concepts of reflective thinking, which include biography, context of uncertainty, and "not-yet." A model of reflective thought…

  16. Tiedeman's Approach to Career Development.

    ERIC Educational Resources Information Center

    Harren, Vincent A.

    Basic to Tiedeman's approach to career development and decision making is the assumption that one is responsible for one's own behavior because one has the capacity for choice and lives in a world which is not deterministic. Tiedeman, a cognitive-developmental theorist, views continuity of development as internal or psychological while…

  17. Linking Educational Philosophy with Micro-Level Technology: The Search for a Complete Method.

    ERIC Educational Resources Information Center

    Januszewski, Alan

    Traditionally, educational technologists have not been concerned with social or philosophical questions, and the field does not have a basic educational philosophy. Instead, it is dominated by a viewpoint characterized as "technical rationality" or "technicism"; the most important assumption of this viewpoint is that science…

  18. Network Analysis in Comparative Social Sciences

    ERIC Educational Resources Information Center

    Vera, Eugenia Roldan; Schupp, Thomas

    2006-01-01

    This essay describes the pertinence of Social Network Analysis (SNA) for the social sciences in general, and discusses its methodological and conceptual implications for comparative research in particular. The authors first present a basic summary of the theoretical and methodological assumptions of SNA, followed by a succinct overview of its…

  19. Conservatism in America--What Does it Mean for Teacher Education?

    ERIC Educational Resources Information Center

    Dolce, Carl J.

    The current conflict among opposing sets of cultural ideals is illustrated by several interrelated conditions. The conservative phenomenon is more complex than the traditional liberal-conservative dichotomy would suggest. Changes in societal conditions invite a reexamination of basic assumptions across the broad spectrum of political ideology.…

  20. Variable thickness transient ground-water flow model. Volume 1. Formulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reisenauer, A.E.

    1979-12-01

    Mathematical formulation for the variable thickness transient (VTT) model of an aquifer system is presented. The basic assumptions are described. Specific data requirements for the physical parameters are discussed. The boundary definitions and solution techniques of the numerical formulation of the system of equations are presented.

  1. A SYSTEMS ANALYSIS OF SCHOOL BOARD ACTION.

    ERIC Educational Resources Information Center

    SCRIBNER, JAY D.

    THE BASIC ASSUMPTION OF THE FUNCTIONAL-SYSTEMS THEORY IS THAT STRUCTURES FULFILL FUNCTIONS IN SYSTEMS AND THAT SUBSYSTEMS OPERATE SEPARATELY WITHIN ANY TYPE OF STRUCTURE. RELYING MAINLY ON GABRIEL ALMOND'S PARADIGM, THE AUTHOR ATTEMPTS TO DETERMINE THE USEFULNESS OF THE FUNCTIONAL-SYSTEMS THEORY IN CONDUCTING EMPIRICAL RESEARCH OF SCHOOL BOARDS.…

  2. Distance-Based and Distributed Learning: A Decision Tool for Education Leaders.

    ERIC Educational Resources Information Center

    McGraw, Tammy M.; Ross, John D.

    This decision tool presents a progression of data collection and decision-making strategies that can increase the effectiveness of distance-based or distributed learning instruction. A narrative and flow chart cover the following steps: (1) basic assumptions, including purpose of instruction, market scan, and financial resources; (2) needs…

  3. Applying the Principles of Specific Objectivity and of Generalizability to the Measurement of Change.

    ERIC Educational Resources Information Center

    Fischer, Gerhard H.

    1987-01-01

    A natural parameterization and formalization of the problem of measuring change in dichotomous data is developed. Mathematically-exact definitions of specific objectivity are presented, and the basic structures of the linear logistic test model and the linear logistic model with relaxed assumptions are clarified. (SLD)

  4. Validated Test Method 1315: Mass Transfer Rates of Constituents in Monolithic or Compacted Granular Materials Using a Semi-Dynamic Tank Leaching Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  5. Document-Oriented E-Learning Components

    ERIC Educational Resources Information Center

    Piotrowski, Michael

    2009-01-01

    This dissertation questions the common assumption that e-learning requires a "learning management system" (LMS) such as Moodle or Blackboard. Based on an analysis of the current state of the art in LMSs, we come to the conclusion that the functionality of conventional e-learning platforms consists of basic content management and…

  6. Moral Development in Higher Education

    ERIC Educational Resources Information Center

    Liddell, Debora L.; Cooper, Diane L.

    2012-01-01

    In this article, the authors lay out the basic foundational concepts and assumptions that will guide the reader through the chapters to come as the chapter authors explore "how" moral growth can be facilitated through various initiatives on the college campus. This article presents a brief review of the theoretical frameworks that provide the…

  7. Measuring Protein Interactions by Optical Biosensors

    PubMed Central

    Zhao, Huaying; Boyd, Lisa F.; Schuck, Peter

    2017-01-01

    This unit gives an introduction to the basic techniques of optical biosensing for measuring equilibrium and kinetics of reversible protein interactions. Emphasis is given to the description of robust approaches that will provide reliable results with few assumptions. How to avoid the most commonly encountered problems and artifacts is also discussed. PMID:28369667

  8. A "View from Nowhen" on Time Perception Experiments

    ERIC Educational Resources Information Center

    Riemer, Martin; Trojan, Jorg; Kleinbohl, Dieter; Holzl, Rupert

    2012-01-01

    Systematic errors in time reproduction tasks have been interpreted as a misperception of time and therefore seem to contradict basic assumptions of pacemaker-accumulator models. Here we propose an alternative explanation of this phenomenon based on methodological constraints regarding the direction of time, which cannot be manipulated in…

  9. Teaching Literature: Some Honest Doubts.

    ERIC Educational Resources Information Center

    Rutledge, Donald G.

    1968-01-01

    The possibility that many English teachers take their subject too seriously should be considered. The assumption that literature can to any degree either improve or adversely affect students is doubtful, but the exclusive study of "great literature" in our secondary schools may invite basic reflections too early: a year's steady diet of "King…

  10. East Europe Report, Political, Sociological and Military Affairs, No. 2219

    DTIC Science & Technology

    1983-10-24

    takes place in training booths and classrooms. On the way to warrant officer one must take sociology, Russian, basic construction, materials...polemics. I admit that I like this much more than the obligatory hearty kiss on both cheeks along with, of course, the assumption that polemicists have

  11. Exceptional Children Conference Papers: Behavioral and Emotional Problems.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Arlington, VA.

    Four of the seven conference papers treating behavioral and emotional problems concern the Conceptual Project, an attempt to provide definition and evaluation of conceptual models of the various theories of emotional disturbance and their basic assumptions, and to provide training packages based on these materials. The project is described in…

  12. The Binding Properties of Quechua Suffixes.

    ERIC Educational Resources Information Center

    Weber, David

    This paper sketches an explicitly non-lexicalist application of grammatical theory to Huallaga (Huanuco) Quechua (HgQ). The advantages of applying binding theory to many suffixes that have previously been treated only as objects of the morphology are demonstrated. After an introduction, section 2 outlines basic assumptions about the nature of HgQ…

  13. Validated Test Method 1316: Liquid-Solid Partitioning as a Function of Liquid-to-Solid Ratio in Solid Materials Using a Parallel Batch Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  14. Creating a Healthy Camp Community: A Nurse's Role.

    ERIC Educational Resources Information Center

    Lishner, Kris Miller; Bruya, Margaret Auld

    This book provides an organized, systematic overview of the basic aspects of health program management, nursing practice, and human relations issues in camp nursing. A foremost assumption is that health care in most camps needs improvement. Good health is dependent upon interventions involving social, environmental, and lifestyle factors that…

  15. Fatherless America: Confronting Our Most Urgent Social Problem.

    ERIC Educational Resources Information Center

    Blankenhorn, David

    The United States is rapidly becoming a fatherless society. Fatherlessness is the leading cause of declining child well-being, providing the impetus behind social problems such as crime, domestic violence, and adolescent pregnancy. Challenging the basic assumptions of opinion leaders in academia and in the media, this book debunks the prevailing…

  16. Teaching Strategy: A New Planet.

    ERIC Educational Resources Information Center

    O'Brien, Edward L.

    1998-01-01

    Presents a lesson for middle and secondary school students in which they respond to a hypothetical scenario that enables them to develop a list of basic rights. Expounds that students compare their list of rights to the Universal Declaration of Human Rights in order to explore the assumptions about human rights. (CMK)

  17. Session overview: forest ecosystems

    Treesearch

    John J. Battles; Robert C. Heald

    2004-01-01

    The core assumption of this symposium is that science can provide insight to management. Nowhere is this link more formally established than in regard to the science and management of forest ecosystems. The basic questions addressed are integral to our understanding of nature; the applications of this understanding are crucial to effective stewardship of natural...

  18. A Comprehensive Real-World Distillation Experiment

    ERIC Educational Resources Information Center

    Kazameas, Christos G.; Keller, Kaitlin N.; Luyben, William L.

    2015-01-01

    Most undergraduate mass transfer and separation courses cover the design of distillation columns, and many undergraduate laboratories have distillation experiments. In many cases, the treatment is restricted to simple column configurations and simplifying assumptions are made so as to convey only the basic concepts. In industry, the analysis of a…

  19. Human Praxis: A New Basic Assumption for Art Educators of the Future.

    ERIC Educational Resources Information Center

    Hodder, Geoffrey S.

    1980-01-01

    After analyzing Vincent Lanier's five characteristic roles of art education, the article briefly explains the pedagogy of Paulo Freire, based on human praxis, and applies it to the existing "oppresive" art education system. The article reduces Lanier's roles to resemble a single Freirean model. (SB)

  20. Model-Based Reasoning

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk; Seel, Norbert M.

    2013-01-01

    In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…

  1. Alternate hosts of Blepharipa pratensis (Meigen)

    Treesearch

    Paul A. Godwin; Thomas M. Odell

    1977-01-01

    A current tactic for biological control of the gypsy moth, Lymantria dispar Linnaeus, is to release its parasites in forests susceptible to gypsy moth damage before the gypsy moth arrives. The basic assumption in these anticipatory releases is that the parasites can find and utilize native insects as hosts in the interim. Blepharipa...

  2. Children and Adolescents: Should We Teach Them or Let Them Learn?

    ERIC Educational Resources Information Center

    Rohwer, William D., Jr.

    Research to date has provided too few answers for vital educational questions concerning teaching children or letting them learn. A basic problem is that experimentation usually begins by accepting conventional assumptions about schooling, ignoring experiments that would entail disturbing the ordering of current educational priorities.…

  3. Factors influencing the thermally-induced strength degradation of B/Al composites

    NASA Technical Reports Server (NTRS)

    Dicarlo, J. A.

    1983-01-01

    Literature data related to the thermally-induced strength degradation of B/Al composites were examined in the light of fracture theories based on reaction-controlled fiber weakening. Under the assumption of a parabolic time-dependent growth for the interfacial reaction product, a Griffith-type fracture model was found to yield simple equations whose predictions were in good agreement with data for boron fiber average strength and for B/Al axial fracture strain. The only variables in these equations were the time and temperature of the thermal exposure and an empirical factor related to fiber surface smoothness prior to composite consolidation. Such variables as fiber diameter and aluminum alloy composition were found to have little influence. The basic and practical implications of the fracture model equations are discussed. Previously announced in STAR as N82-24297

  4. How Mean is the Mean?

    PubMed Central

    Speelman, Craig P.; McGann, Marek

    2013-01-01

    In this paper we voice concerns about the uncritical manner in which the mean is often used as a summary statistic in psychological research. We identify a number of implicit assumptions underlying the use of the mean and argue that the fragility of these assumptions should be more carefully considered. We examine some of the ways in which the potential violation of these assumptions can lead us into significant theoretical and methodological error. Illustrations of alternative models of research already extant within Psychology are used to explore methods of research less mean-dependent and suggest that a critical assessment of the assumptions underlying its use in research play a more explicit role in the process of study design and review. PMID:23888147

  5. Why is it Doing That? - Assumptions about the FMS

    NASA Technical Reports Server (NTRS)

    Feary, Michael; Immanuel, Barshi; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    In the glass cockpit, it's not uncommon to hear exclamations such as "why is it doing that?". Sometimes pilots ask "what were they thinking when they set it this way?" or "why doesn't it tell me what it's going to do next?". Pilots may hold a conceptual model of the automation that is the result of fleet lore, which may or may not be consistent with what the engineers had in mind. But what did the engineers have in mind? In this study, we present some of the underlying assumptions surrounding the glass cockpit. Engineers and designers make assumptions about the nature of the flight task; at the other end, instructor and line pilots make assumptions about how the automation works and how it was intended to be used. These underlying assumptions are seldom recognized or acknowledged, This study is an attempt to explicitly arti culate such assumptions to better inform design and training developments. This work is part of a larger project to support training strategies for automation.

  6. A robust two-way semi-linear model for normalization of cDNA microarray data

    PubMed Central

    Wang, Deli; Huang, Jian; Xie, Hehuang; Manzella, Liliana; Soares, Marcelo Bento

    2005-01-01

    Background Normalization is a basic step in microarray data analysis. A proper normalization procedure ensures that the intensity ratios provide meaningful measures of relative expression values. Methods We propose a robust semiparametric method in a two-way semi-linear model (TW-SLM) for normalization of cDNA microarray data. This method does not make the usual assumptions underlying some of the existing methods. For example, it does not assume that: (i) the percentage of differentially expressed genes is small; or (ii) the numbers of up- and down-regulated genes are about the same, as required in the LOWESS normalization method. We conduct simulation studies to evaluate the proposed method and use a real data set from a specially designed microarray experiment to compare the performance of the proposed method with that of the LOWESS normalization approach. Results The simulation results show that the proposed method performs better than the LOWESS normalization method in terms of mean square errors for estimated gene effects. The results of analysis of the real data set also show that the proposed method yields more consistent results between the direct and the indirect comparisons and also can detect more differentially expressed genes than the LOWESS method. Conclusions Our simulation studies and the real data example indicate that the proposed robust TW-SLM method works at least as well as the LOWESS method and works better when the underlying assumptions for the LOWESS method are not satisfied. Therefore, it is a powerful alternative to the existing normalization methods. PMID:15663789

  7. Sliding friction between polymer surfaces: A molecular interpretation

    NASA Astrophysics Data System (ADS)

    Allegra, Giuseppe; Raos, Guido

    2006-04-01

    For two contacting rigid bodies, the friction force F is proportional to the normal load and independent of the macroscopic contact area and relative velocity V (Amonton's law). With two mutually sliding polymer samples, the surface irregularities transmit deformation to the underlying material. Energy loss along the deformation cycles is responsible for the friction force, which now appears to depend strongly on V [see, e.g., N. Maeda et al., Science 297, 379 (2002)]. We base our theoretical interpretation on the assumption that polymer chains are mainly subjected to oscillatory "reptation" along their "tubes." At high deformation frequencies—i.e., with a large sliding velocity V—the internal viscosity due to the rotational energy barriers around chain bonds hinders intramolecular mobility. As a result, energy dissipation and the correlated friction force strongly diminish at large V. Derived from a linear differential equation for chain dynamics, our results are basically consistent with the experimental data by Maeda et al. [Science 297, 379 (2002)] on modified polystyrene. Although the bulk polymer is below Tg, we regard the first few chain layers below the surface to be in the liquid state. In particular, the observed maximum of F vs V is consistent with physically reasonable values of the molecular parameters. As a general result, the ratio F /V is a steadily decreasing function of V, tending to V-2 for large velocities. We evaluate a much smaller friction for a cross-linked polymer under the assumption that the junctions are effectively immobile, also in agreement with the experimental results of Maeda et al. [Science 297, 379 (2002)].

  8. Teaching "Instant Experience" with Graphical Model Validation Techniques

    ERIC Educational Resources Information Center

    Ekstrøm, Claus Thorn

    2014-01-01

    Graphical model validation techniques for linear normal models are often used to check the assumptions underlying a statistical model. We describe an approach to provide "instant experience" in looking at a graphical model validation plot, so it becomes easier to validate if any of the underlying assumptions are violated.

  9. R0 for vector-borne diseases: impact of the assumption for the duration of the extrinsic incubation period.

    PubMed

    Hartemink, Nienke; Cianci, Daniela; Reiter, Paul

    2015-03-01

    Mathematical modeling and notably the basic reproduction number R0 have become popular tools for the description of vector-borne disease dynamics. We compare two widely used methods to calculate the probability of a vector to survive the extrinsic incubation period. The two methods are based on different assumptions for the duration of the extrinsic incubation period; one method assumes a fixed period and the other method assumes a fixed daily rate of becoming infectious. We conclude that the outcomes differ substantially between the methods when the average life span of the vector is short compared to the extrinsic incubation period.

  10. Likelihood ratio decisions in memory: three implied regularities.

    PubMed

    Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T

    2009-06-01

    We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.

  11. Shielding of substations against direct lightning strokes by shield wires

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chowdhuri, P.

    1994-01-01

    A new analysis for shielding outdoor substations against direct lightning strokes by shield wires is proposed. The basic assumption of this proposed method is that any lightning stroke which penetrates the shields will cause damage. The second assumption is that a certain level of risk of failure must be accepted, such as one or two failures per 100 years. The proposed method, using electrogeometric model, was applied to design shield wires for two outdoor substations: (1) 161-kV/69-kV station, and (2) 500-kV/161-kV station. The results of the proposed method were also compared with the shielding data of two other substations.

  12. Can organizations benefit from worksite health promotion?

    PubMed Central

    Leviton, L C

    1989-01-01

    A decision-analytic model was developed to project the future effects of selected worksite health promotion activities on employees' likelihood of chronic disease and injury and on employer costs due to illness. The model employed a conservative set of assumptions and a limited five-year time frame. Under these assumptions, hypertension control and seat belt campaigns prevent a substantial amount of illness, injury, and death. Sensitivity analysis indicates that these two programs pay for themselves and under some conditions show a modest savings to the employer. Under some conditions, smoking cessation programs pay for themselves, preventing a modest amount of illness and death. Cholesterol reduction by behavioral means does not pay for itself under these assumptions. These findings imply priorities in prevention for employer and employee alike. PMID:2499556

  13. Blind separation of positive sources by globally convergent gradient search.

    PubMed

    Oja, Erkki; Plumbley, Mark

    2004-09-01

    The instantaneous noise-free linear mixing model in independent component analysis is largely a solved problem under the usual assumption of independent nongaussian sources and full column rank mixing matrix. However, with some prior information on the sources, like positivity, new analysis and perhaps simplified solution methods may yet become possible. In this letter, we consider the task of independent component analysis when the independent sources are known to be nonnegative and well grounded, which means that they have a nonzero pdf in the region of zero. It can be shown that in this case, the solution method is basically very simple: an orthogonal rotation of the whitened observation vector into nonnegative outputs will give a positive permutation of the original sources. We propose a cost function whose minimum coincides with nonnegativity and derive the gradient algorithm under the whitening constraint, under which the separating matrix is orthogonal. We further prove that in the Stiefel manifold of orthogonal matrices, the cost function is a Lyapunov function for the matrix gradient flow, implying global convergence. Thus, this algorithm is guaranteed to find the nonnegative well-grounded independent sources. The analysis is complemented by a numerical simulation, which illustrates the algorithm.

  14. Operant conditioning of autobiographical memory retrieval.

    PubMed

    Debeer, Elise; Raes, Filip; Williams, J Mark G; Craeynest, Miet; Hermans, Dirk

    2014-01-01

    Functional avoidance is considered as one of the key mechanisms underlying overgeneral autobiographical memory (OGM). According to this view OGM is regarded as a learned cognitive avoidance strategy, based on principles of operant conditioning; i.e., individuals learn to avoid the emotionally painful consequences associated with the retrieval of specific negative memories. The aim of the present study was to test one of the basic assumptions of the functional avoidance account, namely that autobiographical memory retrieval can be brought under operant control. Here 41 students were instructed to retrieve personal memories in response to 60 emotional cue words. Depending on the condition, they were punished with an aversive sound for the retrieval of specific or nonspecific memories in an operant conditioning procedure. Analyzes showed that the course of memory specificity significantly differed between conditions. After the procedure participants punished for nonspecific memories retrieved significantly more specific memories compared to participants punished for specific memories. However, whereas memory specificity significantly increased in participants punished for specific memories, it did not significantly decrease in participants punished for nonspecific memories. Thus, while our findings indicate that autobiographical memory retrieval can be brought under operant control, they do not support a functional avoidance view on OGM.

  15. Faculty and Student Attitudes about Transfer of Learning

    ERIC Educational Resources Information Center

    Lightner, Robin; Benander, Ruth; Kramer, Eugene F.

    2008-01-01

    Transfer of learning is using previous knowledge in novel contexts. While this is a basic assumption of the educational process, students may not always perceive all the options for using what they have learned in different, novel situations. Within the framework of transfer of learning, this study outlines an attitudinal survey concerning faculty…

  16. New Directions in Teacher Education: Foundations, Curriculum, Policy.

    ERIC Educational Resources Information Center

    Denton, Jon, Ed.; And Others

    This publication includes presentations made at the Aikin-Stinnett Lecture Series and follow-up papers sponsored by the Instructional Research Laboratory at Texas A&M University. The papers in this collection focus upon the basic assumptions and conceptual bases of teacher education and the use of research in providing a foundation for…

  17. Perspective Making: Constructivism as a Meaning-Making Structure for Simulation Gaming

    ERIC Educational Resources Information Center

    Lainema, Timo

    2009-01-01

    Constructivism has recently gained popularity, although it is not a completely new learning paradigm. Much of the work within e-learning, for example, uses constructivism as a reference "discipline" (explicitly or implicitly). However, some of the work done within the simulation gaming (SG) community discusses what the basic assumptions and…

  18. Spiral Growth in Plants: Models and Simulations

    ERIC Educational Resources Information Center

    Allen, Bradford D.

    2004-01-01

    The analysis and simulation of spiral growth in plants integrates algebra and trigonometry in a botanical setting. When the ideas presented here are used in a mathematics classroom/computer lab, students can better understand how basic assumptions about plant growth lead to the golden ratio and how the use of circular functions leads to accurate…

  19. Dynamic Assessment and Its Implications for RTI Models

    ERIC Educational Resources Information Center

    Wagner, Richard K.; Compton, Donald L.

    2011-01-01

    Dynamic assessment refers to assessment that combines elements of instruction for the purpose of learning something about an individual that cannot be learned as easily or at all from conventional assessment. The origins of dynamic assessment can be traced to Thorndike (1924), Rey (1934), and Vygotsky (1962), who shared three basic assumptions.…

  20. Looking for Skinner and Finding Freud

    ERIC Educational Resources Information Center

    Overskeid, Geir

    2007-01-01

    Sigmund Freud and B. F. Skinner are often seen as psychology's polar opposites. It seems this view is fallacious. Indeed, Freud and Skinner had many things in common, including basic assumptions shaped by positivism and determinism. More important, Skinner took a clear interest in psychoanalysis and wanted to be analyzed but was turned down. His…

  1. Student Teachers' Beliefs about the Teacher's Role in Inclusive Education

    ERIC Educational Resources Information Center

    Domovic, Vlatka; Vidovic Vlasta, Vizek; Bouillet, Dejana

    2017-01-01

    The main aim of this research is to examine the basic features of student teachers' professional beliefs about the teacher's role in relation to teaching mainstream pupils and pupils with developmental disabilities. The starting assumption of this analysis is that teacher professional development is largely dependent upon teachers' beliefs about…

  2. Cable in Boston; A Basic Viability Report.

    ERIC Educational Resources Information Center

    Hauben, Jan Ward; And Others

    The viability of urban cable television (CATV) as an economic phenomenon is examined via a case study of its feasibility in Boston, a microcosm of general urban environment. To clarify cable's economics, a unitary concept of viability is used in which all local characteristics, cost assumptions, and growth estimates are structured dynamically as a…

  3. "I Fell off [the Mothering] Track": Barriers to "Effective Mothering" among Prostituted Women

    ERIC Educational Resources Information Center

    Dalla, Rochelle

    2004-01-01

    Ecological theory and basic assumptions for the promotion of effective mothering among low-income and working-poor women are applied in relation to a particularly vulnerable population: street-level prostitution-involved women. Qualitative data from 38 street-level prostituted women shows barriers to effective mothering at the individual,…

  4. Between "Homo Sociologicus" and "Homo Biologicus": The Reflexive Self in the Age of Social Neuroscience

    ERIC Educational Resources Information Center

    Pickel, Andreas

    2012-01-01

    The social sciences rely on assumptions of a unified self for their explanatory logics. Recent work in the new multidisciplinary field of social neuroscience challenges precisely this unproblematic character of the subjective self as basic, well-defined entity. If disciplinary self-insulation is deemed unacceptable, the philosophical challenge…

  5. Fueling a Third Paradigm of Education: The Pedagogical Implications of Digital, Social and Mobile Media

    ERIC Educational Resources Information Center

    Pavlik, John V.

    2015-01-01

    Emerging technologies are fueling a third paradigm of education. Digital, networked and mobile media are enabling a disruptive transformation of the teaching and learning process. This paradigm challenges traditional assumptions that have long characterized educational institutions and processes, including basic notions of space, time, content,…

  6. Using LISREL to Evaluate Measurement Models and Scale Reliability.

    ERIC Educational Resources Information Center

    Fleishman, John; Benson, Jeri

    1987-01-01

    LISREL program was used to examine measurement model assumptions and to assess reliability of Coopersmith Self-Esteem Inventory for Children, Form B. Data on 722 third-sixth graders from over 70 schools in large urban school district were used. LISREL program assessed (1) nature of basic measurement model for scale, (2) scale invariance across…

  7. What Are We Looking For?--Pro Critical Realism in Text Interpretation

    ERIC Educational Resources Information Center

    Siljander, Pauli

    2011-01-01

    A visible role in the theoretical discourses on education has been played in the last couple of decades by the constructivist epistemologies, which have questioned the basic assumptions of realist epistemologies. The increased popularity of interpretative approaches especially has put the realist epistemologies on the defensive. Basing itself on…

  8. The Hidden Reason Behind Children's Misbehavior.

    ERIC Educational Resources Information Center

    Nystul, Michael S.

    1986-01-01

    Discusses hidden reason theory based on the assumptions that: (1) the nature of people is positive; (2) a child's most basic psychological need is involvement; and (3) a child has four possible choices in life (good somebody, good nobody, bad somebody, or severely mentally ill.) A three step approach for implementing hidden reason theory is…

  9. 78 FR 26269 - Connect America Fund; High-Cost Universal Service Support

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-06

    ... the model platform, which is the basic framework for the model consisting of key assumptions about the... combination of competitive bidding and a new forward-looking model of the cost of constructing modern multi-purpose networks.'' Using the cost model to ``estimate the support necessary to serve areas where costs...

  10. The Effective Elementary School Principal: Theoretical Bases, Research Findings and Practical Implications.

    ERIC Educational Resources Information Center

    Burnett, I. Emett, Jr.; Pankake, Anita M.

    Although much of the current school reform movement relies on the basic assumption of effective elementary school administration, insufficient effort has been made to synthesize key concepts found in organizational theory and management studies with relevant effective schools research findings. This paper attempts such a synthesis to help develop…

  11. Response: Training Doctoral Students to Be Scientists

    ERIC Educational Resources Information Center

    Pollio, David E.

    2012-01-01

    The purpose of this article is to begin framing doctoral training for a science of social work. This process starts by examining two seemingly simple questions: "What is a social work scientist?" and "How do we train social work scientists?" In answering the first question, some basic assumptions and concepts about what constitutes a "social work…

  12. Adults with Intellectual and Developmental Disabilities and Participation in Decision Making: Ethical Considerations for Professional-Client Practice

    ERIC Educational Resources Information Center

    Lotan, Gurit; Ells, Carolyn

    2010-01-01

    In this article, the authors challenge professionals to re-examine assumptions about basic concepts and their implications in supporting adults with intellectual and developmental disabilities. The authors focus on decisions with significant implications, such as planning transition from school to adult life, changing living environments, and…

  13. A Convergence of Two Cultures in the Implementation of P.L. 94-142.

    ERIC Educational Resources Information Center

    Haas, Toni J.

    The Education for All Handicapped Children Act (PL 94-142) demanded basic changes in the practices, purposes, and institutional structures of schools to accommodate handicapped students, but did not adequately address the differences between general and special educators in expectations, training, or assumptions about the functions of schooling…

  14. From Earth to Space--Advertising Films Created in a Computer-Based Primary School Task

    ERIC Educational Resources Information Center

    Öman, Anne

    2017-01-01

    Today, teachers orchestrate computer-based tasks in software applications in Swedish primary schools. Meaning is made through various modes, and multimodal perspectives on literacy have the basic assumption that meaning is made through many representational and communicational resources. The case study presented in this paper has analysed pupils'…

  15. Child Sexual Abuse: Intervention and Treatment Issues. The User Manual Series.

    ERIC Educational Resources Information Center

    Faller, Kathleen Coulborn

    This manual describes professional practices in intervention and treatment of sexual abuse and discusses how to address the problems of sexually abused children and their families. It makes an assumption that the reader has basic information about sexual abuse. The discussion focuses primarily on the child's guardian as the abuser. The manual…

  16. A Comparative Analysis of Selected Mechanical Aspects of the Ice Skating Stride.

    ERIC Educational Resources Information Center

    Marino, G. Wayne

    This study quantitatively analyzes selected aspects of the skating strides of above-average and below-average ability skaters. Subproblems were to determine how stride length and stride rate are affected by changes in skating velocity, to ascertain whether the basic assumption that stride length accurately approximates horizontal movement of the…

  17. Implementing a Redesign Strategy: Lessons from Educational Change.

    ERIC Educational Resources Information Center

    Basom, Richard E., Jr.; Crandall, David P.

    The effective implementation of school redesign, based on a social systems approach, is discussed in this paper. A basic assumption is that the interdependence of system elements has implications for a complex change process. Seven barriers to redesign and five critical issues for successful redesign strategy are presented. Seven linear steps for…

  18. Civility in Politics and Education. Routledge Studies in Contemporary Philosophy

    ERIC Educational Resources Information Center

    Mower, Deborah, Ed.; Robison, Wade L., Ed.

    2011-01-01

    This book examines the concept of civility and the conditions of civil disagreement in politics and education. Although many assume that civility is merely polite behavior, it functions to aid rational discourse. Building on this basic assumption, the book offers multiple accounts of civility and its contribution to citizenship, deliberative…

  19. Improving Clinical Teaching: The ADN Experience. Pathways to Practice.

    ERIC Educational Resources Information Center

    Haase, Patricia T.; And Others

    Three Florida associate degree in nursing (ADN) demonstration projects of the Nursing Curriculum Project (NCP) are described, and the history of the ADN program and current controversies are reviewed. In 1976, the NCP of the Southern Regional Education Board issued basic assumptions about the role of the ADN graduate, relating them to client…

  20. Development and Validation of a Clarinet Performance Adjudication Scale

    ERIC Educational Resources Information Center

    Abeles, Harold F.

    1973-01-01

    A basic assumption of this study is that there are generally agreed upon performance standards as evidenced by the use of adjudicators for evaluations at contests and festivals. An evaluation instrument was developed to enable raters to measure effectively those aspects of performance that have common standards of proficiency. (Author/RK)

  1. Organize Your School for Improvement

    ERIC Educational Resources Information Center

    Truby, William F.

    2017-01-01

    W. Edwards Deming has suggested 96% of organization performance is a function of the organization's structure. He contends only about 4% of an organization's performance is attributable to the people. This is a fundamental difference as most school leaders work with the basic assumption that 80% of a school's performance is related to staff and…

  2. Training for Basic Skills or Educating Workers?: Changing Conceptions of Workplace Education Programs.

    ERIC Educational Resources Information Center

    Schultz, Katherine

    Although the National Workplace Literacy Program is relatively new, a new orthodoxy of program development based on particular understandings of literacy and learning has emerged. Descriptions of two model workplace education programs are the beginning points for an examination of the assumptions contained in most reports of workplace education…

  3. Appreciative Inquiry: A Model for Organizational Development and Performance Improvement in Student Affairs

    ERIC Educational Resources Information Center

    Elleven, Russell K.

    2007-01-01

    The article examines a relatively new tool to increase the effectiveness of organizations and people. The recent development and background of Appreciative Inquiry (AI) is reviewed. Basic assumptions of the model are discussed. Implications for departments and divisions of student affairs are analyzed. Finally, suggested readings and workshop…

  4. Resegregation in Norfolk, Virginia. Does Restoring Neighborhood Schools Work?

    ERIC Educational Resources Information Center

    Meldrum, Christina; Eaton, Susan E.

    This report reviews school department data and interviews with officials and others involved in the Norfolk (Virginia) school resegregation plan designed to stem White flight and increase parental involvement. The report finds that all the basic assumptions the local community and the court had about the potential benefits of undoing the city's…

  5. An Economic Theory of School Governance.

    ERIC Educational Resources Information Center

    Rada, Roger D.

    Working from the basic assumption that the primary motivation for those involved in school governance is self-interest, this paper develops and discusses 15 hypotheses that form the essential elements of an economic theory of school governance. The paper opens with a review of previous theories of governance and their origins in social science…

  6. The Effectiveness of Ineffectiveness: A New Approach to Assessing Patterns of Organizational Effectiveness.

    ERIC Educational Resources Information Center

    Cameron, Kim S.

    A way to assess and improve organizational effectiveness is discussed, with a focus on factors that inhibit successful organizational performance. The basic assumption is that it is easier, more accurate, and more beneficial for individuals and organizations to identify criteria of ineffectiveness (faults and weaknesses) than to identify criteria…

  7. Validated Test Method 1314: Liquid-Solid Partitioning as a Function of Liquid-Solid Ratio for Constituents in Solid Materials Using An Up-Flow Percolation Column Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  8. Lifeboat Counseling: The Issue of Survival Decisions

    ERIC Educational Resources Information Center

    Dowd, E. Thomas; Emener, William G.

    1978-01-01

    Rehabilitation counseling, as a profession, needs to look at future world possibilities, especially in light of overpopulation, and be aware that the need may arise for adjusting basic assumptions about human life--from the belief that every individual has a right to a meaningful life to the notion of selecting who shall live. (DTT)

  9. Challenges of Adopting Constructive Alignment in Action Learning Education

    ERIC Educational Resources Information Center

    Remneland Wikhamn, Björn

    2017-01-01

    This paper will critically examine how the two influential pedagogical approaches of action-based learning and constructive alignment relate to each other, and how they may differ in focus and basic assumptions. From the outset, they are based on similar underpinnings, with the student and the learning outcomes in the center. Drawing from…

  10. Curricular Learning Communities and Unprepared Students: How Faculty Can Provide a Foundation for Success

    ERIC Educational Resources Information Center

    Engstrom, Cathy McHugh

    2008-01-01

    The pedagogical assumptions and teaching practices of learning community models reflect exemplary conditions for learning, so using these models with unprepared students seems desirable and worthy of investigation. This chapter describes the key role of faculty in creating active, integrative learning experiences for students in basic skills…

  11. Education in Conflict and Crisis for National Security.

    ERIC Educational Resources Information Center

    McClelland, Charles A.

    A basic assumption is that the level of conflict within and between nations will escalate over the next 50 years. Trying to "muddle through" using the tools and techniques of organized violence may yield national suicide. Therefore, complex conflict resolution skills need to be developed and used by some part of society to quell disorder…

  12. Textbooks as a Possible Influence on Unscientific Ideas about Evolution

    ERIC Educational Resources Information Center

    Tshuma, Tholani; Sanders, Martie

    2015-01-01

    While school textbooks are assumed to be written for and used by students, it is widely acknowledged that they also serve a vital support function for teachers, particularly in times of curriculum change. A basic assumption is that biology textbooks are scientifically accurate. Furthermore, because of the negative impact of…

  13. A basic review on the inferior alveolar nerve block techniques.

    PubMed

    Khalil, Hesham

    2014-01-01

    The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned.

  14. A basic review on the inferior alveolar nerve block techniques

    PubMed Central

    Khalil, Hesham

    2014-01-01

    The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned. PMID:25886095

  15. Three-class ROC analysis--the equal error utility assumption and the optimality of three-class ROC surface using the ideal observer.

    PubMed

    He, Xin; Frey, Eric C

    2006-08-01

    Previously, we have developed a decision model for three-class receiver operating characteristic (ROC) analysis based on decision theory. The proposed decision model maximizes the expected decision utility under the assumption that incorrect decisions have equal utilities under the same hypothesis (equal error utility assumption). This assumption reduced the dimensionality of the "general" three-class ROC analysis and provided a practical figure-of-merit to evaluate the three-class task performance. However, it also limits the generality of the resulting model because the equal error utility assumption will not apply for all clinical three-class decision tasks. The goal of this study was to investigate the optimality of the proposed three-class decision model with respect to several other decision criteria. In particular, besides the maximum expected utility (MEU) criterion used in the previous study, we investigated the maximum-correctness (MC) (or minimum-error), maximum likelihood (ML), and Nyman-Pearson (N-P) criteria. We found that by making assumptions for both MEU and N-P criteria, all decision criteria lead to the previously-proposed three-class decision model. As a result, this model maximizes the expected utility under the equal error utility assumption, maximizes the probability of making correct decisions, satisfies the N-P criterion in the sense that it maximizes the sensitivity of one class given the sensitivities of the other two classes, and the resulting ROC surface contains the maximum likelihood decision operating point. While the proposed three-class ROC analysis model is not optimal in the general sense due to the use of the equal error utility assumption, the range of criteria for which it is optimal increases its applicability for evaluating and comparing a range of diagnostic systems.

  16. A Note on the Assumption of Identical Distributions for Nonparametric Tests of Location

    ERIC Educational Resources Information Center

    Nordstokke, David W.; Colp, S. Mitchell

    2018-01-01

    Often, when testing for shift in location, researchers will utilize nonparametric statistical tests in place of their parametric counterparts when there is evidence or belief that the assumptions of the parametric test are not met (i.e., normally distributed dependent variables). An underlying and often unattended to assumption of nonparametric…

  17. Use of Positive Pressures to Establish Vulnerability Curves 1

    PubMed Central

    Cochard, Hervé; Cruiziat, Pierre; Tyree, Melvin T.

    1992-01-01

    Loss of hydraulic conductivity occurs in stems when the water in xylem conduits is subjected to sufficiently negative pressure. According to the air-seeding hypothesis, this loss of conductivity occurs when air bubbles are sucked into water-filled conduits through micropores adjacent to air spaces in the stem. Results in this study showed that loss of hydraulic conductivity occurred in stem segments pressurized in a pressure chamber while the xylem water was under positive pressure. Vulnerability curves can be defined as a plot of percentage loss of hydraulic conductivity versus the pressure difference between xylem water and the outside air inducing the loss of conductivity. Vulnerability curves were similar whether loss of conductivity was induced by lowering the xylem water pressure or by raising the external air pressure. These results are consistent with the air-seeding hypothesis of how embolisms are nucleated, but not with the nucleation of embolisms at hydrophobic cracks because the latter requires negative xylem water pressure. The results also call into question some basic underlying assumptions used in the determination of components of tissue water potential using “pressure-volume” analysis. PMID:16652947

  18. Risk-dependent reward value signal in human prefrontal cortex

    PubMed Central

    Tobler, Philippe N.; Christopoulos, George I.; O'Doherty, John P.; Dolan, Raymond J.; Schultz, Wolfram

    2009-01-01

    When making choices under uncertainty, people usually consider both the expected value and risk of each option, and choose the one with the higher utility. Expected value increases the expected utility of an option for all individuals. Risk increases the utility of an option for risk-seeking individuals, but decreases it for risk averse individuals. In 2 separate experiments, one involving imperative (no-choice), the other choice situations, we investigated how predicted risk and expected value aggregate into a common reward signal in the human brain. Blood oxygen level dependent responses in lateral regions of the prefrontal cortex increased monotonically with increasing reward value in the absence of risk in both experiments. Risk enhanced these responses in risk-seeking participants, but reduced them in risk-averse participants. The aggregate value and risk responses in lateral prefrontal cortex contrasted with pure value signals independent of risk in the striatum. These results demonstrate an aggregate risk and value signal in the prefrontal cortex that would be compatible with basic assumptions underlying the mean-variance approach to utility. PMID:19369207

  19. Coevolution at protein complex interfaces can be detected by the complementarity trace with important impact for predictive docking

    PubMed Central

    Madaoui, Hocine; Guerois, Raphaël

    2008-01-01

    Protein surfaces are under significant selection pressure to maintain interactions with their partners throughout evolution. Capturing how selection pressure acts at the interfaces of protein–protein complexes is a fundamental issue with high interest for the structural prediction of macromolecular assemblies. We tackled this issue under the assumption that, throughout evolution, mutations should minimally disrupt the physicochemical compatibility between specific clusters of interacting residues. This constraint drove the development of the so-called Surface COmplementarity Trace in Complex History score (SCOTCH), which was found to discriminate with high efficiency the structure of biological complexes. SCOTCH performances were assessed not only with respect to other evolution-based approaches, such as conservation and coevolution analyses, but also with respect to statistically based scoring methods. Validated on a set of 129 complexes of known structure exhibiting both permanent and transient intermolecular interactions, SCOTCH appears as a robust strategy to guide the prediction of protein–protein complex structures. Of particular interest, it also provides a basic framework to efficiently track how protein surfaces could evolve while keeping their partners in contact. PMID:18511568

  20. Linear models for assessing mechanisms of sperm competition: the trouble with transformations.

    PubMed

    Eggert, Anne-Katrin; Reinhardt, Klaus; Sakaluk, Scott K

    2003-01-01

    Although sperm competition is a pervasive selective force shaping the reproductive tactics of males, the mechanisms underlying different patterns of sperm precedence remain obscure. Parker et al. (1990) developed a series of linear models designed to identify two of the more basic mechanisms: sperm lotteries and sperm displacement; the models can be tested experimentally by manipulating the relative numbers of sperm transferred by rival males and determining the paternity of offspring. Here we show that tests of the model derived for sperm lotteries can result in misleading inferences about the underlying mechanism of sperm precedence because the required inverse transformations may lead to a violation of fundamental assumptions of linear regression. We show that this problem can be remedied by reformulating the model using the actual numbers of offspring sired by each male, and log-transforming both sides of the resultant equation. Reassessment of data from a previous study (Sakaluk and Eggert 1996) using the corrected version of the model revealed that we should not have excluded a simple sperm lottery as a possible mechanism of sperm competition in decorated crickets, Gryllodes sigillatus.

  1. Reinforcement Learning Trees

    PubMed Central

    Zhu, Ruoqing; Zeng, Donglin; Kosorok, Michael R.

    2015-01-01

    In this paper, we introduce a new type of tree-based method, reinforcement learning trees (RLT), which exhibits significantly improved performance over traditional methods such as random forests (Breiman, 2001) under high-dimensional settings. The innovations are three-fold. First, the new method implements reinforcement learning at each selection of a splitting variable during the tree construction processes. By splitting on the variable that brings the greatest future improvement in later splits, rather than choosing the one with largest marginal effect from the immediate split, the constructed tree utilizes the available samples in a more efficient way. Moreover, such an approach enables linear combination cuts at little extra computational cost. Second, we propose a variable muting procedure that progressively eliminates noise variables during the construction of each individual tree. The muting procedure also takes advantage of reinforcement learning and prevents noise variables from being considered in the search for splitting rules, so that towards terminal nodes, where the sample size is small, the splitting rules are still constructed from only strong variables. Last, we investigate asymptotic properties of the proposed method under basic assumptions and discuss rationale in general settings. PMID:26903687

  2. 10 CFR 436.17 - Establishing energy or water cost data.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... escalation rate assumptions under § 436.14. When energy costs begin to accrue at a later time, subtract the... assumptions under § 436.14. When water costs begin to accrue at a later time, subtract the present value of... Methodology and Procedures for Life Cycle Cost Analyses § 436.17 Establishing energy or water cost data. (a...

  3. Political Assumptions Underlying Pedagogies of National Education: The Case of Student Teachers Teaching 'British Values' in England

    ERIC Educational Resources Information Center

    Sant, Edda; Hanley, Chris

    2018-01-01

    Teacher education in England now requires that student teachers follow practices that do not undermine "fundamental British values" where these practices are assessed against a set of ethics and behaviour standards. This paper examines the political assumptions underlying pedagogical interpretations about the education of national…

  4. Description of bipolar charge transport in polyethylene using a fluid model with a constant mobility: model prediction

    NASA Astrophysics Data System (ADS)

    LeRoy, S.; Segur, P.; Teyssedre, G.; Laurent, C.

    2004-01-01

    We present a conduction model aimed at describing bipolar transport and space charge phenomena in low density polyethylene under dc stress. In the first part we recall the basic requirements for the description of charge transport and charge storage in disordered media with emphasis on the case of polyethylene. A quick review of available conduction models is presented and our approach is compared with these models. Then, the bases of the model are described and related assumptions are discussed. Finally, results on external current, trapped and free space charge distributions, field distribution and recombination rate are presented and discussed, considering a constant dc voltage, a step-increase of the voltage, and a polarization-depolarization protocol for the applied voltage. It is shown that the model is able to describe the general features reported for external current, electroluminescence and charge distribution in polyethylene.

  5. Exploring individual differences in children's mathematical skills: a correlational and dimensional approach.

    PubMed

    Sigmundsson, H; Polman, R C J; Lorås, H

    2013-08-01

    Individual differences in mathematical skills are typically explained by an innate capability to solve mathematical tasks. At the behavioural level this implies a consistent level of mathematical achievement that can be captured by strong relationships between tasks, as well as by a single statistical dimension that underlies performance on all mathematical tasks. To investigate this general assumption, the present study explored interrelations and dimensions of mathematical skills. For this purpose, 68 ten-year-old children from two schools were tested using nine mathematics tasks from the Basic Knowledge in Mathematics Test. Relatively low-to-moderate correlations between the mathematics tasks indicated most tasks shared less than 25% of their variance. There were four principal components, accounting for 70% of the variance in mathematical skill across tasks and participants. The high specificity in mathematical skills was discussed in relation to the principle of task specificity of learning.

  6. Flat Engineered Multichannel Reflectors

    NASA Astrophysics Data System (ADS)

    Asadchy, V. S.; Díaz-Rubio, A.; Tcvetkova, S. N.; Kwon, D.-H.; Elsakka, A.; Albooyeh, M.; Tretyakov, S. A.

    2017-07-01

    Recent advances in engineered gradient metasurfaces have enabled unprecedented opportunities for light manipulation using optically thin sheets, such as anomalous refraction, reflection, or focusing of an incident beam. Here, we introduce a concept of multichannel functional metasurfaces, which are able to control incoming and outgoing waves in a number of propagation directions simultaneously. In particular, we reveal a possibility to engineer multichannel reflectors. Under the assumption of reciprocity and energy conservation, we find that there exist three basic functionalities of such reflectors: specular, anomalous, and retroreflections. Multichannel response of a general flat reflector can be described by a combination of these functionalities. To demonstrate the potential of the introduced concept, we design and experimentally test three different multichannel reflectors: three- and five-channel retroreflectors and a three-channel power splitter. Furthermore, by extending the concept to reflectors supporting higher-order Floquet harmonics, we forecast the emergence of other multichannel flat devices, such as isolating mirrors, complex splitters, and multi-functional gratings.

  7. Involuntary detention and treatment of the mentally ill: China's 2012 Mental Health Law.

    PubMed

    Ding, Chunyan

    2014-01-01

    The long-awaited Mental Health Law of China was passed on 26 October 2012 and took effect on 1 May 2013. Being the first national legislation on mental health, it establishes a basic legal framework to regulate mental health practice and recognizes the fundamental rights of persons with mental disorders. This article focuses on the system of involuntary detention and treatment of the mentally ill under the new law, which is expected to prevent the so-called "Being misidentified as mentally disordered" cases in China. A systematic examination of the new system demonstrates that the Mental Health Law of China implicitly holds two problematic assumptions and does not provide adequate protection of the fundamental rights of the involuntary patients. Administrative enactments and further national legislative efforts are needed to remedy these flaws in the new law. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Psychological needs and the facilitation of integrative processes.

    PubMed

    Ryan, R M

    1995-09-01

    The assumption that there are innate integrative or actualizing tendencies underlying personality and social development is reexamined. Rather than viewing such processes as either nonexistent or as automatic, I argue that they are dynamic and dependent upon social-contextual supports pertaining to basic human psychological needs. To develop this viewpoint, I conceptually link the notion of integrative tendencies to specific developmental processes, namely intrinsic motivation; internalization; and emotional integration. These processes are then shown to be facilitated by conditions that fulfill psychological needs for autonomy, competence, and relatedness, and forestalled within contexts that frustrate these needs. Interactions between psychological needs and contextual supports account, in part, for the domain and situational specificity of motivation, experience, and relative integration. The meaning of psychological needs (vs. wants) is directly considered, as are the relations between concepts of integration and autonomy and those of independence, individualism, efficacy, and cognitive models of "multiple selves."

  9. [The right to food in obesogenic environments: Reflections on the role of health professionals].

    PubMed

    Piaggio, Laura Raquel

    2016-01-01

    Faced with the current obesity epidemic, this article problematizes the way the right to food is often circumscribed to situations of nutritional deficit. It is argued that the right to adequate food is violated in obesogenic environments and that protection of the right requires the establishment of measures to regulate advertising and marketing practices regarding ultra-processed products. The work suggests that the main barriers to the implementation of such measures are the strategies employed by Big Food; among these, strategies that have the scientific community as a target and/or means are highlighted. Certain basic underlying assumptions are identified in the discourse of health professionals that contribute to create a framework of legitimacy regarding the consumption of ultra-processed products. The adoption of an ethical position that is free of conflicts of interest is suggested, so as to advocate for needed regulatory measures of a statutory nature.

  10. On computing stress in polymer systems involving multi-body potentials from molecular dynamics simulation

    NASA Astrophysics Data System (ADS)

    Fu, Yao; Song, Jeong-Hoon

    2014-08-01

    Hardy stress definition has been restricted to pair potentials and embedded-atom method potentials due to the basic assumptions in the derivation of a symmetric microscopic stress tensor. Force decomposition required in the Hardy stress expression becomes obscure for multi-body potentials. In this work, we demonstrate the invariance of the Hardy stress expression for a polymer system modeled with multi-body interatomic potentials including up to four atoms interaction, by applying central force decomposition of the atomic force. The balance of momentum has been demonstrated to be valid theoretically and tested under various numerical simulation conditions. The validity of momentum conservation justifies the extension of Hardy stress expression to multi-body potential systems. Computed Hardy stress has been observed to converge to the virial stress of the system with increasing spatial averaging volume. This work provides a feasible and reliable linkage between the atomistic and continuum scales for multi-body potential systems.

  11. The Importance of the Assumption of Uncorrelated Errors in Psychometric Theory

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.; Patelis, Thanos

    2015-01-01

    A critical discussion of the assumption of uncorrelated errors in classical psychometric theory and its applications is provided. It is pointed out that this assumption is essential for a number of fundamental results and underlies the concept of parallel tests, the Spearman-Brown's prophecy and the correction for attenuation formulas as well as…

  12. Under What Assumptions Do Site-by-Treatment Instruments Identify Average Causal Effects?

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Raudenbush, Stephen W.

    2011-01-01

    The purpose of this paper is to clarify the assumptions that must be met if this--multiple site, multiple mediator--strategy, hereafter referred to as "MSMM," is to identify the average causal effects (ATE) in the populations of interest. The authors' investigation of the assumptions of the multiple-mediator, multiple-site IV model demonstrates…

  13. Keeping Things Simple: Why the Human Development Index Should Not Diverge from Its Equal Weights Assumption

    ERIC Educational Resources Information Center

    Stapleton, Lee M.; Garrod, Guy D.

    2007-01-01

    Using a range of statistical criteria rooted in Information Theory we show that there is little justification for relaxing the equal weights assumption underlying the United Nation's Human Development Index (HDI) even if the true HDI diverges significantly from this assumption. Put differently, the additional model complexity that unequal weights…

  14. Power and Method: Political Activism and Educational Research. Critical Social Thought Series.

    ERIC Educational Resources Information Center

    Gitlin, Andrew, Ed.

    This book scrutinizes some basic assumptions about educational research with the aim that such research may act more powerfully on those persistent and important problems of our schools surrounding issues of race, class, and gender. In particular, the 13 essays in this book examine how power is infused in research by addressing such questions as…

  15. Philosophical Ethnography: Or, How Philosophy and Ethnography Can Live Together in the World of Educational Research

    ERIC Educational Resources Information Center

    Feinberg, Walter

    2006-01-01

    This essay explores a disciplinary hybrid, called here, philosophical ethnography. Philosophical ethnography is a philosophy of the everyday and ethnography in the context of intercultural discourse about coordinating meaning, evaluation, norms and action. Its basic assumption is that in the affairs of human beings truth, justice and beauty are…

  16. The Future of Family Business Education in UK Business Schools

    ERIC Educational Resources Information Center

    Collins, Lorna; Seaman, Claire; Graham, Stuart; Stepek, Martin

    2013-01-01

    Purpose: This practitioner paper aims to question basic assumptions about management education and to argue that a new paradigm is needed for UK business schools which embraces an oft neglected, yet economically vital, stakeholder group, namely family businesses. It seeks to pose the question of why we have forgotten to teach about family business…

  17. Social Maladjustment and Students with Behavioral and Emotional Disorders: Revisiting Basic Assumptions and Assessment Issues

    ERIC Educational Resources Information Center

    Olympia, Daniel; Farley, Megan; Christiansen, Elizabeth; Pettersson, Hollie; Jenson, William; Clark, Elaine

    2004-01-01

    While much of the current focus in special education remains on reauthorization of the Individuals with Disabilities Act of 1997, disparities in the identification of children with serious emotional disorders continue to plague special educators and school psychologists. Several years after the issue of social maladjustment and its relationship to…

  18. Locations of Racism in Education: A Speech Act Analysis of a Policy Chain

    ERIC Educational Resources Information Center

    Arneback, Emma; Quennerstedt, Ann

    2016-01-01

    This article explores how racism is located in an educational policy chain and identifies how its interpretation changes throughout the chain. A basic assumption is that the policy formation process can be seen as a chain in which international, national and local policies are "links"--separate entities yet joined. With Sweden as the…

  19. Pedagogical and Social Climate in School Questionnaire: Factorial Validity and Reliability of the Teacher Version

    ERIC Educational Resources Information Center

    Dimitrova, Radosveta; Ferrer-Wreder, Laura; Galanti, Maria Rosaria

    2016-01-01

    This study evaluated the factorial structure of the Pedagogical and Social Climate in School (PESOC) questionnaire among 307 teachers in Bulgaria. The teacher edition of PESOC consists of 11 scales (i.e., Expectations for Students, Unity Among Teachers, Approach to Students, Basic Assumptions About Students' Ability to Learn, School-Home…

  20. The Education System in Greece. [Revised.

    ERIC Educational Resources Information Center

    EURYDICE Central Unit, Brussels (Belgium).

    The education policy of the Greek government rests on the basic assumption that effective education is a social goal and that every citizen has a right to an education. A brief description of the Greek education system and of the adjustments made to give practical meaning to the provisions on education in the Constitution is presented in the…

  1. Experiences in Rural Mental Health II: Organizing a Low Budget Program.

    ERIC Educational Resources Information Center

    Hollister, William G.; And Others

    Based on a North Carolina feasibility study (1967-73) which focused on development of a pattern for providing comprehensive mental health services to rural people, this second program guide deals with organization of a low-income program budget. Presenting the basic assumptions utilized in the development of a low-budget program in Franklin and…

  2. Student Achievement in Basic College Mathematics: Its Relationship to Learning Style and Learning Method

    ERIC Educational Resources Information Center

    Gunthorpe, Sydney

    2006-01-01

    From the assumption that matching a student's learning style with the learning method best suited for the student, it follows that developing courses that correlate learning method with learning style would be more successful for students. Albuquerque Technical Vocational Institute (TVI) in New Mexico has attempted to provide students with more…

  3. Reds, Greens, Yellows Ease the Spelling Blues.

    ERIC Educational Resources Information Center

    Irwin, Virginia

    1971-01-01

    This document reports on a color-coding innovation designed to improve the spelling ability of high school seniors. This color-coded system is based on two assumptions: that color will appeal to the students and that there are three principal reasons for misspelling. Two groups were chosen for the experiments. A basic list of spelling demons was…

  4. The Politics and Coverage of Terror: From Media Images to Public Consciousness.

    ERIC Educational Resources Information Center

    Wittebols, James H.

    This paper presents a typology of terrorism which is grounded in how media differentially cover each type. The typology challenges some of the basic assumptions, such as that the media "allow" themselves to be exploited by terrorists and "encourage" terrorism, and the conventional wisdom about the net effects of the media's…

  5. The Past as Prologue: Examining the Consequences of Business as Usual. Center Paper 01-93.

    ERIC Educational Resources Information Center

    Jones, Dennis P.; And Others

    This study examined the ability of California to meet increased demand for postsecondary education without significantly altering the basic historical assumptions and policies that have governed relations between the state and its institutions of higher learning. Results of a series of analyses that estimated projected enrollments and costs under…

  6. Initial Comparison of Single Cylinder Stirling Engine Computer Model Predictions with Test Results

    NASA Technical Reports Server (NTRS)

    Tew, R. C., Jr.; Thieme, L. G.; Miao, D.

    1979-01-01

    A Stirling engine digital computer model developed at NASA Lewis Research Center was configured to predict the performance of the GPU-3 single-cylinder rhombic drive engine. Revisions to the basic equations and assumptions are discussed. Model predictions with the early results of the Lewis Research Center GPU-3 tests are compared.

  7. Effects of Problem Scope and Creativity Instructions on Idea Generation and Selection

    ERIC Educational Resources Information Center

    Rietzschel, Eric F.; Nijstad, Bernard A.; Stroebe, Wolfgang

    2014-01-01

    The basic assumption of brainstorming is that increased quantity of ideas results in increased generation as well as selection of creative ideas. Although previous research suggests that idea quantity correlates strongly with the number of good ideas generated, quantity has been found to be unrelated to the quality of selected ideas. This article…

  8. Methods of Evaluation To Determine the Preservation Needs in Libraries and Archives: A RAMP Study with Guidelines.

    ERIC Educational Resources Information Center

    Cunha, George M.

    This Records and Archives Management Programme (RAMP) study is intended to assist in the development of basic training programs and courses in document preservation and restoration, and to promote harmonization of such training both within the archival profession and within the broader information field. Based on the assumption that conservation…

  9. The Role of the Social Studies in Public Education.

    ERIC Educational Resources Information Center

    Byrne, T. C.

    This paper was prepared for a social studies curriculum conference in Alberta in June, 1967. It provides a point of view on curriculum building which could be useful in establishing a national service in this field. The basic assumption is that the social studies should in some measure change the behavior of the students (a sharp departure from…

  10. Twisting of thin walled columns perfectly restrained at one end

    NASA Technical Reports Server (NTRS)

    Lazzarino, Lucio

    1938-01-01

    Proceeding from the basic assumptions of the Batho-Bredt theory on twisting failure of thin-walled columns, the discrepancies most frequently encountered are analyzed. A generalized approximate method is suggested for the determination of the disturbances in the stress condition of the column, induced by the constrained warping in one of the end sections.

  11. Adolescent Literacy in Europe--An Urgent Call for Action

    ERIC Educational Resources Information Center

    Sulkunen, Sari

    2013-01-01

    This article focuses on the literacy of the adolescents who, in most European countries, are about to leave or have recently left basic education with the assumption that they have the command of functional literacy as required in and for further studies, citizenship, work life and a fulfilling life as individuals. First, the overall performance…

  12. Is the European (Active) Citizenship Ideal Fostering Inclusion within the Union? A Critical Review

    ERIC Educational Resources Information Center

    Milana, Marcella

    2008-01-01

    This article reviews: (1) the establishment and functioning of EU citizenship: (2) the resulting perception of education for European active citizenship; and (3) the question of its adequacy for enhancing democratic values and practices within the Union. Key policy documents produced by the EU help to unfold the basic assumptions on which…

  13. Improving Child Management Practices of Parents and Teachers. Maxi I Practicum. Final Report.

    ERIC Educational Resources Information Center

    Adreani, Arnold J.; McCaffrey, Robert

    The practicum design reported in this document was based on one basic assumption, that the adult perceptions of children influence adult behavior toward children which in turn influences the child's behavior. Therefore, behavior changes by children could best be effected by changing the adult perception of, and behavior toward, the child.…

  14. Going off the Grid: Re-Examining Technology in the Basic Writing Classroom

    ERIC Educational Resources Information Center

    Clay-Buck, Holly; Tuberville, Brenda

    2015-01-01

    The notion that today's students are constantly exposed to information technology has become so pervasive that it seems the academic conversation assumes students are "tech savvy." The proliferation of apps and smart phones aimed at the traditional college-aged population feeds into this assumption, aided in no small part by a growing…

  15. Network model and short circuit program for the Kennedy Space Center electric power distribution system

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Assumptions made and techniques used in modeling the power network to the 480 volt level are discussed. Basic computational techniques used in the short circuit program are described along with a flow diagram of the program and operational procedures. Procedures for incorporating network changes are included in this user's manual.

  16. Patterns and Policies: The Changing Demographics of Foreign Language Instruction. Issues in Language Program Direction: A Series of Annual Volumes.

    ERIC Educational Resources Information Center

    Liskin-Gasparro, Judith E., Ed.

    This collection of papers is divided into three parts. Part 1, "Changing Patterns: Curricular Implications," includes "Basic Assumptions Revisited: Today's French and Spanish Students at a Large Metropolitan University" (Gail Guntermann, Suzanne Hendrickson, and Carmen de Urioste) and "Le Francais et Mort, Vive le…

  17. Empirical Tests of the Assumptions Underlying Models for Foreign Exchange Rates.

    DTIC Science & Technology

    1984-03-01

    Research Report COs 481 EMPIRICAL TESTS OF THE ASSUMPTIO:IS UNDERLYING MODELS FOR FOREIGN EXCHANGE RATES by P. Brockett B. Golany 00 00 CENTER FOR...Research Report CCS 481 EMPIRICAL TESTS OF THE ASSUMPTIONS UNDERLYING MODELS FOR FOREIGN EXCHANGE RATES by P. Brockett B. Golany March 1984...applying these tests to the U.S. dollar to Japanese Yen foreign exchange rates . Conclusions and discussion is given in section VI. 1The previous authors

  18. Using the Folstein Mini Mental State Exam (MMSE) to explore methodological issues in cognitive aging research.

    PubMed

    Monroe, Todd; Carter, Michael

    2012-09-01

    Cognitive scales are used frequently in geriatric research and practice. These instruments are constructed with underlying assumptions that are a part of their validation process. A common measurement scale used in older adults is the Folstein Mini Mental State Exam (MMSE). The MMSE was designed to screen for cognitive impairment and is used often in geriatric research. This paper has three aims. Aim one was to explore four potential threats to validity in the use of the MMSE: (1) administering the exam without meeting the underlying assumptions, (2) not reporting that the underlying assumptions were assessed prior to test administration, (3) use of variable and inconsistent cut-off scores for the determination of presence of cognitive impairment, and (4) failure to adjust the scores based on the demographic characteristics of the tested subject. Aim two was to conduct a literature search to determine if the assumptions of (1) education level assessment, (2) sensory assessment, and (3) language fluency were being met and clearly reported in published research using the MMSE. Aim three was to provide recommendations to minimalize threats to validity in research studies that use cognitive scales, such as the MMSE. We found inconsistencies in published work in reporting whether or not subjects meet the assumptions that underlie a reliable and valid MMSE score. These inconsistencies can pose threats to the reliability of exam results. Fourteen of the 50 studies reviewed reported inclusion of all three of these assumptions. Inconsistencies in reporting the inclusion of the underlying assumptions for a reliable score could mean that subjects were not appropriate to be tested by use of the MMSE or that an appropriate test administration of the MMSE was not clearly reported. Thus, the research literature could have threats to both validity and reliability based on misuse of or improper reported use of the MMSE. Six recommendations are provided to minimalize these threats in future research.

  19. The Space-Time Conservation Element and Solution Element Method: A New High-Resolution and Genuinely Multidimensional Paradigm for Solving Conservation Laws. 1; The Two Dimensional Time Marching Schemes

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung; Wang, Xiao-Yen; Chow, Chuen-Yen

    1998-01-01

    A new high resolution and genuinely multidimensional numerical method for solving conservation laws is being, developed. It was designed to avoid the limitations of the traditional methods. and was built from round zero with extensive physics considerations. Nevertheless, its foundation is mathmatically simple enough that one can build from it a coherent, robust. efficient and accurate numerical framework. Two basic beliefs that set the new method apart from the established methods are at the core of its development. The first belief is that, in order to capture physics more efficiently and realistically, the modeling, focus should be placed on the original integral form of the physical conservation laws, rather than the differential form. The latter form follows from the integral form under the additional assumption that the physical solution is smooth, an assumption that is difficult to realize numerically in a region of rapid chance. such as a boundary layer or a shock. The second belief is that, with proper modeling of the integral and differential forms themselves, the resulting, numerical solution should automatically be consistent with the properties derived front the integral and differential forms, e.g., the jump conditions across a shock and the properties of characteristics. Therefore a much simpler and more robust method can be developed by not using the above derived properties explicitly.

  20. Non-stationary hydrologic frequency analysis using B-spline quantile regression

    NASA Astrophysics Data System (ADS)

    Nasri, B.; Bouezmarni, T.; St-Hilaire, A.; Ouarda, T. B. M. J.

    2017-11-01

    Hydrologic frequency analysis is commonly used by engineers and hydrologists to provide the basic information on planning, design and management of hydraulic and water resources systems under the assumption of stationarity. However, with increasing evidence of climate change, it is possible that the assumption of stationarity, which is prerequisite for traditional frequency analysis and hence, the results of conventional analysis would become questionable. In this study, we consider a framework for frequency analysis of extremes based on B-Spline quantile regression which allows to model data in the presence of non-stationarity and/or dependence on covariates with linear and non-linear dependence. A Markov Chain Monte Carlo (MCMC) algorithm was used to estimate quantiles and their posterior distributions. A coefficient of determination and Bayesian information criterion (BIC) for quantile regression are used in order to select the best model, i.e. for each quantile, we choose the degree and number of knots of the adequate B-spline quantile regression model. The method is applied to annual maximum and minimum streamflow records in Ontario, Canada. Climate indices are considered to describe the non-stationarity in the variable of interest and to estimate the quantiles in this case. The results show large differences between the non-stationary quantiles and their stationary equivalents for an annual maximum and minimum discharge with high annual non-exceedance probabilities.

  1. Program to analyze aquifer test data and check for validity with the jacob method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Field, M.S.

    1993-01-01

    The Jacob straight-line method of aquifer analysis deals with the late-time data and small radius of the Theis type curve which plot as a straight line if the drawdown data are plotted on an arithmetic scale and the time data on a logarithmic (base 10) scale. Correct analysis with the Jacob method normally assumes that (1) the data lie on a straight line, (2) the value of the dimensionless time factor is less than 0.01, and (3) the site's hydrogeology conforms to the method's assumptions and limiting conditions. Items 1 and 2 are usually considered for the Jacob method, butmore » item 3 is often ignored, which can lead to incorrect calculations of aquifer parameters. A BASIC computer program was developed to analyze aquifer test data with the Jacob method to test the validity of its use. Aquifer test data are entered into the program and manipulated so that a slope and time intercept of the straight line drawn through the data (excluding early-time and late-time data) can be used to calculate transmissivity and storage coefficient. Late-time data are excluded to eliminate the effects of positive and negative boundaries. The time-drawdown data then are converted into dimensionless units to determine if the Jacob method's assumptions are valid for the hydrogeologic conditions under which the test was conducted.« less

  2. Flood return level analysis of Peaks over Threshold series under changing climate

    NASA Astrophysics Data System (ADS)

    Li, L.; Xiong, L.; Hu, T.; Xu, C. Y.; Guo, S.

    2016-12-01

    Obtaining insights into future flood estimation is of great significance for water planning and management. Traditional flood return level analysis with the stationarity assumption has been challenged by changing environments. A method that takes into consideration the nonstationarity context has been extended to derive flood return levels for Peaks over Threshold (POT) series. With application to POT series, a Poisson distribution is normally assumed to describe the arrival rate of exceedance events, but this distribution assumption has at times been reported as invalid. The Negative Binomial (NB) distribution is therefore proposed as an alternative to the Poisson distribution assumption. Flood return levels were extrapolated in nonstationarity context for the POT series of the Weihe basin, China under future climate scenarios. The results show that the flood return levels estimated under nonstationarity can be different with an assumption of Poisson and NB distribution, respectively. The difference is found to be related to the threshold value of POT series. The study indicates the importance of distribution selection in flood return level analysis under nonstationarity and provides a reference on the impact of climate change on flood estimation in the Weihe basin for the future.

  3. Effect of body size and body mass on δ 13 C and δ 15 N in coastal fishes and cephalopods

    NASA Astrophysics Data System (ADS)

    Vinagre, C.; Máguas, C.; Cabral, H. N.; Costa, M. J.

    2011-11-01

    Carbon and nitrogen isotopes have been widely used in the investigation of trophic relations, energy pathways, trophic levels and migrations, under the assumption that δ 13C is independent of body size and that variation in δ 15N occurs exclusively due to ontogenetic changes in diet and not body size increase per se. However, several studies have shown that these assumptions are uncertain. Data from food-webs containing an important number of species lack theoretical support on these assumptions because very few species have been tested for δ 13C and δ 15N variation in captivity. However, if sampling comprises a wide range of body sizes from various species, the variation of δ 13C and δ 15N with body size can be investigated. While correlation between body size and δ 13C and δ 15N can be due to ontogenetic diet shifts, stability in such values throughout the size spectrum can be considered an indication that δ 13C and δ 15N in muscle tissues of such species is independent of body size within that size range, and thus the basic assumptions can be applied in the interpretation of such food webs. The present study investigated the variation in muscle δ 13C and δ 15N with body size and body mass of coastal fishes and cephalopods. It was concluded that muscle δ 13C and δ 15N did not vary with body size or mass for all bony fishes with only one exception, the dragonet Callionymus lyra. Muscle δ 13C and δ 15N also did not vary with body size or mass in cartilaginous fishes and cephalopods, meaning that body size/mass per se have no effect on δ 13C or δ 15N, for most species analysed and within the size ranges sampled. The assumption that δ 13C is independent of body size and that variation in δ 15N is not affected by body size increase per se was upheld for most organisms and can be applied to the coastal food web studied taking into account that C. lyra is an exception.

  4. Are We Ready for Real-world Neuroscience?

    PubMed

    Matusz, Pawel J; Dikker, Suzanne; Huth, Alexander G; Perrodin, Catherine

    2018-06-19

    Real-world environments are typically dynamic, complex, and multisensory in nature and require the support of top-down attention and memory mechanisms for us to be able to drive a car, make a shopping list, or pour a cup of coffee. Fundamental principles of perception and functional brain organization have been established by research utilizing well-controlled but simplified paradigms with basic stimuli. The last 30 years ushered a revolution in computational power, brain mapping, and signal processing techniques. Drawing on those theoretical and methodological advances, over the years, research has departed more and more from traditional, rigorous, and well-understood paradigms to directly investigate cognitive functions and their underlying brain mechanisms in real-world environments. These investigations typically address the role of one or, more recently, multiple attributes of real-world environments. Fundamental assumptions about perception, attention, or brain functional organization have been challenged-by studies adapting the traditional paradigms to emulate, for example, the multisensory nature or varying relevance of stimulation or dynamically changing task demands. Here, we present the state of the field within the emerging heterogeneous domain of real-world neuroscience. To be precise, the aim of this Special Focus is to bring together a variety of the emerging "real-world neuroscientific" approaches. These approaches differ in their principal aims, assumptions, or even definitions of "real-world neuroscience" research. Here, we showcase the commonalities and distinctive features of the different "real-world neuroscience" approaches. To do so, four early-career researchers and the speakers of the Cognitive Neuroscience Society 2017 Meeting symposium under the same title answer questions pertaining to the added value of such approaches in bringing us closer to accurate models of functional brain organization and cognitive functions.

  5. Memory Errors in Alibi Generation: How an Alibi Can Turn Against Us.

    PubMed

    Crozier, William E; Strange, Deryn; Loftus, Elizabeth F

    2017-01-01

    Alibis play a critical role in the criminal justice system. Yet research on the process of alibi generation and evaluation is still nascent. Indeed, similar to other widely investigated psychological phenomena in the legal system - such as false confessions, historical claims of abuse, and eyewitness memory - the basic assumptions underlying alibi generation and evaluation require closer empirical scrutiny. To date, the majority of alibi research investigates the social psychological aspects of the process. We argue that applying our understanding of basic human memory is critical to a complete understanding of the alibi process. Specifically, we challenge the use of alibi inconsistency as an indication of guilt by outlining the "cascading effects" that can put innocents at risk for conviction. We discuss how normal encoding and storage processes can pose problems at retrieval, particularly for innocent suspects that can result in alibi inconsistencies over time. Those inconsistencies are typically misunderstood as intentional deception, first by law enforcement, affecting the investigation, then by prosecutors affecting prosecution decisions, and finally by juries, ultimately affecting guilt judgments. Put differently, despite the universal nature of memory inconsistencies, a single error can produce a cascading effect, rendering an innocent individual's alibi, ironically, proof of guilt. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Assumptions of Statistical Tests: What Lies Beneath.

    PubMed

    Jupiter, Daniel C

    We have discussed many statistical tests and tools in this series of commentaries, and while we have mentioned the underlying assumptions of the tests, we have not explored them in detail. We stop to look at some of the assumptions of the t-test and linear regression, justify and explain them, mention what can go wrong when the assumptions are not met, and suggest some solutions in this case. Copyright © 2017 American College of Foot and Ankle Surgeons. Published by Elsevier Inc. All rights reserved.

  7. Improving inference for aerial surveys of bears: The importance of assumptions and the cost of unnecessary complexity.

    PubMed

    Schmidt, Joshua H; Wilson, Tammy L; Thompson, William L; Reynolds, Joel H

    2017-07-01

    Obtaining useful estimates of wildlife abundance or density requires thoughtful attention to potential sources of bias and precision, and it is widely understood that addressing incomplete detection is critical to appropriate inference. When the underlying assumptions of sampling approaches are violated, both increased bias and reduced precision of the population estimator may result. Bear ( Ursus spp.) populations can be difficult to sample and are often monitored using mark-recapture distance sampling (MRDS) methods, although obtaining adequate sample sizes can be cost prohibitive. With the goal of improving inference, we examined the underlying methodological assumptions and estimator efficiency of three datasets collected under an MRDS protocol designed specifically for bears. We analyzed these data using MRDS, conventional distance sampling (CDS), and open-distance sampling approaches to evaluate the apparent bias-precision tradeoff relative to the assumptions inherent under each approach. We also evaluated the incorporation of informative priors on detection parameters within a Bayesian context. We found that the CDS estimator had low apparent bias and was more efficient than the more complex MRDS estimator. When combined with informative priors on the detection process, precision was increased by >50% compared to the MRDS approach with little apparent bias. In addition, open-distance sampling models revealed a serious violation of the assumption that all bears were available to be sampled. Inference is directly related to the underlying assumptions of the survey design and the analytical tools employed. We show that for aerial surveys of bears, avoidance of unnecessary model complexity, use of prior information, and the application of open population models can be used to greatly improve estimator performance and simplify field protocols. Although we focused on distance sampling-based aerial surveys for bears, the general concepts we addressed apply to a variety of wildlife survey contexts.

  8. Bartnik’s splitting conjecture and Lorentzian Busemann function

    NASA Astrophysics Data System (ADS)

    Amini, Roya; Sharifzadeh, Mehdi; Bahrampour, Yousof

    2018-05-01

    In 1988 Bartnik posed the splitting conjecture about the cosmological space-time. This conjecture has been proved by several people, with different approaches and by using some additional assumptions such as ‘S-ray condition’ and ‘level set condition’. It is known that the ‘S-ray condition’ yields the ‘level set condition’. We have proved that the two are indeed equivalent, by giving a different proof under the assumption of the ‘level set condition’. In addition, we have shown several properties of the cosmological space-time, under the presence of the ‘level set condition’. Finally we have provided a proof of the conjecture under a different assumption on the cosmological space-time. But we first prove some results without the timelike convergence condition which help us to state our proofs.

  9. Validity in work-based assessment: expanding our horizons.

    PubMed

    Govaerts, Marjan; van der Vleuten, Cees P M

    2013-12-01

    Although work-based assessments (WBA) may come closest to assessing habitual performance, their use for summative purposes is not undisputed. Most criticism of WBA stems from approaches to validity consistent with the quantitative psychometric framework. However, there is increasing research evidence that indicates that the assumptions underlying the predictive, deterministic framework of psychometrics may no longer hold. In this discussion paper we argue that meaningfulness and appropriateness of current validity evidence can be called into question and that we need alternative strategies to assessment and validity inquiry that build on current theories of learning and performance in complex and dynamic workplace settings. Drawing from research in various professional fields we outline key issues within the mechanisms of learning, competence and performance in the context of complex social environments and illustrate their relevance to WBA. In reviewing recent socio-cultural learning theory and research on performance and performance interpretations in work settings, we demonstrate that learning, competence (as inferred from performance) as well as performance interpretations are to be seen as inherently contextualised, and can only be under-stood 'in situ'. Assessment in the context of work settings may, therefore, be more usefully viewed as a socially situated interpretive act. We propose constructivist-interpretivist approaches towards WBA in order to capture and understand contextualised learning and performance in work settings. Theoretical assumptions underlying interpretivist assessment approaches call for a validity theory that provides the theoretical framework and conceptual tools to guide the validation process in the qualitative assessment inquiry. Basic principles of rigour specific to qualitative research have been established, and they can and should be used to determine validity in interpretivist assessment approaches. If used properly, these strategies generate trustworthy evidence that is needed to develop the validity argument in WBA, allowing for in-depth and meaningful information about professional competence. © 2013 John Wiley & Sons Ltd.

  10. Supply-demand balance in outward-directed networks and Kleiber's law

    PubMed Central

    Painter, Page R

    2005-01-01

    Background Recent theories have attempted to derive the value of the exponent α in the allometric formula for scaling of basal metabolic rate from the properties of distribution network models for arteries and capillaries. It has recently been stated that a basic theorem relating the sum of nutrient currents to the specific nutrient uptake rate, together with a relationship claimed to be required in order to match nutrient supply to nutrient demand in 3-dimensional outward-directed networks, leads to Kleiber's law (b = 3/4). Methods The validity of the supply-demand matching principle and the assumptions required to prove the basic theorem are assessed. The supply-demand principle is evaluated by examining the supply term and the demand term in outward-directed lattice models of nutrient and water distribution systems and by applying the principle to fractal-like models of mammalian arterial systems. Results Application of the supply-demand principle to bifurcating fractal-like networks that are outward-directed does not predict 3/4-power scaling, and evaluation of water distribution system models shows that the matching principle does not match supply to demand in such systems. Furthermore, proof of the basic theorem is shown to require that the covariance of nutrient uptake and current path length is 0, an assumption unlikely to be true in mammalian arterial systems. Conclusion The supply-demand matching principle does not lead to a satisfactory explanation for the approximately 3/4-power scaling of mammalian basal metabolic rate. PMID:16283939

  11. Supply-demand balance in outward-directed networks and Kleiber's law.

    PubMed

    Painter, Page R

    2005-11-10

    Recent theories have attempted to derive the value of the exponent alpha in the allometric formula for scaling of basal metabolic rate from the properties of distribution network models for arteries and capillaries. It has recently been stated that a basic theorem relating the sum of nutrient currents to the specific nutrient uptake rate, together with a relationship claimed to be required in order to match nutrient supply to nutrient demand in 3-dimensional outward-directed networks, leads to Kleiber's law (b = 3/4). The validity of the supply-demand matching principle and the assumptions required to prove the basic theorem are assessed. The supply-demand principle is evaluated by examining the supply term and the demand term in outward-directed lattice models of nutrient and water distribution systems and by applying the principle to fractal-like models of mammalian arterial systems. Application of the supply-demand principle to bifurcating fractal-like networks that are outward-directed does not predict 3/4-power scaling, and evaluation of water distribution system models shows that the matching principle does not match supply to demand in such systems. Furthermore, proof of the basic theorem is shown to require that the covariance of nutrient uptake and current path length is 0, an assumption unlikely to be true in mammalian arterial systems. The supply-demand matching principle does not lead to a satisfactory explanation for the approximately 3/4-power scaling of mammalian basal metabolic rate.

  12. Optimal post-experiment estimation of poorly modeled dynamic systems

    NASA Technical Reports Server (NTRS)

    Mook, D. Joseph

    1988-01-01

    Recently, a novel strategy for post-experiment state estimation of discretely-measured dynamic systems has been developed. The method accounts for errors in the system dynamic model equations in a more general and rigorous manner than do filter-smoother algorithms. The dynamic model error terms do not require the usual process noise assumptions of zero-mean, symmetrically distributed random disturbances. Instead, the model error terms require no prior assumptions other than piecewise continuity. The resulting state estimates are more accurate than filters for applications in which the dynamic model error clearly violates the typical process noise assumptions, and the available measurements are sparse and/or noisy. Estimates of the dynamic model error, in addition to the states, are obtained as part of the solution of a two-point boundary value problem, and may be exploited for numerous reasons. In this paper, the basic technique is explained, and several example applications are given. Included among the examples are both state estimation and exploitation of the model error estimates.

  13. Comparisons of Multilevel Modeling and Structural Equation Modeling Approaches to Actor-Partner Interdependence Model.

    PubMed

    Hong, Sehee; Kim, Soyoung

    2018-01-01

    There are basically two modeling approaches applicable to analyzing an actor-partner interdependence model: the multilevel modeling (hierarchical linear model) and the structural equation modeling. This article explains how to use these two models in analyzing an actor-partner interdependence model and how these two approaches work differently. As an empirical example, marital conflict data were used to analyze an actor-partner interdependence model. The multilevel modeling and the structural equation modeling produced virtually identical estimates for a basic model. However, the structural equation modeling approach allowed more realistic assumptions on measurement errors and factor loadings, rendering better model fit indices.

  14. Status of the Space Station environmental control and life support system design concept

    NASA Technical Reports Server (NTRS)

    Ray, C. D.; Humphries, W. R.

    1986-01-01

    The current status of the Space Station (SS) environmental control and life support system (ECLSS) design is outlined. The concept has been defined at the subsystem level. Data supporting these definitions are provided which identify general configuratioons for all modules. Requirements, guidelines and assumptions used in generating these configurations are detailed. The basic 2 US module 'core' Space Station is addressed along with system synergism issues and early man-tended and future growth considerations. Along with these basic studies, also addressed here are options related to variation in the 'core' module makeup and more austere Station concepts such as commonality, automation and design to cost.

  15. Refraction effects of atmosphere on geodetic measurements to celestial bodies

    NASA Technical Reports Server (NTRS)

    Joshi, C. S.

    1973-01-01

    The problem is considered of obtaining accurate values of refraction corrections for geodetic measurements of celestial bodies. The basic principles of optics governing the phenomenon of refraction are defined, and differential equations are derived for the refraction corrections. The corrections fall into two main categories: (1) refraction effects due to change in the direction of propagation, and (2) refraction effects mainly due to change in the velocity of propagation. The various assumptions made by earlier investigators are reviewed along with the basic principles of improved models designed by investigators of the twentieth century. The accuracy problem for various quantities is discussed, and the conclusions and recommendations are summarized.

  16. Energy Conversion Alternatives Study (ECAS), Westinghouse phase 1. Volume 8: Open-cycle MHD. [energy conversion efficiency and design analysis of electric power plants employing magnetohydrodynamics

    NASA Technical Reports Server (NTRS)

    Hoover, D. Q.

    1976-01-01

    Electric power plant costs and efficiencies are presented for three basic open-cycle MHD systems: (1) direct coal fired system, (2) a system with a separately fired air heater, and (3) a system burning low-Btu gas from an integrated gasifier. Power plant designs were developed corresponding to the basic cases with variation of major parameters for which major system components were sized and costed. Flow diagrams describing each design are presented. A discussion of the limitations of each design is made within the framework of the assumptions made.

  17. Impact of actuarial assumptions on pension costs: A simulation analysis

    NASA Astrophysics Data System (ADS)

    Yusof, Shaira; Ibrahim, Rose Irnawaty

    2013-04-01

    This study investigates the sensitivity of pension costs to changes in the underlying assumptions of a hypothetical pension plan in order to gain a perspective on the relative importance of the various actuarial assumptions via a simulation analysis. Simulation analyses are used to examine the impact of actuarial assumptions on pension costs. There are two actuarial assumptions will be considered in this study which are mortality rates and interest rates. To calculate pension costs, Accrued Benefit Cost Method, constant amount (CA) modification, constant percentage of salary (CS) modification are used in the study. The mortality assumptions and the implied mortality experience of the plan can potentially have a significant impact on pension costs. While for interest rate assumptions, it is inversely related to the pension costs. Results of the study have important implications for analyst of pension costs.

  18. Intospace a European industrial initiative to commercialise space

    NASA Astrophysics Data System (ADS)

    von der Lippe, Juergen K.; Sprenger, Heinz J.

    2005-07-01

    Intospace, founded in 1985, was the response to the government's request to provide evidence to the industrial promises of commercial utilisation of space systems such as Spacelab and the already planned space station. The company was set up with an exceptional structure comprising 95 shareholders from all over western Europe from space and non-space industry and financial institutes. The companies joined as shareholders and committed beyond the basic capital to cover financial losses up to a given limit allowing the company to invest in market development. Compared to other commercial initiatives in the European space scenario the product that Intospace was supposed to offer, was without doubt the most demanding one regarding its market prospects. The primary product of Intospace was to provide services to commercial customers for using microgravity for research and production in space. This was based on the assumption that an effective operational infrastructure with frequent flights of Spacelab and Eureca would be available leading finally to the space station with Columbus. A further assumption had been that basic research projects of the agencies would provide sufficient data as a basis for commercial project planning. The conflict with these assumptions is best illustrated by the fact that the lifetime of Intospace is framed by the two shuttle disasters, the Challenger accident a couple of months after foundation of Intospace and the Columbia accident with Spacehab on board leading to liquidation of the company. The paper will present the background behind the foundation of the Intospace initiative, describe the objectives and major strategic steps to develop the market.

  19. The unique world of the Everett version of quantum theory

    NASA Astrophysics Data System (ADS)

    Squires, Euan J.

    1988-03-01

    We ask whether the basic Everett assumption, that there are no changes of the wavefunction other than those given by the Schrödinger equation, is compatible with experience. We conclude that it is, provided we allow the world of observation to be partially a creation of consciousness. The model suggests the possible existence of quantum paranormal effects.

  20. New Schools for the Cities: Designs for Equality and Excellence. A Working Paper prepared for the Citizens' Crusade Against Poverty.

    ERIC Educational Resources Information Center

    Pressman, Harvey

    This paper outlines several schemes for developing quality private schools for inner city students. The basic assumption justifying the proposal that such schools be independently managed is that the urban public school systems have patently failed to educate poor children. Therefore, a new national network of independent schools should be…

  1. ENRICHMENT PROGRAM FOR ACADEMICALLY TALENTED JUNIOR HIGH SCHOOL STUDENTS FROM LOW INCOME FAMILIES.

    ERIC Educational Resources Information Center

    PRESSMAN, HARVEY

    A PROPOSAL FOR AN ENRICHMENT PROGRAM FOR ACADEMICALLY TALENTED JUNIOR HIGH SCHOOL STUDENTS FROM LOW-INCOME FAMILIES IN CERTAIN AREAS OF BOSTON IS PRESENTED. BASIC ASSUMPTIONS ARE THAT THERE IS AND OBVIOUS AND PRESSING NEED TO GIVE EXTRA HELP TO THE ABLE STUDENT FROM A DISADVANTAGED BACKGROUND, AND THAT A RELATIVELY BRIEF ENRICHMENT EXPERIENCE FOR…

  2. Redwoods—responsibilities for a long-lived species/resource

    Treesearch

    Robert Ewing

    2017-01-01

    What responsibilities do humans have to ensure that redwoods survive? And what values and strategies are required to accomplish such a purpose? A basic assumption is that the saving of a species, or more broadly of an ecosystem, is ultimately about human survival and that there is a responsibility to use all tools available to this end. To date, our actions to sustain...

  3. Comments on ""Contact Diffusion Interaction of Materials with Cladding''

    NASA Technical Reports Server (NTRS)

    Morris, J. F.

    1972-01-01

    A Russian paper by A. A. Babad-Zakhryapina contributes much to the understanding of fuel, clad interactions, and thus to nuclear thermionic technology. In that publication the basic diffusion expression is a simple one. A more general but complicated equation for this mass transport results from the present work. With appropriate assumptions, however, the new relation reduces to Babad-Zakhryapina's version.

  4. First order ball bearing kinematics

    NASA Technical Reports Server (NTRS)

    Kingbury, E.

    1984-01-01

    Two first order equations are given connecting geometry and internal motions in an angular contact ball bearing. Total speed, kinematic equivalence, basic speed ratio, and modal speed ratio are defined and discussed; charts are given for the speed ratios covering all bearings and all rotational modes. Instances where specific first order assumptions might fail are discussed, and the resulting effects on bearing performance reviewed.

  5. Forest inventories generate scientifically sound information on the forest resource, but do our data and information really matter?

    Treesearch

    Christoph Keinn; Goran Stahl

    2009-01-01

    Current research in forest inventory focuses very much on technical-statistical problems geared mainly to the optimization of data collection and information generation. The basic assumption is that better information leads to better decisions and, therefore, to better forest management and forest policy. Not many studies, however, strive to explicitly establish the...

  6. Four Scenarios for Determining the Size and Reusability of Learning Objects

    ERIC Educational Resources Information Center

    Schoonenboom, Judith

    2012-01-01

    The best method for determining the size of learning objects (LOs) so as to optimise their reusability has been a topic of debate for years now. Although there appears to be agreement on basic assumptions, developed guidelines and principles are often in conflict. This study shows that this confusion stems from the fact that in the literature,…

  7. A Survey of Report of Risk Management for Clay County, Florida.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee.

    Risk management encompasses far more than an insurance program alone. The basic elements consist of--(1) elimination or reduction of exposure to loss, (2) protection from exposure to loss, (3) assumption of risk loss, and (4) transfer of risk to a professional carrier. This survey serves as a means of evaluating the methods of application of these…

  8. Data-Driven Leadership: Determining Your Indicators and Building Your Dashboards

    ERIC Educational Resources Information Center

    Copeland, Mo

    2016-01-01

    For years, schools have tended to approach budgets with some basic assumptions and aspirations and general wish lists but with scant data to drive the budget conversation. Suppose there were a better way? What if the conversation started with a review of the last five to ten years of data on three key mission- and strategy-driven indicators:…

  9. The "Cause" of Low Self-Control: The Influence of Maternal Self-Control

    ERIC Educational Resources Information Center

    Nofziger, Stacey

    2008-01-01

    Self-control theory is one of the most tested theories within the field of criminology. However, one of the basic assumptions of the theory has remained largely ignored. Gottfredson and Hirschi stated that the focus of their general theory of crime is the "connection between the self-control of the parent and the subsequent self-control of the…

  10. Consumption of Mass Communication--Construction of a Model on Information Consumption Behaviour.

    ERIC Educational Resources Information Center

    Sepstrup, Preben

    A general conceptual model on the consumption of information is introduced. Information as the output of the mass media is treated as a product, and a model on the consumption of this product is developed by merging elements from consumer behavior theory and mass communication theory. Chapter I gives basic assumptions about the individual and the…

  11. Treating the Tough Adolescent: A Family-Based, Step-by-Step Guide. The Guilford Family Therapy Series.

    ERIC Educational Resources Information Center

    Sells, Scott P.

    A model for treating difficult adolescents and their families is presented. Part 1 offers six basic assumptions about the causes of severe behavioral problems and presents the treatment model with guidelines necessary to address each of these six causes. Case examples highlight and clarify major points within each of the 15 procedural steps of the…

  12. The Nature of Living Systems: An Exposition of the Basic Concepts in General Systems Theory.

    ERIC Educational Resources Information Center

    Miller, James G.

    General systems theory is a set of related definitions, assumptions, and propositions which deal with reality as an integrated hierarchy of organizations of matter and energy. In this paper, the author defines the concepts of space, time, matter, energy, and information in terms of their meaning in general systems theory. He defines a system as a…

  13. Cognitive access to numbers: the philosophical significance of empirical findings about basic number abilities.

    PubMed

    Giaquinto, Marcus

    2017-02-19

    How can we acquire a grasp of cardinal numbers, even the first very small positive cardinal numbers, given that they are abstract mathematical entities? That problem of cognitive access is the main focus of this paper. All the major rival views about the nature and existence of cardinal numbers face difficulties; and the view most consonant with our normal thought and talk about numbers, the view that cardinal numbers are sizes of sets, runs into the cognitive access problem. The source of the problem is the plausible assumption that cognitive access to something requires causal contact with it. It is argued that this assumption is in fact wrong, and that in this and similar cases, we should accept that a certain recognize-and-distinguish capacity is sufficient for cognitive access. We can then go on to solve the cognitive access problem, and thereby support the set-size view of cardinal numbers, by paying attention to empirical findings about basic number abilities. To this end, some selected studies of infants, pre-school children and a trained chimpanzee are briefly discussed.This article is part of a discussion meeting issue 'The origins of numerical abilities'. © 2017 The Author(s).

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bogen, K.T.; Conrado, C.L.; Robison, W.L.

    A detailed analysis of uncertainty and interindividual variability in estimated doses was conducted for a rehabilitation scenario for Bikini Island at Bikini Atoll, in which the top 40 cm of soil would be removed in the housing and village area, and the rest of the island is treated with potassium fertilizer, prior to an assumed resettlement date of 1999. Predicted doses were considered for the following fallout-related exposure pathways: ingested Cesium-137 and Strontium-90, external gamma exposure, and inhalation and ingestion of Americium-241 + Plutonium-239+240. Two dietary scenarios were considered: (1) imported foods are available (IA), and (2) imported foods aremore » unavailable (only local foods are consumed) (IUA). Corresponding calculations of uncertainty in estimated population-average dose showed that after {approximately}5 y of residence on Bikini, the upper and lower 95% confidence limits with respect to uncertainty in this dose are estimated to be approximately 2-fold higher and lower than its population-average value, respectively (under both IA and IUA assumptions). Corresponding calculations of interindividual variability in the expected value of dose with respect to uncertainty showed that after {approximately}5 y of residence on Bikini, the upper and lower 95% confidence limits with respect to interindividual variability in this dose are estimated to be approximately 2-fold higher and lower than its expected value, respectively (under both IA and IUA assumptions). For reference, the expected values of population-average dose at age 70 were estimated to be 1.6 and 5.2 cSv under the IA and IUA dietary assumptions, respectively. Assuming that 200 Bikini resettlers would be exposed to local foods (under both IA and IUA assumptions), the maximum 1-y dose received by any Bikini resident is most likely to be approximately 2 and 8 mSv under the IA and IUA assumptions, respectively.« less

  15. The Robustness of LOGIST and BILOG IRT Estimation Programs to Violations of Local Independence.

    ERIC Educational Resources Information Center

    Ackerman, Terry A.

    One of the important underlying assumptions of all item response theory (IRT) models is that of local independence. This assumption requires that the response to an item on a test not be influenced by the response to any other items. This assumption is often taken for granted, with little or no scrutiny of the response process required to answer…

  16. Fundamental Assumptions and Aims Underlying the Principles and Policies of Federal Financial Aid to Students. Research Report.

    ERIC Educational Resources Information Center

    Johnstone, D. Bruce

    As background to the National Dialogue on Student Financial Aid, this essay discusses the fundamental assumptions and aims that underlie the principles and policies of federal financial aid to students. These eight assumptions and aims are explored: (1) higher education is the province of states, and not of the federal government; (2) the costs of…

  17. Thinking science with thinking machines: The multiple realities of basic and applied knowledge in a research border zone.

    PubMed

    Hoffman, Steve G

    2015-04-01

    Some scholars dismiss the distinction between basic and applied science as passé, yet substantive assumptions about this boundary remain obdurate in research policy, popular rhetoric, the sociology and philosophy of science, and, indeed, at the level of bench practice. In this article, I draw on a multiple ontology framework to provide a more stable affirmation of a constructivist position in science and technology studies that cannot be reduced to a matter of competing perspectives on a single reality. The analysis is grounded in ethnographic research in the border zone of Artificial Intelligence science. I translate in-situ moments in which members of neighboring but differently situated labs engage in three distinct repertoires that render the reality of basic and applied science: partitioning, flipping, and collapsing. While the essences of scientific objects are nowhere to be found, the boundary between basic and applied is neither illusion nor mere propaganda. Instead, distinctions among scientific knowledge are made real as a matter of course.

  18. On the physical parameters for Centaurus X-3 and Hercules X-1.

    NASA Technical Reports Server (NTRS)

    Mccluskey, G. E., Jr.; Kondo, Y.

    1972-01-01

    It is shown how upper and lower limits on the physical parameters of X-ray sources in Centaurus X-3 and Hercules X-1 may be determined from a reasonably simple and straightforward consideration. The basic assumption is that component A (the non-X-ray emitting component) is not a star collapsing toward its Schwartzschild radius (i.e., a black hole). This assumption appears reasonable since component A (the radius of the central occulting star) appears to physically occult component X. If component A is a 'normal' star, both observation and theory indicate that its mass is not greater than about 60 solar masses. The possibility in which component X is either a neutron star or a white dwarf is considered.

  19. The millennium development goals and household energy requirements in Nigeria.

    PubMed

    Ibitoye, Francis I

    2013-01-01

    Access to clean and affordable energy is critical for the realization of the United Nations' Millennium Development Goals, or MDGs. In many developing countries, a large proportion of household energy requirements is met by use of non-commercial fuels such as wood, animal dung, crop residues, etc., and the associated health and environmental hazards of these are well documented. In this work, a scenario analysis of energy requirements in Nigeria's households is carried out to compare estimates between 2005 and 2020 under a reference scenario, with estimates under the assumption that Nigeria will meet the millennium goals. Requirements for energy under the MDG scenario are measured by the impacts on energy use, of a reduction by half, in 2015, (a) the number of household without access to electricity for basic services, (b) the number of households without access to modern energy carriers for cooking, and (c) the number of families living in one-room households in Nigeria's overcrowded urban slums. For these to be achieved, household electricity consumption would increase by about 41% over the study period, while the use of modern fuels would more than double. This migration to the use of modern fuels for cooking results in a reduction in the overall fuelwood consumption, from 5 GJ/capita in 2005, to 2.9 GJ/capita in 2015.

  20. Regularity Results for a Class of Functionals with Non-Standard Growth

    NASA Astrophysics Data System (ADS)

    Acerbi, Emilio; Mingione, Giuseppe

    We consider the integral functional under non-standard growth assumptions that we call p(x) type: namely, we assume that a relevant model case being the functional Under sharp assumptions on the continuous function p(x)>1 we prove regularity of minimizers. Energies exhibiting this growth appear in several models from mathematical physics.

  1. Questionable assumptions hampered interpretation of a network meta-analysis of primary care depression treatments.

    PubMed

    Linde, Klaus; Rücker, Gerta; Schneider, Antonius; Kriston, Levente

    2016-03-01

    We aimed to evaluate the underlying assumptions of a network meta-analysis investigating which depression treatment works best in primary care and to highlight challenges and pitfalls of interpretation under consideration of these assumptions. We reviewed 100 randomized trials investigating pharmacologic and psychological treatments for primary care patients with depression. Network meta-analysis was carried out within a frequentist framework using response to treatment as outcome measure. Transitivity was assessed by epidemiologic judgment based on theoretical and empirical investigation of the distribution of trial characteristics across comparisons. Homogeneity and consistency were investigated by decomposing the Q statistic. There were important clinical and statistically significant differences between "pure" drug trials comparing pharmacologic substances with each other or placebo (63 trials) and trials including a psychological treatment arm (37 trials). Overall network meta-analysis produced results well comparable with separate meta-analyses of drug trials and psychological trials. Although the homogeneity and consistency assumptions were mostly met, we considered the transitivity assumption unjustifiable. An exchange of experience between reviewers and, if possible, some guidance on how reviewers addressing important clinical questions can proceed in situations where important assumptions for valid network meta-analysis are not met would be desirable. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. International Organisations and the Construction of the Learning Active Citizen: An Analysis of Adult Learning Policy Documents from a Durkheimian Perspective

    ERIC Educational Resources Information Center

    Field, John; Schemmann, Michael

    2017-01-01

    The article analyses how citizenship is conceptualised in policy documents of four key international organisations. The basic assumption is that public policy has not turned away from adult learning for active citizenship, but that there are rather new ways in which international governmental organisations conceptualise and in some cases seek to…

  3. Learners in the English Learning and Skills Sector: The Implications of Half-Right Policy Assumptions

    ERIC Educational Resources Information Center

    Hodgson, Ann; Steer, Richard; Spours, Ken; Edward, Sheila; Coffield, Frank; Finlay, Ian; Gregson, Maggie

    2007-01-01

    The English Learning and Skills Sector (LSS) contains a highly diverse range of learners and covers all aspects of post-16 learning with the exception of higher education. In the research on which this paper is based we are concerned with the effects of policy on three types of learners--unemployed adults attempting to improve their basic skills…

  4. The Trouble with Levels: A Reexamination of Craik and Lockhart's Framework for Memory Research

    ERIC Educational Resources Information Center

    Baddeley, Alan D.

    1978-01-01

    Begins by discussing a number of problems in applying a levels-of-processing approach to memory as proposed in the late 1960s and then revised in 1972 by Craik and Lockhart, suggests that some of the basic assumptions are false, and argues for information-processing models devised to study working memory and reading, which aim to explore the…

  5. Modernism, Postmodernism, or Neither? A Fresh Look at "Fine Art"

    ERIC Educational Resources Information Center

    Kamhi, Michelle Marder

    2006-01-01

    Numerous incidents have been reported in recent years wherein a work of art is mistaken as trash. The question is, how have people reached the point in the civilized world where a purported work of art cannot be distinguished from a pile of rubbish or a grid of condensation pipes? The answer to that question lies in the basic assumption of nearly…

  6. New Strategies for Delivering Library Resources to Users: Rethinking the Mechanisms in which Libraries Are Processing and Delivering Bibliographic Records

    ERIC Educational Resources Information Center

    El-Sherbini, Magda; Wilson, Amanda J

    2007-01-01

    The focus of this paper is to examine the current library practice of processing and delivering information and to introduce alternative scenarios that may keep librarians relevant in the technological era. In the scenarios presented here, the authors will attempt to challenge basic assumptions about the usefulness of and need for OPAC systems,…

  7. How Content and Symbolism in Mother Goose May Contribute to the Development of a Child's Integrated Psyche.

    ERIC Educational Resources Information Center

    Abrams, Joan

    Based on the assumption that the content and symbolism of nursery rhymes reflect the particular needs of those who respond to them, this paper analyzes Mother Goose rhymes in relation to the psychological stages of child development. Each basic need of the child, as defined in Bruno Bettelheim's "The Uses of Enchantment," is applied to…

  8. United States Air Force Agency Financial Report 2014

    DTIC Science & Technology

    2014-01-01

    basic sciences and 45 semester hours in humanities and social sciences . This 90 semester hour total comprises 60 percent of the total academic...Test and Evaluation Support $723 F-35 $628 Defense Research Sciences $373 GPS III-Operational Control Segment $373 Long Range Strike Bomber $359...Development, Test & Evaluation Family Housing & Military Construction (Less: Earned Revenue) Net Cost before Losses/ (Gains) from Actuarial Assumption

  9. Analytics in Online and Offline Language Learning Environments: The Role of Learning Design to Understand Student Online Engagement

    ERIC Educational Resources Information Center

    Rienties, Bart; Lewis, Tim; McFarlane, Ruth; Nguyen, Quan; Toetenel, Lisette

    2018-01-01

    Language education has a rich history of research and scholarship focusing on the effectiveness of learning activities and the impact these have on student behaviour and outcomes. One of the basic assumptions in foreign language pedagogy and CALL in particular is that learners want to be able to communicate effectively with native speakers of…

  10. Changing the Culture of Fuel Efficiency: A Change in Attitude

    DTIC Science & Technology

    2014-05-09

    2011 September). Organizational Culture: Assessment and Transformation. Journal of Change Management, 11(3), 305-328. Bandura , A. (1986). Social ...describes that, “organizational culture is a set of basic assumptions that a group has invented, discovered or developed in learning to cope with its...change. In the first category they found the most influential factors are leadership, attraction-selection-attrition, socialization , reward systems

  11. Take-off and Landing

    DTIC Science & Technology

    1975-01-01

    Studies Program. The results of AGARD work are reported to the member nations and the NATO Authorities through the AGARD series of publications of...calculated based on a low altitude mission profile. 2. GROUND RULES AND BASIC ASSUMPTIONS Base Design All aircraft synthesized for this study are...In this study manoeuverability is defined in terms of specific excess power (as shown in Fig. 5) at specified Mach number, altitude,and load

  12. Patterns in δ15N in roots, stems, and leaves of sugar maple and American beech seedlings, saplings, and mature trees

    Treesearch

    L.H. Pardo; P. Semaoune; P.G. Schaberg; C. Eagar; M. Sebilo

    2013-01-01

    Stable isotopes of nitrogen (N) in plants are increasingly used to evaluate ecosystem N cycling patterns. A basic assumption in this research is that plant δ15N reflects the δ15N of the N source. Recent evidence suggests that plants may fractionate on uptake, transport, or transformation of N. If the...

  13. High Voltage Testing. Volume 2. Specifications and Test Procedures

    DTIC Science & Technology

    1982-08-01

    the greatest impact on the initial assumption and criteria developed in the published criteria documents include: dielectric withstanding voltage...3382-75 Measurement of Energy and Integrated Charge Transfer Due to Partial Discharges (Corona) Using Bridge Techniques. ASTM-D 3426 - Dielectric... Energy (NEMA Publication No. WC 7-1971). NEMA Publication No. 109 - AIEE-EEI-NEMA Standard Basic Insulation Level. 092-57 - Method of Test for Flash and

  14. Critical points of metal vapors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khomkin, A. L., E-mail: alhomkin@mail.ru; Shumikhin, A. S.

    2015-09-15

    A new method is proposed for calculating the parameters of critical points and binodals for the vapor–liquid (insulator–metal) phase transition in vapors of metals with multielectron valence shells. The method is based on a model developed earlier for the vapors of alkali metals, atomic hydrogen, and exciton gas, proceeding from the assumption that the cohesion determining the basic characteristics of metals under normal conditions is also responsible for their properties in the vicinity of the critical point. It is proposed to calculate the cohesion of multielectron atoms using well-known scaling relations for the binding energy, which are constructed for mostmore » metals in the periodic table by processing the results of many numerical calculations. The adopted model allows the parameters of critical points and binodals for the vapor–liquid phase transition in metal vapors to be calculated using published data on the properties of metals under normal conditions. The parameters of critical points have been calculated for a large number of metals and show satisfactory agreement with experimental data for alkali metals and with available estimates for all other metals. Binodals of metals have been calculated for the first time.« less

  15. Ultrafast electron diffraction study of ab-plane dynamics in superconducting Bi2Sr<2CaCu2O8+d

    NASA Astrophysics Data System (ADS)

    Konstantinova, Tatiana; Reid, Alexander; Wu, Lijun; Durr, Hermann; Wang, Xijie; Zhu, Yimei

    The role of phonons and other collective modes in cooperative electron phenomena in high-TC cuprate superconductors is an extensively interesting topic. Time-resolved experiments provide temporal hierarchy of the bosonic modes interacting with electrons. However, majority of research in this field explore dynamics of electronic states and can only make indirect conclusion about involvement of the lattice. We report time-resolved study of optimally doped Bi2Sr2CaCu2O8+d lattice response to photo-excitation by means of ultrafast electron diffraction that is directly sensitive to atomic motion. Data analysis utilizing Bloch-wave calculation of diffraction peak intensity allows separation of Cu-O in-plane vibration building up on the sub picosecond time scale from the low energy phonon population growth with a much slower rate. This study confirms the assumption of strong electron coupling to the Cu-O plane phonons. This work was supported by the US DOE, Office of Science, Basic Energy Science, Materials Science and Engineering Division under Contract No: DE-AC02-98CH10886; DOE LDRD funding under contract DE-AC02-76SF00515 and BNL.

  16. Non-Classical Order in Sphere Forming ABAC Tetrablock Copolymers

    NASA Astrophysics Data System (ADS)

    Zhang, Jingwen; Sides, Scott; Bates, Frank

    2013-03-01

    AB diblock and ABC triblock copolymers have been studied thoroughly. ABAC tetrablock copolymers, representing the simplest variation from ABC triblock by breaking the molecular symmetry via inserting some of the A block in between B and C blocks, have been studied systematically in this research. The model system is poly(styrene-b-isoprene-b-styrene-b-ethylene oxide) (SISO) tetrablock terpolymers and the resulting morphologies were characterized by nuclear magnetic resonance, gel permeation chromatography, small-angle X-ray scattering, transmission electron microscopy, differential scanning calorimetry and dynamic mechanical spectroscopy. Two novel phases are first discovered in a single component block copolymers: hexagonally ordered spherical phase and tentatively identified dodecagonal quasicrystalline (QC) phase. In particular, the discovery of QC phase bridges the world of soft matters to that of metals. These unusual sets of morphologies will be discussed in the context of segregation under the constraints associated with the tetrablock molecular architecture. Theoretical calculations based on the assumption of Gaussian chain statistics provide valuable insights into the molecular configurations associated with these morphologies. the U.S. Department of Energy, Basic Energy Sciences, Division of Materials Science and Engineering, under contract number DEAC05-00OR22725 with UT-Battelle LLC at Oak Ridge National Lab.

  17. Neuroscience of Internet Pornography Addiction: A Review and Update.

    PubMed

    Love, Todd; Laier, Christian; Brand, Matthias; Hatch, Linda; Hajela, Raju

    2015-09-18

    Many recognize that several behaviors potentially affecting the reward circuitry in human brains lead to a loss of control and other symptoms of addiction in at least some individuals. Regarding Internet addiction, neuroscientific research supports the assumption that underlying neural processes are similar to substance addiction. The American Psychiatric Association (APA) has recognized one such Internet related behavior, Internet gaming, as a potential addictive disorder warranting further study, in the 2013 revision of their Diagnostic and Statistical Manual. Other Internet related behaviors, e.g., Internet pornography use, were not covered. Within this review, we give a summary of the concepts proposed underlying addiction and give an overview about neuroscientific studies on Internet addiction and Internet gaming disorder. Moreover, we reviewed available neuroscientific literature on Internet pornography addiction and connect the results to the addiction model. The review leads to the conclusion that Internet pornography addiction fits into the addiction framework and shares similar basic mechanisms with substance addiction. Together with studies on Internet addiction and Internet Gaming Disorder we see strong evidence for considering addictive Internet behaviors as behavioral addiction. Future research needs to address whether or not there are specific differences between substance and behavioral addiction.

  18. Analysis of the reflective multibandgap solar cell concept

    NASA Technical Reports Server (NTRS)

    Stern, T. G.

    1983-01-01

    A new and unique approach to improving photovoltaic conversion efficiency, the reflective multiband gap solar cell concept, was examined. This concept uses back surface reflectors and light trapping with several physically separated cells of different bandgaps to make more effective use of energy from different portions of the solar spectrum. Preliminary tests performed under General Dynamics Independent Research and Development (IRAD) funding have demonstrated the capability for achieving in excess of 20% conversion efficiency with aluminum gallium arsenide and silicon. This study analyzed the ultimate potential for high conversion efficiency with 2, 3, 4, and 5 different bandgap materials, determined the appropriate bandgaps needed to achieve this optimized efficiency, and identified potential problems or constraints. The analysis indicated that an improvement in efficiency of better than 40% could be attained in this multibandgap approach, compared to a single bandgap converter under the same assumptions. Increased absorption loss on the back surface reflector was found to incur a minimal penalty on efficiency for two and three bandgap systems. Current models for bulk absorption losses in 3-5 materials were found to be inadequate for explaining laboratory observed transmission losses. Recommendations included the continued development of high bandgap back surface reflector cells and basic research on semiconductor absorption mechanisms.

  19. Neuroscience of Internet Pornography Addiction: A Review and Update

    PubMed Central

    Love, Todd; Laier, Christian; Brand, Matthias; Hatch, Linda; Hajela, Raju

    2015-01-01

    Many recognize that several behaviors potentially affecting the reward circuitry in human brains lead to a loss of control and other symptoms of addiction in at least some individuals. Regarding Internet addiction, neuroscientific research supports the assumption that underlying neural processes are similar to substance addiction. The American Psychiatric Association (APA) has recognized one such Internet related behavior, Internet gaming, as a potential addictive disorder warranting further study, in the 2013 revision of their Diagnostic and Statistical Manual. Other Internet related behaviors, e.g., Internet pornography use, were not covered. Within this review, we give a summary of the concepts proposed underlying addiction and give an overview about neuroscientific studies on Internet addiction and Internet gaming disorder. Moreover, we reviewed available neuroscientific literature on Internet pornography addiction and connect the results to the addiction model. The review leads to the conclusion that Internet pornography addiction fits into the addiction framework and shares similar basic mechanisms with substance addiction. Together with studies on Internet addiction and Internet Gaming Disorder we see strong evidence for considering addictive Internet behaviors as behavioral addiction. Future research needs to address whether or not there are specific differences between substance and behavioral addiction. PMID:26393658

  20. Some important considerations in the development of stress corrosion cracking test methods.

    NASA Technical Reports Server (NTRS)

    Wei, R. P.; Novak, S. R.; Williams, D. P.

    1972-01-01

    Discussion of some of the precaution needs the development of fracture-mechanics based test methods for studying stress corrosion cracking involves. Following a review of pertinent analytical fracture mechanics considerations and of basic test methods, the implications for test corrosion cracking studies of the time-to-failure determining kinetics of crack growth and life are examined. It is shown that the basic assumption of the linear-elastic fracture mechanics analyses must be clearly recognized and satisfied in experimentation and that the effects of incubation and nonsteady-state crack growth must also be properly taken into account in determining the crack growth kinetics, if valid data are to be obtained from fracture-mechanics based test methods.

  1. Acoustic Absorption in Porous Materials

    NASA Technical Reports Server (NTRS)

    Kuczmarski, Maria A.; Johnston, James C.

    2011-01-01

    An understanding of both the areas of materials science and acoustics is necessary to successfully develop materials for acoustic absorption applications. This paper presents the basic knowledge and approaches for determining the acoustic performance of porous materials in a manner that will help materials researchers new to this area gain the understanding and skills necessary to make meaningful contributions to this field of study. Beginning with the basics and making as few assumptions as possible, this paper reviews relevant topics in the acoustic performance of porous materials, which are often used to make acoustic bulk absorbers, moving from the physics of sound wave interactions with porous materials to measurement techniques for flow resistivity, characteristic impedance, and wavenumber.

  2. Hydrogen donors and acceptors and basic amino acids jointly contribute to carcinogenesis.

    PubMed

    Tang, Man; Zhou, Yanchao; Li, Yiqi; Zou, Juntong; Yang, Beicheng; Cai, Li; Zhang, Xuelan; Liu, Qiuyun

    2017-01-01

    A hypothesis is postulated that high content of hydrogen donors and acceptors, and basic amino acids cause the intracellular trapping of the H + and Cl - ions, which increases cancer risks as local formation of HCl is mutagenic to DNA. Other cations such as Ca 2+ , and weak acids such as short-chain organic acids may attenuate the intracellular gathering of the H + and Cl - , two of the most abundant ions in the cells. Current data on increased cancer risks in diabetic and obese patients are consistent with the assumption that hydrogen bonding propensity on glucose, triglycerides and other molecules is among the causative factors. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. MHD processes in the outer heliosphere

    NASA Technical Reports Server (NTRS)

    Burlaga, L. F.

    1984-01-01

    The magnetic field measurements from Voyager and the magnetohydrodynamic (MHD) processes in the outer heliosphere are reviewed. A bibliography of the experimental and theoretical work concerning magnetic fields and plasmas observed in the outer heliosphere is given. Emphasis in this review is on basic concepts and dynamical processes involving the magnetic field. The theory that serves to explain and unify the interplanetary magnetic field and plasma observations is magnetohydrodynamics. Basic physical processes and observations that relate directly to solutions of the MHD equations are emphasized, but obtaining solutions of this complex system of equations involves various assumptions and approximations. The spatial and temporal complexity of the outer heliosphere and some approaches for dealing with this complexity are discussed.

  4. Basic statistics (the fundamental concepts).

    PubMed

    Lim, Eric

    2014-12-01

    An appreciation and understanding of statistics is import to all practising clinicians, not simply researchers. This is because mathematics is the fundamental basis to which we base clinical decisions, usually with reference to the benefit in relation to risk. Unless a clinician has a basic understanding of statistics, he or she will never be in a position to question healthcare management decisions that have been handed down from generation to generation, will not be able to conduct research effectively nor evaluate the validity of published evidence (usually making an assumption that most published work is either all good or all bad). This article provides a brief introduction to basic statistical methods and illustrates its use in common clinical scenarios. In addition, pitfalls of incorrect usage have been highlighted. However, it is not meant to be a substitute for formal training or consultation with a qualified and experienced medical statistician prior to starting any research project.

  5. The Equations of Oceanic Motions

    NASA Astrophysics Data System (ADS)

    Müller, Peter

    2006-10-01

    Modeling and prediction of oceanographic phenomena and climate is based on the integration of dynamic equations. The Equations of Oceanic Motions derives and systematically classifies the most common dynamic equations used in physical oceanography, from large scale thermohaline circulations to those governing small scale motions and turbulence. After establishing the basic dynamical equations that describe all oceanic motions, M|ller then derives approximate equations, emphasizing the assumptions made and physical processes eliminated. He distinguishes between geometric, thermodynamic and dynamic approximations and between the acoustic, gravity, vortical and temperature-salinity modes of motion. Basic concepts and formulae of equilibrium thermodynamics, vector and tensor calculus, curvilinear coordinate systems, and the kinematics of fluid motion and wave propagation are covered in appendices. Providing the basic theoretical background for graduate students and researchers of physical oceanography and climate science, this book will serve as both a comprehensive text and an essential reference.

  6. [Introduction to Exploratory Factor Analysis (EFA)].

    PubMed

    Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón

    2012-03-01

    Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  7. A Logistic Regression and Markov Chain Model for the Prediction of Nation-state Violent Conflicts and Transitions

    DTIC Science & Technology

    2016-03-24

    McCarthy, Blood Meridian 1.1 General Issue Violent conflict between competing groups has been a pervasive and driving force for all of human history...It has evolved from small skirmishes between unarmed groups , wielding rudimentary weapons, to industrialized global conflagrations. Global...methodology is presented in Figure 2. Figure 2: Study Methodology 5 1.6 Study Assumptions and Limitations Assumptions Four underlying assumptions were

  8. 29 CFR 4050.2 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... interest rate means the rate of interest applicable to underpayments of guaranteed benefits by the PBGC... of proof of death, individuals not located are presumed living. Missing participant annuity assumptions means the interest rate assumptions and actuarial methods for valuing benefits under § 4044.52 of...

  9. 29 CFR 4050.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... interest rate means the rate of interest applicable to underpayments of guaranteed benefits by the PBGC... of proof of death, individuals not located are presumed living. Missing participant annuity assumptions means the interest rate assumptions and actuarial methods for valuing benefits under § 4044.52 of...

  10. 29 CFR 4050.2 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... interest rate means the rate of interest applicable to underpayments of guaranteed benefits by the PBGC... of proof of death, individuals not located are presumed living. Missing participant annuity assumptions means the interest rate assumptions and actuarial methods for valuing benefits under § 4044.52 of...

  11. 29 CFR 4050.2 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... interest rate means the rate of interest applicable to underpayments of guaranteed benefits by the PBGC... of proof of death, individuals not located are presumed living. Missing participant annuity assumptions means the interest rate assumptions and actuarial methods for valuing benefits under § 4044.52 of...

  12. 29 CFR 4050.2 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... interest rate means the rate of interest applicable to underpayments of guaranteed benefits by the PBGC... of proof of death, individuals not located are presumed living. Missing participant annuity assumptions means the interest rate assumptions and actuarial methods for valuing benefits under § 4044.52 of...

  13. From Generating in the Lab to Tutoring Systems in Classrooms.

    PubMed

    McNamara, Danielle S; Jacovina, Matthew E; Snow, Erica L; Allen, Laura K

    2015-01-01

    Work in cognitive and educational psychology examines a variety of phenomena related to the learning and retrieval of information. Indeed, Alice Healy, our honoree, and her colleagues have conducted a large body of groundbreaking research on this topic. In this article we discuss how 3 learning principles (the generation effect, deliberate practice and feedback, and antidotes to disengagement) discussed in Healy, Schneider, and Bourne (2012) have influenced the design of 2 intelligent tutoring systems that attempt to incorporate principles of skill and knowledge acquisition. Specifically, this article describes iSTART-2 and the Writing Pal, which provide students with instruction and practice using comprehension and writing strategies. iSTART-2 provides students with training to use effective comprehension strategies while self-explaining complex text. The Writing Pal provides students with instruction and practice to use basic writing strategies when writing persuasive essays. Underlying these systems are the assumptions that students should be provided with initial instruction that breaks down the tasks into component skills and that deliberate practice should include active generation with meaningful feedback, all while remaining engaging. The implementation of these assumptions is complicated by the ill-defined natures of comprehension and writing and supported by the use of various natural language processing techniques. We argue that there is value in attempting to integrate empirically supported learning principles into educational activities, even when there is imperfect alignment between them. Examples from the design of iSTART-2 and Writing Pal guide this argument.

  14. Retrocausation Or Extant Indefinite Reality?

    NASA Astrophysics Data System (ADS)

    Houtkooper, Joop M.

    2006-10-01

    The possibility of retrocausation has been considered to explain the occurrence of anomalous phenomena in which the ostensible effects are preceded by their causes. A scrutiny of both experimental methodology and the experimental data is called for. A review of experimental data reveals the existence of such effects to be a serious possibility. The experimental methodology entails some conceptual difficulties, these depending on the underlying assumptions about the effects. A major point is an ambiguity between anomalous acquisition of information and retrocausation in exerted influences. A unifying theory has been proposed, based upon the fundamental randomness of quantum mechanics. Quantum mechanical randomness may be regarded as a tenacious phenomenon, that apparently is only resolved by the human observer of the random variable in question. This has led to the "observational theory" of anomalous phenomena, which is based upon the assumption that the preference of a motivated observer is able to interact with the extant indefinite random variable that is being observed. This observational theory has led to a novel prediction, which has been corroborated in experiments. Moreover, different classes of anomalous phenomena can be explained by the same basic mechanism. This foregoes retroactive causation, but, instead, requires that macroscopic physical variables remain in a state of indefinite reality and thus remain influenceable by mental efforts until these are observed. More work is needed to discover the relevant psychological and neurophysiological variables involved in effective motivated observation. Besides these practicalities, the fundamentals still have some interesting loose ends.

  15. Household transmissibility of avian influenza A (H7N9) virus, China, February to May 2013 and October 2013 to March 2014

    PubMed Central

    Yang, Y; Zhang, Y; Fang, L; Halloran, M E; Ma, M; Liang, S; Kenah, E; Britton, T; Chen, E; Hu, J; Tang, F; Cao, W; Feng, Z; Longini, I M

    2015-01-01

    To study human-to-human transmissibility of the avian influenza A (H7N9) virus in China, household contact information was collected for 125 index cases during the spring wave (February to May 2013), and for 187 index cases during the winter wave (October 2013 to March 2014). Using a statistical model, we found evidence for human-to-human transmission, but such transmission is not sustainable. Under plausible assumptions about the natural history of disease and the relative transmission frequencies in settings other than household, we estimate the household secondary attack rate (SAR) among humans to be 1.4% (95% CI: 0.8 to 2.3), and the basic reproductive number R0 to be 0.08 (95% CI: 0.05 to 0.13). The estimates range from 1.3% to 2.2% for SAR and from 0.07 to 0.12 for R0 with reasonable changes in the assumptions. There was no significant change in the human-to-human transmissibility of the virus between the two waves, although a minor increase was observed in the winter wave. No sex or age difference in the risk of infection from a human source was found. Human-to-human transmissibility of H7N9 continues to be limited, but it needs to be closely monitored for potential increase via genetic reassortment or mutation. PMID:25788253

  16. Are Assumptions of Well-Known Statistical Techniques Checked, and Why (Not)?

    PubMed Central

    Hoekstra, Rink; Kiers, Henk A. L.; Johnson, Addie

    2012-01-01

    A valid interpretation of most statistical techniques requires that one or more assumptions be met. In published articles, however, little information tends to be reported on whether the data satisfy the assumptions underlying the statistical techniques used. This could be due to self-selection: Only manuscripts with data fulfilling the assumptions are submitted. Another explanation could be that violations of assumptions are rarely checked for in the first place. We studied whether and how 30 researchers checked fictitious data for violations of assumptions in their own working environment. Participants were asked to analyze the data as they would their own data, for which often used and well-known techniques such as the t-procedure, ANOVA and regression (or non-parametric alternatives) were required. It was found that the assumptions of the techniques were rarely checked, and that if they were, it was regularly by means of a statistical test. Interviews afterward revealed a general lack of knowledge about assumptions, the robustness of the techniques with regards to the assumptions, and how (or whether) assumptions should be checked. These data suggest that checking for violations of assumptions is not a well-considered choice, and that the use of statistics can be described as opportunistic. PMID:22593746

  17. ASP-G: an ASP-based method for finding attractors in genetic regulatory networks

    PubMed Central

    Mushthofa, Mushthofa; Torres, Gustavo; Van de Peer, Yves; Marchal, Kathleen; De Cock, Martine

    2014-01-01

    Motivation: Boolean network models are suitable to simulate GRNs in the absence of detailed kinetic information. However, reducing the biological reality implies making assumptions on how genes interact (interaction rules) and how their state is updated during the simulation (update scheme). The exact choice of the assumptions largely determines the outcome of the simulations. In most cases, however, the biologically correct assumptions are unknown. An ideal simulation thus implies testing different rules and schemes to determine those that best capture an observed biological phenomenon. This is not trivial because most current methods to simulate Boolean network models of GRNs and to compute their attractors impose specific assumptions that cannot be easily altered, as they are built into the system. Results: To allow for a more flexible simulation framework, we developed ASP-G. We show the correctness of ASP-G in simulating Boolean network models and obtaining attractors under different assumptions by successfully recapitulating the detection of attractors of previously published studies. We also provide an example of how performing simulation of network models under different settings help determine the assumptions under which a certain conclusion holds. The main added value of ASP-G is in its modularity and declarativity, making it more flexible and less error-prone than traditional approaches. The declarative nature of ASP-G comes at the expense of being slower than the more dedicated systems but still achieves a good efficiency with respect to computational time. Availability and implementation: The source code of ASP-G is available at http://bioinformatics.intec.ugent.be/kmarchal/Supplementary_Information_Musthofa_2014/asp-g.zip. Contact: Kathleen.Marchal@UGent.be or Martine.DeCock@UGent.be Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25028722

  18. Experimental measurement of binding energy, selectivity, and allostery using fluctuation theorems.

    PubMed

    Camunas-Soler, Joan; Alemany, Anna; Ritort, Felix

    2017-01-27

    Thermodynamic bulk measurements of binding reactions rely on the validity of the law of mass action and the assumption of a dilute solution. Yet, important biological systems such as allosteric ligand-receptor binding, macromolecular crowding, or misfolded molecules may not follow these assumptions and may require a particular reaction model. Here we introduce a fluctuation theorem for ligand binding and an experimental approach using single-molecule force spectroscopy to determine binding energies, selectivity, and allostery of nucleic acids and peptides in a model-independent fashion. A similar approach could be used for proteins. This work extends the use of fluctuation theorems beyond unimolecular folding reactions, bridging the thermodynamics of small systems and the basic laws of chemical equilibrium. Copyright © 2017, American Association for the Advancement of Science.

  19. On the accuracy of personality judgment: a realistic approach.

    PubMed

    Funder, D C

    1995-10-01

    The "accuracy paradigm" for the study of personality judgment provides an important, new complement to the "error paradigm" that dominated this area of research for almost 2 decades. The present article introduces a specific approach within the accuracy paradigm called the Realistic Accuracy Model (RAM). RAM begins with the assumption that personality traits are real attributes of individuals. This assumption entails the use of a broad array of criteria for the evaluation of personality judgment and leads to a model that describes accuracy as a function of the availability, detection, and utilization of relevant behavioral cues. RAM provides a common explanation for basic moderators of accuracy, sheds light on how these moderators interact, and outlines a research agenda that includes the reintegration of the study of error with the study of accuracy.

  20. The limits of discipline: towards interdisciplinary food studies.

    PubMed

    Wilk, Richard

    2012-11-05

    While the number of scholars working on the broad topic of food has never been greater, the topic is still divided among numerous disciplines and specialists who do not often communicate with each other. This paper discusses some of the deep differences between disciplinary approaches, and concludes that food scientists differ in some of their basic assumptions about human nature. After outlining some of the institutional issues standing in the way of interdisciplinary work, the paper argues for a more synthetic and empirical approach, grounded in the study of everyday life. True interdisciplinary collaboration will have to go beyond assembling multidisciplinary teams. Instead we must accept the limitations of the classic disciplinary paradigms, and be willing to question and test our methods and assumptions. Copyright © 2012 Elsevier Inc. All rights reserved.

  1. An Analysis of the Economic Assumptions Underlying Fiscal Plans FY1981 - FY1984.

    DTIC Science & Technology

    1986-06-01

    OF THE ECONOMIC ASSUMPTIONS UNDERLYING FISCAL PLANS FY1981 - FY1984 by Robert Welch Beck June 1986 Thesis Advisor: P. M. CARRICK Approved for public ...DOWGRDIN SHEDLEApproved for public releace; it - 2b ECLSSIICAIONI DWNGAD G SHEDLEbut ion is unlimited. 4! PERFORMING ORGANIZATION REPORT NUMBER(S) S...SECURITY CLASSIFICATION OF T4𔃿 PAC~E All other editions are obsolete Approved for public release; distribution is unlimited. An Analysis of the

  2. Inexperience and risky decisions of young adolescents, as pedestrians and cyclists, in interactions with lorries, and the effects of competency versus awareness education.

    PubMed

    Twisk, Divera; Vlakveld, Willem; Mesken, Jolieke; Shope, Jean T; Kok, Gerjo

    2013-06-01

    Road injuries are a prime cause of death in early adolescence. Often road safety education (RSE) is used to target risky road behaviour in this age group. These RSE programmes are frequently based on the assumption that deliberate risk taking rather than lack of competency underlies risk behaviour. This study tested the competency of 10-13 year olds, by examining their decisions - as pedestrians and cyclists - in dealing with blind spot areas around lorries. Also, the effects of an awareness programme and a competency programme on these decisions were evaluated. Table-top models were used, representing seven scenarios that differed in complexity: one basic scenario to test the identification of blind spot areas, and 6 traffic scenarios to test behaviour in traffic situations of low or high task complexity. Using a quasi-experimental design (pre-test and post-test reference group design without randomization), the programme effects were assessed by requiring participants (n=62) to show, for each table-top traffic scenario, how they would act if they were in that traffic situation. On the basic scenario, at pre-test 42% of the youngsters identified all blind spots correctly, but only 27% showed safe behaviour in simple scenarios and 5% in complex scenarios. The competency programme yielded improved performance on the basic scenario but not on the traffic scenarios, whereas the awareness programme did not result in any improvements. The correlation between improvements on the basic scenarios and the traffic scenarios was not significant. Young adolescents have not yet mastered the necessary skills for safe performance in simple and complex traffic situations, thus underlining the need for effective prevention programmes. RSE may improve the understanding of blind spot areas but this does not 'automatically' transfer to performance in traffic situations. Implications for the design of RSE are discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Pulsational mode fluctuations and their basic conservation laws

    NASA Astrophysics Data System (ADS)

    Borah, B.; Karmakar, P. K.

    2015-01-01

    We propose a theoretical hydrodynamic model for investigating the basic features of nonlinear pulsational mode stability in a partially charged dust molecular cloud within the framework of the Jeans homogenization assumption. The inhomogeneous cloud is modeled as a quasi-neutral multifluid consisting of the warm electrons, warm ions, and identical inertial cold dust grains with partial ionization in a neutral gaseous background. The grain-charge is assumed not to vary in the fluctuation evolution time scale. The active inertial roles of the thermal species are included. We apply a standard multiple scaling technique centered on the gravito-electrostatic equilibrium to understand the fluctuations on the astrophysical scales of space and time. This is found that electrostatic and self-gravitational eigenmodes co-exist as diverse solitary spectral patterns governed by a pair of Korteweg-de Vries (KdV) equations. In addition, all the relevant classical conserved quantities associated with the KdV system under translational invariance are methodologically derived and numerically analyzed. A full numerical shape-analysis of the fluctuations, scale lengths and perturbed densities with multi-parameter variation of judicious plasma conditions is carried out. A correlation of the perturbed densities and gravito-electrostatic spectral patterns is also graphically indicated. It is demonstrated that the solitary mass, momentum and energy densities also evolve like solitary spectral patterns which remain conserved throughout the spatiotemporal scales of the fluctuation dynamics. Astrophysical and space environments significant to our results are briefly highlighted.

  4. Economics, ecologics, and mechanics: The dynamics of responding under conditions of varying motivation

    PubMed Central

    Killeen, Peter R.

    1995-01-01

    The mechanics of behavior developed by Killeen (1994) is extended to deal with deprivation and satiation and with recovery of arousal at the beginning of sessions. The extended theory is validated against satiation curves and within-session changes in response rates. Anomalies, such as (a) the positive correlation between magnitude of an incentive and response rates in some contexts and a negative correlation in other contexts and (b) the greater prominence of incentive effects when magnitude is varied within the session rather than between sessions, are explained in terms of the basic interplay of drive and incentive motivation. The models are applied to data from closed economies in which changes of satiation levels play a key role in determining the changes in behavior. Relaxation of various assumptions leads to closed-form models for response rates and demand functions in these contexts, ones that show reasonable accord with the data and reinforce arguments for unit price as a controlling variable. The central role of deprivation level in this treatment distinguishes it from economic models. It is argued that traditional experiments should be redesigned to reveal basic principles, that ecologic experiments should be redesigned to test the applicability of those principles in more natural contexts, and that behavioral economics should consist of the applications of these principles to economic contexts, not the adoption of economic models as alternatives to behavioral analysis. PMID:16812776

  5. Assumptions Underlying the Use of Different Types of Simulations.

    ERIC Educational Resources Information Center

    Cunningham, J. Barton

    1984-01-01

    Clarifies appropriateness of certain simulation approaches by distinguishing between different types of simulations--experimental, predictive, evaluative, and educational--on the basis of purpose, assumptions, procedures, and criteria for evaluating. The kinds of questions each type best responds to are discussed. (65 references) (MBR)

  6. ADJECTIVES AS NOUN PHRASES.

    ERIC Educational Resources Information Center

    ROSS, JOHN ROBERT

    THIS ANALYSIS OF UNDERLYING SYNTACTIC STRUCTURE IS BASED ON THE ASSUMPTION THAT THE PARTS OF SPEECH CALLED "VERBS" AND "ADJECTIVES" ARE TWO SUBCATEGORIES OF ONE MAJOR LEXICAL CATEGORY, "PREDICATE." FROM THIS ASSUMPTION, THE HYPOTHESIS IS ADVANCED THAT, IN LANGUAGES EXHIBITING THE COPULA, THE DEEP STRUCTURE OF SENTENCES CONTAINING PREDICATE…

  7. 24 CFR 58.4 - Assumption authority.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., decision-making, and action that would otherwise apply to HUD under NEPA and other provisions of law that... environmental review, decision-making and action for programs authorized by the Native American Housing... separate decision regarding assumption of responsibilities for each of these Acts and communicate that...

  8. THE MODELING OF THE FATE AND TRANSPORT OF ENVIRONMENTAL POLLUTANTS

    EPA Science Inventory

    Current models that predict the fate of organic compounds released to the environment are based on the assumption that these compounds exist exclusively as neutral species. This assumption is untrue under many environmental conditions, as some molecules can exist as cations, anio...

  9. Nonlinear interaction of a fast magnetogasdynamic shock with a tangential discontinuity

    NASA Technical Reports Server (NTRS)

    Neubauer, F. M.

    1973-01-01

    A basic problem, which is of considerable interest in geoastrophysical applications of magnetogasdynamics, is the nonlinear interaction of a fast shock (S sub f) with a tangential discontinuity (T). The problem is treated for an arbitrary S sub f interacting with an arbitrary T under the assumption that in the frame of reference in which S sub f and T are at rest, the flow is superfast on both sides of T, and that a steady flow develops. As a result of the nonlinear analysis a flow pattern is obtained consisting of the incident discontinuities S sub f 1 and T2 and a transmitted fast shock S sub f 3, the modified tangential discontinuity T4 and a reflected fast shock S sub f 5 or fast rarefaction wave R sub f 5. The results are discussed in terms of seven significant similarity parameters. In addition special cases like changes in magnetic field direction only, changes in desnity or velocity shear only etc. are discussed.

  10. On computing stress in polymer systems involving multi-body potentials from molecular dynamics simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Yao, E-mail: fu5@mailbox.sc.edu, E-mail: jhsong@cec.sc.edu; Song, Jeong-Hoon, E-mail: fu5@mailbox.sc.edu, E-mail: jhsong@cec.sc.edu

    2014-08-07

    Hardy stress definition has been restricted to pair potentials and embedded-atom method potentials due to the basic assumptions in the derivation of a symmetric microscopic stress tensor. Force decomposition required in the Hardy stress expression becomes obscure for multi-body potentials. In this work, we demonstrate the invariance of the Hardy stress expression for a polymer system modeled with multi-body interatomic potentials including up to four atoms interaction, by applying central force decomposition of the atomic force. The balance of momentum has been demonstrated to be valid theoretically and tested under various numerical simulation conditions. The validity of momentum conservation justifiesmore » the extension of Hardy stress expression to multi-body potential systems. Computed Hardy stress has been observed to converge to the virial stress of the system with increasing spatial averaging volume. This work provides a feasible and reliable linkage between the atomistic and continuum scales for multi-body potential systems.« less

  11. Influence of Tempo and Rhythmic Unit in Musical Emotion Regulation.

    PubMed

    Fernández-Sotos, Alicia; Fernández-Caballero, Antonio; Latorre, José M

    2016-01-01

    This article is based on the assumption of musical power to change the listener's mood. The paper studies the outcome of two experiments on the regulation of emotional states in a series of participants who listen to different auditions. The present research focuses on note value, an important musical cue related to rhythm. The influence of two concepts linked to note value is analyzed separately and discussed together. The two musical cues under investigation are tempo and rhythmic unit. The participants are asked to label music fragments by using opposite meaningful words belonging to four semantic scales, namely "Tension" (ranging from Relaxing to Stressing), "Expressiveness" (Expressionless to Expressive), "Amusement" (Boring to Amusing) and "Attractiveness" (Pleasant to Unpleasant). The participants also have to indicate how much they feel certain basic emotions while listening to each music excerpt. The rated emotions are "Happiness," "Surprise," and "Sadness." This study makes it possible to draw some interesting conclusions about the associations between note value and emotions.

  12. Pressure and temperature dependence of shear modulus and yield strength for aluminum, copper, and tungsten under shock compression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peng Jianxiang; Jing Fuqian; Li Dahong

    2005-07-01

    Experimental data for the shear modulus and yield strength of shocked aluminum, copper, and tungsten were systematically analyzed. Comparisons between these data and calculations using the Steinberg-Cochran-Guinan (SCG) constitutive model [D. J. Steinberg, S. G. Cochran, and M. W. Guinan, J. Appl. Phys. 51, 1498 (1980)] indicate that the yield strength has the same dependence on pressure and temperature as the shear modulus for aluminum for shock pressures up to 50 GPa, for copper to 100 GPa, and for tungsten to 200 GPa. Therefore, the assumption of Y{sub p}{sup '}/Y{sub 0}=G{sub p}{sup '}/G{sub 0},Y{sub T}{sup '}/Y{sub 0}=G{sub T}{sup '}/G{sub 0}more » is basically acceptable for these materials, and the SCG model can be used to describe the shear modulus and yield strength of the shocked material at high pressure and temperature.« less

  13. Introduction and Application of non-stationary Standardized Precipitation Index Considering Probability Distribution Function and Return Period

    NASA Astrophysics Data System (ADS)

    Park, J.; Lim, Y. J.; Sung, J. H.; Kang, H. S.

    2017-12-01

    The widely used meteorological drought index, the Standardized Precipitation Index (SPI) basically assumes stationarity, but recent change in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process has been proposed. The results are evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the shape of probability distribution function wider than before. This understanding implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.

  14. Transmission Heterogeneity and Autoinoculation in a Multisite Infection Model of HPV

    PubMed Central

    Brouwer, Andrew F.; Meza, Rafael; Eisenberg, Marisa C.

    2015-01-01

    The human papillomavirus (HPV) is sexually transmitted and can infect oral, genital, and anal sites in the human epithelium. Here, we develop a multisite transmission model that includes autoinoculation, to study HPV and other multisite diseases. Under a homogeneous-contacts assumption, we analyze the basic reproduction number R0, as well as type and target reproduction numbers, for a two-site model. In particular, we find that R0 occupies a space between taking the maximum of next generation matrix terms for same site transmission and taking the geometric average of cross-site transmission terms in such a way that heterogeneity in the same-site transmission rates increases R0 while heterogeneity in the cross-site transmission decreases it. Additionally, autoinoculation adds considerable complexity to the form of R0. We extend this analysis to a heterosexual population, which additionally yields dynamics analogous to those of vector–host models. We also examine how these issues of heterogeneity may affect disease control, using type and target reproduction numbers. PMID:26518265

  15. Decision-making and problem-solving methods in automation technology

    NASA Technical Reports Server (NTRS)

    Hankins, W. W.; Pennington, J. E.; Barker, L. K.

    1983-01-01

    The state of the art in the automation of decision making and problem solving is reviewed. The information upon which the report is based was derived from literature searches, visits to university and government laboratories performing basic research in the area, and a 1980 Langley Research Center sponsored conferences on the subject. It is the contention of the authors that the technology in this area is being generated by research primarily in the three disciplines of Artificial Intelligence, Control Theory, and Operations Research. Under the assumption that the state of the art in decision making and problem solving is reflected in the problems being solved, specific problems and methods of their solution are often discussed to elucidate particular aspects of the subject. Synopses of the following major topic areas comprise most of the report: (1) detection and recognition; (2) planning; and scheduling; (3) learning; (4) theorem proving; (5) distributed systems; (6) knowledge bases; (7) search; (8) heuristics; and (9) evolutionary programming.

  16. Calculation of thermomechanical fatigue life based on isothermal behavior

    NASA Technical Reports Server (NTRS)

    Halford, Gary R.; Saltsman, James F.

    1987-01-01

    The isothermal and thermomechanical fatigue (TMF) crack initiation response of a hypothetical material was analyzed. Expected thermomechanical behavior was evaluated numerically based on simple, isothermal, cyclic stress-strain - time characteristics and on strainrange versus cyclic life relations that have been assigned to the material. The attempt was made to establish basic minimum requirements for the development of a physically accurate TMF life-prediction model. A worthy method must be able to deal with the simplest of conditions: that is, those for which thermal cycling, per se, introduces no damage mechanisms other than those found in isothermal behavior. Under these assumed conditions, the TMF life should be obtained uniquely from known isothermal behavior. The ramifications of making more complex assumptions will be dealt with in future studies. Although analyses are only in their early stages, considerable insight has been gained in understanding the characteristics of several existing high-temperature life-prediction methods. The present work indicates that the most viable damage parameter is based on the inelastic strainrange.

  17. Introduction and application of non-stationary standardized precipitation index considering probability distribution function and return period

    NASA Astrophysics Data System (ADS)

    Park, Junehyeong; Sung, Jang Hyun; Lim, Yoon-Jin; Kang, Hyun-Suk

    2018-05-01

    The widely used meteorological drought index, the Standardized Precipitation Index (SPI), basically assumes stationarity, but recent changes in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process was proposed. The results were evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered that the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite that these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the probability distribution wider than before. This implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.

  18. Epistemological issues in the study of microbial life: alternative terran biospheres?

    PubMed

    Cleland, Carol E

    2007-12-01

    The assumption that all life on Earth today shares the same basic molecular architecture and biochemistry is part of the paradigm of modern biology. This paper argues that there is little theoretical or empirical support for this widely held assumption. Scientists know that life could have been at least modestly different at the molecular level and it is clear that alternative molecular building blocks for life were available on the early Earth. If the emergence of life is, like other natural phenomena, highly probable given the right chemical and physical conditions then it seems likely that the early Earth hosted multiple origins of life, some of which produced chemical variations on life as we know it. While these points are often conceded, it is nevertheless maintained that any primitive alternatives to familiar life would have been eliminated long ago, either amalgamated into a single form of life through lateral gene transfer (LGT) or alternatively out-competed by our putatively more evolutionarily robust form of life. Besides, the argument continues, if such life forms still existed, we surely would have encountered telling signs of them by now. These arguments do not hold up well under close scrutiny. They reflect a host of assumptions that are grounded in our experience with large multicellular organisms and, most importantly, do not apply to microbial forms of life, which cannot be easily studied without the aid of sophisticated technologies. Significantly, the most powerful molecular biology techniques available-polymerase chain reaction (PCR) amplification of rRNA genes augmented by metagenomic analysis-could not detect such microbes if they existed. Given the profound philosophical and scientific importance that such a discovery would represent, a dedicated search for 'shadow microbes' (heretofore unrecognized 'alien' forms of terran microbial life) seems in order. The best place to start such a search is with puzzling (anomalous) phenomena, such as desert varnish, that resist classification as 'biological' or 'nonbiological'.

  19. Soot Formation in Purely-Curved Premixed Flames and Laminar Flame Speeds of Soot-Forming Flames

    NASA Technical Reports Server (NTRS)

    Buchanan, Thomas; Wang, Hai

    2005-01-01

    The research addressed here is a collaborative project between University of Delaware and Case Western Reserve University. There are two basic and related scientific objectives. First, we wish to demonstrate the suitability of spherical/cylindrical, laminar, premixed flames in the fundamental study of the chemical and physical processes of soot formation. Our reasoning is that the flame standoff distance in spherical/cylindrical flames under microgravity can be substantially larger than that in a flat burner-stabilized flame. Therefore the spherical/cylindrical flame is expected to give better spatial resolution to probe the soot inception and growth chemistry than flat flames. Second, we wish to examine the feasibility of determining the laminar flame speed of soot forming flames. Our basic assumption is that under the adiabatic condition (in the absence of conductive heat loss), the amount and dynamics of soot formed in the flame is unique for a given fuel/air mixture. The laminar flame speed can be rigorously defined as long as the radiative heat loss can be determined. This laminar flame speed characterizes the flame soot formation and dynamics in addition to the heat release rate. The research involves two integral parts: experiments of spherical and cylindrical sooting flames in microgravity (CWRU), and the computational counterpart (UD) that aims to simulate sooting laminar flames, and the sooting limits of near adiabatic flames. The computations work is described in this report, followed by a summary of the accomplishments achieved to date. Details of the microgra+ experiments will be discussed in a separate, final report prepared by the co-PI, Professor C-J. Sung of CWRU. Here only a brief discussion of these experiments will be given.

  20. Historical Thinking and Other Unnatural Acts: Charting the Future of Teaching the Past. Critical Perspectives on the Past.

    ERIC Educational Resources Information Center

    Wineburg, Sam

    What ways of thinking, writing, and questioning would be lost if we eliminated history from the curriculum? The essays in this book begin with the basic assumption that history teaches people a way to make choices, to balance opinions, to tell stories, and to become uneasy--when necessary--about the stories that are told. The book is concerned…

  1. Recombination-generation currents in degenerate semiconductors

    NASA Technical Reports Server (NTRS)

    Von Roos, O.

    1978-01-01

    The classical Shockley-Read-Hall theory of free carrier recombination and generation via traps is extended to degenerate semiconductors. A concise and simple expression is found which avoids completely the concept of a Fermi level, a concept which is alien to nonequilibrium situations. Assumptions made in deriving the recombination generation current are carefully delineated and are found to be basically identical to those made in the original theory applicable to nondegenerate semiconductors.

  2. French NATO Policy: The Next Five Years

    DTIC Science & Technology

    1990-06-01

    tradeoffs on the ambitious French modernization programs. Most dramatic have been the projected strategic consequences of perestroika: France , like... project power into areas of French influence in the Third World. In the mid-I 980s, France was spending roughly 3.9 percent of gross domestic product on...policy environment and its effects on the basic assumptions underpinning French policy. He concludes that in the future, France will be easier to work

  3. Techniques for the computation in demographic projections of health manpower.

    PubMed

    Horbach, L

    1979-01-01

    Some basic principles and algorithms are presented which can be used for projective calculations of medical staff on the basis of demographic data. The effects of modifications of the input data such as by health policy measures concerning training capacity, can be demonstrated by repeated calculations with assumptions. Such models give a variety of results and may highlight the probable future balance between health manpower supply and requirements.

  4. Forging a Combat Mobility Culture

    DTIC Science & Technology

    2006-04-01

    values and beliefs, and basic assumptions. Artifacts are the most visible aspects of an organization. They include physical environment...Leadership, Command, and Communication Studies Academic Year 2006 Coursebook (Edited by Sharon McBride, Maxwell AFB, AL: Air Command and Staff...Air Force Doing it Right?.” In Leadership, Command, and Communication Studies Academic Year 2006 Coursebook . Edited by Sharon McBride, Maxwell AFB, AL: Air Command and Staff College, October 2005. 38

  5. Modeling precipitation δ 18O variability in East Asia since the Last Glacial Maximum: temperature and amount effects across different timescales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wen, Xinyu; Liu, Zhengyu; Chen, Zhongxiao

    Water isotopes in precipitation have played a key role in the reconstruction of past climate on millennial timescales and longer. But, for midlatitude regions like East Asia with complex terrain, the reliability behind the basic assumptions of the temperature effect and amount effect is based on modern observational data and still remains unclear for past climate. In the present work, we reexamine the two basic effects on seasonal, interannual, and millennial timescales in a set of time slice experiments for the period 22–0 ka using an isotope-enabled atmospheric general circulation model (AGCM). Our study confirms the robustness of the temperaturemore » and amount effects on the seasonal cycle over China in the present climatic conditions, with the temperature effect dominating in northern China and the amount effect dominating in the far south of China but no distinct effect in the transition region of central China. However, our analysis shows that neither temperature nor amount effect is significantly dominant over China on millennial and interannual timescales, which is a challenge to those classic assumptions in past climate reconstruction. This work helps shed light on the interpretation of the proxy record of δ 18O from a modeling point of view.« less

  6. Modeling precipitation δ 18O variability in East Asia since the Last Glacial Maximum: temperature and amount effects across different timescales

    DOE PAGES

    Wen, Xinyu; Liu, Zhengyu; Chen, Zhongxiao; ...

    2016-11-06

    Water isotopes in precipitation have played a key role in the reconstruction of past climate on millennial timescales and longer. But, for midlatitude regions like East Asia with complex terrain, the reliability behind the basic assumptions of the temperature effect and amount effect is based on modern observational data and still remains unclear for past climate. In the present work, we reexamine the two basic effects on seasonal, interannual, and millennial timescales in a set of time slice experiments for the period 22–0 ka using an isotope-enabled atmospheric general circulation model (AGCM). Our study confirms the robustness of the temperaturemore » and amount effects on the seasonal cycle over China in the present climatic conditions, with the temperature effect dominating in northern China and the amount effect dominating in the far south of China but no distinct effect in the transition region of central China. However, our analysis shows that neither temperature nor amount effect is significantly dominant over China on millennial and interannual timescales, which is a challenge to those classic assumptions in past climate reconstruction. This work helps shed light on the interpretation of the proxy record of δ 18O from a modeling point of view.« less

  7. Dynamics of an HIV-1 infection model with cell mediated immunity

    NASA Astrophysics Data System (ADS)

    Yu, Pei; Huang, Jianing; Jiang, Jiao

    2014-10-01

    In this paper, we study the dynamics of an improved mathematical model on HIV-1 virus with cell mediated immunity. This new 5-dimensional model is based on the combination of a basic 3-dimensional HIV-1 model and a 4-dimensional immunity response model, which more realistically describes dynamics between the uninfected cells, infected cells, virus, the CTL response cells and CTL effector cells. Our 5-dimensional model may be reduced to the 4-dimensional model by applying a quasi-steady state assumption on the variable of virus. However, it is shown in this paper that virus is necessary to be involved in the modeling, and that a quasi-steady state assumption should be applied carefully, which may miss some important dynamical behavior of the system. Detailed bifurcation analysis is given to show that the system has three equilibrium solutions, namely the infection-free equilibrium, the infectious equilibrium without CTL, and the infectious equilibrium with CTL, and a series of bifurcations including two transcritical bifurcations and one or two possible Hopf bifurcations occur from these three equilibria as the basic reproduction number is varied. The mathematical methods applied in this paper include characteristic equations, Routh-Hurwitz condition, fluctuation lemma, Lyapunov function and computation of normal forms. Numerical simulation is also presented to demonstrate the applicability of the theoretical predictions.

  8. A novel methodology for estimating upper limits of major cost drivers for profitable conceptual launch system architectures

    NASA Astrophysics Data System (ADS)

    Rhodes, Russel E.; Byrd, Raymond J.

    1998-01-01

    This paper presents a ``back of the envelope'' technique for fast, timely, on-the-spot, assessment of affordability (profitability) of commercial space transportation architectural concepts. The tool presented here is not intended to replace conventional, detailed costing methodology. The process described enables ``quick look'' estimations and assumptions to effectively determine whether an initial concept (with its attendant cost estimating line items) provides focus for major leapfrog improvement. The Cost Charts Users Guide provides a generic sample tutorial, building an approximate understanding of the basic launch system cost factors and their representative magnitudes. This process will enable the user to develop a net ``cost (and price) per payload-mass unit to orbit'' incorporating a variety of significant cost drivers, supplemental to basic vehicle cost estimates. If acquisition cost and recurring cost factors (as a function of cost per payload-mass unit to orbit) do not meet the predetermined system-profitability goal, the concept in question will be clearly seen as non-competitive. Multiple analytical approaches, and applications of a variety of interrelated assumptions, can be examined in a quick, (on-the-spot) cost approximation analysis as this tool has inherent flexibility. The technique will allow determination of concept conformance to system objectives.

  9. Approximate calculation of multispar cantilever and semicantilever wings with parallel ribs under direct and indirect loading

    NASA Technical Reports Server (NTRS)

    Sanger, Eugen

    1932-01-01

    A method is presented for approximate static calculation, which is based on the customary assumption of rigid ribs, while taking into account the systematic errors in the calculation results due to this arbitrary assumption. The procedure is given in greater detail for semicantilever and cantilever wings with polygonal spar plan form and for wings under direct loading only. The last example illustrates the advantages of the use of influence lines for such wing structures and their practical interpretation.

  10. Plants do not count… or do they? New perspectives on the universality of senescence

    PubMed Central

    Salguero-Gómez, Roberto; Shefferson, Richard P; Hutchings, Michael J

    2013-01-01

    1. Senescence, the physiological decline that results in decreasing survival and/or reproduction with age, remains one of the most perplexing topics in biology. Most theories explaining the evolution of senescence (i.e. antagonistic pleiotropy, accumulation of mutations, disposable soma) were developed decades ago. Even though these theories have implicitly focused on unitary animals, they have also been used as the foundation from which the universality of senescence across the tree of life is assumed. 2. Surprisingly, little is known about the general patterns, causes and consequences of whole-individual senescence in the plant kingdom. There are important differences between plants and most animals, including modular architecture, the absence of early determination of cell lines between the soma and gametes, and cellular division that does not always shorten telomere length. These characteristics violate the basic assumptions of the classical theories of senescence and therefore call the generality of senescence theories into question. 3. This Special Feature contributes to the field of whole-individual plant senescence with five research articles addressing topics ranging from physiology to demographic modelling and comparative analyses. These articles critically examine the basic assumptions of senescence theories such as age-specific gene action, the evolution of senescence regardless of the organism's architecture and environmental filtering, and the role of abiotic agents on mortality trajectories. 4. Synthesis. Understanding the conditions under which senescence has evolved is of general importance across biology, ecology, evolution, conservation biology, medicine, gerontology, law and social sciences. The question ‘why is senescence universal or why is it not?’ naturally calls for an evolutionary perspective. Senescence is a puzzling phenomenon, and new insights will be gained by uniting methods, theories and observations from formal demography, animal demography and plant population ecology. Plants are more amenable than animals to experiments investigating senescence, and there is a wealth of published plant demographic data that enable interpretation of experimental results in the context of their full life cycles. It is time to make plants count in the field of senescence. PMID:23853389

  11. Fair lineups are better than biased lineups and showups, but not because they increase underlying discriminability.

    PubMed

    Smith, Andrew M; Wells, Gary L; Lindsay, R C L; Penrod, Steven D

    2017-04-01

    Receiver Operating Characteristic (ROC) analysis has recently come in vogue for assessing the underlying discriminability and the applied utility of lineup procedures. Two primary assumptions underlie recommendations that ROC analysis be used to assess the applied utility of lineup procedures: (a) ROC analysis of lineups measures underlying discriminability, and (b) the procedure that produces superior underlying discriminability produces superior applied utility. These same assumptions underlie a recently derived diagnostic-feature detection theory, a theory of discriminability, intended to explain recent patterns observed in ROC comparisons of lineups. We demonstrate, however, that these assumptions are incorrect when ROC analysis is applied to lineups. We also demonstrate that a structural phenomenon of lineups, differential filler siphoning, and not the psychological phenomenon of diagnostic-feature detection, explains why lineups are superior to showups and why fair lineups are superior to biased lineups. In the process of our proofs, we show that computational simulations have assumed, unrealistically, that all witnesses share exactly the same decision criteria. When criterial variance is included in computational models, differential filler siphoning emerges. The result proves dissociation between ROC curves and underlying discriminability: Higher ROC curves for lineups than for showups and for fair than for biased lineups despite no increase in underlying discriminability. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  12. On the Use of Rank Tests and Estimates in the Linear Model.

    DTIC Science & Technology

    1982-06-01

    assumption A5, McKean and Hettmansperger (1976) show that 10 w (W(N-c) - W (c+l))/ (2Z /2) (14) where 2Z is the 1-a interpercentile range of the standard...r(.75n) - r(.25n)) (13) The window width h incorporates a resistant estimate of scale, then interquartile range of the residuals, and a normalizing...alternative estimate of i is available with the additional assumption of symmetry of the error distribution. ASSUMPTION: A5. Suppose the underlying error

  13. Fourier's law of heat conduction: quantum mechanical master equation analysis.

    PubMed

    Wu, Lian-Ao; Segal, Dvira

    2008-06-01

    We derive the macroscopic Fourier's Law of heat conduction from the exact gain-loss time convolutionless quantum master equation under three assumptions for the interaction kernel. To second order in the interaction, we show that the first two assumptions are natural results of the long time limit. The third assumption can be satisfied by a family of interactions consisting of an exchange effect. The pure exchange model directly leads to energy diffusion in a weakly coupled spin- 12 chain.

  14. Ontology Extraction Tools: An Empirical Study with Educators

    ERIC Educational Resources Information Center

    Hatala, M.; Gasevic, D.; Siadaty, M.; Jovanovic, J.; Torniai, C.

    2012-01-01

    Recent research in Technology-Enhanced Learning (TEL) demonstrated several important benefits that semantic technologies can bring to the TEL domain. An underlying assumption for most of these research efforts is the existence of a domain ontology. The second unspoken assumption follows that educators will build domain ontologies for their…

  15. Extracurricular Business Planning Competitions: Challenging the Assumptions

    ERIC Educational Resources Information Center

    Watson, Kayleigh; McGowan, Pauric; Smith, Paul

    2014-01-01

    Business planning competitions [BPCs] are a commonly offered yet under-examined extracurricular activity. Given the extent of sceptical comment about business planning, this paper offers what the authors believe is a much-needed critical discussion of the assumptions that underpin the provision of such competitions. In doing so it is suggested…

  16. Hybrid Approaches and Industrial Applications of Pattern Recognition,

    DTIC Science & Technology

    1980-10-01

    emphasized that the probability distribution in (9) is correct only under the assumption that P( wIx ) is known exactly. In practice this assumption will...sufficient precision. The alternative would be to take the probability distribution of estimates of P( wix ) into account in the analysis. However, from the

  17. Diagnostic tools for nearest neighbors techniques when used with satellite imagery

    Treesearch

    Ronald E. McRoberts

    2009-01-01

    Nearest neighbors techniques are non-parametric approaches to multivariate prediction that are useful for predicting both continuous and categorical forest attribute variables. Although some assumptions underlying nearest neighbor techniques are common to other prediction techniques such as regression, other assumptions are unique to nearest neighbor techniques....

  18. Ontological, Epistemological and Methodological Assumptions: Qualitative versus Quantitative

    ERIC Educational Resources Information Center

    Ahmed, Abdelhamid

    2008-01-01

    The review to follow is a comparative analysis of two studies conducted in the field of TESOL in Education published in "TESOL QUARTERLY." The aspects to be compared are as follows. First, a brief description of each study will be presented. Second, the ontological, epistemological and methodological assumptions underlying each study…

  19. Questionable Validity of Poisson Assumptions in a Combined Loglinear/MDS Mapping Model.

    ERIC Educational Resources Information Center

    Gleason, John M.

    1993-01-01

    This response to an earlier article on a combined log-linear/MDS model for mapping journals by citation analysis discusses the underlying assumptions of the Poisson model with respect to characteristics of the citation process. The importance of empirical data analysis is also addressed. (nine references) (LRW)

  20. Shattering the Glass Ceiling: Women in School Administration.

    ERIC Educational Resources Information Center

    Patterson, Jean A.

    Consistent with national trends, white males hold the majority of public school administrator positions in North Carolina. This paper examines the barriers and underlying assumptions that have prevented women and minorities from gaining access to high-level positions in educational administration. These include: (1) the assumption that leadership…

  1. Transferring Goods or Splitting a Resource Pool

    ERIC Educational Resources Information Center

    Dijkstra, Jacob; Van Assen, Marcel A. L. M.

    2008-01-01

    We investigated the consequences for exchange outcomes of the violation of an assumption underlying most social psychological research on exchange. This assumption is that the negotiated direct exchange of commodities between two actors (pure exchange) can be validly represented as two actors splitting a fixed pool of resources (split pool…

  2. Preparing Democratic Education Leaders

    ERIC Educational Resources Information Center

    Young, Michelle D.

    2010-01-01

    Although it is common to hear people espouse the importance of education to ensuring a strong and vibrant democracy, the assumptions underlying such statements are rarely unpacked. Two of the most widespread, though not necessarily complimentary, assumptions include: (1) to truly participate in a democracy, citizens must be well educated; and (2)…

  3. Commentary on Coefficient Alpha: A Cautionary Tale

    ERIC Educational Resources Information Center

    Green, Samuel B.; Yang, Yanyun

    2009-01-01

    The general use of coefficient alpha to assess reliability should be discouraged on a number of grounds. The assumptions underlying coefficient alpha are unlikely to hold in practice, and violation of these assumptions can result in nontrivial negative or positive bias. Structural equation modeling was discussed as an informative process both to…

  4. Timber value—a matter of choice: a study of how end use assumptions affect timber values.

    Treesearch

    John H. Beuter

    1971-01-01

    The relationship between estimated timber values and actual timber prices is discussed. Timber values are related to how, where, and when the timber is used. An analysis demonstrates the relative values of a typical Douglas-fir stand under assumptions about timber use.

  5. Mexican-American Cultural Assumptions and Implications.

    ERIC Educational Resources Information Center

    Carranza, E. Lou

    The search for presuppositions of a people's thought is not new. Octavio Paz and Samuel Ramos have both attempted to describe the assumptions underlying the Mexican character. Paz described Mexicans as private, defensive, and stoic, characteristics taken to the extreme in the "pachuco." Ramos, on the other hand, described Mexicans as…

  6. Discrete Deterministic and Stochastic Petri Nets

    NASA Technical Reports Server (NTRS)

    Zijal, Robert; Ciardo, Gianfranco

    1996-01-01

    Petri nets augmented with timing specifications gained a wide acceptance in the area of performance and reliability evaluation of complex systems exhibiting concurrency, synchronization, and conflicts. The state space of time-extended Petri nets is mapped onto its basic underlying stochastic process, which can be shown to be Markovian under the assumption of exponentially distributed firing times. The integration of exponentially and non-exponentially distributed timing is still one of the major problems for the analysis and was first attacked for continuous time Petri nets at the cost of structural or analytical restrictions. We propose a discrete deterministic and stochastic Petri net (DDSPN) formalism with no imposed structural or analytical restrictions where transitions can fire either in zero time or according to arbitrary firing times that can be represented as the time to absorption in a finite absorbing discrete time Markov chain (DTMC). Exponentially distributed firing times are then approximated arbitrarily well by geometric distributions. Deterministic firing times are a special case of the geometric distribution. The underlying stochastic process of a DDSPN is then also a DTMC, from which the transient and stationary solution can be obtained by standard techniques. A comprehensive algorithm and some state space reduction techniques for the analysis of DDSPNs are presented comprising the automatic detection of conflicts and confusions, which removes a major obstacle for the analysis of discrete time models.

  7. A new framework of statistical inferences based on the valid joint sampling distribution of the observed counts in an incomplete contingency table.

    PubMed

    Tian, Guo-Liang; Li, Hui-Qiong

    2017-08-01

    Some existing confidence interval methods and hypothesis testing methods in the analysis of a contingency table with incomplete observations in both margins entirely depend on an underlying assumption that the sampling distribution of the observed counts is a product of independent multinomial/binomial distributions for complete and incomplete counts. However, it can be shown that this independency assumption is incorrect and can result in unreliable conclusions because of the under-estimation of the uncertainty. Therefore, the first objective of this paper is to derive the valid joint sampling distribution of the observed counts in a contingency table with incomplete observations in both margins. The second objective is to provide a new framework for analyzing incomplete contingency tables based on the derived joint sampling distribution of the observed counts by developing a Fisher scoring algorithm to calculate maximum likelihood estimates of parameters of interest, the bootstrap confidence interval methods, and the bootstrap testing hypothesis methods. We compare the differences between the valid sampling distribution and the sampling distribution under the independency assumption. Simulation studies showed that average/expected confidence-interval widths of parameters based on the sampling distribution under the independency assumption are shorter than those based on the new sampling distribution, yielding unrealistic results. A real data set is analyzed to illustrate the application of the new sampling distribution for incomplete contingency tables and the analysis results again confirm the conclusions obtained from the simulation studies.

  8. Dynamic Self-Consistent Field Theories for Polymer Blends and Block Copolymers

    NASA Astrophysics Data System (ADS)

    Kawakatsu, Toshihiro

    Understanding the behavior of the phase separated domain structures and rheological properties of multi-component polymeric systems require detailed information on the dynamics of domains and that of conformations of constituent polymer chains. Self-consistent field (SCF) theory is a useful tool to treat such a problem because the conformation entropy of polymer chains in inhomogeneous systems can be evaluated quantitatively using this theory. However, when we turn our attention to the dynamic properties in a non-equilibrium state, the basic assumption of the SCF theory, i.e. the assumption of equilibrium chain conformation, breaks down. In order to avoid such a difficulty, dynamic SCF theories were developed. In this chapter, we give a brief review of the recent developments of dynamic SCF theories, and discuss where the cutting-edge of this theory is.

  9. Values and assumptions in the development of DSM-III and DSM-III-R: an insider's perspective and a belated response to Sadler, Hulgus, and Agich's "On values in recent American psychiatric classification".

    PubMed

    Spitzer, R L

    2001-06-01

    It is widely acknowledged that the approach taken in the development of a classification of mental disorders is guided by various values and assumptions. The author, who played a central role in the development of DSM-III (American Psychiatric Association [1980] Diagnostic and statistical manual of mental disorders, 3rd ed. Washington, DC:Author) and DSM-III-R (American Psychiatric Association [1987] Diagnostic and statistical manual of mental disorders, 3rd ed, rev. Washington, DC:Author) will explicate the basic values and assumptions that guided the development of these two diagnostic manuals. In so doing, the author will respond to the critique of DSM-III and DSM-III-R made by Sadler et al. in their 1994 paper (Sadler JZ, Hulgus YF, Agich GJ [1994] On values in recent American psychiatric classification. JMed Phil 19:261-277). The author will attempt to demonstrate that the stated goals of DSM-III and DSM-III-R are not inherently in conflict and are easily explicated by appealing to widely held values and assumptions, most of which appeared in the literature during the development of the manuals. Furthermore, we will demonstrate that it is not true that DSM-III places greater emphasis on reliability over validity and is covertly committed to a biological approach to explaining psychiatric disturbance.

  10. Comparison of Two Methods for Detecting Alternative Splice Variants Using GeneChip® Exon Arrays

    PubMed Central

    Fan, Wenhong; Stirewalt, Derek L.; Radich, Jerald P.; Zhao, Lueping

    2011-01-01

    The Affymetrix GeneChip Exon Array can be used to detect alternative splice variants. Microarray Detection of Alternative Splicing (MIDAS) and Partek® Genomics Suite (Partek® GS) are among the most popular analytical methods used to analyze exon array data. While both methods utilize statistical significance for testing, MIDAS and Partek® GS could produce somewhat different results due to different underlying assumptions. Comparing MIDAS and Partek® GS is quite difficult due to their substantially different mathematical formulations and assumptions regarding alternative splice variants. For meaningful comparison, we have used the previously published generalized probe model (GPM) which encompasses both MIDAS and Partek® GS under different assumptions. We analyzed a colon cancer exon array data set using MIDAS, Partek® GS and GPM. MIDAS and Partek® GS produced quite different sets of genes that are considered to have alternative splice variants. Further, we found that GPM produced results similar to MIDAS as well as to Partek® GS under their respective assumptions. Within the GPM, we show how discoveries relating to alternative variants can be quite different due to different assumptions. MIDAS focuses on relative changes in expression values across different exons within genes and tends to be robust but less efficient. Partek® GS, however, uses absolute expression values of individual exons within genes and tends to be more efficient but more sensitive to the presence of outliers. From our observations, we conclude that MIDAS and Partek® GS produce complementary results, and discoveries from both analyses should be considered. PMID:23675234

  11. Approximations of Two-Attribute Utility Functions

    DTIC Science & Technology

    1976-09-01

    preferred to") be a bina-zy relation on the set • of simple probability measures or ’gambles’ defined on a set T of consequences. Throughout this study it...simplifying independence assumptions. Although there are several approaches to this problem, the21 present study will focus on approximations of u... study will elicit additional interest in the topic. 2. REMARKS ON APPROXIMATION THEORY This section outlines a few basic ideas of approximation theory

  12. WalkThrough Example Procedures for MAMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruggiero, Christy E.; Gaschen, Brian Keith; Bloch, Jeffrey Joseph

    This documentation is a growing set of walk through examples of analyses using the MAMA V2.0 software. It does not cover all the features or possibilities with the MAMA software, but will address using many of the basic analysis tools to quantify particle size and shape in an image. This document will continue to evolve as additional procedures and examples are added. The starting assumption is that the MAMA software has been successfully installed.

  13. Deepening and Extending Channels for Navigation, Georgetown Harbor, South Carolina. Review of Reports.

    DTIC Science & Technology

    1978-01-01

    South Carolina fo 9*10=0 ~c cmd me, hA. lUU~h~hum~gd.~ JANUARY 1978 85 01 11 084 S ~~ . . . . . . . . . . . . . . FEASIBILITY REPORT REVIEW OF REPORT...ADOPTED JANUARY 28, 1958 2 ENVIRONMENTAL ASSESSMENT . . °.. . . . . . . . . . . .. . .. -" . , .". * * . . . . . . . .. -~ . . . -. REVIEW OF REPORTS... review . As a result of this review , it was judged that some of the basic assumptions presented in the draft report were no longer applicable and that

  14. Predictability of currency market exchange

    NASA Astrophysics Data System (ADS)

    Ohira, Toru; Sazuka, Naoya; Marumo, Kouhei; Shimizu, Tokiko; Takayasu, Misako; Takayasu, Hideki

    2002-05-01

    We analyze tick data of yen-dollar exchange with a focus on its up and down movement. We show that there exists a rather particular conditional probability structure with such high frequency data. This result provides us with evidence to question one of the basic assumptions of the traditional market theory, where such bias in high frequency price movements is regarded as not present. We also construct systematically a random walk model reflecting this probability structure.

  15. New Beginnings: Ensuring Quality Bilingual/ESL Instruction in New York City Public Schools. Executive Summary [and] Report of the Chancellor's Bilingual/ESL Education Practitioners' Workgroup and Policy/Research Panels.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Office of Bilingual Education.

    The report presents a conceptual framework and related strategies designed to help policymakers and practitioners re-examine, and when necessary, rework the basic assumptions and practices defining the educational experiences of bilingual/English-as-a-Second-Language (ESL) learners in New York City (New York) public schools. The report consists of…

  16. Performance of species occurrence estimators when basic assumptions are not met: a test using field data where true occupancy status is known

    USGS Publications Warehouse

    Miller, David A. W.; Bailey, Larissa L.; Grant, Evan H. Campbell; McClintock, Brett T.; Weir, Linda A.; Simons, Theodore R.

    2015-01-01

    Our results demonstrate that even small probabilities of misidentification and among-site detection heterogeneity can have severe effects on estimator reliability if ignored. We challenge researchers to place greater attention on both heterogeneity and false positives when designing and analysing occupancy studies. We provide 9 specific recommendations for the design, implementation and analysis of occupancy studies to better meet this challenge.

  17. Conditioned Limit Theorems for Some Null Recurrent Markov Processes

    DTIC Science & Technology

    1976-08-01

    Chapter 1 INTRODUCTION 1.1 Summary of Results Let (Vk, k ! 0) be a discrete time Markov process with state space EC(- , ) and let S be...explain our results in some detail. 2 We begin by stating our three basic assumptions: (1) vk s k 2 0 Is a Markov process with state space E C(-o,%); (Ii... 12 n 3. CONDITIONING ON T (, > n.................................1.9 3.1 Preliminary Results

  18. Anthropometric Source Book. Volume 1: Anthropometry for Designers

    DTIC Science & Technology

    1978-07-01

    diet initiates replacement of the tissue loss incurred in the first day or two of flight. Any further caloric excess or deficit would be superimposed...the Skylab missions, a calorically inadequate basic diet was supplied as a result of the assumption that in-flight requirements were less than those...from one-g to weightlessness conditions or vice versa, any remaining volume changes are probably tissue changes. If a diet is calorically inadequate

  19. The Assumption of Adequacy: Operation Safe Haven, A Chaplain’s View.

    DTIC Science & Technology

    1999-06-04

    poverty , their ignorance regarding everything from literacy to the most basic hygiene was overwhelming. One chaplain assistant from Fort Carson...perspective, the Panamanians, ninety percent of whom lived in absolute poverty were less than enamored with this state of affairs. The Canal Zone...was soon discovered that the entire adult population on the island of Cuba is addicted to nicotine ), and a brand new pair of running shoes. While going

  20. Why Are Experts Correlated? Decomposing Correlations between Judges

    ERIC Educational Resources Information Center

    Broomell, Stephen B.; Budescu, David V.

    2009-01-01

    We derive an analytic model of the inter-judge correlation as a function of five underlying parameters. Inter-cue correlation and the number of cues capture our assumptions about the environment, while differentiations between cues, the weights attached to the cues, and (un)reliability describe assumptions about the judges. We study the relative…

  1. Contexts and Pragmatics Learning: Problems and Opportunities of the Study Abroad Research

    ERIC Educational Resources Information Center

    Taguchi, Naoko

    2018-01-01

    Despite different epistemologies and assumptions, all theories in second language (L2) acquisition emphasize the centrality of context in understanding L2 acquisition. Under the assumption that language emerges from use in context, the cognitivist approach focuses on distributions and properties of input to infer both learning objects and process…

  2. Marking and Moderation in the UK: False Assumptions and Wasted Resources

    ERIC Educational Resources Information Center

    Bloxham, Sue

    2009-01-01

    This article challenges a number of assumptions underlying marking of student work in British universities. It argues that, in developing rigorous moderation procedures, we have created a huge burden for markers which adds little to accuracy and reliability but creates additional work for staff, constrains assessment choices and slows down…

  3. 29 CFR 4010.8 - Plan actuarial information.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... Assumptions for decrements other than mortality and retirement (such as turnover or disability) used to... than 25 years of service. Employee A is an active participant who is age 40 and has completed 5 years... entitled under the assumption that A works until age 58. (2) Example 2. Employee B is also an active...

  4. An Assessment of Propensity Score Matching as a Nonexperimental Impact Estimator: Evidence from Mexico's PROGRESA Program

    ERIC Educational Resources Information Center

    Diaz, Juan Jose; Handa, Sudhanshu

    2006-01-01

    Not all policy questions can be addressed by social experiments. Nonexperimental evaluation methods provide an alternative to experimental designs but their results depend on untestable assumptions. This paper presents evidence on the reliability of propensity score matching (PSM), which estimates treatment effects under the assumption of…

  5. 29 CFR 4044.53 - Mortality assumptions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... assumptions. (a) General rule. Subject to paragraph (b) of this section (regarding certain death benefits...), and (g) of this section to value benefits under § 4044.52. (b) Certain death benefits. If an annuity for one person is in pay status on the valuation date, and if the payment of a death benefit after the...

  6. 29 CFR 4044.53 - Mortality assumptions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... assumptions. (a) General rule. Subject to paragraph (b) of this section (regarding certain death benefits...), and (g) of this section to value benefits under § 4044.52. (b) Certain death benefits. If an annuity for one person is in pay status on the valuation date, and if the payment of a death benefit after the...

  7. 29 CFR 4044.53 - Mortality assumptions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... assumptions. (a) General rule. Subject to paragraph (b) of this section (regarding certain death benefits...), and (g) of this section to value benefits under § 4044.52. (b) Certain death benefits. If an annuity for one person is in pay status on the valuation date, and if the payment of a death benefit after the...

  8. 29 CFR 4044.53 - Mortality assumptions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... assumptions. (a) General rule. Subject to paragraph (b) of this section (regarding certain death benefits...), and (g) of this section to value benefits under § 4044.52. (b) Certain death benefits. If an annuity for one person is in pay status on the valuation date, and if the payment of a death benefit after the...

  9. Under What Assumptions Do Site-by-Treatment Instruments Identify Average Causal Effects?

    ERIC Educational Resources Information Center

    Reardon, Sean F.; Raudenbush, Stephen W.

    2013-01-01

    The increasing availability of data from multi-site randomized trials provides a potential opportunity to use instrumental variables methods to study the effects of multiple hypothesized mediators of the effect of a treatment. We derive nine assumptions needed to identify the effects of multiple mediators when using site-by-treatment interactions…

  10. An identifiable model for informative censoring

    USGS Publications Warehouse

    Link, W.A.; Wegman, E.J.; Gantz, D.T.; Miller, J.J.

    1988-01-01

    The usual model for censored survival analysis requires the assumption that censoring of observations arises only due to causes unrelated to the lifetime under consideration. It is easy to envision situations in which this assumption is unwarranted, and in which use of the Kaplan-Meier estimator and associated techniques will lead to unreliable analyses.

  11. Biological control agents elevate hantavirus by subsidizing deer mouse populations

    Treesearch

    Dean E. Pearson; Ragan M. Callaway

    2006-01-01

    Biological control of exotic invasive plants using exotic insects is practiced under the assumption that biological control agents are safe if they do not directly attack non-target species. We tested this assumption by evaluating the potential for two host-specific biological control agents (Urophora spp.), widely established in North America for spotted...

  12. Assumptions Underlying Curriculum Decisions in Australia: An American Perspective.

    ERIC Educational Resources Information Center

    Willis, George

    An analysis of the cultural and historical context in which curriculum decisions are made in Australia and a comparison with educational assumptions in the United States is the purpose of this paper. Methodology is based on personal teaching experience and observation in Australia. Seven factors are identified upon which curricular decisions in…

  13. Parabolic Systems with p, q-Growth: A Variational Approach

    NASA Astrophysics Data System (ADS)

    Bögelein, Verena; Duzaar, Frank; Marcellini, Paolo

    2013-10-01

    We consider the evolution problem associated with a convex integrand {f : {R}^{Nn}to [0,infty)} satisfying a non-standard p, q-growth assumption. To establish the existence of solutions we introduce the concept of variational solutions. In contrast to weak solutions, that is, mappings {u\\colon Ω_T to {R}^n} which solve partial_tu-div Df(Du)=0 weakly in {Ω_T}, variational solutions exist under a much weaker assumption on the gap q - p. Here, we prove the existence of variational solutions provided the integrand f is strictly convex and 2n/n+2 < p le q < p+1. These variational solutions turn out to be unique under certain mild additional assumptions on the data. Moreover, if the gap satisfies the natural stronger assumption 2le p le q < p+ minbig \\{1,4/n big \\}, we show that variational solutions are actually weak solutions. This means that solutions u admit the necessary higher integrability of the spatial derivative Du to satisfy the parabolic system in the weak sense, that is, we prove that uin L^q_locbig(0,T; W^{1,q}_loc(Ω,{R}^N)big).

  14. A general method for handling missing binary outcome data in randomized controlled trials

    PubMed Central

    Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen

    2014-01-01

    Aims The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. Design We propose a sensitivity analysis where standard analyses, which could include ‘missing = smoking’ and ‘last observation carried forward’, are embedded in a wider class of models. Setting We apply our general method to data from two smoking cessation trials. Participants A total of 489 and 1758 participants from two smoking cessation trials. Measurements The abstinence outcomes were obtained using telephone interviews. Findings The estimated intervention effects from both trials depend on the sensitivity parameters used. The findings differ considerably in magnitude and statistical significance under quite extreme assumptions about the missing data, but are reasonably consistent under more moderate assumptions. Conclusions A new method for undertaking sensitivity analyses when handling missing data in trials with binary outcomes allows a wide range of assumptions about the missing data to be assessed. In two smoking cessation trials the results were insensitive to all but extreme assumptions. PMID:25171441

  15. Supplementation of an Artificial Medium for the Parasitoid Exorista larvarum (Diptera: Tachnidae) With Hemolymph of Hermetia illucens (Diptera: Stratiomyidae) or Antheraea pernyi (Lepidoptera: Saturniidae).

    PubMed

    Dindo, Maria Luisa; Vandicke, Jonas; Marchetti, Elisa; Spranghers, Thomas; Bonte, Jochem; De Clercq, Patrick

    2016-04-01

    The effect of supplementing hemolymph of the black soldier fly, Hermetia illucens (L.), or the Chinese oak silkworm, Antheraea pernyi (Guérin-Méneville), to a basic insect-free artificial medium for the tachinid Exorista larvarum (L.) was investigated. The supplementation (20% w/w) was based on the assumption that insect additives may optimize the media for this parasitoid. Egg hatch, pupal and adult yields, and sex ratio did not differ among the enriched and basic media. Preimaginal development was faster on both hemolymph-enriched media than on the basic medium. Despite the shorter development on the medium supplemented with H. illucens hemolymph than on the basic medium, on the two media puparium weights were comparable. The female flies reared on the medium enriched with H. illucens hemolymph did not lay more eggs, but the latter yielded significantly more puparia compared with the control females. Conversely, the medium enriched with A. pernyi hemolymph yielded lower female puparium weights than the basic medium and produced only one ovipositing female out of the five obtained female adults. These results indicate that the in vitro development of E. larvarum improved when the basic artificial medium was enriched with H. illucens hemolymph, whereas the supplementation with A. pernyi hemolymph negatively affected the quality of the in vitro-reared females.

  16. Causal analysis of ordinal treatments and binary outcomes under truncation by death.

    PubMed

    Wang, Linbo; Richardson, Thomas S; Zhou, Xiao-Hua

    2017-06-01

    It is common that in multi-arm randomized trials, the outcome of interest is "truncated by death," meaning that it is only observed or well-defined conditioning on an intermediate outcome. In this case, in addition to pairwise contrasts, the joint inference for all treatment arms is also of interest. Under a monotonicity assumption we present methods for both pairwise and joint causal analyses of ordinal treatments and binary outcomes in presence of truncation by death. We illustrate via examples the appropriateness of our assumptions in different scientific contexts.

  17. Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Markley, F. Landis; Gold, Dara

    2013-01-01

    We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.

  18. Robustness of location estimators under t-distributions: a literature review

    NASA Astrophysics Data System (ADS)

    Sumarni, C.; Sadik, K.; Notodiputro, K. A.; Sartono, B.

    2017-03-01

    The assumption of normality is commonly used in estimation of parameters in statistical modelling, but this assumption is very sensitive to outliers. The t-distribution is more robust than the normal distribution since the t-distributions have longer tails. The robustness measures of location estimators under t-distributions are reviewed and discussed in this paper. For the purpose of illustration we use the onion yield data which includes outliers as a case study and showed that the t model produces better fit than the normal model.

  19. Interpreting the concordance statistic of a logistic regression model: relation to the variance and odds ratio of a continuous explanatory variable.

    PubMed

    Austin, Peter C; Steyerberg, Ewout W

    2012-06-20

    When outcomes are binary, the c-statistic (equivalent to the area under the Receiver Operating Characteristic curve) is a standard measure of the predictive accuracy of a logistic regression model. An analytical expression was derived under the assumption that a continuous explanatory variable follows a normal distribution in those with and without the condition. We then conducted an extensive set of Monte Carlo simulations to examine whether the expressions derived under the assumption of binormality allowed for accurate prediction of the empirical c-statistic when the explanatory variable followed a normal distribution in the combined sample of those with and without the condition. We also examine the accuracy of the predicted c-statistic when the explanatory variable followed a gamma, log-normal or uniform distribution in combined sample of those with and without the condition. Under the assumption of binormality with equality of variances, the c-statistic follows a standard normal cumulative distribution function with dependence on the product of the standard deviation of the normal components (reflecting more heterogeneity) and the log-odds ratio (reflecting larger effects). Under the assumption of binormality with unequal variances, the c-statistic follows a standard normal cumulative distribution function with dependence on the standardized difference of the explanatory variable in those with and without the condition. In our Monte Carlo simulations, we found that these expressions allowed for reasonably accurate prediction of the empirical c-statistic when the distribution of the explanatory variable was normal, gamma, log-normal, and uniform in the entire sample of those with and without the condition. The discriminative ability of a continuous explanatory variable cannot be judged by its odds ratio alone, but always needs to be considered in relation to the heterogeneity of the population.

  20. Assessment of dietary exposure in the French population to 13 selected food colours, preservatives, antioxidants, stabilizers, emulsifiers and sweeteners.

    PubMed

    Bemrah, Nawel; Leblanc, Jean-Charles; Volatier, Jean-Luc

    2008-01-01

    The results of French intake estimates for 13 food additives prioritized by the methods proposed in the 2001 Report from the European Commission on Dietary Food Additive Intake in the European Union are reported. These 13 additives were selected using the first and second tiers of the three-tier approach. The first tier was based on theoretical food consumption data and the maximum permitted level of additives. The second tier used real individual food consumption data and the maximum permitted level of additives for the substances which exceeded the acceptable daily intakes (ADI) in the first tier. In the third tier reported in this study, intake estimates were calculated for the 13 additives (colours, preservatives, antioxidants, stabilizers, emulsifiers and sweeteners) according to two modelling assumptions corresponding to two different food habit scenarios (assumption 1: consumers consume foods that may or may not contain food additives, and assumption 2: consumers always consume foods that contain additives) when possible. In this approach, real individual food consumption data and the occurrence/use-level of food additives reported by the food industry were used. Overall, the results of the intake estimates are reassuring for the majority of additives studied since the risk of exceeding the ADI was low, except for nitrites, sulfites and annatto, whose ADIs were exceeded by either children or adult consumers or by both populations under one and/or two modelling assumptions. Under the first assumption, the ADI is exceeded for high consumers among adults for nitrites and sulfites (155 and 118.4%, respectively) and among children for nitrites (275%). Under the second assumption, the average nitrites dietary exposure in children exceeds the ADI (146.7%). For high consumers, adults exceed the nitrite and sulfite ADIs (223 and 156.4%, respectively) and children exceed the nitrite, annatto and sulfite ADIs (416.7, 124.6 and 130.6%, respectively).

Top