On the physical parameters for Centaurus X-3 and Hercules X-1.
NASA Technical Reports Server (NTRS)
Mccluskey, G. E., Jr.; Kondo, Y.
1972-01-01
It is shown how upper and lower limits on the physical parameters of X-ray sources in Centaurus X-3 and Hercules X-1 may be determined from a reasonably simple and straightforward consideration. The basic assumption is that component A (the non-X-ray emitting component) is not a star collapsing toward its Schwartzschild radius (i.e., a black hole). This assumption appears reasonable since component A (the radius of the central occulting star) appears to physically occult component X. If component A is a 'normal' star, both observation and theory indicate that its mass is not greater than about 60 solar masses. The possibility in which component X is either a neutron star or a white dwarf is considered.
ERIC Educational Resources Information Center
Straus, Murray A.; Kantor, Glenda Kaufman
One of the reasons why so few parents question the wisdom of "spare the rod and spoil the child" and why so few researchers have investigated the potential adverse effects, is probably the culturally accepted assumption that, when done "in moderation," physical punishment is harmless and sometimes necessary. This study starts from assumptions that…
Psychological abuse: a variable deserving critical attention in domestic violence.
O'Leary, K D
1999-01-01
Policy makers and researchers give psychological abuse considerably less attention than physical abuse in the partner abuse area. One reason for the relative neglect of psychological abuse is that there are difficulties in arriving at a common definition of psychological abuse that might be useful to both the mental health and legal professions. Another reason for the relative neglect of psychological abuse has been an implicit assumption that physical abuse exacts a greater psychological toll on victims than does psychological abuse. At the extreme level of physical abuse, this assumption seems defensible, but at levels of physical aggression that are most common in marriage and long-term relationships, psychological abuse appears to have as great an impact as physical abuse. Even direct ratings of psychological and physical abuse by women in physically abusive relationships indicate that psychological abuse has a greater adverse effect on them than physical abuse. Retrospective reports, longitudinal research, and treatment dropout research all provide evidence that psychological abuse can exact a negative effect on relationships that is as great as that of physical abuse. Finally, psychological abuse almost always precedes physical abuse, so that prevention and treatment efforts clearly need to address psychological abuse. Eight measures of various forms of psychological abuse that have reasonable psychometric properties and considerable construct validity are reviewed and a definition of psychological abuse in intimate relations is provided.
Common-sense chemistry: The use of assumptions and heuristics in problem solving
NASA Astrophysics Data System (ADS)
Maeyer, Jenine Rachel
Students experience difficulty learning and understanding chemistry at higher levels, often because of cognitive biases stemming from common sense reasoning constraints. These constraints can be divided into two categories: assumptions (beliefs held about the world around us) and heuristics (the reasoning strategies or rules used to build predictions and make decisions). A better understanding and characterization of these constraints are of central importance in the development of curriculum and teaching strategies that better support student learning in science. It was the overall goal of this thesis to investigate student reasoning in chemistry, specifically to better understand and characterize the assumptions and heuristics used by undergraduate chemistry students. To achieve this, two mixed-methods studies were conducted, each with quantitative data collected using a questionnaire and qualitative data gathered through semi-structured interviews. The first project investigated the reasoning heuristics used when ranking chemical substances based on the relative value of a physical or chemical property, while the second study characterized the assumptions and heuristics used when making predictions about the relative likelihood of different types of chemical processes. Our results revealed that heuristics for cue selection and decision-making played a significant role in the construction of answers during the interviews. Many study participants relied frequently on one or more of the following heuristics to make their decisions: recognition, representativeness, one-reason decision-making, and arbitrary trend. These heuristics allowed students to generate answers in the absence of requisite knowledge, but often led students astray. When characterizing assumptions, our results indicate that students relied on intuitive, spurious, and valid assumptions about the nature of chemical substances and processes in building their responses. In particular, many interviewees seemed to view chemical reactions as macroscopic reassembling processes where favorability was related to the perceived ease with which reactants broke apart or products formed. Students also expressed spurious chemical assumptions based on the misinterpretation and overgeneralization of periodicity and electronegativity. Our findings suggest the need to create more opportunities for college chemistry students to monitor their thinking, develop and apply analytical ways of reasoning, and evaluate the effectiveness of shortcut reasoning procedures in different contexts.
Life Support Baseline Values and Assumptions Document
NASA Technical Reports Server (NTRS)
Anderson, Molly S.; Ewert, Michael K.; Keener, John F.
2018-01-01
The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.
NASA Astrophysics Data System (ADS)
Dancy, Melissa
2004-09-01
It is well known that women are underrepresented in physics. The prevailing view is that there is a "leaky pipeline" of female physicists which has lead to a focus on providing mentors and increasing the opportunity for girls to experience science. The assumption is that the numbers of women in physics can be increased by integrating women into the existing structure. Although it may seem reasonable, women are making only small gains in participation levels. In this paper, I explore the idea that there is no leaky pipeline. Rather, the environment is fundamentally "male" and women will never be equally represented until fundamental changes are made in both our educational system and in the cultural assumptions of our physics community.
NASA Astrophysics Data System (ADS)
Rusli, Aloysius
2016-08-01
Until the 1980s, it is well known and practiced in Indonesian Basic Physics courses, to present physics by its effective technicalities: The ideally elastic spring, the pulley and moving blocks, the thermodynamics of ideal engine models, theoretical electrostatics and electrodynamics with model capacitors and inductors, wave behavior and its various superpositions, and hopefully closed with a modern physics description. A different approach was then also experimented with, using the Hobson and Moore texts, stressing the alternative aim of fostering awareness, not just mastery, of science and the scientific method. This is hypothesized to be more in line with the changed attitude of the so-called Millenials cohort who are less attentive if not interested, and are more used to multi-tasking which suits their shorter span of attention. The upside is increased awareness of science and the scientific method. The downside is that they are getting less experience of the scientific method which intensely bases itself on critical observation, analytic thinking to set up conclusions or hypotheses, and checking consistency of the hypotheses with measured data. Another aspect is recognition that the human person encompasses both the reasoning capacity and the mental- spiritual-cultural capacity. This is considered essential, as the world grows even smaller due to increased communication capacity, causing strong interactions, nonlinear effects, and showing that value systems become more challenging and challenged due to physics / science and its cosmology, which is successfully based on the scientific method. So students should be made aware of the common basis of these two capacities: the assumptions, the reasoning capacity and the consistency assumption. This shows that the limits of science are their set of basic quantifiable assumptions, and the limits of the mental-spiritual-cultural aspects of life are their set of basic metaphysical (non-quantifiable) assumptions. The bridging between these two human aspects of life, can lead to a “why” of science, and a “meaning” of life. A progress report on these efforts is presented, essentially being of the results indicated by an extended format of the usual weekly reporting used previously in Basic Physics lectures.
Object reasoning for waste remediation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pennock, K.A.; Bohn, S.J.; Franklin, A.L.
1991-08-01
A large number of contaminated waste sites across the United States await size remediation efforts. These sites can be physically complex, composed of multiple, possibly interacting, contaminants distributed throughout one or more media. The Remedial Action Assessment System (RAAS) is being designed and developed to support decisions concerning the selection of remediation alternatives. The goal of this system is to broaden the consideration of remediation alternatives, while reducing the time and cost of making these considerations. The Remedial Action Assessment System is a hybrid system, designed and constructed using object-oriented, knowledge- based systems, and structured programming techniques. RAAS uses amore » combination of quantitative and qualitative reasoning to consider and suggest remediation alternatives. The reasoning process that drives this application is centered around an object-oriented organization of remediation technology information. This paper describes the information structure and organization used to support this reasoning process. In addition, the paper describes the level of detail of the technology related information used in RAAS, discusses required assumptions and procedural implications of these assumptions, and provides rationale for structuring RAAS in this manner. 3 refs., 3 figs.« less
NASA Technical Reports Server (NTRS)
Gosling, J. T.; Riley, P.; Skoug, R. M.
2001-01-01
We strongly disagree with the essence of the Osherovich (hereafter Osherovich) comment on one of our papers. The following paragraphs provide the basis of our disagreement and elaborate on why we believe that none of the concluding statements in his Comment are true. Our most important point is that one can apply the model developed by Osherovich and colleagues to real data obtained at a single point in space to determine the polytropic index within magnetic clouds if and only if the highly idealized assumptions of that model conform to physical reality. There is good reason to believe that those assumptions do not provide an accurate physical description of real magnetic clouds in the spherically expanding solar wind.
Formalization and analysis of reasoning by assumption.
Bosse, Tibor; Jonker, Catholijn M; Treur, Jan
2006-01-02
This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically analyzed against dynamic properties they fulfill. To this end, for the pattern of reasoning by assumption a variety of dynamic properties have been specified, some of which are considered characteristic for the reasoning pattern, whereas some other properties can be used to discriminate among different approaches to the reasoning. These properties have been automatically checked for the traces acquired in experiments undertaken. The approach turned out to be beneficial from two perspectives. First, checking characteristic properties contributes to the empirical validation of a theory on reasoning by assumption. Second, checking discriminating properties allows the analyst to identify different classes of human reasoners. 2006 Lawrence Erlbaum Associates, Inc.
Formalization and Analysis of Reasoning by Assumption
ERIC Educational Resources Information Center
Bosse, Tibor; Jonker, Catholijn M.; Treur, Jan
2006-01-01
This article introduces a novel approach for the analysis of the dynamics of reasoning processes and explores its applicability for the reasoning pattern called reasoning by assumption. More specifically, for a case study in the domain of a Master Mind game, it is shown how empirical human reasoning traces can be formalized and automatically…
Model-based reasoning in the physics laboratory: Framework and initial results
NASA Astrophysics Data System (ADS)
Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.
2015-12-01
[This paper is part of the Focused Collection on Upper Division Physics Courses.] We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to refine the framework and demonstrate its utility by documenting examples of model-based reasoning in the laboratory. When applied to the think-aloud interviews, the framework captures and differentiates students' model-based reasoning and helps identify areas of future research. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of model limitations, and revision. Finally, we document students' challenges in explicitly articulating assumptions when constructing models of experimental systems and further challenges in model construction due to students' insufficient prior conceptual understanding. A modeling perspective reframes many of the seemingly arbitrary technical details of measurement tools and apparatus as an opportunity for authentic and engaging scientific sense making.
The Cosmology of Edgar Allan Poe
NASA Astrophysics Data System (ADS)
Cappi, Alberto
2011-06-01
Eureka is a ``prose poem'' published in 1848, where Edgar Allan Poe presents his original cosmology. While starting from metaphysical assumptions, Poe develops an evolving Newtonian model of the Universe which has many and non casual analogies with modern cosmology. Poe was well informed about astronomical and physical discoveries, and he was influenced by both contemporary science and ancient ideas. For these reasons, Eureka is a unique synthesis of metaphysics, art and science.
A Closer Look at Men Who Sustain Intimate Terrorism by Women
Hines, Denise A.; Douglas, Emily M.
2010-01-01
Over 30 years of research has established that both men and women are capable of sustaining intimate partner violence (IPV) by their opposite-sex partners, yet little research has examined men's experiences in such relationships. Some experts in the field have forwarded assumptions about men who sustain IPV–for example, that the abuse they experience is trivial or humorous and of no consequence and that, if their abuse was severe enough, they have the financial and psychological resources to easily leave the relationship–but these assumptions have little data to support them. The present study is an in-depth, descriptive examination of 302 men who sustained severe IPV from their women partners within the previous year and sought help. We present information on their demographics, overall mental health, and the types and frequency of various forms of physical and psychological IPV they sustained. We also provide both quantitative and qualitative information about their last physical argument and their reasons for staying in the relationship. It is concluded that, contrary to many assumptions about these men, the IPV they sustain is quite severe and both mentally and physically damaging; their most frequent response to their partner's IPV is to get away from her; and they are often blocked in their efforts to leave, sometimes physically, but more often because of strong psychological and emotional ties to their partners and especially their children. These results are discussed in terms of their implications for policy and practice. PMID:20686677
Supporting students in building interdisciplinary connections across physics and biology
NASA Astrophysics Data System (ADS)
Turpen, Chandra
2014-03-01
Our research team has been engaged in the iterative redesign of an Introductory Physics course for Life Science (IPLS) majors to explicitly bridge biology and physics in ways that are authentic to the disciplines. Our interdisciplinary course provides students opportunities to examine how modeling decisions (e.g. knowing when and how to use different concepts, identifying implicit assumptions, making and justifying assumptions) may differ depending on canonical disciplinary aims and interests. Our focus on developing students' interdisciplinary reasoning skills requires 1) shifting course topics to focus on core ideas that span the disciplines, 2) shifting epistemological expectations, and 3) foregrounding typically tacit disciplinary assumptions. In working to build an authentic interdisciplinary course that bridges physics and biology, we pay careful attention to supporting students in constructing these bridges. This course has been shown to have important impacts: a) students seek meaningful connections between the disciplines, b) students perceive relevance and utility of ideas from different disciplines, and c) students reconcile challenging disciplinary ideas. Although our focus has been on building interdisciplinary coherence, we have succeeded in maintaining strong student learning gains on fundamental physics concepts and allowed students to deepen their understanding of challenging concepts in thermodynamics. This presentation will describe the shifts in course content and the modern pedagogical approaches that have been integrated into the course, and provide an overview of key research results from this project. These results may aid physicists in reconsidering how they can meaningfully reach life-science students. This work is supported by NSF-TUES DUE 11-22818, the HHMI NEXUS grant, and a NSF Graduate Research Fellowship (DGE 0750616).
Moral reasoning and personality traits.
Mudrack, Peter E
2006-06-01
Moral reasoning should not be clearly associated with measures of personality traits. Although this assumption pervades the moral reasoning literature, it may not always be true. This paper provides evidence that moral reasoning, as assessed with P scores of the Defining Issues Test, is indeed positively associated with five traits from the California Psychological Inventory: Achievement via Independence, Intellectual Efficiency, Tolerance, Responsibility, and Capacity for Status. Such relationships make conceptual sense, shed light on the meaning and implications of moral reasoning, call into question prevailing assumptions in the literature, and may encourage investigators to broaden the types of research questions asked in the context of moral reasoning.
Mathematization in introductory physics
NASA Astrophysics Data System (ADS)
Brahmia, Suzanne M.
Mathematization is central to STEM disciplines as a cornerstone of the quantitative reasoning that characterizes these fields. Introductory physics is required for most STEM majors in part so that students develop expert-like mathematization. This dissertation describes coordinated research and curriculum development for strengthening mathematization in introductory physics; it blends scholarship in physics and mathematics education in the form of three papers. The first paper explores mathematization in the context of physics, and makes an original contribution to the measurement of physics students' struggle to mathematize. Instructors naturally assume students have a conceptual mastery of algebra before embarking on a college physics course because these students are enrolled in math courses beyond algebra. This paper provides evidence that refutes the validity of this assumption and categorizes some of the barriers students commonly encounter with quantification and representing ideas symbolically. The second paper develops a model of instruction that can help students progress from their starting points to their instructor's desired endpoints. Instructors recognize that the introductory physics course introduces new ideas at an astonishing rate. More than most physicists realize, however, the way that mathematics is used in the course is foreign to a large portion of class. This paper puts forth an instructional model that can move all students toward better quantitative and physical reasoning, despite the substantial variability of those students' initial states. The third paper describes the design and testing of curricular materials that foster mathematical creativity to prepare students to better understand physics reasoning. Few students enter introductory physics with experience generating equations in response to specific challenges involving unfamiliar quantities and units, yet this generative use of mathematics is typical of the thinking involved in doing physics. It contrasts with their more common experience with mathematics as the practice of specified procedures to improve efficiency. This paper describes new curricular materials based on invention instruction provide students with opportunities to generate mathematical relationships in physics, and the paper presents preliminary evidence of the effectiveness of this method with mathematically underprepared engineering students.
Making Predictions about Chemical Reactivity: Assumptions and Heuristics
ERIC Educational Resources Information Center
Maeyer, Jenine; Talanquer, Vicente
2013-01-01
Diverse implicit cognitive elements seem to support but also constrain reasoning in different domains. Many of these cognitive constraints can be thought of as either implicit assumptions about the nature of things or reasoning heuristics for decision-making. In this study we applied this framework to investigate college students' understanding of…
Common-Sense Chemistry: The Use of Assumptions and Heuristics in Problem Solving
ERIC Educational Resources Information Center
Maeyer, Jenine Rachel
2013-01-01
Students experience difficulty learning and understanding chemistry at higher levels, often because of cognitive biases stemming from common sense reasoning constraints. These constraints can be divided into two categories: assumptions (beliefs held about the world around us) and heuristics (the reasoning strategies or rules used to build…
NASA Astrophysics Data System (ADS)
Dowd, Jason E.; Araujo, Ives; Mazur, Eric
2015-06-01
Although confusion is generally perceived to be negative, educators dating as far back as Socrates, who asked students to question assumptions and wrestle with ideas, have challenged this notion. Can confusion be productive? How should instructors interpret student expressions of confusion? During two semesters of introductory physics that involved Just-in-Time Teaching (JiTT) and research-based reading materials, we evaluated performance on reading assignments while simultaneously measuring students' self-assessment of their confusion over the preclass reading material (N =137 ; Nfall=106 , Nspring=88 ). We examined the relationship between confusion and correctness, confidence in reasoning, and (in the spring) precourse self-efficacy. We find that student expressions of confusion before coming to class are negatively related to correctness on preclass content-related questions, confidence in reasoning on those questions, and self-efficacy, but weakly positively related to final grade when controlling for these factors (β =0.23 , p =0.03 ).
Gleason-Busch theorem for sequential measurements
NASA Astrophysics Data System (ADS)
Flatt, Kieran; Barnett, Stephen M.; Croke, Sarah
2017-12-01
Gleason's theorem is a statement that, given some reasonable assumptions, the Born rule used to calculate probabilities in quantum mechanics is essentially unique [A. M. Gleason, Indiana Univ. Math. J. 6, 885 (1957), 10.1512/iumj.1957.6.56050]. We show that Gleason's theorem contains within it also the structure of sequential measurements, and along with this the state update rule. We give a small set of axioms, which are physically motivated and analogous to those in Busch's proof of Gleason's theorem [P. Busch, Phys. Rev. Lett. 91, 120403 (2003), 10.1103/PhysRevLett.91.120403], from which the familiar Kraus operator form follows. An axiomatic approach has practical relevance as well as fundamental interest, in making clear those assumptions which underlie the security of quantum communication protocols. Interestingly, the two-time formalism is seen to arise naturally in this approach.
Shriver, K A
1986-01-01
Realistic estimates of economic depreciation are required for analyses of tax policy, economic growth and production, and national income and wealth. THe purpose of this paper is to examine the stability assumption underlying the econometric derivation of empirical estimates of economic depreciation for industrial machinery and and equipment. The results suggest that a reasonable stability of economic depreciation rates of decline may exist over time. Thus, the assumption of a constant rate of economic depreciation may be a reasonable approximation for further empirical economic analyses.
Violation of local realism with freedom of choice
Scheidl, Thomas; Ursin, Rupert; Kofler, Johannes; Ramelow, Sven; Ma, Xiao-Song; Herbst, Thomas; Ratschbacher, Lothar; Fedrizzi, Alessandro; Langford, Nathan K.; Jennewein, Thomas; Zeilinger, Anton
2010-01-01
Bell’s theorem shows that local realistic theories place strong restrictions on observable correlations between different systems, giving rise to Bell’s inequality which can be violated in experiments using entangled quantum states. Bell’s theorem is based on the assumptions of realism, locality, and the freedom to choose between measurement settings. In experimental tests, “loopholes” arise which allow observed violations to still be explained by local realistic theories. Violating Bell’s inequality while simultaneously closing all such loopholes is one of the most significant still open challenges in fundamental physics today. In this paper, we present an experiment that violates Bell’s inequality while simultaneously closing the locality loophole and addressing the freedom-of-choice loophole, also closing the latter within a reasonable set of assumptions. We also explain that the locality and freedom-of-choice loopholes can be closed only within nondeterminism, i.e., in the context of stochastic local realism. PMID:21041665
Violation of local realism with freedom of choice.
Scheidl, Thomas; Ursin, Rupert; Kofler, Johannes; Ramelow, Sven; Ma, Xiao-Song; Herbst, Thomas; Ratschbacher, Lothar; Fedrizzi, Alessandro; Langford, Nathan K; Jennewein, Thomas; Zeilinger, Anton
2010-11-16
Bell's theorem shows that local realistic theories place strong restrictions on observable correlations between different systems, giving rise to Bell's inequality which can be violated in experiments using entangled quantum states. Bell's theorem is based on the assumptions of realism, locality, and the freedom to choose between measurement settings. In experimental tests, "loopholes" arise which allow observed violations to still be explained by local realistic theories. Violating Bell's inequality while simultaneously closing all such loopholes is one of the most significant still open challenges in fundamental physics today. In this paper, we present an experiment that violates Bell's inequality while simultaneously closing the locality loophole and addressing the freedom-of-choice loophole, also closing the latter within a reasonable set of assumptions. We also explain that the locality and freedom-of-choice loopholes can be closed only within nondeterminism, i.e., in the context of stochastic local realism.
Models in biology: ‘accurate descriptions of our pathetic thinking’
2014-01-01
In this essay I will sketch some ideas for how to think about models in biology. I will begin by trying to dispel the myth that quantitative modeling is somehow foreign to biology. I will then point out the distinction between forward and reverse modeling and focus thereafter on the former. Instead of going into mathematical technicalities about different varieties of models, I will focus on their logical structure, in terms of assumptions and conclusions. A model is a logical machine for deducing the latter from the former. If the model is correct, then, if you believe its assumptions, you must, as a matter of logic, also believe its conclusions. This leads to consideration of the assumptions underlying models. If these are based on fundamental physical laws, then it may be reasonable to treat the model as ‘predictive’, in the sense that it is not subject to falsification and we can rely on its conclusions. However, at the molecular level, models are more often derived from phenomenology and guesswork. In this case, the model is a test of its assumptions and must be falsifiable. I will discuss three models from this perspective, each of which yields biological insights, and this will lead to some guidelines for prospective model builders. PMID:24886484
ERIC Educational Resources Information Center
Slisko, Josip; Cruz, Adrian Corona
2013-01-01
There is a general agreement that critical thinking is an important element of 21st century skills. Although critical thinking is a very complex and controversial conception, many would accept that recognition and evaluation of assumptions is a basic critical-thinking process. When students use simple mathematical model to reason quantitatively…
Blumenthal, J A
1998-02-01
Courts and legislatures have begun to develop the "reasonable woman standard" (RWS) as a criterion for deciding sexual harassment trials. This standard rests on assumptions of a "wide divergence" between the perceptions of men and women when viewing social-sexual behavior that may be considered harassing. Narrative reviews of the literature on such perceptions have suggested that these assumptions are only minimally supported. To test these assumptions quantitatively, a meta-analytic review was conducted that assessed the size, stability, and moderators of gender differences in perceptions of sexual harassment. The effect of the actor's status relative to the target also was evaluated meta-analytically, as one alternative to the importance of gender effects. Results supported the claims of narrative reviews for a relatively small gender effect, and draw attention to the status effect. In discussing legal implications of the present findings, earlier claims are echoed suggesting caution in establishing the reasonable woman standard, and one alternative to the RWS, the "reasonable victim standard," is discussed.
How unequal fluxes of high energy astrophysical neutrinos and antineutrinos can fake new physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nunokawa, Hiroshi; Panes, Boris; Funchal, Renata Zukanovich
Flavor ratios of very high energy astrophysical neutrinos, which can be studied at the Earth by a neutrino telescope such as IceCube, can serve to diagnose their production mechanism at the astrophysical source. The flavor ratios for neutrinos and antineutrinos can be quite different as we do not know how they are produced in the astrophysical environment. Due to this uncertainty the neutrino and antineutrino flavor ratios at the Earth also could be quite different. Nonetheless, it is generally assumed that flavor ratios for neutrinos and antineutrinos are the same at the Earth, in fitting the high energy astrophysical neutrinomore » data. This is a reasonable assumption for the limited statistics for the data we currently have. However, in the future the fit must be performed allowing for a possible discrepancy in these two fractions in order to be able to disentangle different production mechanisms at the source from new physics in the neutrino sector. To reinforce this issue, in this work we show that a wrong assumption about the distribution of neutrino flavor ratios at the Earth may indeed lead to misleading interpretations of IceCube results.« less
Thermal machines beyond the weak coupling regime
NASA Astrophysics Data System (ADS)
Gallego, R.; Riera, A.; Eisert, J.
2014-12-01
How much work can be extracted from a heat bath using a thermal machine? The study of this question has a very long history in statistical physics in the weak-coupling limit, when applied to macroscopic systems. However, the assumption that thermal heat baths remain uncorrelated with associated physical systems is less reasonable on the nano-scale and in the quantum setting. In this work, we establish a framework of work extraction in the presence of quantum correlations. We show in a mathematically rigorous and quantitative fashion that quantum correlations and entanglement emerge as limitations to work extraction compared to what would be allowed by the second law of thermodynamics. At the heart of the approach are operations that capture the naturally non-equilibrium dynamics encountered when putting physical systems into contact with each other. We discuss various limits that relate to known results and put our work into the context of approaches to finite-time quantum thermodynamics.
Uniform circular motion in general relativity: existence and extendibility of the trajectories
NASA Astrophysics Data System (ADS)
de la Fuente, Daniel; Romero, Alfonso; Torres, Pedro J.
2017-06-01
The concept of uniform circular motion in a general spacetime is introduced as a particular case of a planar motion. The initial value problem of the corresponding differential equation is analysed in detail. Geometrically, an observer that obeys a uniform circular motion is characterized as a Lorentzian helix. The completeness of inextensible trajectories is studied in generalized Robertson-Walker spacetimes and in a relevant family of pp-wave spacetimes. Under reasonable assumptions, the physical interpretation of such results is that a uniform circular observer lives forever, providing the absence of the singularities defined by these timelike curves.
NASA Astrophysics Data System (ADS)
Watson, Norman F.
The relative merits of gimballed INS based on mechanical gyroscopes and strapdown INS based on ring laser gyroscopes are compared with regard to their use in 1 nm/hr combat aircraft navigation. Navigation performance, velocity performance, attitude performance, body axis outputs, environmental influences, reliability and maintainability, cost, and physical parameters are taken into consideration. Some of the advantages which have been claimed elsewhere for the laser INS, such as dramatically lower life cycle costs than for gimballed INS, are shown to be unrealistic under reasonable assumptions.
Hadronic and nuclear interactions in QCD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Despite the evidence that QCD - or something close to it - gives a correct description of the structure of hadrons and their interactions, it seems paradoxical that the theory has thus far had very little impact in nuclear physics. One reason for this is that the application of QCD to distances larger than 1 fm involves coherent, non-perturbative dynamics which is beyond present calculational techniques. For example, in QCD the nuclear force can evidently be ascribed to quark interchange and gluon exchange processes. These, however, are as complicated to analyze from a fundamental point of view as is themore » analogous covalent bond in molecular physics. Since a detailed description of quark-quark interactions and the structure of hadronic wavefunctions is not yet well-understood in QCD, it is evident that a quantitative first-principle description of the nuclear force will require a great deal of theoretical effort. Another reason for the limited impact of QCD in nuclear physics has been the conventional assumption that nuclear interactions can for the most part be analyzed in terms of an effective meson-nucleon field theory or potential model in isolation from the details of short distance quark and gluon structure of hadrons. These lectures, argue that this view is untenable: in fact, there is no correspondence principle which yields traditional nuclear physics as a rigorous large-distance or non-relativistic limit of QCD dynamics. On the other hand, the distinctions between standard nuclear physics dynamics and QCD at nuclear dimensions are extremely interesting and illuminating for both particle and nuclear physics.« less
Tumor Restrictions to Oncolytic Virus
Vähä-Koskela, Markus; Hinkkanen, Ari
2014-01-01
Oncolytic virotherapy has advanced since the days of its conception but therapeutic efficacy in the clinics does not seem to reach the same level as in animal models. One reason is premature oncolytic virus clearance in humans, which is a reasonable assumption considering the immune-stimulating nature of the oncolytic agents. However, several studies are beginning to reveal layers of restriction to oncolytic virotherapy that are present before an adaptive neutralizing immune response. Some of these barriers are present constitutively halting infection before it even begins, whereas others are raised by minute cues triggered by virus infection. Indeed, we and others have noticed that delivering viruses to tumors may not be the biggest obstacle to successful therapy, but instead the physical make-up of the tumor and its capacity to mount antiviral defenses seem to be the most important efficacy determinants. In this review, we summarize the constitutive and innate barriers to oncolytic virotherapy and discuss strategies to overcome them. PMID:28548066
In Defense of Artificial Replacement.
Shiller, Derek
2017-06-01
If it is within our power to provide a significantly better world for future generations at a comparatively small cost to ourselves, we have a strong moral reason to do so. One way of providing a significantly better world may involve replacing our species with something better. It is plausible that in the not-too-distant future, we will be able to create artificially intelligent creatures with whatever physical and psychological traits we choose. Granted this assumption, it is argued that we should engineer our extinction so that our planet's resources can be devoted to making artificial creatures with better lives. © 2017 John Wiley & Sons Ltd.
Schroyens, Walter; Fleerackers, Lieve; Maes, Sunile
2010-01-01
Two experiments (N1 = 117 and N2 = 245) on reasoning with knowledge-rich conditionals showed a main effect of logical validity, which was due to the negative effect of counter-examples being smaller for valid than for invalid arguments. These findings support the thesis that some people tend to inhibit background inconsistent with the hypothetical truth of the premises, while others tend to abandon the implicit truth-assumption when they have factual evidence to the contrary. Findings show that adhering to the truth-assumption in the face of conflicting evidence to the contrary requires an investment of time and effort which people with a higher general aptitude are more likely to do. PMID:21228921
Structure induction in diagnostic causal reasoning.
Meder, Björn; Mayrhofer, Ralf; Waldmann, Michael R
2014-07-01
Our research examines the normative and descriptive adequacy of alternative computational models of diagnostic reasoning from single effects to single causes. Many theories of diagnostic reasoning are based on the normative assumption that inferences from an effect to its cause should reflect solely the empirically observed conditional probability of cause given effect. We argue against this assumption, as it neglects alternative causal structures that may have generated the sample data. Our structure induction model of diagnostic reasoning takes into account the uncertainty regarding the underlying causal structure. A key prediction of the model is that diagnostic judgments should not only reflect the empirical probability of cause given effect but should also depend on the reasoner's beliefs about the existence and strength of the link between cause and effect. We confirmed this prediction in 2 studies and showed that our theory better accounts for human judgments than alternative theories of diagnostic reasoning. Overall, our findings support the view that in diagnostic reasoning people go "beyond the information given" and use the available data to make inferences on the (unobserved) causal rather than on the (observed) data level. (c) 2014 APA, all rights reserved.
Children's and Their Friends' Moral Reasoning: Relations with Aggressive Behavior
ERIC Educational Resources Information Center
Gasser, Luciano; Malti, Tina
2012-01-01
Friends' moral characteristics such as their moral reasoning represent an important social contextual factor for children's behavioral socialization. Guided by this assumption, we compared the effects of children's and friends' moral reasoning on their aggressive behavior in a low-risk sample of elementary school children. Peer nominations and…
ERIC Educational Resources Information Center
Ifenthaler, Dirk; Seel, Norbert M.
2013-01-01
In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…
Fostering deliberations about health innovation: what do we want to know from publics?
Lehoux, Pascale; Daudelin, Genevieve; Demers-Payette, Olivier; Boivin, Antoine
2009-06-01
As more complex and uncertain forms of health innovation keep emerging, scholars are increasingly voicing arguments in favour of public involvement in health innovation policy. The current conceptualization of this involvement is, however, somewhat problematic as it tends to assume that scientific facts form a "hard," indisputable core around which "soft," relative values can be attached. This paper, by giving precedence to epistemological issues, explores what there is to know from public involvement. We argue that knowledge and normative assumptions are co-constitutive of each other and pivotal to the ways in which both experts and non-experts reason about health innovations. Because knowledge and normative assumptions are different but interrelated ways of reasoning, public involvement initiatives need to emphasise deliberative processes that maximise mutual learning within and across various groups of both experts and non-experts (who, we argue, all belong to the "publics"). Hence, we believe that what researchers might wish to know from publics is how their reasoning is anchored in normative assumptions (what makes a given innovation desirable?) and in knowledge about the plausibility of their effects (are they likely to be realised?). Accordingly, one sensible goal of greater public involvement in health innovation policy would be to refine normative assumptions and make their articulation with scientific observations explicit and openly contestable. The paper concludes that we must differentiate between normative assumptions and knowledge, rather than set up a dichotomy between them or confound them.
Is the local linearity of space-time inherited from the linearity of probabilities?
NASA Astrophysics Data System (ADS)
Müller, Markus P.; Carrozza, Sylvain; Höhn, Philipp A.
2017-02-01
The appearance of linear spaces, describing physical quantities by vectors and tensors, is ubiquitous in all of physics, from classical mechanics to the modern notion of local Lorentz invariance. However, as natural as this seems to the physicist, most computer scientists would argue that something like a ‘local linear tangent space’ is not very typical and in fact a quite surprising property of any conceivable world or algorithm. In this paper, we take the perspective of the computer scientist seriously, and ask whether there could be any inherently information-theoretic reason to expect this notion of linearity to appear in physics. We give a series of simple arguments, spanning quantum information theory, group representation theory, and renormalization in quantum gravity, that supports a surprising thesis: namely, that the local linearity of space-time might ultimately be a consequence of the linearity of probabilities. While our arguments involve a fair amount of speculation, they have the virtue of being independent of any detailed assumptions on quantum gravity, and they are in harmony with several independent recent ideas on emergent space-time in high-energy physics.
ERIC Educational Resources Information Center
Haegele, Justin A.; Hodge, Samuel R.
2015-01-01
Emerging professionals, particularly senior-level undergraduate and graduate students in kinesiology who have an interest in physical education for individuals with and without disabilities, should understand the basic assumptions of the quantitative research paradigm. Knowledge of basic assumptions is critical for conducting, analyzing, and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Góźdź, A., E-mail: andrzej.gozdz@umcs.lublin.pl; Góźdź, M., E-mail: mgozdz@kft.umcs.lublin.pl
The theory of neutrino oscillations rests on the assumption, that the interaction basis and the physical (mass) basis of neutrino states are different. Therefore neutrino is produced in a certain welldefined superposition of three mass eigenstates, which propagate separately and may be detected as a different superposition. This is called flavor oscillations. It is, however, not clear why neutrinos behave this way, i.e., what is the underlying mechanism which leads to the production of a superposition of physical states in a single reaction. In this paper we argue, that one of the reasons may be connected with the temporal structuremore » of the process. In order to discuss the role of time in processes on the quantum level, we use a special formulation of the quantum mechanics, which is based on the projection time evolution. We arrive at the conclusion, that for short reaction times the formation of a superposition of states of similar masses is natural.« less
The Hidden Reason Behind Children's Misbehavior.
ERIC Educational Resources Information Center
Nystul, Michael S.
1986-01-01
Discusses hidden reason theory based on the assumptions that: (1) the nature of people is positive; (2) a child's most basic psychological need is involvement; and (3) a child has four possible choices in life (good somebody, good nobody, bad somebody, or severely mentally ill.) A three step approach for implementing hidden reason theory is…
Logic Brightens My Day: Evidence for Implicit Sensitivity to Logical Validity
ERIC Educational Resources Information Center
Trippas, Dries; Handley, Simon J.; Verde, Michael F.; Morsanyi, Kinga
2016-01-01
A key assumption of dual process theory is that reasoning is an explicit, effortful, deliberative process. The present study offers evidence for an implicit, possibly intuitive component of reasoning. Participants were shown sentences embedded in logically valid or invalid arguments. Participants were not asked to reason but instead rated the…
Halo-independent direct detection analyses without mass assumptions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Adam J.; Fox, Patrick J.; Kahn, Yonatan
2015-10-01
Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the m{sub χ}−σ{sub n} plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the v{sub min}− g-tilde plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from v{sub min} to nuclear recoil momentummore » (p{sub R}), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call h-tilde (p{sub R}). The entire family of conventional halo-independent g-tilde (v{sub min}) plots for all DM masses are directly found from the single h-tilde (p{sub R}) plot through a simple rescaling of axes. By considering results in h-tilde (p{sub R}) space, one can determine if two experiments are inconsistent for all masses and all physically possible halos, or for what range of dark matter masses the results are inconsistent for all halos, without the necessity of multiple g-tilde (v{sub min}) plots for different DM masses. We conduct a sample analysis comparing the CDMS II Si events to the null results from LUX, XENON10, and SuperCDMS using our method and discuss how the results can be strengthened by imposing the physically reasonable requirement of a finite halo escape velocity.« less
Halo-independent direct detection analyses without mass assumptions
Anderson, Adam J.; Fox, Patrick J.; Kahn, Yonatan; ...
2015-10-06
Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the m χ – σ n plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the v min – g ~ plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from v minmore » to nuclear recoil momentum (p R), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call tilde h(p R). The entire family of conventional halo-independent tilde g ~(v min) plots for all DM masses are directly found from the single tilde h ~(p R) plot through a simple rescaling of axes. By considering results in tildeh ~(p R) space, one can determine if two experiments are inconsistent for all masses and all physically possible halos, or for what range of dark matter masses the results are inconsistent for all halos, without the necessity of multiple tilde g ~(v min) plots for different DM masses. As a result, we conduct a sample analysis comparing the CDMS II Si events to the null results from LUX, XENON10, and SuperCDMS using our method and discuss how the results can be strengthened by imposing the physically reasonable requirement of a finite halo escape velocity.« less
NASA Astrophysics Data System (ADS)
Salmon, R. A.; Priestley, R. K.; Goven, J. F.
2014-12-01
Scientists, policymakers and science communicators generally work from an assumption that science communication, or 'outreach', is good and often work from a primarily practice-based knowledge. Meanwhile, the science, technology and society (STS) community, which is strongly grounded in theory and critical analysis, is critical of certain aspects of science communication, particularly in controversial areas of science. Unfortunately, these two groups rarely speak to each other, and when they do they don't necessarily understand one another. Much of this confusion relates to different assumptions around the goals of science communication, as well as differing understandings of the various roles and responsibilities in both science and society. The result, unfortunately, is a lack of science communication practice and theory informing each other. This research is a collaboration between a scientist communicator with a positive attitude to outreach, who works in the field of climate change; a political theorist with expertise in public dialogue around biotechnology and has been critical of motivations for engaging the public with science; and a science historian and science communicator who has uncovered surprising and significant changes in public attitudes towards nuclear science and technology in New Zealand. By exploring our understanding of science communication through these diverse disciplinary lenses, and considering three fields of science that are or have been highly controversial for different reasons, we have identified several subtleties in both the politics of communicating different areas of controversial science, and the difficulties of finding a common language across social and physical sciences. We conclude that greater reflexivity about our own roles and assumptions, and increased efforts at enhanced understanding across disciplines, is central to applying the theories in STS to the practice of communication by scientists.
CMG-Augmented Control of a Hovering VTOL Platform
NASA Technical Reports Server (NTRS)
Lim, K. B.; Moerder, D. D.
2007-01-01
This paper describes how Control Moment Gyroscopes (CMGs) can be used for stability augmentation to a thrust vectoring system for a generic Vertical Take-Off and Landing platform. The response characteristics of the platform which uses only thrust vectoring and a second configuration which includes a single-gimbal CMG array are simulated and compared for hovering flight while subject to severe air turbulence. Simulation results demonstrate the effectiveness of a CMG array in its ability to significantly reduce the agility requirement on the thrust vectoring system. Albeit simplifying physical assumptions on a generic CMG configuration, the numerical results also suggest that reasonably sized CMGs will likely be sufficient for a small hovering vehicle.
Suicide Notes in Hong Kong in 2000
ERIC Educational Resources Information Center
Wong, Paul W. C.; Yeung, April W. M.; Chan, Wincy S. C.; Yip, Paul S. F.; Tang, Arthur K. H.
2009-01-01
Suicide notes have been regarded as one of the most informative data sources to understand the reasons why people commit suicide. However, there is a paucity of suicide note studies, leaving researchers with an assumption that this phenomenon remains static over time. This study examines this assumption by comparing the characteristics of…
Abusive Administration: A Case Study
ERIC Educational Resources Information Center
Jefferson, Anne L.
2006-01-01
In the academic world, there is an assumption of reasonable administrative conduct. In fact, to ensure such conduct, universities, like other public institutions, may have collective agreements to reinforce this assumption. However, in some cases, the university as employer can very quick off the mark should any faculty member wander into what it…
New Assumptions to Guide SETI Research
NASA Technical Reports Server (NTRS)
Colombano, S. P.
2018-01-01
The recent Kepler discoveries of Earth-like planets offer the opportunity to focus our attention on detecting signs of life and technology in specific planetary systems, but I feel we need to become more flexible in our assumptions. The reason is that, while it is still reasonable and conservative to assume that life is most likely to have originated in conditions similar to ours, the vast time differences in potential evolutions render the likelihood of "matching" technologies very slim. In light of these challenges I propose a more "aggressive"� approach to future SETI exploration in directions that until now have received little consideration.
NASA Technical Reports Server (NTRS)
Perez-Peraza, J.; Alvarez, M.; Laville, A.; Gallegos, A.
1985-01-01
The study of charge changing cross sections of fast ions colliding with matter provides the fundamental basis for the analysis of the charge states produced in such interactions. Given the high degree of complexity of the phenomena, there is no theoretical treatment able to give a comprehensive description. In fact, the involved processes are very dependent on the basic parameters of the projectile, such as velocity charge state, and atomic number, and on the target parameters, the physical state (molecular, atomic or ionized matter) and density. The target velocity, may have also incidence on the process, through the temperature of the traversed medium. In addition, multiple electron transfer in single collisions intrincates more the phenomena. Though, in simplified cases, such as protons moving through atomic hydrogen, considerable agreement has been obtained between theory and experiments However, in general the available theoretical approaches have only limited validity in restricted regions of the basic parameters. Since most measurements of charge changing cross sections are performed in atomic matter at ambient temperature, models are commonly based on the assumption of targets at rest, however at Astrophysical scales, temperature displays a wide range in atomic and ionized matter. Therefore, due to the lack of experimental data , an attempt is made here to quantify temperature dependent cross sections on basis to somewhat arbitrary, but physically reasonable assumptions.
ERIC Educational Resources Information Center
Cullipher, S.; Sevian, H.; Talanquer, V.
2015-01-01
The ability to evaluate options and make informed decisions about problems in relevant contexts is a core competency in science education that requires the use of both domain-general and discipline-specific knowledge and reasoning strategies. In this study we investigated the implicit assumptions and modes of reasoning applied by individuals with…
Memory Activation and the Availability of Explanations in Sequential Diagnostic Reasoning
ERIC Educational Resources Information Center
Mehlhorn, Katja; Taatgen, Niels A.; Lebiere, Christian; Krems, Josef F.
2011-01-01
In the field of diagnostic reasoning, it has been argued that memory activation can provide the reasoner with a subset of possible explanations from memory that are highly adaptive for the task at hand. However, few studies have experimentally tested this assumption. Even less empirical and theoretical work has investigated how newly incoming…
In Their Own Words: A Qualitative Study of the Reasons Australian University Students Plagiarize
ERIC Educational Resources Information Center
Devlin, Marcia; Gray, Kathleen
2007-01-01
The ways in which universities and individual academics attempt to deter and respond to student plagiarism may be based on untested assumptions about particular or primary reasons for this behaviour. Using a series of group interviews, this qualitative study gathered the views of 56 Australian university students on the possible reasons for…
On Cognitive Constraints and Learning Progressions: The Case of "Structure of Matter"
ERIC Educational Resources Information Center
Talanquer, Vicente
2009-01-01
Based on the analysis of available research on students' alternative conceptions about the particulate nature of matter, we identified basic implicit assumptions that seem to constrain students' ideas and reasoning on this topic at various learning stages. Although many of these assumptions are interrelated, some of them seem to change or…
Just Say No to Affirmative Action
ERIC Educational Resources Information Center
Heriot, Gail
2011-01-01
The assumption behind the fierce competition for admission to elite colleges and universities is clear: The more elite the school one attends, the brighter one's future. That assumption, however, may well be flawed. The research examined recently by the U.S. Commission on Civil Rights provides strong reason to believe that attending the most…
7 CFR 1779.88 - Transfers and assumptions.
Code of Federal Regulations, 2013 CFR
2013-01-01
... borrowers will include a one-time nonrefundable transfer fee to the Agency of no more than 1 percent... reasonable debt-paying ability considering their assets and income at the time of transfer, and (ii) The... 7 Agriculture 12 2013-01-01 2013-01-01 false Transfers and assumptions. 1779.88 Section 1779.88...
ERIC Educational Resources Information Center
Sánchez Tapia, Ingrid; Gelman, Susan A.; Hollander, Michelle A.; Manczak, Erika M.; Mannheim, Bruce; Escalante, Carmen
2016-01-01
Teleological reasoning involves the assumption that entities exist for a purpose (giraffes have long necks for reaching leaves). This study examines how teleological reasoning relates to cultural context, by studying teleological reasoning in 61 Quechua-speaking Peruvian preschoolers (M[subscript age] = 5.3 years) and adults in an indigenous…
Strong Stackelberg reasoning in symmetric games: An experimental replication and extension
Colman, Andrew M.; Lawrence, Catherine L.
2014-01-01
In common interest games in which players are motivated to coordinate their strategies to achieve a jointly optimal outcome, orthodox game theory provides no general reason or justification for choosing the required strategies. In the simplest cases, where the optimal strategies are intuitively obvious, human decision makers generally coordinate without difficulty, but how they achieve this is poorly understood. Most theories seeking to explain strategic coordination have limited applicability, or require changes to the game specification, or introduce implausible assumptions or radical departures from fundamental game-theoretic assumptions. The theory of strong Stackelberg reasoning, according to which players choose strategies that would maximize their own payoffs if their co-players could invariably anticipate any strategy and respond with a best reply to it, avoids these problems and explains strategic coordination in all dyadic common interest games. Previous experimental evidence has provided evidence for strong Stackelberg reasoning in asymmetric games. Here we report evidence from two experiments consistent with players being influenced by strong Stackelberg reasoning in a wide variety of symmetric 3 × 3 games but tending to revert to other choice criteria when strong Stackelberg reasoning generates small payoffs. PMID:24688846
Strong Stackelberg reasoning in symmetric games: An experimental replication and extension.
Pulford, Briony D; Colman, Andrew M; Lawrence, Catherine L
2014-01-01
In common interest games in which players are motivated to coordinate their strategies to achieve a jointly optimal outcome, orthodox game theory provides no general reason or justification for choosing the required strategies. In the simplest cases, where the optimal strategies are intuitively obvious, human decision makers generally coordinate without difficulty, but how they achieve this is poorly understood. Most theories seeking to explain strategic coordination have limited applicability, or require changes to the game specification, or introduce implausible assumptions or radical departures from fundamental game-theoretic assumptions. The theory of strong Stackelberg reasoning, according to which players choose strategies that would maximize their own payoffs if their co-players could invariably anticipate any strategy and respond with a best reply to it, avoids these problems and explains strategic coordination in all dyadic common interest games. Previous experimental evidence has provided evidence for strong Stackelberg reasoning in asymmetric games. Here we report evidence from two experiments consistent with players being influenced by strong Stackelberg reasoning in a wide variety of symmetric 3 × 3 games but tending to revert to other choice criteria when strong Stackelberg reasoning generates small payoffs.
Blacksher, Erika; Lovasi, Gina S
2012-03-01
Built environment characteristics have been linked to health outcomes and health disparities. However, the effects of an environment on behavior may depend on human perception, interpretation, motivation, and other forms of human agency. We draw on epidemiological and ethical concepts to articulate a critique of research on the built environment and physical activity. We identify problematic assumptions and enumerate both scientific and ethical reasons to incorporate subjective perspectives and public engagement strategies into built environment research and interventions. We maintain that taking agency seriously is essential to the pursuit of health equity and the broader demands of social justice in public health, an important consideration as studies of the built environment and physical activity increasingly focus on socially disadvantaged communities. Attention to how people understand their environment and navigate competing demands can improve the scientific value of ongoing efforts to promote active living and health, while also better fulfilling our ethical obligations to the individuals and communities whose health we strive to protect. Copyright © 2011 Elsevier Ltd. All rights reserved.
Voorspoels, Wouter; Navarro, Daniel J; Perfors, Amy; Ransom, Keith; Storms, Gert
2015-09-01
A robust finding in category-based induction tasks is for positive observations to raise the willingness to generalize to other categories while negative observations lower the willingness to generalize. This pattern is referred to as monotonic generalization. Across three experiments we find systematic non-monotonicity effects, in which negative observations raise the willingness to generalize. Experiments 1 and 2 show that this effect emerges in hierarchically structured domains when a negative observation from a different category is added to a positive observation. They also demonstrate that this is related to a specific kind of shift in the reasoner's hypothesis space. Experiment 3 shows that the effect depends on the assumptions that the reasoner makes about how inductive arguments are constructed. Non-monotonic reasoning occurs when people believe the facts were put together by a helpful communicator, but monotonicity is restored when they believe the observations were sampled randomly from the environment. Copyright © 2015 Elsevier Inc. All rights reserved.
Monte Carlo simulation of wave sensing with a short pulse radar
NASA Technical Reports Server (NTRS)
Levine, D. M.; Davisson, L. D.; Kutz, R. L.
1977-01-01
A Monte Carlo simulation is used to study the ocean wave sensing potential of a radar which scatters short pulses at small off-nadir angles. In the simulation, realizations of a random surface are created commensurate with an assigned probability density and power spectrum. Then the signal scattered back to the radar is computed for each realization using a physical optics analysis which takes wavefront curvature and finite radar-to-surface distance into account. In the case of a Pierson-Moskowitz spectrum and a normally distributed surface, reasonable assumptions for a fully developed sea, it has been found that the cumulative distribution of time intervals between peaks in the scattered power provides a measure of surface roughness. This observation is supported by experiments.
Some Thoughts on the Implications of Faster-Than-Light Interstellar Space Travel
NASA Astrophysics Data System (ADS)
Crawford, I. A.
1995-09-01
There are reasons for believing that faster-than-light (FTL) interstellar space travel may be consistent with the laws of physics, and a brief review of various FTL travel concepts is presented. It is argued that FTL travel would revolutionise the scientific exploration of the Universe, but would only significantly shorten the Galactic colonisation timescale from the 106 years estimated on the assumption of sub-light interstellar travel if the mass-production of FTL space vehicles proves to be practical. FTL travel would permit the development of interstellar social and political institutions which would probably be impossible otherwise, and may therefore strengthen the 'zoo hypothesis' as an explanation for the apparent absence of extraterrestrial beings in the Solar System.
ERIC Educational Resources Information Center
Kagan, Jerome
Noting that a reluctance to question some assumptions of social and behavior sciences is one reason for the halting progress in these fields, this book examines three potentially misleading ideas and reasons for their continued popularity. Chapter 1 critiques the idea that all behavior is influenced by one's psychological construction of the…
Nonrational Processes in Ethical Decision Making
ERIC Educational Resources Information Center
Rogerson, Mark D.; Gottlieb, Michael C.; Handelsman, Mitchell M.; Knapp, Samuel; Younggren, Jeffrey
2011-01-01
Most current ethical decision-making models provide a logical and reasoned process for making ethical judgments, but these models are empirically unproven and rely upon assumptions of rational, conscious, and quasi-legal reasoning. Such models predominate despite the fact that many nonrational factors influence ethical thought and behavior,…
Religious Conviction, Morality and Social Convention among Finnish Adolescents
ERIC Educational Resources Information Center
Vainio, Annukka
2011-01-01
The assumptions of Kohlberg, Turiel and Shweder regarding the features of moral reasoning were compared empirically. The moral reasoning of Finnish Evangelical Lutheran, Conservative Laestadian and non-religious adolescents was studied using Kohlberg's Moral Judgment Interview and Turiel Rule Transgression Interview methods. Religiosity and choice…
Exploring examinee behaviours as validity evidence for multiple-choice question examinations.
Surry, Luke T; Torre, Dario; Durning, Steven J
2017-10-01
Clinical-vignette multiple choice question (MCQ) examinations are used widely in medical education. Standardised MCQ examinations are used by licensure and certification bodies to award credentials that are meant to assure stakeholders as to the quality of physicians. Such uses are based on the interpretation of MCQ examination performance as giving meaningful information about the quality of clinical reasoning. There are several assumptions foundational to these interpretations and uses of standardised MCQ examinations. This study explores the implicit assumption that cognitive processes elicited by clinical-vignette MCQ items are like the processes thought to occur with 'real-world' clinical reasoning as theorised by dual-process theory. Fourteen participants (three medical students, five residents and six staff physicians) completed three sets of five timed MCQ items (total 15) from the Medical Knowledge Self-Assessment Program (MKSAP). Upon answering a set of MCQs, each participant completed a retrospective think aloud (TA) protocol. Using constant comparative analysis (CCA) methods sensitised by dual-process theory, we performed a qualitative thematic analysis. Examinee behaviours fell into three categories: clinical reasoning behaviours, test-taking behaviours and reactions to the MCQ. Consistent with dual-process theory, statements about clinical reasoning behaviours were divided into two sub-categories: analytical reasoning and non-analytical reasoning. Each of these categories included several themes. Our study provides some validity evidence that test-takers' descriptions of their cognitive processes during completion of high-quality clinical-vignette MCQs align with processes expected in real-world clinical reasoning. This supports one of the assumptions important for interpretations of MCQ examination scores as meaningful measures of clinical reasoning. Our observations also suggest that MCQs elicit other cognitive processes, including certain test-taking behaviours, that seem 'inauthentic' to real-world clinical reasoning. Further research is needed to explore if similar themes arise in other contexts (e.g. simulated patient encounters) and how observed behaviours relate to performance on MCQ-based assessments. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
2017-04-17
Cyberphysical Systems, Formal Methods , Requirements Patterns, AADL, Assume Guarantee Reasoning Environment 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...5 3. Methods , Assumptions, and Procedures...Rockwell Collins has been addressing these challenges by developing compositional reasoning methods that permit the verification of systems that exceed
Reason, Language and Education: Philosophical Assumptions for New Curricular Orientations
ERIC Educational Resources Information Center
Papastephanou, Marianna; Koutselini, Mary
2006-01-01
A theory of reason, language and their interconnection constitutes a research topic of epistemological, ontological and metaphysical significance. It also represents a crucial point of contention between defenders and detractors of postmodernism. Therefore, in this article we set out to discuss its stakes and search for its most accomplished…
Educational Applications of the Dialectic: Theory and Research.
ERIC Educational Resources Information Center
Slife, Brent D.
The field of education has largely ignored the concept of the dialectic, except in the Socratic teaching method, and even there bipolar meaning or reasoning has not been recognized. Mainstream educational psychology bases its assumptions about human reasoning and learning on current demonstrative concepts of information processing and levels of…
ERIC Educational Resources Information Center
Thacker, Rebecca A.; Gohmann, Stephen F.
1993-01-01
Discusses the "reasonable woman" standard in sexual harassment cases and gender-based differences in defining harassment. Investigates the issue of these differences in the emotional and psychological effects of hostile environments, using data from a survey of 8,523 public employees. (SK)
Quantum dot laser optimization: selectively doped layers
NASA Astrophysics Data System (ADS)
Korenev, Vladimir V.; Konoplev, Sergey S.; Savelyev, Artem V.; Shernyakov, Yurii M.; Maximov, Mikhail V.; Zhukov, Alexey E.
2016-08-01
Edge emitting quantum dot (QD) lasers are discussed. It has been recently proposed to use modulation p-doping of the layers that are adjacent to QD layers in order to control QD's charge state. Experimentally it has been proven useful to enhance ground state lasing and suppress the onset of excited state lasing at high injection. These results have been also confirmed with numerical calculations involving solution of drift-diffusion equations. However, deep understanding of physical reasons for such behavior and laser optimization requires analytical approaches to the problem. In this paper, under a set of assumptions we provide an analytical model that explains major effects of selective p-doping. Capture rates of elections and holes can be calculated by solving Poisson equations for electrons and holes around the charged QD layer. The charge itself is ruled by capture rates and selective doping concentration. We analyzed this self-consistent set of equations and showed that it can be used to optimize QD laser performance and to explain underlying physics.
NASA Astrophysics Data System (ADS)
Hess, Julian; Wang, Yongqi
2016-11-01
A new mixture model for granular-fluid flows, which is thermodynamically consistent with the entropy principle, is presented. The extra pore pressure described by a pressure diffusion equation and the hypoplastic material behavior obeying a transport equation are taken into account. The model is applied to granular-fluid flows, using a closing assumption in conjunction with the dynamic fluid pressure to describe the pressure-like residual unknowns, hereby overcoming previous uncertainties in the modeling process. Besides the thermodynamically consistent modeling, numerical simulations are carried out and demonstrate physically reasonable results, including simple shear flow in order to investigate the vertical distribution of the physical quantities, and a mixture flow down an inclined plane by means of the depth-integrated model. Results presented give insight in the ability of the deduced model to capture the key characteristics of granular-fluid flows. We acknowledge the support of the Deutsche Forschungsgemeinschaft (DFG) for this work within the Project Number WA 2610/3-1.
Physics evaluation of compact tokamak ignition experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uckan, N.A.; Houlberg, W.A.; Sheffield, J.
1985-01-01
At present, several approaches for compact, high-field tokamak ignition experiments are being considered. A comprehensive method for analyzing the potential physics operating regimes and plasma performance characteristics of such ignition experiments with O-D (analytic) and 1-1/2-D (WHIST) transport models is presented. The results from both calculations are in agreement and show that there are regimes in parameter space in which a class of small (R/sub o/ approx. 1-2 m), high-field (B/sub o/ approx. 8-13 T) tokamaks with aB/sub o/S/q/sub */ approx. 25 +- 5 and kappa = b/a approx. 1.6-2.0 appears ignitable for a reasonable range of transport assumptions. Consideringmore » both the density and beta limits, an evaluation of the performance is presented for various forms of chi/sub e/ and chi/sub i/, including degradation at high power and sawtooth activity. The prospects of ohmic ignition are also examined. 16 refs., 13 figs.« less
Relativities of fundamentality
NASA Astrophysics Data System (ADS)
McKenzie, Kerry
2017-08-01
S-dualities have been held to have radical implications for our metaphysics of fundamentality. In particular, it has been claimed that they make the fundamentality status of a physical object theory-relative in an important new way. But what physicists have had to say on the issue has not been clear or consistent, and in particular seems to be ambiguous between whether S-dualities demand an anti-realist interpretation of fundamentality talk or merely a revised realism. This paper is an attempt to bring some clarity to the matter. After showing that even antecedently familiar fundamentality claims are true only relative to a raft of metaphysical, physical, and mathematical assumptions, I argue that the relativity of fundamentality inherent in S-duality nevertheless represents something new, and that part of the reason for this is that it has both realist and anti-realist implications for fundamentality talk. I close by discussing the broader significance that S-dualities have for structuralist metaphysics and for fundamentality metaphysics more generally.
The Project Physics Course (Modularized) for Grades 10-12.
ERIC Educational Resources Information Center
Flint, William
This report was produced by the Sedro-Woolley Project which has the goal of infusing environmental education into the whole curriculum of a school district. Included are assumptions which the author believes are appropriate to environmental education; a relating of these assumptions to some topics of chemistry and physics; an outline of specific…
ERIC Educational Resources Information Center
Sobel, David M.; Kirkham, Natasha Z.
2007-01-01
A fundamental assumption of the causal graphical model framework is the Markov assumption, which posits that learners can discriminate between two events that are dependent because of a direct causal relation between them and two events that are independent conditional on the value of another event(s). Sobel and Kirkham (2006) demonstrated that…
Life Support Baseline Values and Assumptions Document
NASA Technical Reports Server (NTRS)
Anderson, Molly S.; Ewert, Michael K.; Keener, John F.; Wagner, Sandra A.
2015-01-01
The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. With the ability to accurately compare different technologies' performance for the same function, managers will be able to make better decisions regarding technology development.
Evaluation of a distributed catchment scale water balance model
NASA Technical Reports Server (NTRS)
Troch, Peter A.; Mancini, Marco; Paniconi, Claudio; Wood, Eric F.
1993-01-01
The validity of some of the simplifying assumptions in a conceptual water balance model is investigated by comparing simulation results from the conceptual model with simulation results from a three-dimensional physically based numerical model and with field observations. We examine, in particular, assumptions and simplifications related to water table dynamics, vertical soil moisture and pressure head distributions, and subsurface flow contributions to stream discharge. The conceptual model relies on a topographic index to predict saturation excess runoff and on Philip's infiltration equation to predict infiltration excess runoff. The numerical model solves the three-dimensional Richards equation describing flow in variably saturated porous media, and handles seepage face boundaries, infiltration excess and saturation excess runoff production, and soil driven and atmosphere driven surface fluxes. The study catchments (a 7.2 sq km catchment and a 0.64 sq km subcatchment) are located in the North Appalachian ridge and valley region of eastern Pennsylvania. Hydrologic data collected during the MACHYDRO 90 field experiment are used to calibrate the models and to evaluate simulation results. It is found that water table dynamics as predicted by the conceptual model are close to the observations in a shallow water well and therefore, that a linear relationship between a topographic index and the local water table depth is found to be a reasonable assumption for catchment scale modeling. However, the hydraulic equilibrium assumption is not valid for the upper 100 cm layer of the unsaturated zone and a conceptual model that incorporates a root zone is suggested. Furthermore, theoretical subsurface flow characteristics from the conceptual model are found to be different from field observations, numerical simulation results, and theoretical baseflow recession characteristics based on Boussinesq's groundwater equation.
Condition trees as a mechanism for communicating the meaning of uncertainties
NASA Astrophysics Data System (ADS)
Beven, Keith
2015-04-01
Uncertainty communication for environmental problems is fraught with difficulty for good epistemic reasons. The fact that most sources of uncertainty are subject to, and often dominated by, epistemic uncertainties means that the unthinking use of probability theory might actually be misleading and lead to false inference (even in some cases where the assumptions of a probabilistic error model might seem to be reasonably valid). This therefore creates problems in communicating the meaning of probabilistic uncertainties of model predictions to potential users (there are many examples in hydrology, hydraulics, climate change and other domains). It is suggested that one way of being more explicit about the meaning of uncertainties is to associate each type of application with a condition tree of assumptions that need to be made in producing an estimate of uncertainty. The condition tree then provides a basis for discussion and communication of assumptions about uncertainties with users. Agreement of assumptions (albeit generally at some institutional level) will provide some buy-in on the part of users, and a basis for commissioning of future studies. Even in some relatively well-defined problems, such as mapping flood risk, such a condition tree can be rather extensive, but by making each step in the tree explicit then an audit trail is established for future reference. This can act to provide focus in the exercise of agreeing more realistic assumptions.
Lessons Learned from Numerical Simulations of the F-16XL Aircraft at Flight Conditions
NASA Technical Reports Server (NTRS)
Rizzi, Arthur; Jirasek, Adam; Lamar, John; Crippa, Simone; Badcock, Kenneth; Boelens, Oklo
2009-01-01
Nine groups participating in the Cranked Arrow Wing Aerodynamics Project International (CAWAPI) project have contributed steady and unsteady viscous simulations of a full-scale, semi-span model of the F-16XL aircraft. Three different categories of flight Reynolds/Mach number combinations were computed and compared with flight-test measurements for the purpose of code validation and improved understanding of the flight physics. Steady-state simulations are done with several turbulence models of different complexity with no topology information required and which overcome Boussinesq-assumption problems in vortical flows. Detached-eddy simulation (DES) and its successor delayed detached-eddy simulation (DDES) have been used to compute the time accurate flow development. Common structured and unstructured grids as well as individually-adapted unstructured grids were used. Although discrepancies are observed in the comparisons, overall reasonable agreement is demonstrated for surface pressure distribution, local skin friction and boundary velocity profiles at subsonic speeds. The physical modeling, steady or unsteady, and the grid resolution both contribute to the discrepancies observed in the comparisons with flight data, but at this time it cannot be determined how much each part contributes to the whole. Overall it can be said that the technology readiness of CFD-simulation technology for the study of vehicle performance has matured since 2001 such that it can be used today with a reasonable level of confidence for complex configurations.
Dimensions of the Feminist Research Methodology Debate: Impetus, Definitions, Dilemmas & Stances.
ERIC Educational Resources Information Center
Reinharz, Shulamit
For various well-documented reasons, the feminist social movement has been critical of academia as a worksetting and of the social sciences as a set of disciplines. For these reasons, feminists claim that the assumptions underlying several research designs and procedures are sexist. They have developed a feminist methodology to examine these…
The mathematical bases for qualitative reasoning
NASA Technical Reports Server (NTRS)
Kalagnanam, Jayant; Simon, Herbert A.; Iwasaki, Yumi
1991-01-01
The practices of researchers in many fields who use qualitative reasoning are summarized and explained. The goal is to gain an understanding of the formal assumptions and mechanisms that underlie this kind of analysis. The explanations given are based on standard mathematical formalisms, particularly on ordinal properties, continuous differentiable functions, and the mathematics of nonlinear dynamic systems.
Substance Abuse Counselors and Moral Reasoning: Hypothetical and Authentic Dilemmas
ERIC Educational Resources Information Center
Sias, Shari M.
2009-01-01
This exploratory study examined the assumption that the level of moral reasoning (Defining Issues Test; J. R. Rest, 1986) used in solving hypothetical and authentic dilemmas is similar for substance abuse counselors (N = 188). The statistical analyses used were paired-sample t tests, Pearson product-moment correlation, and simultaneous multiple…
Evaluation of Final Examination Papers in Engineering: A Case Study Using Bloom's Taxonomy
ERIC Educational Resources Information Center
Swart, A. J.
2010-01-01
Questions are used to obtain information, stimulate thinking, and redirect reasoning. Academics in higher education use questions on a daily basis to stimulate thinking and reasoning in students. Final examination papers are used by academics to assess the retention and application skills of students. The assumption, however, exists that questions…
Direct observation limits on antimatter gravitation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fischler, Mark; Lykken, Joe; Roberts, Tom
2008-06-01
The proposed Antihydrogen Gravity experiment at Fermilab (P981) will directly measure the gravitational attraction g between antihydrogen and the Earth, with an accuracy of 1% or better. The following key question has been asked by the PAC: Is a possible 1% difference between g and g already ruled out by other evidence? This memo presents the key points of existing evidence, to answer whether such a difference is ruled out (a) on the basis of direct observational evidence; and/or (b) on the basis of indirect evidence, combined with reasoning based on strongly held theoretical assumptions. The bottom line is thatmore » there are no direct observations or measurements of gravitational asymmetry which address the antimatter sector. There is evidence which by indirect reasoning can be taken to rule out such a difference, but the analysis needed to draw that conclusion rests on models and assumptions which are in question for other reasons and are thus worth testing. There is no compelling evidence or theoretical reason to rule out such a difference at the 1% level.« less
Evaluation of a vortex-based subgrid stress model using DNS databases
NASA Technical Reports Server (NTRS)
Misra, Ashish; Lund, Thomas S.
1996-01-01
The performance of a SubGrid Stress (SGS) model for Large-Eddy Simulation (LES) developed by Misra k Pullin (1996) is studied for forced and decaying isotropic turbulence on a 32(exp 3) grid. The physical viability of the model assumptions are tested using DNS databases. The results from LES of forced turbulence at Taylor Reynolds number R(sub (lambda)) approximately equals 90 are compared with filtered DNS fields. Probability density functions (pdfs) of the subgrid energy transfer, total dissipation, and the stretch of the subgrid vorticity by the resolved velocity-gradient tensor show reasonable agreement with the DNS data. The model is also tested in LES of decaying isotropic turbulence where it correctly predicts the decay rate and energy spectra measured by Comte-Bellot & Corrsin (1971).
Comparison of NGA-West2 directivity models
Spudich, Paul A.; Rowshandel, Badie; Shahi, Shrey; Baker, Jack W.; Chiou, Brian S-J
2014-01-01
Five directivity models have been developed based on data from the NGA-West2 database and based on numerical simulations of large strike-slip and reverse-slip earthquakes. All models avoid the use of normalized rupture dimension, enabling them to scale up to the largest earthquakes in a physically reasonable way. Four of the five models are explicitly “narrow-band” (in which the effect of directivity is maximum at a specific period that is a function of earthquake magnitude). Several strategies for determining the zero-level for directivity have been developed. We show comparisons of maps of the directivity amplification. This comparison suggests that the predicted geographic distributions of directivity amplification are dominated by effects of the models' assumptions, and more than one model should be used for ruptures dipping less than about 65 degrees.
Team reasoning and collective rationality: piercing the veil of obviousness.
Colman, Andrew M; Pulford, Briony D; Rose, Jo
2008-06-01
The experiments reported in our target article provide strong evidence of collective utility maximization, and the findings suggest that team reasoning should now be included among the social value orientations used in cognitive and social psychology. Evidential decision theory offers a possible alternative explanation for our results but fails to predict intuitively compelling strategy choices in simple games with asymmetric team-reasoning outcomes. Although many of our experimental participants evidently used team reasoning, some appear to have ignored the other players' expected strategy choices and used lower-level, nonstrategic forms of reasoning. Standard payoff transformations cannot explain the experimental findings, nor team reasoning in general, without an unrealistic assumption that players invariably reason nonstrategically.
Inflation-Theory Implications for Extraterrestrial Visitation
NASA Astrophysics Data System (ADS)
Deardoff, J.; Haisch, B.; Maccabee, B.; Puthoff, H. E.
It has recently been argued that anthropic reasoning applied to inflation theory reinforces the prediction that we should find ourselves part of a large, galaxy-sized civilisation, thus strengthening Fermi's paradox concerning `Where are they?' Furthermore, superstring and M-brane theory allow for the possibility of parallel universes, some of which in principle could be habitable. In addition, discussion of such exotic transport concepts as `traversable wormholes' now appears in the rigorous physics literature. As a result, the `We are alone' solution to Fermi's paradox, based on the constraints of earlier 20th century viewpoints, appears today to be inconsistent with new developments in our best current physics and astrophysics theories. Therefore we reexamine and reevaluate the present assumption that extraterrestrials or their probes are not in the vicinity of Earth, and argue instead that some evidence of their presence might be found in certain high-quality UFO reports. This study follows up on previous arguments that (1) interstellar travel for advanced civilizations is not a priori ruled out by physical principles and therefore may be practicable, and (2) such advanced civilisations may value the search for knowledge from uncontaminated species more than direct, interspecies communication, thereby accounting for apparent covertness regarding their presence.
Baker, Sarah E; Painter, Elizabeth E; Morgan, Brandon C; Kaus, Anna L; Petersen, Evan J; Allen, Christopher S; Deyle, Gail D; Jensen, Gail M
2017-01-01
Clinical reasoning is essential to physical therapist practice. Solid clinical reasoning processes may lead to greater understanding of the patient condition, early diagnostic hypothesis development, and well-tolerated examination and intervention strategies, as well as mitigate the risk of diagnostic error. However, the complex and often subconscious nature of clinical reasoning can impede the development of this skill. Protracted tools have been published to help guide self-reflection on clinical reasoning but might not be feasible in typical clinical settings. This case illustrates how the Systematic Clinical Reasoning in Physical Therapy (SCRIPT) tool can be used to guide the clinical reasoning process and prompt a physical therapist to search the literature to answer a clinical question and facilitate formal mentorship sessions in postprofessional physical therapist training programs. The SCRIPT tool enabled the mentee to generate appropriate hypotheses, plan the examination, query the literature to answer a clinical question, establish a physical therapist diagnosis, and design an effective treatment plan. The SCRIPT tool also facilitated the mentee's clinical reasoning and provided the mentor insight into the mentee's clinical reasoning. The reliability and validity of the SCRIPT tool have not been formally studied. Clinical mentorship is a cornerstone of postprofessional training programs and intended to develop advanced clinical reasoning skills. However, clinical reasoning is often subconscious and, therefore, a challenging skill to develop. The use of a tool such as the SCRIPT may facilitate developing clinical reasoning skills by providing a systematic approach to data gathering and making clinical judgments to bring clinical reasoning to the conscious level, facilitate self-reflection, and make a mentored physical therapist's thought processes explicit to his or her clinical mentor. © 2017 American Physical Therapy Association
Mallinckrodt, C H; Lin, Q; Molenberghs, M
2013-01-01
The objective of this research was to demonstrate a framework for drawing inference from sensitivity analyses of incomplete longitudinal clinical trial data via a re-analysis of data from a confirmatory clinical trial in depression. A likelihood-based approach that assumed missing at random (MAR) was the primary analysis. Robustness to departure from MAR was assessed by comparing the primary result to those from a series of analyses that employed varying missing not at random (MNAR) assumptions (selection models, pattern mixture models and shared parameter models) and to MAR methods that used inclusive models. The key sensitivity analysis used multiple imputation assuming that after dropout the trajectory of drug-treated patients was that of placebo treated patients with a similar outcome history (placebo multiple imputation). This result was used as the worst reasonable case to define the lower limit of plausible values for the treatment contrast. The endpoint contrast from the primary analysis was - 2.79 (p = .013). In placebo multiple imputation, the result was - 2.17. Results from the other sensitivity analyses ranged from - 2.21 to - 3.87 and were symmetrically distributed around the primary result. Hence, no clear evidence of bias from missing not at random data was found. In the worst reasonable case scenario, the treatment effect was 80% of the magnitude of the primary result. Therefore, it was concluded that a treatment effect existed. The structured sensitivity framework of using a worst reasonable case result based on a controlled imputation approach with transparent and debatable assumptions supplemented a series of plausible alternative models under varying assumptions was useful in this specific situation and holds promise as a generally useful framework. Copyright © 2012 John Wiley & Sons, Ltd.
Numerical solution of the electron transport equation
NASA Astrophysics Data System (ADS)
Woods, Mark
The electron transport equation has been solved many times for a variety of reasons. The main difficulty in its numerical solution is that it is a very stiff boundary value problem. The most common numerical methods for solving boundary value problems are symmetric collocation methods and shooting methods. Both of these types of methods can only be applied to the electron transport equation if the boundary conditions are altered with unrealistic assumptions because they require too many points to be practical. Further, they result in oscillating and negative solutions, which are physically meaningless for the problem at hand. For these reasons, all numerical methods for this problem to date are a bit unusual because they were designed to try and avoid the problem of extreme stiffness. This dissertation shows that there is no need to introduce spurious boundary conditions or invent other numerical methods for the electron transport equation. Rather, there already exists methods for very stiff boundary value problems within the numerical analysis literature. We demonstrate one such method in which the fast and slow modes of the boundary value problem are essentially decoupled. This allows for an upwind finite difference method to be applied to each mode as is appropriate. This greatly reduces the number of points needed in the mesh, and we demonstrate how this eliminates the need to define new boundary conditions. This method is verified by showing that under certain restrictive assumptions, the electron transport equation has an exact solution that can be written as an integral. We show that the solution from the upwind method agrees with the quadrature evaluation of the exact solution. This serves to verify that the upwind method is properly solving the electron transport equation. Further, it is demonstrated that the output of the upwind method can be used to compute auroral light emissions.
The gene-editing of super-ego.
Hofmann, Bjørn
2018-04-17
New emerging biotechnologies, such as gene editing, vastly extend our ability to alter the human being. This comes together with strong aspirations to improve humans not only physically, but also mentally, morally, and socially. These conjoined ambitions aggregate to what can be labelled "the gene editing of super-ego." This article investigates a general way used to argue for new biotechnologies, such as gene-editing: if it is safe and efficacious to implement technology X for the purpose of a common good Y, why should we not do so? This is a rhetorical question with a conditional, and may be dismissed as such. Moreover, investigating the question transformed into a formal argument reveals that the argument does not hold either. Nonetheless, the compelling force of the question calls for closer scrutiny, revealing that this way of arguing for biotechnology is based on five assumptions. Analysis of these assumptions shows their significant axiological, empirical, and philosophical challenges. This makes it reasonable to claim that these kinds of question based promotions of specific biotechnologies fail. Hence, the aspirations to make a super-man with a super-ego appear fundamentally flawed. As these types of moral bioenhancement arguments become more prevalent, a revealing hype test is suggested: What is special with this technology (e.g., gene editing), compared to existing methods, that makes it successful in improving human social characteristics in order to make the world a better place for all? Valid answers to this question will provide good reasons to pursue such technologies. Hence, the aim is not to bar the development of modern biotechnology, but rather to ensure good developments and applications of highly potent technologies. So far, we still have a long way to go to make persons with goodness gene(s).
Advanced quantitative measurement methodology in physics education research
NASA Astrophysics Data System (ADS)
Wang, Jing
The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The reasoning ability structures for U.S. and Chinese students at different educational levels are given by the analysis. A final discussion on the advanced quantitative assessment methodology and the pure mathematical methodology is presented at the end.
Haegele, Justin A; Hodge, Samuel Russell
2015-10-01
There are basic philosophical and paradigmatic assumptions that guide scholarly research endeavors, including the methods used and the types of questions asked. Through this article, kinesiology faculty and students with interests in adapted physical activity are encouraged to understand the basic assumptions of applied behavior analysis (ABA) methodology for conducting, analyzing, and presenting research of high quality in this paradigm. The purposes of this viewpoint paper are to present information fundamental to understanding the assumptions undergirding research methodology in ABA, describe key aspects of single-subject research designs, and discuss common research designs and data-analysis strategies used in single-subject studies.
Cognitive-psychology expertise and the calculation of the probability of a wrongful conviction.
Rouder, Jeffrey N; Wixted, John T; Christenfeld, Nicholas J S
2018-05-08
Cognitive psychologists are familiar with how their expertise in understanding human perception, memory, and decision-making is applicable to the justice system. They may be less familiar with how their expertise in statistical decision-making and their comfort working in noisy real-world environments is just as applicable. Here we show how this expertise in ideal-observer models may be leveraged to calculate the probability of guilt of Gary Leiterman, a man convicted of murder on the basis of DNA evidence. We show by common probability theory that Leiterman is likely a victim of a tragic contamination event rather than a murderer. Making any calculation of the probability of guilt necessarily relies on subjective assumptions. The conclusion about Leiterman's innocence is not overly sensitive to the assumptions-the probability of innocence remains high for a wide range of reasonable assumptions. We note that cognitive psychologists may be well suited to make these calculations because as working scientists they may be comfortable with the role a reasonable degree of subjectivity plays in analysis.
Mæstad, Ottar; Norheim, Ole Frithjof
2012-11-01
The literature on how to combine efficiency and equity considerations in the social valuation of health allocations has borrowed extensively from applied welfare economics, including the literature on inequality measurement. By so doing, it has adopted normative assumptions that have been applied for evaluating the allocation of welfare (or income) rather than the allocation of health, including the assumption of a monotonically declining social marginal value of welfare/income/health. At the same time, empirical studies that have elicited social preferences for allocation of health have reported results that are seemingly incompatible with this assumption. There are two ways of addressing this inconsistency; we may censor the stated preferences by arguing that they cannot be supported by normative arguments, or we may reject or modify the analytical framework in order to accommodate the stated preferences. We argue that the stated preferences can be supported by normative reasoning and therefore conclude that one should be cautious in applying the standard welfare economic framework to the allocation of health. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bellac, Michel Le
2014-11-01
The final form of quantum physics, in the particular case of wave mechanics, was established in the years 1925-1927 by Heisenberg, Schrödinger, Born and others, but the synthesis was the work of Bohr who gave an epistemological interpretation of all the technicalities built up over those years; this interpretation will be examined briefly in Chapter 10. Although Einstein acknowledged the success of quantum mechanics in atomic, molecular and solid state physics, he disagreed deeply with Bohr's interpretation. For many years, he tried to find flaws in the formulation of quantum theory as it had been more or less accepted by a large majority of physicists, but his objections were brushed away by Bohr. However, in an article published in 1935 with Podolsky and Rosen, universally known under the acronym EPR, Einstein thought he had identified a difficulty in the by then standard interpretation. Bohr's obscure, and in part beyond the point, answer showed that Einstein had hit a sensitive target. Nevertheless, until 1964, the so-called Bohr-Einstein debate stayed uniquely on a philosophical level, and it was actually forgotten by most physicists, as the few of them aware of it thought it had no practical implication. In 1964, the Northern Irish physicist John Bell realized that the assumptions contained in the EPR article could be tested experimentally. These assumptions led to inequalities, the Bell inequalities, which were in contradiction with quantum mechanical predictions: as we shall see later on, it is extremely likely that the assumptions of the EPR article are not consistent with experiment, which, on the contrary, vindicates the predictions of quantum physics. In Section 3.2, the origin of Bell's inequalities will be explained with an intuitive example, then they will be compared with the predictions of quantum theory in Section 3.3, and finally their experimental status will be reviewed in Section 3.4. The debate between Bohr and Einstein goes much beyond a simple controversy, which is after all almost eighty years old and has been settled today. In fact, the concept introduced in this debate, that of entanglement, lies at the heart of many very important developments of modern quantum physics, in particular all those linked to quantum information (Chapter 8). Moreover, we shall see that the phenomenon of non-local correlations compels us to revise in depth our space-time representation of quantum processes. These are the two reasons why a whole chapter is devoted to this debate.
RNA Editing Underlies Temperature Adaptation in K+ Channels from Polar Octopuses
Garrett, Sandra; Rosenthal, Joshua J.C.
2014-01-01
To operate in the extreme cold, ion channels from psychrophiles must have evolved structural changes to compensate for their thermal environment. A reasonable assumption would be that the underlying adaptations lie within the encoding genes. Here we show that delayed rectifier K+ channel genes from an Antarctic and a tropical octopus encode channels that differ at only four positions and display very similar behavior when expressed in Xenopus oocytes. However, the transcribed mRNAs are extensively edited, creating functional diversity. One editing site, which recodes an isoleucine to a valine in the channel’s pore, greatly accelerates gating kinetics by destabilizing the open state. This site is extensively edited in both Antarctic and Arctic species, but mostly unedited in tropical species. Thus A-to-I RNA editing can respond to the physical environment. PMID:22223739
Making Sense of Bell's Theorem and Quantum Nonlocality
NASA Astrophysics Data System (ADS)
Boughn, Stephen
2017-05-01
Bell's theorem has fascinated physicists and philosophers since his 1964 paper, which was written in response to the 1935 paper of Einstein, Podolsky, and Rosen. Bell's theorem and its many extensions have led to the claim that quantum mechanics and by inference nature herself are nonlocal in the sense that a measurement on a system by an observer at one location has an immediate effect on a distant entangled system (one with which the original system has previously interacted). Einstein was repulsed by such "spooky action at a distance" and was led to question whether quantum mechanics could provide a complete description of physical reality. In this paper I argue that quantum mechanics does not require spooky action at a distance of any kind and yet it is entirely reasonable to question the assumption that quantum mechanics can provide a complete description of physical reality. The magic of entangled quantum states has little to do with entanglement and everything to do with superposition, a property of all quantum systems and a foundational tenet of quantum mechanics.
NASA Astrophysics Data System (ADS)
Hooshyar, M.; Wang, D.
2016-12-01
The empirical proportionality relationship, which indicates that the ratio of cumulative surface runoff and infiltration to their corresponding potentials are equal, is the basis of the extensively used Soil Conservation Service Curve Number (SCS-CN) method. The objective of this paper is to provide the physical basis of the SCS-CN method and its proportionality hypothesis from the infiltration excess runoff generation perspective. To achieve this purpose, an analytical solution of Richards' equation is derived for ponded infiltration in shallow water table environment under the following boundary conditions: 1) the soil is saturated at the land surface; and 2) there is a no-flux boundary which moves downward. The solution is established based on the assumptions of negligible gravitational effect, constant soil water diffusivity, and hydrostatic soil moisture profile between the no-flux boundary and water table. Based on the derived analytical solution, the proportionality hypothesis is a reasonable approximation for rainfall partitioning at the early stage of ponded infiltration in areas with a shallow water table for coarse textured soils.
NASA Astrophysics Data System (ADS)
Hooshyar, Milad; Wang, Dingbao
2016-08-01
The empirical proportionality relationship, which indicates that the ratio of cumulative surface runoff and infiltration to their corresponding potentials are equal, is the basis of the extensively used Soil Conservation Service Curve Number (SCS-CN) method. The objective of this paper is to provide the physical basis of the SCS-CN method and its proportionality hypothesis from the infiltration excess runoff generation perspective. To achieve this purpose, an analytical solution of Richards' equation is derived for ponded infiltration in shallow water table environment under the following boundary conditions: (1) the soil is saturated at the land surface; and (2) there is a no-flux boundary which moves downward. The solution is established based on the assumptions of negligible gravitational effect, constant soil water diffusivity, and hydrostatic soil moisture profile between the no-flux boundary and water table. Based on the derived analytical solution, the proportionality hypothesis is a reasonable approximation for rainfall partitioning at the early stage of ponded infiltration in areas with a shallow water table for coarse textured soils.
GLOBEC: Global Ocean Ecosystems Dynamics: A component of the US Global Change Research Program
NASA Technical Reports Server (NTRS)
1991-01-01
GLOBEC (GLOBal ocean ECosystems dynamics) is a research initiative proposed by the oceanographic and fisheries communities to address the question of how changes in global environment are expected to affect the abundance and production of animals in the sea. The approach to this problem is to develop a fundamental understanding of the mechanisms that determine both the abundance of key marine animal populations and their variances in space and time. The assumption is that the physical environment is a major contributor to patterns of abundance and production of marine animals, in large part because the planktonic life stages typical of most marine animals are intrinsically at the mercy of the fluid motions of the medium in which they live. Consequently, the authors reason that a logical approach to predicting the potential impact of a globally changing environment is to understand how the physical environment, both directly and indirectly, contributes to animal abundance and its variability in marine ecosystems. The plans for this coordinated study of of the potential impact of global change on ocean ecosystems dynamics are discussed.
Another look through Heisenberg’s microscope
NASA Astrophysics Data System (ADS)
Boughn, Stephen; Reginatto, Marcel
2018-05-01
Heisenberg introduced his famous uncertainty relations in a seminal 1927 paper entitled The Physical Content of Quantum Kinematics and Mechanics. He motivated his arguments with a gedanken experiment, a gamma ray microscope to measure the position of a particle. A primary result was that, due to the quantum nature of light, there is an inherent uncertainty in the determinations of the particle’s position and momentum dictated by an indeterminacy relation, δ qδ p∼ h. Heisenberg offered this demonstration as ‘a direct physical interpretation of the [quantum mechanical] equation {{pq}}-{{qp}}=-{{i}}{\\hslash }’ but considered the indeterminacy relation to be much more than this. He also argued that it implies limitations on the very meanings of position and momentum and emphasised that these limitations are the source of the statistical character of quantum mechanics. In addition, Heisenberg hoped but was unable to demonstrate that the laws of quantum mechanics could be derived directly from the uncertainty relation. In this paper, we revisit Heisenberg’s microscope and argue that the Schrödinger equation for a free particle does indeed follow from the indeterminacy relation together with reasonable statistical assumptions.
Women's motivations to have sex in casual and committed relationships with male and female partners.
Armstrong, Heather L; Reissing, Elke D
2015-05-01
Women report a wide variety of reasons to have sex (e.g., Meston & Buss, 2010), and while it is reasonable to assume that those reasons may vary based on the context of the relationship, this assumption has not yet been tested. The purpose of this study was to explore how relationship type, sexual attraction, and the gender of one's partner interact and affect the sexual motivations of women. A total of 510 women (361 who reported exclusively other-sex attraction and 149 who reported same-sex/bisexual attraction) completed the YSEX? questionnaire. Participants rated their sexual motivations for casual sex and sex in a committed relationship with male and/or female partners, depending on reported sexual attraction. Results showed that relationship type affected reported motivation for sex: physical motivations were more strongly endorsed for casual sex, whereas emotional motivations were more strongly endorsed for sex in committed relationships. No significant differences in motivation were reported between women who reported same-sex attraction and those who did not. Women who reported bisexual attraction and identified as being lesbian, bisexual, or another sexual minority reported no significant differences in motivation for sex with male or female partners. The results of this study highlight the importance of relationship context when discussing sexual motivation and suggest a high degree of similarity in motivation for women, regardless of sexual orientation or gender of partner.
possible reasons. Want information on the technical assumptions and methods behind the site ? - Documentation on appliances, heating/cooling methods, and the tariff analysis methods is all available here
A Bottom-Up Approach to Understanding Protein Layer Formation at Solid-Liquid Interfaces
Kastantin, Mark; Langdon, Blake B.; Schwartz, Daniel K.
2014-01-01
A common goal across different fields (e.g. separations, biosensors, biomaterials, pharmaceuticals) is to understand how protein behavior at solid-liquid interfaces is affected by environmental conditions. Temperature, pH, ionic strength, and the chemical and physical properties of the solid surface, among many factors, can control microscopic protein dynamics (e.g. adsorption, desorption, diffusion, aggregation) that contribute to macroscopic properties like time-dependent total protein surface coverage and protein structure. These relationships are typically studied through a top-down approach in which macroscopic observations are explained using analytical models that are based upon reasonable, but not universally true, simplifying assumptions about microscopic protein dynamics. Conclusions connecting microscopic dynamics to environmental factors can be heavily biased by potentially incorrect assumptions. In contrast, more complicated models avoid several of the common assumptions but require many parameters that have overlapping effects on predictions of macroscopic, average protein properties. Consequently, these models are poorly suited for the top-down approach. Because the sophistication incorporated into these models may ultimately prove essential to understanding interfacial protein behavior, this article proposes a bottom-up approach in which direct observations of microscopic protein dynamics specify parameters in complicated models, which then generate macroscopic predictions to compare with experiment. In this framework, single-molecule tracking has proven capable of making direct measurements of microscopic protein dynamics, but must be complemented by modeling to combine and extrapolate many independent microscopic observations to the macro-scale. The bottom-up approach is expected to better connect environmental factors to macroscopic protein behavior, thereby guiding rational choices that promote desirable protein behaviors. PMID:24484895
Of mental models, assumptions and heuristics: The case of acids and acid strength
NASA Astrophysics Data System (ADS)
McClary, Lakeisha Michelle
This study explored what cognitive resources (i.e., units of knowledge necessary to learn) first-semester organic chemistry students used to make decisions about acid strength and how those resources guided the prediction, explanation and justification of trends in acid strength. We were specifically interested in the identifying and characterizing the mental models, assumptions and heuristics that students relied upon to make their decisions, in most cases under time constraints. The views about acids and acid strength were investigated for twenty undergraduate students. Data sources for this study included written responses and individual interviews. The data was analyzed using a qualitative methodology to answer five research questions. Data analysis regarding these research questions was based on existing theoretical frameworks: problem representation (Chi, Feltovich & Glaser, 1981), mental models (Johnson-Laird, 1983); intuitive assumptions (Talanquer, 2006), and heuristics (Evans, 2008). These frameworks were combined to develop the framework from which our data were analyzed. Results indicated that first-semester organic chemistry students' use of cognitive resources was complex and dependent on their understanding of the behavior of acids. Expressed mental models were generated using prior knowledge and assumptions about acids and acid strength; these models were then employed to make decisions. Explicit and implicit features of the compounds in each task mediated participants' attention, which triggered the use of a very limited number of heuristics, or shortcut reasoning strategies. Many students, however, were able to apply more effortful analytic reasoning, though correct trends were predicted infrequently. Most students continued to use their mental models, assumptions and heuristics to explain a given trend in acid strength and to justify their predicted trends, but the tasks influenced a few students to shift from one model to another model. An emergent finding from this project was that the problem representation greatly influenced students' ability to make correct predictions in acid strength. Many students, however, were able to apply more effortful analytic reasoning, though correct trends were predicted infrequently. Most students continued to use their mental models, assumptions and heuristics to explain a given trend in acid strength and to justify their predicted trends, but the tasks influenced a few students to shift from one model to another model. An emergent finding from this project was that the problem representation greatly influenced students' ability to make correct predictions in acid strength.
Economics in "Global Health 2035": a sensitivity analysis of the value of a life year estimates.
Chang, Angela Y; Robinson, Lisa A; Hammitt, James K; Resch, Stephen C
2017-06-01
In "Global health 2035: a world converging within a generation," The Lancet Commission on Investing in Health (CIH) adds the value of increased life expectancy to the value of growth in gross domestic product (GDP) when assessing national well-being. To value changes in life expectancy, the CIH relies on several strong assumptions to bridge gaps in the empirical research. It finds that the value of a life year (VLY) averages 2.3 times GDP per capita for low- and middle-income countries (LMICs) assuming the changes in life expectancy they experienced from 2000 to 2011 are permanent. The CIH VLY estimate is based on a specific shift in population life expectancy and includes a 50 percent reduction for children ages 0 through 4. We investigate the sensitivity of this estimate to the underlying assumptions, including the effects of income, age, and life expectancy, and the sequencing of the calculations. We find that reasonable alternative assumptions regarding the effects of income, age, and life expectancy may reduce the VLY estimates to 0.2 to 2.1 times GDP per capita for LMICs. Removing the reduction for young children increases the VLY, while reversing the sequencing of the calculations reduces the VLY. Because the VLY is sensitive to the underlying assumptions, analysts interested in applying this approach elsewhere must tailor the estimates to the impacts of the intervention and the characteristics of the affected population. Analysts should test the sensitivity of their conclusions to reasonable alternative assumptions. More work is needed to investigate options for improving the approach.
Unexpected Results are Usually Wrong, but Often Interesting
NASA Astrophysics Data System (ADS)
Huber, M.
2014-12-01
In climate modeling, an unexpected result is usually wrong, arising from some sort of mistake. Despite the fact that we all bemoan uncertainty in climate, the field is underlain by a robust, successful body of theory and any properly conducted modeling experiment is posed and conducted within that context. Consequently, if results from a complex climate model disagree with theory or from expectations from simpler models, much skepticism is in order. But, this exposes the fundamental tension of using complex, sophisticated models. If simple models and theory were perfect there would be no reason for complex models--the entire point of sophisticated models is to see if unexpected phenomena arise as emergent properties of the system. In this talk, I will step through some paleoclimate examples, drawn from my own work, of unexpected results that emerge from complex climate models arising from mistakes of two kinds. The first kind of mistake, is what I call a 'smart mistake'; it is an intentional incorporation of assumptions, boundary conditions, or physics that is in violation of theoretical or observational constraints. The second mistake, a 'dumb mistake', is just that, an unintentional violation. Analysis of such mistaken simulations provides some potentially novel and certainly interesting insights into what is possible and right in paleoclimate modeling by forcing the reexamination of well-held assumptions and theories.
ERIC Educational Resources Information Center
Grotzer, Tina A.; Tutwiler, M. Shane
2014-01-01
This article considers a set of well-researched default assumptions that people make in reasoning about complex causality and argues that, in part, they result from the forms of causal induction that we engage in and the type of information available in complex environments. It considers how information often falls outside our attentional frame…
Aging and social expenditures in Italy: some issues associated with population projections.
Terra Abrami, V
1990-01-01
"After describing the main results of the recent Italian population projections, and some possible consequences...aging may have on social expenditures, this paper focuses on attempts to improve the accuracy of development assumptions, with special regard to natural components. Emphasis is placed on the importance of applying specific methodological tools to define self-explanatory assumptions for fertility and mortality and to produce projections which could be considered, with reasonable limitations, as real forecasts." excerpt
How rational should bioethics be? The value of empirical approaches.
Alvarez, A A
2001-10-01
Rational justification of claims with empirical content calls for empirical and not only normative philosophical investigation. Empirical approaches to bioethics are epistemically valuable, i.e., such methods may be necessary in providing and verifying basic knowledge about cultural values and norms. Our assumptions in moral reasoning can be verified or corrected using these methods. Moral arguments can be initiated or adjudicated by data drawn from empirical investigation. One may argue that individualistic informed consent, for example, is not compatible with the Asian communitarian orientation. But this normative claim uses an empirical assumption that may be contrary to the fact that some Asians do value and argue for informed consent. Is it necessary and factual to neatly characterize some cultures as individualistic and some as communitarian? Empirical investigation can provide a reasonable way to inform such generalizations. In a multi-cultural context, such as in the Philippines, there is a need to investigate the nature of the local ethos before making any appeal to authenticity. Otherwise we may succumb to the same ethical imperialism we are trying hard to resist. Normative claims that involve empirical premises cannot be reasonable verified or evaluated without utilizing empirical methods along with philosophical reflection. The integration of empirical methods to the standard normative approach to moral reasoning should be reasonably guided by the epistemic demands of claims arising from cross-cultural discourse in bioethics.
Patterns of Clinical Reasoning in Physical Therapist Students.
Gilliland, Sarah; Wainwright, Susan Flannery
2017-05-01
Clinical reasoning is a complex, nonlinear problem-solving process that is influenced by models of practice. The development of physical therapists' clinical reasoning abilities is a crucial yet underresearched aspect of entry-level (professional) physical therapist education. The purpose of this qualitative study was to examine the types of clinical reasoning strategies physical therapist students engage in during a patient encounter. A qualitative descriptive case study design involving within and across case analysis was used. Eight second-year, professional physical therapist students from 2 different programs completed an evaluation and initial intervention for a standardized patient followed by a retrospective think-aloud interview to explicate their reasoning processes. Participants' clinical reasoning strategies were examined using a 2-stage qualitative method of thematic analysis. Participants demonstrated consistent signs of development of physical therapy-specific reasoning processes, yet varied in their approach to the case and use of reflection. Participants who gave greater attention to patient education and empowerment also demonstrated greater use of reflection-in-action during the patient encounter. One negative case illustrates the variability in the rate at which students may develop these abilities. Participants demonstrated development toward physical therapist--specific clinical reasoning, yet demonstrated qualitatively different approaches to the patient encounter. Multiple factors, including the use of reflection-in-action, may enable students to develop greater flexibility in their reasoning processes. © 2017 American Physical Therapy Association
NASA Astrophysics Data System (ADS)
Inam, Azhar; Adamowski, Jan; Prasher, Shiv; Halbe, Johannes; Malard, Julien; Albano, Raffaele
2017-08-01
Effective policies, leading to sustainable management solutions for land and water resources, require a full understanding of interactions between socio-economic and physical processes. However, the complex nature of these interactions, combined with limited stakeholder engagement, hinders the incorporation of socio-economic components into physical models. The present study addresses this challenge by integrating the physical Spatial Agro Hydro Salinity Model (SAHYSMOD) with a participatory group-built system dynamics model (GBSDM) that includes socio-economic factors. A stepwise process to quantify the GBSDM is presented, along with governing equations and model assumptions. Sub-modules of the GBSDM, describing agricultural, economic, water and farm management factors, are linked together with feedbacks and finally coupled with the physically based SAHYSMOD model through commonly used tools (i.e., MS Excel and a Python script). The overall integrated model (GBSDM-SAHYSMOD) can be used to help facilitate the role of stakeholders with limited expertise and resources in model and policy development and implementation. Following the development of the integrated model, a testing methodology was used to validate the structure and behavior of the integrated model. Model robustness under different operating conditions was also assessed. The model structure was able to produce anticipated real behaviours under the tested scenarios, from which it can be concluded that the formulated structures generate the right behaviour for the right reasons.
A class of simple bouncing and late-time accelerating cosmologies in f(R) gravity
NASA Astrophysics Data System (ADS)
Kuiroukidis, A.
We consider the field equations for a flat FRW cosmological model, given by Eq. (??), in an a priori generic f(R) gravity model and cast them into a, completely normalized and dimensionless, system of ODEs for the scale factor and the function f(R), with respect to the scalar curvature R. It is shown that under reasonable assumptions, namely for power-law functional form for the f(R) gravity model, one can produce simple analytical and numerical solutions describing bouncing cosmological models where in addition there are late-time accelerating. The power-law form for the f(R) gravity model is typically considered in the literature as the most concrete, reasonable, practical and viable assumption [see S. D. Odintsov and V. K. Oikonomou, Phys. Rev. D 90 (2014) 124083, arXiv:1410.8183 [gr-qc
Cognitive architectures, rationality, and next-generation AI: a prolegomenon
NASA Astrophysics Data System (ADS)
Bello, Paul; Bringsjord, Selmer; Yang, Yingrui
2004-08-01
Computational models that give us insight into the behavior of individuals and the organizations to which they belong will be invaluable assets in our nation's war against terrorists, and state sponsorship of terror organizations. Reasoning and decision-making are essential ingredients in the formula for human cognition, yet the two have almost exclusively been studied in isolation from one another. While we have witnessed the emergence of strong traditions in both symbolic logic, and decision theory, we have yet to describe an acceptable interface between the two. Mathematical formulations of decision-making and reasoning have been developed extensively, but both fields make assumptions concerning human rationality that are untenable at best. True to this tradition, artificial intelligence has developed architectures for intelligent agents under these same assumptions. While these digital models of "cognition" tend to perform superbly, given their tremendous capacity for calculation, it is hardly reasonable to develop simulacra of human performance using these techniques. We will discuss some the challenges associated with the problem of developing integrated cognitive systems for use in modelling, simulation, and analysis, along with some ideas for the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauböck, Michi; Psaltis, Dimitrios; Özel, Feryal, E-mail: mbaubock@email.arizona.edu
We calculate the effects of spot size on pulse profiles of moderately rotating neutron stars. Specifically, we quantify the bias introduced in radius measurements from the common assumption that spots are infinitesimally small. We find that this assumption is reasonable for spots smaller than 10°–18° and leads to errors that are ≤10% in the radius measurement, depending on the location of the spot and the inclination of the observer. We consider the implications of our results for neutron star radius measurements with the upcoming and planned X-ray missions NICER and LOFT. We calculate the expected spot size for different classesmore » of sources and investigate the circumstances under which the assumption of a small spot is justified.« less
Is a time symmetric interpretation of quantum theory possible without retrocausality?
NASA Astrophysics Data System (ADS)
Leifer, Matthew S.; Pusey, Matthew F.
2017-06-01
Huw Price has proposed an argument that suggests a time symmetric ontology for quantum theory must necessarily be retrocausal, i.e. it must involve influences that travel backwards in time. One of Price's assumptions is that the quantum state is a state of reality. However, one of the reasons for exploring retrocausality is that it offers the potential for evading the consequences of no-go theorems, including recent proofs of the reality of the quantum state. Here, we show that this assumption can be replaced by a different assumption, called λ-mediation, that plausibly holds independently of the status of the quantum state. We also reformulate the other assumptions behind the argument to place them in a more general framework and pin down the notion of time symmetry involved more precisely. We show that our assumptions imply a timelike analogue of Bell's local causality criterion and, in doing so, give a new interpretation of timelike violations of Bell inequalities. Namely, they show the impossibility of a (non-retrocausal) time symmetric ontology.
Overview of physical models of liquid entrainment in annular gas-liquid flow
NASA Astrophysics Data System (ADS)
Cherdantsev, Andrey V.
2018-03-01
A number of recent papers devoted to development of physically-based models for prediction of liquid entrainment in annular regime of two-phase flow are analyzed. In these models shearing-off the crests of disturbance waves by the gas drag force is supposed to be the physical mechanism of entrainment phenomenon. The models are based on a number of assumptions on wavy structure, including inception of disturbance waves due to Kelvin-Helmholtz instability, linear velocity profile inside liquid film and high degree of three-dimensionality of disturbance waves. Validity of the assumptions is analyzed by comparison to modern experimental observations. It was shown that nearly every assumption is in strong qualitative and quantitative disagreement with experiments, which leads to massive discrepancies between the modeled and real properties of the disturbance waves. As a result, such models over-predict the entrained fraction by several orders of magnitude. The discrepancy is usually reduced using various kinds of empirical corrections. This, combined with empiricism already included in the models, turns the models into another kind of empirical correlations rather than physically-based models.
Mapping student thinking in chemical synthesis
NASA Astrophysics Data System (ADS)
Weinrich, Melissa
In order to support the development of learning progressions about central ideas and practices in different disciplines, we need detailed analyses of the implicit assumptions and reasoning strategies that guide students' thinking at different educational levels. In the particular case of chemistry, understanding how new chemical substances are produced (chemical synthesis) is of critical importance. Thus, we have used a qualitative research approach based on individual interviews with first semester general chemistry students (n = 16), second semester organic chemistry students (n = 15), advanced undergraduates (n = 9), first year graduate students (n = 15), and PhD candidates (n = 16) to better characterize diverse students' underlying cognitive elements (conceptual modes and modes of reasoning) when thinking about chemical synthesis. Our results reveal a great variability in the cognitive resources and strategies used by students with different levels of training in the discipline to make decisions, particularly at intermediate levels of expertise. The specific nature of the task had a strong influence on the conceptual sophistication and mode of reasoning that students exhibited. Nevertheless, our data analysis has allowed us to identify common modes of reasoning and assumptions that seem to guide students' thinking at different educational levels. Our results should facilitate the development of learning progressions that help improve chemistry instruction, curriculum, and assessment.
Code of Federal Regulations, 2013 CFR
2013-10-01
... physical, occupational, speech, and other therapists, and services of other health specialists (other than... 42 Public Health 2 2013-10-01 2013-10-01 false Reasonable cost of physical and other therapy... SKILLED NURSING FACILITIES Specific Categories of Costs § 413.106 Reasonable cost of physical and other...
Code of Federal Regulations, 2012 CFR
2012-10-01
... physical, occupational, speech, and other therapists, and services of other health specialists (other than... 42 Public Health 2 2012-10-01 2012-10-01 false Reasonable cost of physical and other therapy... SKILLED NURSING FACILITIES Specific Categories of Costs § 413.106 Reasonable cost of physical and other...
Code of Federal Regulations, 2014 CFR
2014-10-01
... physical, occupational, speech, and other therapists, and services of other health specialists (other than... 42 Public Health 2 2014-10-01 2014-10-01 false Reasonable cost of physical and other therapy... SKILLED NURSING FACILITIES Specific Categories of Costs § 413.106 Reasonable cost of physical and other...
Gender and Physics: Feminist Philosophy and Science Education
ERIC Educational Resources Information Center
Rolin, Kristina
2008-01-01
Physics education reform movements should pay attention to feminist analyses of gender in the culture of physics for two reasons. One reason is that feminist analyses contribute to an understanding of a "chilly climate" women encounter in many physics university departments. Another reason is that feminist analyses reveal that certain styles of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Adam J.; Fox, Patrick J.; Kahn, Yonatan
Results from direct detection experiments are typically interpreted by employing an assumption about the dark matter velocity distribution, with results presented in the m{sub χ}−σ{sub n} plane. Recently methods which are independent of the DM halo velocity distribution have been developed which present results in the v{sub min}−g-tilde plane, but these in turn require an assumption on the dark matter mass. Here we present an extension of these halo-independent methods for dark matter direct detection which does not require a fiducial choice of the dark matter mass. With a change of variables from v{sub min} to nuclear recoil momentum (p{submore » R}), the full halo-independent content of an experimental result for any dark matter mass can be condensed into a single plot as a function of a new halo integral variable, which we call h-til-tilde(p{sub R}). The entire family of conventional halo-independent g-tilde(v{sub min}) plots for all DM masses are directly found from the single h-tilde(p{sub R}) plot through a simple rescaling of axes. By considering results in h-tilde(p{sub R}) space, one can determine if two experiments are inconsistent for all masses and all physically possible halos, or for what range of dark matter masses the results are inconsistent for all halos, without the necessity of multiple g-tilde(v{sub min}) plots for different DM masses. We conduct a sample analysis comparing the CDMS II Si events to the null results from LUX, XENON10, and SuperCDMS using our method and discuss how the results can be strengthened by imposing the physically reasonable requirement of a finite halo escape velocity.« less
Hall-Effect Thruster Simulations with 2-D Electron Transport and Hydrodynamic Ions
NASA Technical Reports Server (NTRS)
Mikellides, Ioannis G.; Katz, Ira; Hofer, Richard H.; Goebel, Dan M.
2009-01-01
A computational approach that has been used extensively in the last two decades for Hall thruster simulations is to solve a diffusion equation and energy conservation law for the electrons in a direction that is perpendicular to the magnetic field, and use discrete-particle methods for the heavy species. This "hybrid" approach has allowed for the capture of bulk plasma phenomena inside these thrusters within reasonable computational times. Regions of the thruster with complex magnetic field arrangements (such as those near eroded walls and magnets) and/or reduced Hall parameter (such as those near the anode and the cathode plume) challenge the validity of the quasi-one-dimensional assumption for the electrons. This paper reports on the development of a computer code that solves numerically the 2-D axisymmetric vector form of Ohm's law, with no assumptions regarding the rate of electron transport in the parallel and perpendicular directions. The numerical challenges related to the large disparity of the transport coefficients in the two directions are met by solving the equations in a computational mesh that is aligned with the magnetic field. The fully-2D approach allows for a large physical domain that extends more than five times the thruster channel length in the axial direction, and encompasses the cathode boundary. Ions are treated as an isothermal, cold (relative to the electrons) fluid, accounting for charge-exchange and multiple-ionization collisions in the momentum equations. A first series of simulations of two Hall thrusters, namely the BPT-4000 and a 6-kW laboratory thruster, quantifies the significance of ion diffusion in the anode region and the importance of the extended physical domain on studies related to the impact of the transport coefficients on the electron flow field.
77 FR 76380 - Partner's Distributive Share
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-28
...'s Distributive Share AGENCY: Internal Revenue Service (IRS), Treasury. ACTION: Final regulations... partnership's allocations are substantial. However, this commenter also explained that many partnerships are... partnership. This commenter further explained that, provided the partnership's assumptions are reasonable...
The implications of climate change on pavement performance and design.
DOT National Transportation Integrated Search
2011-09-25
Pavements are designed based on historic climatic patterns, reflecting local climate and : incorporating assumptions about a reasonable range of temperatures and precipitation levels. : Given anticipated climate changes and the inherent uncertainty a...
Renormalizing Entanglement Distillation.
Waeldchen, Stephan; Gertis, Janina; Campbell, Earl T; Eisert, Jens
2016-01-15
Entanglement distillation refers to the task of transforming a collection of weakly entangled pairs into fewer highly entangled ones. It is a core ingredient in quantum repeater protocols, which are needed to transmit entanglement over arbitrary distances in order to realize quantum key distribution schemes. Usually, it is assumed that the initial entangled pairs are identically and independently distributed and are uncorrelated with each other, an assumption that might not be reasonable at all in any entanglement generation process involving memory channels. Here, we introduce a framework that captures entanglement distillation in the presence of natural correlations arising from memory channels. Conceptually, we bring together ideas from condensed-matter physics-ideas from renormalization and matrix-product states and operators-with those of local entanglement manipulation, Markov chain mixing, and quantum error correction. We identify meaningful parameter regions for which we prove convergence to maximally entangled states, arising as the fixed points of a matrix-product operator renormalization flow.
NASA Astrophysics Data System (ADS)
Tornambe, Amedeo
1989-08-01
Theoretical rates of mergings of envelope-deprived components of binary systems, which can give rise to supernova events are described. The effects of the various assumptions on the physical properties of the progenitor system and of its evolutionary behavior through common envelope phases are discussed. Four cases have been analyzed: CO-CO, He-CO, He-He double degenerate mergings and He star-CO dwarf merging. It is found that, above a critical efficiency of the common envelope action in system shrinkage, the rate of CO-CO mergings is not strongly sensitive to the efficiency. Below this critical value, no CO-CO systems will survive for times larger than a few Gyr. In contrast, He-CO dwarf systems will continue to merge at a reasonable rate up to 20 Gyr, and more, also under extreme conditions.
An event-based architecture for solving constraint satisfaction problems
Mostafa, Hesham; Müller, Lorenz K.; Indiveri, Giacomo
2015-01-01
Constraint satisfaction problems are ubiquitous in many domains. They are typically solved using conventional digital computing architectures that do not reflect the distributed nature of many of these problems, and are thus ill-suited for solving them. Here we present a parallel analogue/digital hardware architecture specifically designed to solve such problems. We cast constraint satisfaction problems as networks of stereotyped nodes that communicate using digital pulses, or events. Each node contains an oscillator implemented using analogue circuits. The non-repeating phase relations among the oscillators drive the exploration of the solution space. We show that this hardware architecture can yield state-of-the-art performance on random SAT problems under reasonable assumptions on the implementation. We present measurements from a prototype electronic chip to demonstrate that a physical implementation of the proposed architecture is robust to practical non-idealities and to validate the theory proposed. PMID:26642827
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Ke; Linden, Tim, E-mail: kefang@umd.edu, E-mail: linden.70@osu.edu
Radio observations at multiple frequencies have detected a significant isotropic emission component between 22 MHz and 10 GHz, commonly termed the ARCADE-2 Excess. The origin of this radio emission is unknown, as the intensity, spectrum and isotropy of the signal are difficult to model with either traditional astrophysical mechanisms or novel physics such as dark matter annihilation. We posit a new model capable of explaining the key components of the excess radio emission. Specifically, we show that the re-acceleration of non-thermal electrons via turbulence in merging galaxy clusters are capable of explaining the intensity, spectrum, and isotropy of the ARCADE-2more » data. We examine the parameter spaces of cluster re-acceleration, magnetic field, and merger rate, finding that the radio excess can be reproduced assuming reasonable assumptions for each. Finally, we point out that future observations will definitively confirm or rule-out the contribution of cluster mergers to the isotropic radio background.« less
NASA Astrophysics Data System (ADS)
Lei, Fan; Li, Xiaoping; Liu, Yanming; Liu, Donglin; Yang, Min; Yu, Yuanyuan
2018-01-01
A two-dimensional axisymmetric inductively coupled plasma (ICP) model with its implementation in the COMSOL (Multi-physics simulation software) platform is described. Specifically, a large size ICP generator filled with argon is simulated in this study. Distributions of the number density and temperature of electrons are obtained for various input power and pressure settings and compared. In addition, the electron trajectory distribution is obtained in simulation. Finally, using experimental data, the results from simulations are compared to assess the veracity of the two-dimensional fluid model. The purpose of this comparison is to validate the veracity of the simulation model. An approximate agreement was found (variation tendency is the same). The main reasons for the numerical magnitude discrepancies are the assumption of a Maxwellian distribution and a Druyvesteyn distribution for the electron energy and the lack of cross sections of collision frequencies and reaction rates for argon plasma.
Economics in “Global Health 2035”: a sensitivity analysis of the value of a life year estimates
Chang, Angela Y; Robinson, Lisa A; Hammitt, James K; Resch, Stephen C
2017-01-01
Background In “Global health 2035: a world converging within a generation,” The Lancet Commission on Investing in Health (CIH) adds the value of increased life expectancy to the value of growth in gross domestic product (GDP) when assessing national well–being. To value changes in life expectancy, the CIH relies on several strong assumptions to bridge gaps in the empirical research. It finds that the value of a life year (VLY) averages 2.3 times GDP per capita for low– and middle–income countries (LMICs) assuming the changes in life expectancy they experienced from 2000 to 2011 are permanent. Methods The CIH VLY estimate is based on a specific shift in population life expectancy and includes a 50 percent reduction for children ages 0 through 4. We investigate the sensitivity of this estimate to the underlying assumptions, including the effects of income, age, and life expectancy, and the sequencing of the calculations. Findings We find that reasonable alternative assumptions regarding the effects of income, age, and life expectancy may reduce the VLY estimates to 0.2 to 2.1 times GDP per capita for LMICs. Removing the reduction for young children increases the VLY, while reversing the sequencing of the calculations reduces the VLY. Conclusion Because the VLY is sensitive to the underlying assumptions, analysts interested in applying this approach elsewhere must tailor the estimates to the impacts of the intervention and the characteristics of the affected population. Analysts should test the sensitivity of their conclusions to reasonable alternative assumptions. More work is needed to investigate options for improving the approach. PMID:28400950
Existence of ``free will'' as a problem of physics
NASA Astrophysics Data System (ADS)
Peres, Asher
1986-06-01
The proof of Bell's inequality is based on the assumption that distant observers can freely and independently choose their experiments. As Bell's inequality is experimentally violated, it appears that distant physical systems may behave as a single, nonlocal, indivisible entity. This apparent contradiction is resolved. It is shown that the “free will” assumption is, under usual circumstances, an excellent approximation. I have set before you life and death, blessing and cursing: therefore choose life.... — Deuteronomy XXX, 19
Role of physical properties of liquids in cavitation erosion
NASA Technical Reports Server (NTRS)
Thiruvengadam, A.
1974-01-01
The dependence of erosion rates on the ambient temperature of water is discussed. The assumption that the gas inside the bubble is compressed adiabatically during collapse gives better agreement with experiments than the assumption that the gas is isothermally compressed. Acoustic impedance is an important liquid parameter that governs the erosion intensity in vibratory devices. The investigation reveals that the major physical properties of liquids governing the intensity of erosion include density, sound speed, surface tension, vapor pressure, gas content, and nuclei distribution.
Using Socratic Questioning in the Classroom.
ERIC Educational Resources Information Center
Moore, Lori; Rudd, Rick
2002-01-01
Describes the Socratic questioning method and discusses its use in the agricultural education classroom. Presents a four-step model: origin and source of point of view; support, reasons, evidence, and assumptions; conflicting views; and implications and consequences. (JOW)
Resistances to Knowing in the Nuclear Age.
ERIC Educational Resources Information Center
Mack, John E.
1984-01-01
Explores psychological reasons why educators and parents resist dealing with the issue of nuclear war. Describing individual resistance (avoidance) and collective resistance (commitment to a nation's economic and political assumptions), the author discusses implications for nuclear education. (SK)
The Torsion of Members Having Sections Common in Aircraft Construction
NASA Technical Reports Server (NTRS)
Trayer, George W; March, H W
1930-01-01
Within recent years a great variety of approximate torsion formulas and drafting-room processes have been advocated. In some of these, especially where mathematical considerations are involved, the results are extremely complex and are not generally intelligible to engineers. The principal object of this investigation was to determine by experiment and theoretical investigation how accurate the more common of these formulas are and on what assumptions they are founded and, if none of the proposed methods proved to be reasonable accurate in practice, to produce simple, practical formulas from reasonably correct assumptions, backed by experiment. A second object was to collect in readily accessible form the most useful of known results for the more common sections. Formulas for all the important solid sections that have yielded to mathematical treatment are listed. Then follows a discussion of the torsion of tubular rods with formulas both rigorous and approximate.
Mellers, B A; Schwartz, A; Cooke, A D
1998-01-01
For many decades, research in judgment and decision making has examined behavioral violations of rational choice theory. In that framework, rationality is expressed as a single correct decision shared by experimenters and subjects that satisfies internal coherence within a set of preferences and beliefs. Outside of psychology, social scientists are now debating the need to modify rational choice theory with behavioral assumptions. Within psychology, researchers are debating assumptions about errors for many different definitions of rationality. Alternative frameworks are being proposed. These frameworks view decisions as more reasonable and adaptive that previously thought. For example, "rule following." Rule following, which occurs when a rule or norm is applied to a situation, often minimizes effort and provides satisfying solutions that are "good enough," though not necessarily the best. When rules are ambiguous, people look for reasons to guide their decisions. They may also let their emotions take charge. This chapter presents recent research on judgment and decision making from traditional and alternative frameworks.
The Moral Insignificance of Self‐consciousness
2017-01-01
Abstract In this paper, I examine the claim that self‐consciousness is highly morally significant, such that the fact that an entity is self‐conscious generates strong moral reasons against harming or killing that entity. This claim is apparently very intuitive, but I argue it is false. I consider two ways to defend this claim: one indirect, the other direct. The best‐known arguments relevant to self‐consciousness's significance take the indirect route. I examine them and argue that (a) in various ways they depend on unwarranted assumptions about self‐consciousness's functional significance, and (b) once these assumptions are undermined, motivation for these arguments dissipates. I then consider the direct route to self‐consciousness's significance, which depends on claims that self‐consciousness has intrinsic value or final value. I argue what intrinsic or final value self‐consciousness possesses is not enough to generate strong moral reasons against harming or killing. PMID:28919670
Weak annihilation and new physics in charmless [Formula: see text] decays.
Bobeth, Christoph; Gorbahn, Martin; Vickers, Stefan
We use currently available data of nonleptonic charmless 2-body [Formula: see text] decays ([Formula: see text]) that are mediated by [Formula: see text] QCD- and QED-penguin operators to study weak annihilation and new-physics effects in the framework of QCD factorization. In particular we introduce one weak-annihilation parameter for decays related by [Formula: see text] quark interchange and test this universality assumption. Within the standard model, the data supports this assumption with the only exceptions in the [Formula: see text] system, which exhibits the well-known "[Formula: see text] puzzle", and some tensions in [Formula: see text]. Beyond the standard model, we simultaneously determine weak-annihilation and new-physics parameters from data, employing model-independent scenarios that address the "[Formula: see text] puzzle", such as QED-penguins and [Formula: see text] current-current operators. We discuss also possibilities that allow further tests of our assumption once improved measurements from LHCb and Belle II become available.
Bell violation using entangled photons without the fair-sampling assumption.
Giustina, Marissa; Mech, Alexandra; Ramelow, Sven; Wittmann, Bernhard; Kofler, Johannes; Beyer, Jörn; Lita, Adriana; Calkins, Brice; Gerrits, Thomas; Nam, Sae Woo; Ursin, Rupert; Zeilinger, Anton
2013-05-09
The violation of a Bell inequality is an experimental observation that forces the abandonment of a local realistic viewpoint--namely, one in which physical properties are (probabilistically) defined before and independently of measurement, and in which no physical influence can propagate faster than the speed of light. All such experimental violations require additional assumptions depending on their specific construction, making them vulnerable to so-called loopholes. Here we use entangled photons to violate a Bell inequality while closing the fair-sampling loophole, that is, without assuming that the sample of measured photons accurately represents the entire ensemble. To do this, we use the Eberhard form of Bell's inequality, which is not vulnerable to the fair-sampling assumption and which allows a lower collection efficiency than other forms. Technical improvements of the photon source and high-efficiency transition-edge sensors were crucial for achieving a sufficiently high collection efficiency. Our experiment makes the photon the first physical system for which each of the main loopholes has been closed, albeit in different experiments.
39 Questionable Assumptions in Modern Physics
NASA Astrophysics Data System (ADS)
Volk, Greg
2009-03-01
The growing body of anomalies in new energy, low energy nuclear reactions, astrophysics, atomic physics, and entanglement, combined with the failure of the Standard Model and string theory to predict many of the most basic fundamental phenomena, all point to a need for major new paradigms. Not Band-Aids, but revolutionary new ways of conceptualizing physics, in the spirit of Thomas Kuhn's The Structure of Scientific Revolutions. This paper identifies a number of long-held, but unproven assumptions currently being challenged by an increasing number of alternative scientists. Two common themes, both with venerable histories, keep recurring in the many alternative theories being proposed: (1) Mach's Principle, and (2) toroidal, vortex particles. Matter-based Mach's Principle differs from both space-based universal frames and observer-based Einsteinian relativity. Toroidal particles, in addition to explaining electron spin and the fundamental constants, satisfy the basic requirement of Gauss's misunderstood B Law, that motion itself circulates. Though a comprehensive theory is beyond the scope of this paper, it will suggest alternatives to the long list of assumptions in context.
Mapping Soil Age at Continental Scales
NASA Astrophysics Data System (ADS)
Slessarev, E.; Feng, X.
2017-12-01
Soil age controls the balance between weathered and unweathered minerals in soil, and thus strongly influences many of the biological, geochemical, and hydrological functions of the critical zone. However, most quantitative models of soil development do not represent soil age. Instead, they rely on a steady-state assumption: physical erosion controls the residence time of unweathered minerals in soil, and thus fixes the chemical weathering rate. This assumption may hold true in mountainous landscapes, where physical erosion rates are high. However, the steady-state assumption may fail in low-relief landscapes, where physical erosion rates have been insufficient to remove unweathered minerals left by glaciation and dust deposition since the Last Glacial Maximum (LGM). To test the applicability of the steady-state assumption at continental scales, we developed an empirical predictor for physical erosion, and then simulated soil development since LGM with a numerical model. We calibrated the physical erosion predictor using a compilation of watershed-scale sediment yield data, and in-situ 10Be denudation measurements corrected for weathering by Zr/Ti mass-balance. Physical erosion rates can be predicted using a power-law function of local relief and peak ground acceleration, a proxy for tectonic activity. Coupling physical erosion rates with the numerical model reveals that extensive low-relief areas of North America may depart from steady-state because they were glaciated, or received high dust fluxes during LGM. These LGM legacy effects are reflected in topsoil Ca:Al and Quartz:Feldspar ratios derived from United States Geological Survey data, and in a global compilation of soil pH measurements. Our results quantitatively support the classic idea that soils in the mid-high latitudes of the Northern Hemisphere are "young", in the sense that they are undergoing transient response to LGM conditions. Where they occur, such departures from steady-state likely increase mineral weathering rates and the supply of rock-derived nutrients to ecosystems.
Neural basis of nonanalytical reasoning expertise during clinical evaluation.
Durning, Steven J; Costanzo, Michelle E; Artino, Anthony R; Graner, John; van der Vleuten, Cees; Beckman, Thomas J; Wittich, Christopher M; Roy, Michael J; Holmboe, Eric S; Schuwirth, Lambert
2015-03-01
Understanding clinical reasoning is essential for patient care and medical education. Dual-processing theory suggests that nonanalytic reasoning is an essential aspect of expertise; however, assessing nonanalytic reasoning is challenging because it is believed to occur on the subconscious level. This assumption makes concurrent verbal protocols less reliable assessment tools. Functional magnetic resonance imaging was used to explore the neural basis of nonanalytic reasoning in internal medicine interns (novices) and board-certified staff internists (experts) while completing United States Medical Licensing Examination and American Board of Internal Medicine multiple-choice questions. The results demonstrated that novices and experts share a common neural network in addition to nonoverlapping neural resources. However, experts manifested greater neural processing efficiency in regions such as the prefrontal cortex during nonanalytical reasoning. These findings reveal a multinetwork system that supports the dual-process mode of expert clinical reasoning during medical evaluation.
Hayes, Brett K; Heit, Evan
2018-05-01
Inductive reasoning entails using existing knowledge to make predictions about novel cases. The first part of this review summarizes key inductive phenomena and critically evaluates theories of induction. We highlight recent theoretical advances, with a special emphasis on the structured statistical approach, the importance of sampling assumptions in Bayesian models, and connectionist modeling. A number of new research directions in this field are identified including comparisons of inductive and deductive reasoning, the identification of common core processes in induction and memory tasks and induction involving category uncertainty. The implications of induction research for areas as diverse as complex decision-making and fear generalization are discussed. This article is categorized under: Psychology > Reasoning and Decision Making Psychology > Learning. © 2017 Wiley Periodicals, Inc.
Transformation based endorsement systems
NASA Technical Reports Server (NTRS)
Sudkamp, Thomas
1988-01-01
Evidential reasoning techniques classically represent support for a hypothesis by a numeric value or an evidential interval. The combination of support is performed by an arithmetic rule which often requires restrictions to be placed on the set of possibilities. These assumptions usually require the hypotheses to be exhausitive and mutually exclusive. Endorsement based classification systems represent support for the alternatives symbolically rather than numerically. A framework for constructing endorsement systems is presented in which transformations are defined to generate and update the knowledge base. The interaction of the knowledge base and transformations produces a non-monotonic reasoning system. Two endorsement based reasoning systems are presented to demonstrate the flexibility of the transformational approach for reasoning with ambiguous and inconsistent information.
Cadeddu, Maria P.; Marchand, Roger; Orlandi, Emiliano; ...
2017-08-11
Satellite and ground-based microwave radiometers are routinely used for the retrieval of liquid water path (LWP) under all atmospheric conditions. The retrieval of water vapor and LWP from ground-based radiometers during rain has proved to be a difficult challenge for two principal reasons: the inadequacy of the nonscattering approximation in precipitating clouds and the deposition of rain drops on the instrument's radome. In this paper, we combine model computations and real ground-based, zenith-viewing passive microwave radiometer brightness temperature measurements to investigate how total, cloud, and rain LWP retrievals are affected by assumptions on the cloud drop size distribution (DSD) andmore » under which conditions a nonscattering approximation can be considered reasonably accurate. Results show that until the drop effective diameter is larger than similar to 200 mu m, a nonscattering approximation yields results that are still accurate at frequencies less than 90 GHz. For larger drop sizes, it is shown that higher microwave frequencies contain useful information that can be used to separate cloud and rain LWP provided that the vertical distribution of hydrometeors, as well as the DSD, is reasonably known. The choice of the DSD parameters becomes important to ensure retrievals that are consistent with the measurements. A physical retrieval is tested on a synthetic data set and is then used to retrieve total, cloud, and rain LWP from radiometric measurements during two drizzling cases at the atmospheric radiation measurement Eastern North Atlantic site.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cadeddu, Maria P.; Marchand, Roger; Orlandi, Emiliano
Satellite and ground-based microwave radiometers are routinely used for the retrieval of liquid water path (LWP) under all atmospheric conditions. The retrieval of water vapor and LWP from ground-based radiometers during rain has proved to be a difficult challenge for two principal reasons: the inadequacy of the nonscattering approximation in precipitating clouds and the deposition of rain drops on the instrument's radome. In this paper, we combine model computations and real ground-based, zenith-viewing passive microwave radiometer brightness temperature measurements to investigate how total, cloud, and rain LWP retrievals are affected by assumptions on the cloud drop size distribution (DSD) andmore » under which conditions a nonscattering approximation can be considered reasonably accurate. Results show that until the drop effective diameter is larger than similar to 200 mu m, a nonscattering approximation yields results that are still accurate at frequencies less than 90 GHz. For larger drop sizes, it is shown that higher microwave frequencies contain useful information that can be used to separate cloud and rain LWP provided that the vertical distribution of hydrometeors, as well as the DSD, is reasonably known. The choice of the DSD parameters becomes important to ensure retrievals that are consistent with the measurements. A physical retrieval is tested on a synthetic data set and is then used to retrieve total, cloud, and rain LWP from radiometric measurements during two drizzling cases at the atmospheric radiation measurement Eastern North Atlantic site.« less
Mathematical Reasoning Requirements in Swedish National Physics Tests
ERIC Educational Resources Information Center
Johansson, Helena
2016-01-01
This paper focuses on one aspect of mathematical competence, namely mathematical reasoning, and how this competency influences students' knowing of physics. This influence was studied by analysing the mathematical reasoning requirements upper secondary students meet when solving tasks in national physics tests. National tests are constructed to…
Denison, Stephanie; Trikutam, Pallavi; Xu, Fei
2014-08-01
A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar contexts with different outcomes. Can infants rapidly acquire probabilistic physical knowledge, such as some leaves fall and some glasses break by simply observing the statistical regularity with which objects behave and apply that knowledge in subsequent reasoning? We taught 11-month-old infants physical constraints on objects and asked them to reason about the probability of different outcomes when objects were drawn from a large distribution. Infants could have reasoned either by using the perceptual similarity between the samples and larger distributions or by applying physical rules to adjust base rates and estimate the probabilities. Infants learned the physical constraints quickly and used them to estimate probabilities, rather than relying on similarity, a version of the representativeness heuristic. These results indicate that infants can rapidly and flexibly acquire physical knowledge about objects following very brief exposure and apply it in subsequent reasoning. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Thinking before sinning: reasoning processes in hedonic consumption
de Witt Huberts, Jessie; Evers, Catharine; de Ridder, Denise
2014-01-01
Whereas hedonic consumption is often labeled as impulsive, findings from self-licensing research suggest that people sometimes rely on reasons to justify hedonic consumption. Although the concept of self-licensing assumes the involvement of reasoning processes, this has not been demonstrated explicitly. Two studies investigated whether people indeed rely on reasons to allow themselves a guilty pleasure. Participants were exposed to a food temptation after which passive and active reasoning was assessed by asking participants to indicate the justifications that applied to them for indulging in that temptation (Study 1) or having them construe reasons to consume the hedonic product (Study 2). Regression analyses indicated that higher levels of temptation predicted the number of reasons employed and construed to justify consumption. By providing evidence for the involvement of reasoning processes, these findings support the assumption of self-licensing theory that temptations not only exert their influence by making us more impulsive, but can also facilitate gratification by triggering deliberative reasoning processes. PMID:25408680
Equal Plate Charges on Series Capacitors?
ERIC Educational Resources Information Center
Illman, B. L.; Carlson, G. T.
1994-01-01
Provides a line of reasoning in support of the contention that the equal charge proposition is at best an approximation. Shows how the assumption of equal plate charge on capacitors in series contradicts the conservative nature of the electric field. (ZWH)
INCORPORATING NONCHEMICAL STRESSORS INTO CUMMULATIVE RISK ASSESSMENTS
The risk assessment paradigm has begun to shift from assessing single chemicals using "reasonable worst case" assumptions for individuals to considering multiple chemicals and community-based models. Inherent in community-based risk assessment is examination of all stressors a...
High and Crazy Niggers: Anti-Rationalism in Leroi Jones
ERIC Educational Resources Information Center
Brown, Lloyd W.
1974-01-01
Suggests that the assumption that the theme of anti-rationalism in Jones is an attack on reason as such and that such a theme rests on an exclusive commitment to the irrational needs to be tested. (Author/AM)
The lifespan and life-cycle of self-help groups: a retrospective study of groups in Nottingham, UK.
Chaudhary, Sarah; Avis, Mark; Munn-Giddings, Carol
2010-07-01
This article is based on an analysis of a practice database held by Self Help Nottingham, an organisation that supports local self-help groups. The database contains details of 936 groups that closed between 1982 and 2007. The aim of the study is to provide qualitative and descriptive quantitative information about the life-cycles of self-help groups, the problems that they face throughout their existence and the likelihood of different problems leading to their closure. The database was not collated for research purposes and so we restrict our discussion of the findings to identification of broad patterns regarding the birth and closure rates of different types of group and questions for future research. Comparisons were made between groups that addressed different types of problem, groups with different memberships and groups that had reached different stages in their existence. There was reasonable consistency in the survival rates of different types of group with physical health groups being the most likely to reach maturity followed by mental health and lastly social issue groups. Survival rates for groups that serve different membership populations were reasonably constant although there were some anomalies. There were high levels of consistency regarding the reasons for closure for groups closing at different stages of maturity. The most commonly cited reasons among all groups were the withdrawal of a 'key' member and a decline in membership. The article suggests that some of the assumptions and prescriptions within the existing literature need to be considered in light of more detailed empirical evidence, and it raises questions about the theoretical understanding of self-help groups.
Healing X-ray scattering images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Jiliang; Lhermitte, Julien; Tian, Ye
X-ray scattering images contain numerous gaps and defects arising from detector limitations and experimental configuration. Here, we present a method to heal X-ray scattering images, filling gaps in the data and removing defects in a physically meaningful manner. Unlike generic inpainting methods, this method is closely tuned to the expected structure of reciprocal-space data. In particular, we exploit statistical tests and symmetry analysis to identify the structure of an image; we then copy, average and interpolate measured data into gaps in a way that respects the identified structure and symmetry. Importantly, the underlying analysis methods provide useful characterization of structuresmore » present in the image, including the identification of diffuseversussharp features, anisotropy and symmetry. The presented method leverages known characteristics of reciprocal space, enabling physically reasonable reconstruction even with large image gaps. The method will correspondingly fail for images that violate these underlying assumptions. The method assumes point symmetry and is thus applicable to small-angle X-ray scattering (SAXS) data, but only to a subset of wide-angle data. Our method succeeds in filling gaps and healing defects in experimental images, including extending data beyond the original detector borders.« less
Theory of rotational transition in atom-diatom chemical reaction
NASA Astrophysics Data System (ADS)
Nakamura, Masato; Nakamura, Hiroki
1989-05-01
Rotational transition in atom-diatom chemical reaction is theoretically studied. A new approximate theory (which we call IOS-DW approximation) is proposed on the basis of the physical idea that rotational transition in reaction is induced by the following two different mechanisms: rotationally inelastic half collision in both initial and final arrangement channels, and coordinate transformation in the reaction zone. This theory gives a fairy compact expression for the state-to-state transition probability. Introducing the additional physically reasonable assumption that reaction (particle rearrangement) takes place in a spatially localized region, we have reduced this expression into a simpler analytical form which can explicitly give overall rotational state distribution in reaction. Numerical application was made to the H+H2 reaction and demonstrated its effectiveness for the simplicity. A further simplified most naive approximation, i.e., independent events approximation was also proposed and demonstrated to work well in the test calculation of H+H2. The overall rotational state distribution is expressed simply by a product sum of the transition probabilities for the three consecutive processes in reaction: inelastic transition in the initial half collision, transition due to particle rearrangement, and inelastic transition in the final half collision.
Feynman rules for the Standard Model Effective Field Theory in R ξ -gauges
NASA Astrophysics Data System (ADS)
Dedes, A.; Materkowska, W.; Paraskevas, M.; Rosiek, J.; Suxho, K.
2017-06-01
We assume that New Physics effects are parametrized within the Standard Model Effective Field Theory (SMEFT) written in a complete basis of gauge invariant operators up to dimension 6, commonly referred to as "Warsaw basis". We discuss all steps necessary to obtain a consistent transition to the spontaneously broken theory and several other important aspects, including the BRST-invariance of the SMEFT action for linear R ξ -gauges. The final theory is expressed in a basis characterized by SM-like propagators for all physical and unphysical fields. The effect of the non-renormalizable operators appears explicitly in triple or higher multiplicity vertices. In this mass basis we derive the complete set of Feynman rules, without resorting to any simplifying assumptions such as baryon-, lepton-number or CP conservation. As it turns out, for most SMEFT vertices the expressions are reasonably short, with a noticeable exception of those involving 4, 5 and 6 gluons. We have also supplemented our set of Feynman rules, given in an appendix here, with a publicly available Mathematica code working with the FeynRules package and producing output which can be integrated with other symbolic algebra or numerical codes for automatic SMEFT amplitude calculations.
Healing X-ray scattering images
Liu, Jiliang; Lhermitte, Julien; Tian, Ye; ...
2017-05-24
X-ray scattering images contain numerous gaps and defects arising from detector limitations and experimental configuration. Here, we present a method to heal X-ray scattering images, filling gaps in the data and removing defects in a physically meaningful manner. Unlike generic inpainting methods, this method is closely tuned to the expected structure of reciprocal-space data. In particular, we exploit statistical tests and symmetry analysis to identify the structure of an image; we then copy, average and interpolate measured data into gaps in a way that respects the identified structure and symmetry. Importantly, the underlying analysis methods provide useful characterization of structuresmore » present in the image, including the identification of diffuseversussharp features, anisotropy and symmetry. The presented method leverages known characteristics of reciprocal space, enabling physically reasonable reconstruction even with large image gaps. The method will correspondingly fail for images that violate these underlying assumptions. The method assumes point symmetry and is thus applicable to small-angle X-ray scattering (SAXS) data, but only to a subset of wide-angle data. Our method succeeds in filling gaps and healing defects in experimental images, including extending data beyond the original detector borders.« less
On pendulums and air resistance. The mathematics and physics of Denis Diderot
NASA Astrophysics Data System (ADS)
Dahmen, Sílvio R.
2015-09-01
In this article Denis Diderot's Fifth Memoir of 1748 on the problem of a pendulum damped by air resistance is discussed in its historical as well as mathematical aspects. Diderot wrote the Memoir in order to clarify an assumption Newton made without further justification in the first pages of the Principia in connection with an experiment to verify the Third Law of Motion using colliding pendulums. To explain the differences between experimental and theoretical values, Newton assumed the bob was traversed. By giving Newton's arguments a mathematical scaffolding and recasting his geometrical reasoning in the language of differential calculus, Diderot provided a step-by-step solution guide to the problem. He also showed that Newton's assumption was equivalent to having assumed F R proportional the bob's velocity v, when in fact he believed it should be replaced by F R ˜ v 2. His solution is presented in full detail and his results are compared to those obtained from a Lindstedt-Poincaré approximation for an oscillator with quadratic damping. It is shown that, up to a prefactor, both results coincide. Some results that follow from his approach are presented and discussed for the first time. Experimental evidence to support Diderot's or Newton's claims is discussed together with the limitations of their solutions. Some misprints in the original memoir are pointed out.
NASA Astrophysics Data System (ADS)
Martinez, Guillermo F.; Gupta, Hoshin V.
2011-12-01
Methods to select parsimonious and hydrologically consistent model structures are useful for evaluating dominance of hydrologic processes and representativeness of data. While information criteria (appropriately constrained to obey underlying statistical assumptions) can provide a basis for evaluating appropriate model complexity, it is not sufficient to rely upon the principle of maximum likelihood (ML) alone. We suggest that one must also call upon a "principle of hydrologic consistency," meaning that selected ML structures and parameter estimates must be constrained (as well as possible) to reproduce desired hydrological characteristics of the processes under investigation. This argument is demonstrated in the context of evaluating the suitability of candidate model structures for lumped water balance modeling across the continental United States, using data from 307 snow-free catchments. The models are constrained to satisfy several tests of hydrologic consistency, a flow space transformation is used to ensure better consistency with underlying statistical assumptions, and information criteria are used to evaluate model complexity relative to the data. The results clearly demonstrate that the principle of consistency provides a sensible basis for guiding selection of model structures and indicate strong spatial persistence of certain model structures across the continental United States. Further work to untangle reasons for model structure predominance can help to relate conceptual model structures to physical characteristics of the catchments, facilitating the task of prediction in ungaged basins.
Tukker, Arnold; de Koning, Arjan; Wood, Richard; Moll, Stephan; Bouwmeester, Maaike C
2013-02-19
Environmentally extended input output (EE IO) analysis is increasingly used to assess the carbon footprint of final consumption. Official EE IO data are, however, at best available for single countries or regions such as the EU27. This causes problems in assessing pollution embodied in imported products. The popular "domestic technology assumption (DTA)" leads to errors. Improved approaches based on Life Cycle Inventory data, Multiregional EE IO tables, etc. rely on unofficial research data and modeling, making them difficult to implement by statistical offices. The DTA can lead to errors for three main reasons: exporting countries can have higher impact intensities; may use more intermediate inputs for the same output; or may sell the imported products for lower/other prices than those produced domestically. The last factor is relevant for sustainable consumption policies of importing countries, whereas the first factors are mainly a matter of making production in exporting countries more eco-efficient. We elaborated a simple correction for price differences in imports and domestic production using monetary and physical data from official import and export statistics. A case study for the EU27 shows that this "price-adjusted DTA" gives a partial but meaningful adjustment of pollution embodied in trade compared to multiregional EE IO studies.
Is a time symmetric interpretation of quantum theory possible without retrocausality?
Pusey, Matthew F.
2017-01-01
Huw Price has proposed an argument that suggests a time symmetric ontology for quantum theory must necessarily be retrocausal, i.e. it must involve influences that travel backwards in time. One of Price's assumptions is that the quantum state is a state of reality. However, one of the reasons for exploring retrocausality is that it offers the potential for evading the consequences of no-go theorems, including recent proofs of the reality of the quantum state. Here, we show that this assumption can be replaced by a different assumption, called λ-mediation, that plausibly holds independently of the status of the quantum state. We also reformulate the other assumptions behind the argument to place them in a more general framework and pin down the notion of time symmetry involved more precisely. We show that our assumptions imply a timelike analogue of Bell's local causality criterion and, in doing so, give a new interpretation of timelike violations of Bell inequalities. Namely, they show the impossibility of a (non-retrocausal) time symmetric ontology. PMID:28690401
Stennett, Andrea; De Souza, Lorraine; Norris, Meriel
2018-07-01
Exercise and physical activity have been found to be beneficial in managing disabilities caused by multiple sclerosis. Despite the known benefits, many people with multiple sclerosis are inactive. This study aimed to identify the prioritised exercise and physical activity practices of people with multiple sclerosis living in the community and the reasons why they are engaged in these activities. A four Round Delphi questionnaire scoped and determined consensus of priorities for the top 10 exercise and physical activities and the reasons why people with multiple sclerosis (n = 101) are engaged in these activities. Data were analysed using content analysis, descriptive statistics, and non-parametric tests. The top 10 exercise and physical activity practices and the top 10 reasons why people with multiple sclerosis (n = 70) engaged in these activities were identified and prioritised. Consensus was achieved for the exercise and physical activities (W = 0.744, p < .0001) and for the reasons they engaged in exercise and physical activity (W = 0.723, p < .0001). The exercise and physical activity practices and the reasons people with multiple sclerosis engaged in exercise and physical activity were diverse. These self-selected activities and reasons highlighted that people with multiple sclerosis might conceptualise exercise and physical activity in ways that may not be fully appreciated or understood by health professionals. Considerations of the views of people with multiple sclerosis may be essential if the goal of increasing physical activity in this population is to be achieved. Implications for Rehabilitation Health professionals should work collaboratively with people with multiple sclerosis to understand how they prioritise activities, the underlying reasons for their prioritisations and embed these into rehabilitation programmes. Health professionals should utilise activities prioritised by people with multiple sclerosis in the community as a way to support, promote, and sustain exercise and physical activity in this population. Rehabilitation interventions should include both the activities people with multiple sclerosis prioritise and the reasons why they engage in exercise and physical activity as another option for increasing physical activity levels and reducing sedentary behaviours.
Threads that guide or ties that bind: William Kirby and the essentialism story.
Varma, Charissa S
2009-01-01
Nineteenth-century British entomologist William Kirby is best known for his generic division of bees based on tongues and his vigorous defence of natural theology. Focusing on these aspects of Kirby's work has lead many current scholars to characterise Kirby as an "essentialist." As a result of this characterisation, many important aspects of his work, Monographia Apum Angliae (1802) have been over-looked or misunderstood. Kirby's religious devotion, for example, have lead some scholars to assume Kirby used the term "type" for connecting an ontological assumption about essences with a creationist assumption about species fixity, which I argue conceals a variety of ways Kirby employed the term. Also, Kirby frequently cautioned against organising a classification system exclusively by what he called "analytic reasoning," a style of reasoning 20th century scholars often associate with Aristotelian logic of division. I argue that Kirby's critique of analytic reasoning brought the virtues of his own methodological agenda into sharp relief. Kirby used familiar metaphors in the natural history literature--Ariadne's thread, the Eleusinian mysteries, and Bacon's bee and spider metaphors--to emphasise the virtues of building tradition and cooperation in the goals and methodological practices of 19th century British naturalists.
ERIC Educational Resources Information Center
Hart, Christina
2008-01-01
Models are important both in the development of physics itself and in teaching physics. Historically, the consensus models of physics have come to embody particular ontological assumptions and epistemological commitments. Educators have generally assumed that the consensus models of physics, which have stood the test of time, will also work well…
Scientific Reasoning: Theory Evidence Coordination in Physics-Based and Non-Physics-Based Tasks
ERIC Educational Resources Information Center
Ibrahim, Bashirah; Ding, Lin; Mollohan, Katherine N.; Stammen, Andria
2016-01-01
Scientific reasoning is crucial to any scientific discipline. One sub-skill particularly relevant to the scientific enterprise is theory evidence coordination. This study, underpinned by Kuhn's framework for scientific reasoning, investigates how university students coordinate their self-generated theory and evidence in a physics topic (energy)…
Christensen, Nicole; Black, Lisa; Furze, Jennifer; Huhn, Karen; Vendrely, Ann; Wainwright, Susan
2017-02-01
Although clinical reasoning abilities are important learning outcomes of physical therapist entry-level education, best practice standards have not been established to guide clinical reasoning curricular design and learning assessment. This research explored how clinical reasoning is currently defined, taught, and assessed in physical therapist entry-level education programs. A descriptive, cross-sectional survey was administered to physical therapist program representatives. An electronic 24-question survey was distributed to the directors of 207 programs accredited by the Commission on Accreditation in Physical Therapy Education. Descriptive statistical analysis and qualitative content analysis were performed. Post hoc demographic and wave analyses revealed no evidence of nonresponse bias. A response rate of 46.4% (n=96) was achieved. All respondents reported that their programs incorporated clinical reasoning into their curricula. Only 25% of respondents reported a common definition of clinical reasoning in their programs. Most respondents (90.6%) reported that clinical reasoning was explicit in their curricula, and 94.8% indicated that multiple methods of curricular integration were used. Instructor-designed materials were most commonly used to teach clinical reasoning (83.3%). Assessment of clinical reasoning included practical examinations (99%), clinical coursework (94.8%), written examinations (87.5%), and written assignments (83.3%). Curricular integration of clinical reasoning-related self-reflection skills was reported by 91%. A large number of incomplete surveys affected the response rate, and the program directors to whom the survey was sent may not have consulted the faculty members who were most knowledgeable about clinical reasoning in their curricula. The survey construction limited some responses and application of the results. Although clinical reasoning was explicitly integrated into program curricula, it was not consistently defined, taught, or assessed within or between the programs surveyed-resulting in significant variability in clinical reasoning education. These findings support the need for the development of best educational practices for clinical reasoning curricula and learning assessment. © 2017 American Physical Therapy Association
If technological intelligent extraterrestrials exist, what biological traits are de rigueur
NASA Astrophysics Data System (ADS)
Taylor, E. R.
2018-05-01
If extraterrestrials exist in the depths of cosmic space, and are capable of interstellar communications, even space flight, there is no requirement that they be humanoid in form. However, certain humanoid capabilities would be advantageous for tool fashioning and critical to operating space craft as well as functioning under the disparate extreme conditions under which they may be forced to operate. They would have to be "gas breathing". The reasonable assumption that life based upon the same elements as Earth life requiring water stems from the unique properties of water that no other similar low molecular weight nonmetal hydride offers. Only water offers the diversity of chemical properties and reactivity, including the existence of the three common physical states within a limited temperature range of service to life, avoiding the issues presented by any alternatives. They must, like us, possess a large, abstract-thinking brain, and probably possess at least all the fundamental senses that humankind possess. They would also be carbon-based life, using oxygen as the electron sink of their biochemistry for the reasons considered. They most likely are homeothermic as us, though they may not necessarily be mammalian as we are. Their biochemistry could differ some from ours, perhaps presenting contact hazards for both species as discussed.
Code of Federal Regulations, 2010 CFR
2010-01-01
... borrower developed an adequate supporting database and analyzed a reasonable range of relevant assumptions and alternative futures; (d) The borrower adopted methods and procedures in general use by the...
Clinical reasoning of Filipino physical therapists: Experiences in a developing nation.
Rotor, Esmerita R; Capio, Catherine M
2018-03-01
Clinical reasoning is essential for physical therapists to engage in the process of client care, and has been known to contribute to professional development. The literature on clinical reasoning and experiences have been based on studies from Western and developed nations, from which multiple influencing factors have been found. A developing nation, the Philippines, has distinct social, economic, political, and cultural circumstances. Using a phenomenological approach, this study explored experiences of Filipino physical therapists on clinical reasoning. Ten therapists working in three settings: 1) hospital; 2) outpatient clinic; and 3) home health were interviewed. Major findings were: a prescription-based referral system limited clinical reasoning; procedural reasoning was a commonly experienced strategy while diagnostic and predictive reasoning were limited; factors that influenced clinical reasoning included practice setting and the professional relationship with the referring physician. Physical therapists' responses suggested a lack of autonomy in practice that appeared to stifle clinical reasoning. Based on our findings, we recommend that the current regulations governing PT practice in the Philippines may be updated, and encourage educators to strengthen teaching approaches and strategies that support clinical reasoning. These recommendations are consistent with the global trend toward autonomous practice.
McMahon, Catherine A
2004-03-01
This commentary argues that notwithstanding the implicit logic underpinning philosophical and moral arguments regarding individual rights to access sex selection for non-medical reasons, community concerns about the psychosocial impact of the technology cannot be dismissed or ignored. It is true, however, that such concerns are often based on unsupported assumptions about the impact of the technology on individuals, families and communities and about the propensity of scientists, unless restrained, to act in ways that are irresponsible or dangerous. The research conducted to date has dispelled many of the myths and assumptions about IVF children and their families. Further careful research is now needed to test the extent to which fears and assumptions regarding 'designer babies' are justified.
ERIC Educational Resources Information Center
Besson, Ugo
2010-01-01
This paper presents an analysis of the different types of reasoning and physical explanation used in science, common thought, and physics teaching. It then reflects on the learning difficulties connected with these various approaches, and suggests some possible didactic strategies. Although causal reasoning occurs very frequently in common thought…
The Non-Signalling theorem in generalizations of Bell's theorem
NASA Astrophysics Data System (ADS)
Walleczek, J.; Grössing, G.
2014-04-01
Does "epistemic non-signalling" ensure the peaceful coexistence of special relativity and quantum nonlocality? The possibility of an affirmative answer is of great importance to deterministic approaches to quantum mechanics given recent developments towards generalizations of Bell's theorem. By generalizations of Bell's theorem we here mean efforts that seek to demonstrate the impossibility of any deterministic theories to obey the predictions of Bell's theorem, including not only local hidden-variables theories (LHVTs) but, critically, of nonlocal hidden-variables theories (NHVTs) also, such as de Broglie-Bohm theory. Naturally, in light of the well-established experimental findings from quantum physics, whether or not a deterministic approach to quantum mechanics, including an emergent quantum mechanics, is logically possible, depends on compatibility with the predictions of Bell's theorem. With respect to deterministic NHVTs, recent attempts to generalize Bell's theorem have claimed the impossibility of any such approaches to quantum mechanics. The present work offers arguments showing why such efforts towards generalization may fall short of their stated goal. In particular, we challenge the validity of the use of the non-signalling theorem as a conclusive argument in favor of the existence of free randomness, and therefore reject the use of the non-signalling theorem as an argument against the logical possibility of deterministic approaches. We here offer two distinct counter-arguments in support of the possibility of deterministic NHVTs: one argument exposes the circularity of the reasoning which is employed in recent claims, and a second argument is based on the inconclusive metaphysical status of the non-signalling theorem itself. We proceed by presenting an entirely informal treatment of key physical and metaphysical assumptions, and of their interrelationship, in attempts seeking to generalize Bell's theorem on the basis of an ontic, foundational interpretation of the non-signalling theorem. We here argue that the non-signalling theorem must instead be viewed as an epistemic, operational theorem i.e. one that refers exclusively to what epistemic agents can, or rather cannot, do. That is, we emphasize that the non-signalling theorem is a theorem about the operational inability of epistemic agents to signal information. In other words, as a proper principle, the non-signalling theorem may only be employed as an epistemic, phenomenological, or operational principle. Critically, our argument emphasizes that the non-signalling principle must not be used as an ontic principle about physical reality as such, i.e. as a theorem about the nature of physical reality independently of epistemic agents e.g. human observers. One major reason in favor of our conclusion is that any definition of signalling or of non-signalling invariably requires a reference to epistemic agents, and what these agents can actually measure and report. Otherwise, the non-signalling theorem would equal a general "no-influence" theorem. In conclusion, under the assumption that the non-signalling theorem is epistemic (i.e. "epistemic non-signalling"), the search for deterministic approaches to quantum mechanics, including NHVTs and an emergent quantum mechanics, continues to be a viable research program towards disclosing the foundations of physical reality at its smallest dimensions.
Perceived reasons, incentives, and barriers to physical activity in Swedish elderly men.
Sjörs, Camilla; Bonn, Stephanie E; Trolle Lagerros, Ylva; Sjölander, Arvid; Bälter, Katarina
2014-11-12
Knowledge about factors influencing physical activity behavior is needed in order to tailor physical activity interventions to the individual. The aim of this study was to explore and describe the perceived reasons, barriers, and incentives to increased physical activity, as well as preferable activities, among elderly men in Sweden. In total, 150 men aged 50-86 years responded to a Web-based questionnaire. Men who reported that they exercised sometimes or often received questions about reasons for physical activity (n=104), while men who reported that they never or seldom exercised received questions about barriers (n=46). The most frequent perceived reason for being physically active was health (82%), followed by enjoyment (45%), and a desire to lose/maintain weight (27%). Lack of interest/motivation was identified as the primary perceived barrier (17%). Incentives for increasing the level of activity included becoming more motivated and having a training partner. Walking was the most preferred activity. Enjoyment and maintaining a good health were important reasons for engaging in physical activity among Swedish elderly men.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-29
...] to prepare an EIS for proposed airfield improvements at IAH. The Airport Master Plan (AMP) prepared... to ensure that the assumptions used to develop the AMP remained valid. The HAS cited several reasons...
Hessdalen Light Phenomena and the Inconsistency of the "Car-Headlight" Interpretation
NASA Astrophysics Data System (ADS)
Teodorani, M.
Some gratuitous criticism attempted to attack research concerning the scientific study of anomalous light phenomena in Hessdalen, Norway, by artfully constructing a castle in the air based on the arbitrary assumption that the "EMBLA 2002" field-study was dedicated to car headlights. This paper summarizes and analyzes in a few essential details the reasons why this "criticism" hasn't any reason to be considered such, as it is only a well-constructed fake. Some epistemological aspects are treated as well.
Trujillo, Caleb; Cooper, Melanie M; Klymkowsky, Michael W
2012-01-01
Biological systems, from the molecular to the ecological, involve dynamic interaction networks. To examine student thinking about networks we used graphical responses, since they are easier to evaluate for implied, but unarticulated assumptions. Senior college level molecular biology students were presented with simple molecular level scenarios; surprisingly, most students failed to articulate the basic assumptions needed to generate reasonable graphical representations; their graphs often contradicted their explicit assumptions. We then developed a tiered Socratic tutorial based on leading questions designed to provoke metacognitive reflection. The activity is characterized by leading questions (prompts) designed to provoke meta-cognitive reflection. When applied in a group or individual setting, there was clear improvement in targeted areas. Our results highlight the promise of using graphical responses and Socratic prompts in a tutorial context as both a formative assessment for students and an informative feedback system for instructors, in part because graphical responses are relatively easy to evaluate for implied, but unarticulated assumptions. Copyright © 2011 Wiley Periodicals, Inc.
Learning Assumptions for Compositional Verification
NASA Technical Reports Server (NTRS)
Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)
2002-01-01
Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.
The calculation of aerosol optical properties from aerosol mass is a process subject to uncertainty related to necessary assumptions on the treatment of the chemical species mixing state, density, refractive index, and hygroscopic growth. In the framework of the AQMEII-2 model in...
NASA Astrophysics Data System (ADS)
Russ, Rosemary S.; Odden, Tor Ole B.
2017-12-01
Our field has long valued the goal of teaching students not just the facts of physics, but also the thinking and reasoning skills of professional physicists. The complexity inherent in scientific reasoning demands that we think carefully about how we conceptualize for ourselves, enact in our classes, and encourage in our students the relationship between the multifaceted practices of professional science. The current study draws on existing research in the philosophy of science and psychology to advocate for intertwining two important aspects of scientific reasoning: using evidence from experimentation and modeling. We present a case from an undergraduate physics course to illustrate how these aspects can be intertwined productively and describe specific ways in which these aspects of reasoning can mutually reinforce one another in student learning. We end by discussing implications for this work for instruction in introductory physics courses and for research on scientific reasoning at the undergraduate level.
May, Stephen; Withers, Sarah; Reeve, Sarah; Greasley, Alison
2010-01-01
The aim of this study was to explore the clinical reasoning process used by novice physical therapists in specific patient problems. Nine physical therapists in the UK with limited experience of managing musculoskeletal problems were included. Semi-structured interviews were conducted on how novice physical therapists would assess and manage a patient with a shoulder problem; interviews were transcribed and analyzed using framework analysis. To be included as a final theme at least 50% of participants had to mention that theme. A large number of items (n = 93) were excluded as fewer than 50% of participants referred to each item. Included items related to seven main themes: history (16), physical exam (13), investigations (1), diagnostic reasoning (1), clinical reasoning process (diagnostic pathway) (3), clinical reasoning process (management pathway) (5) and treatment options (1). Items mostly related to information gathering, although there was some use of hypothetico-deductive clinical reasoning there appeared to be limited understanding of the clinical implications of data gathered, and clinical reasoning through use of pattern recognition was minimal. Major weaknesses were apparent in the clinical reasoning skills of these novice therapists compared to previous reports of expert clinical reasoning, indicating areas for development in the education of student and junior physical therapists. PMID:21655390
When rational men fall sick: an inquiry into some assumptions made by medical anthropologists.
Young, A
1981-12-01
Medical anthropologist spend most of their time eliciting and interpreting people's statements about sickness and health. For this task, they make certain assumptions about the importance of language and reason. In this paper I argue that their assumptions are tailored to fit an hypothetical Rational Man rather than real people. The concept of 'explanatory models of sickness' is used to illustrate this point. My critique begins by drawing attention to two non-cognitive determinants of people's statements: their degree of emotional arousal and their capacities for discoursing on medical subjects. These determinants are briefly discussed and then set aside, to make room for the paper's argument proper. This starts with the observation that medical anthropologists tend to overlook the fact that they have established a cognitive no man's land stretching between their informants' statements and the cognitive structures which are supposed to generate these statements. I survey this void, using a five-fold model of medical knowledge. People use one kind of knowledge to organize their medical experiences and perceptions. In Rational Man writing, this form of knowledge is considered equivalent to cognitive structures (e.g., causal models, classificatory schemes), but I argue that is also includes knowledge of prototypical sickness events and knowledge that is embedded in actions, social relations, and material equipment. The theoretical implications of the five-fold model are outlined. This is followed by an analysis of the reasoning processes in which people use medical knowledge to produce the statements whose meaning we wish to learn. I demonstrate the importance of being able to distinguish operational and monothetic forms of reasoning from pre-operational and polythetic ones. Rational Man writers are described as ignoring the latter pair. The concept of 'prototypes' is reintroduced to illustrate these points.
What Predicts Injury from Physical Punishment? A Test of the Typologies of Violence Hypothesis
ERIC Educational Resources Information Center
Gonzalez, Miriam; Durrant, Joan E.; Chabot, Martin; Trocme, Nico; Brown, Jason
2008-01-01
Objective: This study examined the power of child, perpetrator, and socio-economic characteristics to predict injury in cases of reported child physical abuse. The study was designed to assess the validity of the assumption that physically injurious incidents of child physical abuse are qualitatively different from those that do not result in…
Smoking Behavior among Jordanians: Physical, Psychological, Social, and Economic Reasons.
Sweis, Nadia J
2018-03-12
To highlight the physical, psychological, social, and economic reasons related to sex differences in smoking behaviors in Jordan. A cross-sectional questionnaire-based survey was conducted among Jordanian adult smokers. Sex was a significant predictor of physical reasons related to smoking; when controlling for other factors (t 765 = 5.027; P < 0.001), women were more affected by physical factors than were men. In addition, work status was a significant predictor of physical reasons (t 765 = -2.563; P = 0.011), as was the price of cigarettes (t 765 = 2.224; P = 0.026). Age was a significant predictor of psychological reasons (t 765 = -3.092; P = 0.002): younger individuals were more likely to state psychological factors as their reason for smoking than were older individuals. Conversely, sex was a significant predictor (t 765 = 2.798; P = 0.005) of social reasons for smoking, with more men than women reporting social motivations. Women were more likely to smoke for physical factors that are positively correlated with the price of cigarettes, rendering them less responsive to an increase in the price of cigarettes. Conversely, men were more likely to smoke for social reasons that are negatively correlated with the price of cigarettes; thus, men are more responsive to an increase in the price of cigarettes. Future public policies aiming to combat smoking in Jordan should consider sex differences in smoking behavior because one policy may not necessarily fit all. Copyright © 2018. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Rodriguez Marco, Albert
Battery management systems (BMS) require computationally simple but highly accurate models of the battery cells they are monitoring and controlling. Historically, empirical equivalent-circuit models have been used, but increasingly researchers are focusing their attention on physics-based models due to their greater predictive capabilities. These models are of high intrinsic computational complexity and so must undergo some kind of order-reduction process to make their use by a BMS feasible: we favor methods based on a transfer-function approach of battery cell dynamics. In prior works, transfer functions have been found from full-order PDE models via two simplifying assumptions: (1) a linearization assumption--which is a fundamental necessity in order to make transfer functions--and (2) an assumption made out of expedience that decouples the electrolyte-potential and electrolyte-concentration PDEs in order to render an approach to solve for the transfer functions from the PDEs. This dissertation improves the fidelity of physics-based models by eliminating the need for the second assumption and, by linearizing nonlinear dynamics around different constant currents. Electrochemical transfer functions are infinite-order and cannot be expressed as a ratio of polynomials in the Laplace variable s. Thus, for practical use, these systems need to be approximated using reduced-order models that capture the most significant dynamics. This dissertation improves the generation of physics-based reduced-order models by introducing different realization algorithms, which produce a low-order model from the infinite-order electrochemical transfer functions. Physics-based reduced-order models are linear and describe cell dynamics if operated near the setpoint at which they have been generated. Hence, multiple physics-based reduced-order models need to be generated at different setpoints (i.e., state-of-charge, temperature and C-rate) in order to extend the cell operating range. This dissertation improves the implementation of physics-based reduced-order models by introducing different blending approaches that combine the pre-computed models generated (offline) at different setpoints in order to produce good electrochemical estimates (online) along the cell state-of-charge, temperature and C-rate range.
A challenge to lepton universality in B-meson decays
Ciezarek, Gregory; Franco Sevilla, Manuel; Hamilton, Brian; ...
2017-06-07
One of the key assumptions of the standard model of particle physics is that the interactions of the charged leptons, namely electrons, muons and taus, differ only because of their different masses. Whereas precision tests comparing processes involving electrons and muons have not revealed any definite violation of this assumption, recent studies of B-meson decays involving the higher-mass tau lepton have resulted in observations that challenge lepton universality at the level of four standard deviations. Here, a confirmation of these results would point to new particles or interactions, and could have profound implications for our understanding of particle physics.
Reinforcing loose foundation stones in trait-based plant ecology.
Shipley, Bill; De Bello, Francesco; Cornelissen, J Hans C; Laliberté, Etienne; Laughlin, Daniel C; Reich, Peter B
2016-04-01
The promise of "trait-based" plant ecology is one of generalized prediction across organizational and spatial scales, independent of taxonomy. This promise is a major reason for the increased popularity of this approach. Here, we argue that some important foundational assumptions of trait-based ecology have not received sufficient empirical evaluation. We identify three such assumptions and, where possible, suggest methods of improvement: (i) traits are functional to the degree that they determine individual fitness, (ii) intraspecific variation in functional traits can be largely ignored, and (iii) functional traits show general predictive relationships to measurable environmental gradients.
Mielenz, Thelma J; Callahan, Leigh F; Edwards, Michael C
2016-03-12
Examine the feasibility of performing an item response theory (IRT) analysis on two of the Centers for Disease Control and Prevention health-related quality of life (CDC HRQOL) modules - the 4-item Healthy Days Core Module (HDCM) and the 5-item Healthy days Symptoms Module (HDSM). Previous principal components analyses confirm that the two scales both assess a mix of mental (CDC-MH) and physical health (CDC-PH). The purpose is to conduct item response theory (IRT) analysis on the CDC-MH and CDC-PH scales separately. 2182 patients with self-reported or physician-diagnosed arthritis completed a cross-sectional survey including HDCM and HDSM items. Besides global health, the other 8 items ask the number of days that some statement was true; we chose to recode the data into 8 categories based on observed clustering. The IRT assumptions were assessed using confirmatory factor analysis and the data could be modeled using an unidimensional IRT model. The graded response model was used for IRT analyses and CDC-MH and CDC-PH scales were analyzed separately in flexMIRT. The IRT parameter estimates for the five-item CDC-PH all appeared reasonable. The three-item CDC-MH did not have reasonable parameter estimates. The CDC-PH scale is amenable to IRT analysis but the existing The CDC-MH scale is not. We suggest either using the 4-item Healthy Days Core Module (HDCM) and the 5-item Healthy days Symptoms Module (HDSM) as they currently stand or the CDC-PH scale alone if the primary goal is to measure physical health related HRQOL.
Using Order of Magnitude Calculations to Extend Student Comprehension of Laboratory Data
ERIC Educational Resources Information Center
Dean, Rob L.
2015-01-01
Author Rob Dean previously published an Illuminations article concerning "challenge" questions that encourage students to think imaginatively with approximate quantities, reasonable assumptions, and uncertain information. This article has promoted some interesting discussion, which has prompted him to present further examples. Examples…
ERIC Educational Resources Information Center
Resnick, Lauren B.; And Others
This paper discusses a radically different set of assumptions to improve educational outcomes for disadvantaged students. It is argued that disadvantaged children, when exposed to carefully organized thinking-oriented instruction, can acquire the traditional basic skills in the process of reasoning and solving problems. The paper is presented in…
Multi-layered reasoning by means of conceptual fuzzy sets
NASA Technical Reports Server (NTRS)
Takagi, Tomohiro; Imura, Atsushi; Ushida, Hirohide; Yamaguchi, Toru
1993-01-01
The real world consists of a very large number of instances of events and continuous numeric values. On the other hand, people represent and process their knowledge in terms of abstracted concepts derived from generalization of these instances and numeric values. Logic based paradigms for knowledge representation use symbolic processing both for concept representation and inference. Their underlying assumption is that a concept can be defined precisely. However, as this assumption hardly holds for natural concepts, it follows that symbolic processing cannot deal with such concepts. Thus symbolic processing has essential problems from a practical point of view of applications in the real world. In contrast, fuzzy set theory can be viewed as a stronger and more practical notation than formal, logic based theories because it supports both symbolic processing and numeric processing, connecting the logic based world and the real world. In this paper, we propose multi-layered reasoning by using conceptual fuzzy sets (CFS). The general characteristics of CFS are discussed along with upper layer supervision and context dependent processing.
Bekiari, Alexandra; Kokaridas, Dimitrios; Sakellariou, Kimon
2006-04-01
In this study were examined associations among physical education teachers' verbal aggressiveness as perceived by students and students' intrinsic motivation and reasons for discipline. The sample consisted of 265 Greek adolescent students who completed four questionnaires, the Verbal Aggressiveness Scale, the Lesson Satisfaction Scale, the Reasons for Discipline Scale, and the Intrinsic Motivation Inventory during physical education classes. Analysis indicated significant positive correlations among students' perceptions of teachers' verbal aggressiveness with pressure/ tension, external reasons, introjected reasons, no reasons, and self-responsibility. Significant negative correlations were noted for students' perceptions of teachers' verbal aggression with lesson satisfaction, enjoyment/interest, competence, effort/importance, intrinsic reasons, and caring. Differences between the two sexes were observed in their perceptions of teachers' verbal aggressiveness, intrinsic motivation, and reasons for discipline. Findings and implications for teachers' type of communication were also discussed and suggestions for research made.
Cosmological texture is incompatible with Planck-scale physics
NASA Technical Reports Server (NTRS)
Holman, Richard; Hsu, Stephen D. H.; Kolb, Edward W.; Watkins, Richard; Widrow, Lawrence M.
1992-01-01
Nambu-Goldstone modes are sensitive to the effects of physics at energies comparable to the scale of spontaneous symmetry breaking. We show that as a consequence of this the global texture proposal for structure formation requires rather severe assumptions about the nature of physics at the Planck scale.
Estimating trends in the global mean temperature record
NASA Astrophysics Data System (ADS)
Poppick, Andrew; Moyer, Elisabeth J.; Stein, Michael L.
2017-06-01
Given uncertainties in physical theory and numerical climate simulations, the historical temperature record is often used as a source of empirical information about climate change. Many historical trend analyses appear to de-emphasize physical and statistical assumptions: examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for internal variability in nonparametric rather than parametric ways. However, given a limited data record and the presence of internal variability, estimating radiatively forced temperature trends in the historical record necessarily requires some assumptions. Ostensibly empirical methods can also involve an inherent conflict in assumptions: they require data records that are short enough for naive trend models to be applicable, but long enough for long-timescale internal variability to be accounted for. In the context of global mean temperatures, empirical methods that appear to de-emphasize assumptions can therefore produce misleading inferences, because the trend over the twentieth century is complex and the scale of temporal correlation is long relative to the length of the data record. We illustrate here how a simple but physically motivated trend model can provide better-fitting and more broadly applicable trend estimates and can allow for a wider array of questions to be addressed. In particular, the model allows one to distinguish, within a single statistical framework, between uncertainties in the shorter-term vs. longer-term response to radiative forcing, with implications not only on historical trends but also on uncertainties in future projections. We also investigate the consequence on inferred uncertainties of the choice of a statistical description of internal variability. While nonparametric methods may seem to avoid making explicit assumptions, we demonstrate how even misspecified parametric statistical methods, if attuned to the important characteristics of internal variability, can result in more accurate uncertainty statements about trends.
Validity of the mockwitness paradigm: testing the assumptions.
McQuiston, Dawn E; Malpass, Roy S
2002-08-01
Mockwitness identifications are used to provide a quantitative measure of lineup fairness. Some theoretical and practical assumptions of this paradigm have not been studied in terms of mockwitnesses' decision processes and procedural variation (e.g., instructions, lineup presentation method), and the current experiment was conducted to empirically evaluate these assumptions. Four hundred and eighty mockwitnesses were given physical information about a culprit, received 1 of 4 variations of lineup instructions, and were asked to identify the culprit from either a fair or unfair sequential lineup containing 1 of 2 targets. Lineup bias estimates varied as a result of lineup fairness and the target presented. Mockwitnesses generally reported that the target's physical description was their main source of identifying information. Our findings support the use of mockwitness identifications as a useful technique for sequential lineup evaluation, but only for mockwitnesses who selected only 1 lineup member. Recommendations for the use of this evaluation procedure are discussed.
Extended physics as a theoretical framework for systems biology?
Miquel, Paul-Antoine
2011-08-01
In this essay we examine whether a theoretical and conceptual framework for systems biology could be built from the Bailly and Longo (2008, 2009) proposal. These authors aim to understand life as a coherent critical structure, and propose to develop an extended physical approach of evolution, as a diffusion of biomass in a space of complexity. Their attempt leads to a simple mathematical reconstruction of Gould's assumption (1989) concerning the bacterial world as a "left wall of least complexity" that we will examine. Extended physical systems are characterized by their constructive properties. Time is acting and new properties emerge by their history that can open the list of their initial properties. This conceptual and theoretical framework is nothing more than a philosophical assumption, but as such it provides a new and exciting approach concerning the evolution of life, and the transition between physics and biology. Copyright © 2011 Elsevier Ltd. All rights reserved.
Faith and reason and physician-assisted suicide.
Kaczor, Christopher
1998-08-01
Aquinas's conception of the relationship of faith and reason calls into question the arguments and some of the conclusions advanced in contributions to the debate on physician-assisted suicide by David Thomasma and H. Tristram Engelhardt. An understanding of the nature of theology as based on revelation calls into question Thomasma's theological argument in favor of physician-assisted suicide based on the example of Christ and the martyrs. On the other hand, unaided reason calls into question his assumptions about the nature of death as in some cases a good for the human person. Finally, if Aquinas is right about the relationship of faith and reason, Engelhardt's sharp contrast between "Christian" and "secular" approaches to physician-assisted suicide needs reconsideration, although his conclusions about physician-assisted suicide would find support.
Hennessy, Michael; Bleakley, Amy; Ellithorpe, Morgan
2018-03-01
The reasoned action approach is one of the most successful behavioral theories in the history of social psychology. This study outlines the theoretical principles of reasoned action and considers when it is appropriate to augment it with a new variable. To demonstrate, we use survey data collected from a 4 to 17 year old U.S. adolescents to test how the 'prototype' variables fit into reasoned action approach. Through confirmatory factor analysis, we find that the prototype measures are normative pressure measures and when treated as a separate theoretical construct, prototype identity is not completely mediated by the proximal predictors of behavioral intention. We discuss the assumptions of the two theories and finally consider the distinction between augmenting a specific theory versus combining measures derived from different theoretical perspectives.
Uncertain deduction and conditional reasoning.
Evans, Jonathan St B T; Thompson, Valerie A; Over, David E
2015-01-01
There has been a paradigm shift in the psychology of deductive reasoning. Many researchers no longer think it is appropriate to ask people to assume premises and decide what necessarily follows, with the results evaluated by binary extensional logic. Most every day and scientific inference is made from more or less confidently held beliefs and not assumptions, and the relevant normative standard is Bayesian probability theory. We argue that the study of "uncertain deduction" should directly ask people to assign probabilities to both premises and conclusions, and report an experiment using this method. We assess this reasoning by two Bayesian metrics: probabilistic validity and coherence according to probability theory. On both measures, participants perform above chance in conditional reasoning, but they do much better when statements are grouped as inferences, rather than evaluated in separate tasks.
NASA Astrophysics Data System (ADS)
Medlyn, B.; Jiang, M.; Zaehle, S.
2017-12-01
There is now ample experimental evidence that the response of terrestrial vegetation to rising atmospheric CO2 concentration is modified by soil nutrient availability. How to represent nutrient cycling processes is thus a key consideration for vegetation models. We have previously used model intercomparison to demonstrate that models incorporating different assumptions predict very different responses at Free-Air CO2 Enrichment experiments. Careful examination of model outputs has provided some insight into the reasons for the different model outcomes, but it is difficult to attribute outcomes to specific assumptions. Here we investigate the impact of individual assumptions in a generic plant carbon-nutrient cycling model. The G'DAY (Generic Decomposition And Yield) model is modified to incorporate alternative hypotheses for nutrient cycling. We analyse the impact of these assumptions in the model using a simple analytical approach known as "two-timing". This analysis identifies the quasi-equilibrium behaviour of the model at the time scales of the component pools. The analysis provides a useful mathematical framework for probing model behaviour and identifying the most critical assumptions for experimental study.
Satellite Power Systems (SPS) space transportation cost analysis and evaluation
NASA Technical Reports Server (NTRS)
1980-01-01
A picture of Space Power Systems space transportation costs at the present time is given with respect to accuracy as stated, reasonableness of the methods used, assumptions made, and uncertainty associated with the estimates. The approach used consists of examining space transportation costs from several perspectives to perform a variety of sensitivity analyses or reviews and examine the findings in terms of internal consistency and external comparison with analogous systems. These approaches are summarized as a theoretical and historical review including a review of stated and unstated assumptions used to derive the costs, and a performance or technical review. These reviews cover the overall transportation program as well as the individual vehicles proposed. The review of overall cost assumptions is the principal means used for estimating the cost uncertainty derived. The cost estimates used as the best current estimate are included.
The new AP Physics exams: Integrating qualitative and quantitative reasoning
NASA Astrophysics Data System (ADS)
Elby, Andrew
2015-04-01
When physics instructors and education researchers emphasize the importance of integrating qualitative and quantitative reasoning in problem solving, they usually mean using those types of reasoning serially and separately: first students should analyze the physical situation qualitatively/conceptually to figure out the relevant equations, then they should process those equations quantitatively to generate a solution, and finally they should use qualitative reasoning to check that answer for plausibility (Heller, Keith, & Anderson, 1992). The new AP Physics 1 and 2 exams will, of course, reward this approach to problem solving. But one kind of free response question will demand and reward a further integration of qualitative and quantitative reasoning, namely mathematical modeling and sense-making--inventing new equations to capture a physical situation and focusing on proportionalities, inverse proportionalities, and other functional relations to infer what the equation ``says'' about the physical world. In this talk, I discuss examples of these qualitative-quantitative translation questions, highlighting how they differ from both standard quantitative and standard qualitative questions. I then discuss the kinds of modeling activities that can help AP and college students develop these skills and habits of mind.
26 CFR 1.482-5 - Comparable profits method.
Code of Federal Regulations, 2010 CFR
2010-04-01
... operating profit represents a return for the investment of resources and assumption of risks. Therefore... from a sufficient number of years of data to reasonably measure returns that accrue to uncontrolled... party and uncontrolled comparables include the following— (i) Rate of return on capital employed. The...
ERIC Educational Resources Information Center
Lenartowicz, Marta
2015-01-01
Higher education research frequently refers to the complex external conditions that give our old-fashioned universities a good reason to change. The underlying theoretical assumption of such framing is that organizations are open systems. This paper presents an alternative view, derived from the theory of social systems autopoiesis. It proposes…
Code of Federal Regulations, 2010 CFR
2010-04-01
... of section 414(j), the annual deferral for a taxable year is the present value of the increase during... value must be determined using actuarial assumptions and methods that are reasonable (both individually... paying length of service awards to bona fide volunteers (and their beneficiaries) on account of qualified...
Code of Federal Regulations, 2011 CFR
2011-04-01
... of section 414(j), the annual deferral for a taxable year is the present value of the increase during... value must be determined using actuarial assumptions and methods that are reasonable (both individually... paying length of service awards to bona fide volunteers (and their beneficiaries) on account of qualified...
Code of Federal Regulations, 2013 CFR
2013-04-01
... of section 414(j), the annual deferral for a taxable year is the present value of the increase during... value must be determined using actuarial assumptions and methods that are reasonable (both individually... paying length of service awards to bona fide volunteers (and their beneficiaries) on account of qualified...
Helicopter Toy and Lift Estimation
ERIC Educational Resources Information Center
Shakerin, Said
2013-01-01
A $1 plastic helicopter toy (called a Wacky Whirler) can be used to demonstrate lift. Students can make basic measurements of the toy, use reasonable assumptions and, with the lift formula, estimate the lift, and verify that it is sufficient to overcome the toy's weight. (Contains 1 figure.)
Models for Theory-Based M.A. and Ph.D. Programs.
ERIC Educational Resources Information Center
Botan, Carl; Vasquez, Gabriel
1999-01-01
Presents work accomplished at the 1998 National Communication Association Summer Conference. Outlines reasons for theory-based education in public relations. Presents an integrated model of student outcomes, curriculum, pedagogy, and assessment for theory-based master's and doctoral programs, including assumptions made and rationale for such…
ERIC Educational Resources Information Center
Kryjevskaia, Mila; Stetzer, MacKenzie R.; Grosz, Nathaniel
2014-01-01
We have applied the heuristic-analytic theory of reasoning to interpret inconsistencies in student reasoning approaches to physics problems. This study was motivated by an emerging body of evidence that suggests that student conceptual and reasoning competence demonstrated on one task often fails to be exhibited on another. Indeed, even after…
Influence of ventilation structure on air flow distribution of large turbo-generator
NASA Astrophysics Data System (ADS)
Zhang, Liying; Ding, Shuye; Zhao, Zhijun; Yang, Jingmo
2018-04-01
For the 350 MW air - cooled turbo—generator, the rotor body is ventilated by sub -slots and 94 radial ventilation ducts and the end adopts arc segment and the straight section to acquire the wind. The stator is ventilated with five inlets and eight outlet air branches. In order to analyze the cooling effect of different ventilation schemes, a global physical model including the stator, rotor, casing and fan is established, and the assumptions and boundary conditions of the solution domain are given. the finite volume method is used to solve the problem, and the air flow distribution characteristics of each part of the motor under different ventilation schemes are obtained. The results show that the baffle at the end of the rotor can eliminate the eddy current at the end of the rotor, and make the flow distribution of cooling air more uniform and reasonable. The conclusions can provide reference for the design of motor ventilation structure.
Ground-based Observations of Large Solar Flares Precursors
NASA Astrophysics Data System (ADS)
Sheyner, Olga; Smirnova, Anna; Snegirev, Sergey
2010-05-01
The importance problem of Solar-terrestrial physics is regular forecasting of solar activity phenomena, which negatively influence the human's health, operating safety, communication, radar sets and others. The opportunity of development of short-term forecasting technique of geoeffective solar flares is presented in this study. This technique is based on the effect of growth of pulsations of horizontal component of geomagnetic field before the solar proton flares. The long-period (30-60 minutes) pulsations of H-component of geomagnetic field are detected for the events of different intensity on March 22, 1991, November 4, 2001, and November 17, 2001 using the method of wavelet-analysis. Amplitudes of fluctuations of horizontal component of geomagnetic field with the 30-60 minute's periods grow at the most of tested stations during 0.5-3.5 days before the solar flares. The particularities of spectral component are studied for the stations situated on different latitudes. The assumptions about the reason of such precursors-fluctuations appearance are made.
ERIC Educational Resources Information Center
Plowman, Sharon Ann
The use of health-related physical fitness tests for sex-fair ability grouping in physical education classes requires the verification of two assumptions: (1) that there exists a direct positive relationship between health-related physical fitness and development and/or improvement of various sport skills; and (2) that there is a physiological…
ERIC Educational Resources Information Center
Spittle, Michael; Jackson, Kevin; Casey, Meghan
2009-01-01
This study explored the reasons people choose physical education teaching as a profession and investigated the relationship of these choices with motivation. Physical education pre-service teachers (n = 324) completed the Academic Motivation Scale (AMS) and a measure of reasons for choosing physical education teaching. Confident interpersonal…
Jones, S.; Hirschi, R.; Pignatari, M.; ...
2015-01-15
We present a comparison of 15M ⊙ , 20M ⊙ and 25M ⊙ stellar models from three different codes|GENEC, KEPLER and MESA|and their nucleosynthetic yields. The models are calculated from the main sequence up to the pre-supernova (pre-SN) stage and do not include rotation. The GENEC and KEPLER models hold physics assumptions that are characteristic of the two codes. The MESA code is generally more flexible; overshooting of the convective core during the hydrogen and helium burning phases in MESA is chosen such that the CO core masses are consistent with those in the GENEC models. Full nucleosynthesis calculations aremore » performed for all models using the NuGrid post-processing tool MPPNP and the key energy-generating nuclear reaction rates are the same for all codes. We are thus able to highlight the key diferences between the models that are caused by the contrasting physics assumptions and numerical implementations of the three codes. A reasonable agreement is found between the surface abundances predicted by the models computed using the different codes, with GENEC exhibiting the strongest enrichment of H-burning products and KEPLER exhibiting the weakest. There are large variations in both the structure and composition of the models—the 15M ⊙ and 20M ⊙ in particular—at the pre-SN stage from code to code caused primarily by convective shell merging during the advanced stages. For example the C-shell abundances of O, Ne and Mg predicted by the three codes span one order of magnitude in the 15M ⊙ models. For the alpha elements between Si and Fe the differences are even larger. The s-process abundances in the C shell are modified by the merging of convective shells; the modification is strongest in the 15M ⊙ model in which the C-shell material is exposed to O-burning temperatures and the γ -process is activated. The variation in the s-process abundances across the codes is smallest in the 25M ⊙ models, where it is comparable to the impact of nuclear reaction rate uncertainties. In general the differences in the results from the three codes are due to their contrasting physics assumptions (e.g. prescriptions for mass loss and convection). The broadly similar evolution of the 25M ⊙ models gives us reassurance that different stellar evolution codes do produce similar results. For the 15M ⊙ and 20M ⊙ models, however, the different input physics and the interplay between the various convective zones lead to important differences in both the pre-supernova structure and nucleosynthesis predicted by the three codes. For the KEPLER models the core masses are different and therefore an exact match could not be expected.« less
Expert Systems--The New International Language of Business.
ERIC Educational Resources Information Center
Sondak, Norman E.; And Others
A discussion of expert systems, computer programs designed to simulate human reasoning and expertise, begins with the assumption that few business educators understand the impact that expert systems will have on international business. The fundamental principles of the design and development of expert systems in business are outlined, with special…
A Critical View of Home Education
ERIC Educational Resources Information Center
Lubienski, Chris
2003-01-01
The remarkable spread of home education needs to be considered in light of the arguments driving its growth. While acknowledging that there are many good reasons for individuals to choose home education, this analysis examines some of the most prominent assumptions and claims that advance the practice as a mass movement. Specifically, arguments…
7 CFR 4287.134 - Transfer and assumption.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Agriculture Regulations of the Department of Agriculture (Continued) RURAL BUSINESS-COOPERATIVE SERVICE AND... from liability. Any new loan terms must be within the terms authorized by 4279.126 of subpart B of part... explanation of the reasons for the proposed change in loan terms. (c) Release of liability. The transferor...
Play and Self-Regulation: Lessons from Vygotsky
ERIC Educational Resources Information Center
Bodrova, Elena; Germeroth, Carrie; Leong, Deborah J.
2013-01-01
The authors consider the analysis of the literature on play research by Lillard and others in the January 2013 "Psychological Bulletin," an analysis that questioned the prevailing assumption of a causal relationship between play and child development, especially in the areas of creativity, reasoning, executive function, and regulation of…
48 CFR 252.235-7001 - Indemnification under 10 U.S.C. 2354-cost reimbursement.
Code of Federal Regulations, 2010 CFR
2010-10-01
... System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CLAUSES AND FORMS SOLICITATION... Department of the Navy, the Department) specifically approved the assumption of liability; and (6) Must be certified as just and reasonable by the Secretary of the department or designated representative. (d) A...
48 CFR 252.235-7000 - Indemnification under 10 U.S.C. 2354-fixed price.
Code of Federal Regulations, 2012 CFR
2012-10-01
... DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CLAUSES AND FORMS SOLICITATION PROVISIONS... Department of the Navy, the Department) specifically approved the assumption of liability; and (6) Must be certified as just and reasonable by the Secretary of the department or designated representative. (d) The...
48 CFR 252.235-7001 - Indemnification under 10 U.S.C. 2354-cost reimbursement.
Code of Federal Regulations, 2014 CFR
2014-10-01
... System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CLAUSES AND FORMS SOLICITATION... Department of the Navy, the Department) specifically approved the assumption of liability; and (6) Must be certified as just and reasonable by the Secretary of the department or designated representative. (d) A...
48 CFR 252.235-7000 - Indemnification under 10 U.S.C. 2354-fixed price.
Code of Federal Regulations, 2010 CFR
2010-10-01
... DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CLAUSES AND FORMS SOLICITATION PROVISIONS... Department of the Navy, the Department) specifically approved the assumption of liability; and (6) Must be certified as just and reasonable by the Secretary of the department or designated representative. (d) The...
48 CFR 252.235-7001 - Indemnification under 10 U.S.C. 2354-cost reimbursement.
Code of Federal Regulations, 2011 CFR
2011-10-01
... System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CLAUSES AND FORMS SOLICITATION... Department of the Navy, the Department) specifically approved the assumption of liability; and (6) Must be certified as just and reasonable by the Secretary of the department or designated representative. (d) A...
48 CFR 252.235-7000 - Indemnification under 10 U.S.C. 2354-fixed price.
Code of Federal Regulations, 2013 CFR
2013-10-01
... DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CLAUSES AND FORMS SOLICITATION PROVISIONS... Department of the Navy, the Department) specifically approved the assumption of liability; and (6) Must be certified as just and reasonable by the Secretary of the department or designated representative. (d) The...
48 CFR 252.235-7000 - Indemnification under 10 U.S.C. 2354-fixed price.
Code of Federal Regulations, 2014 CFR
2014-10-01
... DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CLAUSES AND FORMS SOLICITATION PROVISIONS... Department of the Navy, the Department) specifically approved the assumption of liability; and (6) Must be certified as just and reasonable by the Secretary of the department or designated representative. (d) The...
48 CFR 252.235-7001 - Indemnification under 10 U.S.C. 2354-cost reimbursement.
Code of Federal Regulations, 2013 CFR
2013-10-01
... System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CLAUSES AND FORMS SOLICITATION... Department of the Navy, the Department) specifically approved the assumption of liability; and (6) Must be certified as just and reasonable by the Secretary of the department or designated representative. (d) A...
48 CFR 252.235-7000 - Indemnification under 10 U.S.C. 2354-fixed price.
Code of Federal Regulations, 2011 CFR
2011-10-01
... DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CLAUSES AND FORMS SOLICITATION PROVISIONS... Department of the Navy, the Department) specifically approved the assumption of liability; and (6) Must be certified as just and reasonable by the Secretary of the department or designated representative. (d) The...
48 CFR 252.235-7001 - Indemnification under 10 U.S.C. 2354-cost reimbursement.
Code of Federal Regulations, 2012 CFR
2012-10-01
... System DEFENSE ACQUISITION REGULATIONS SYSTEM, DEPARTMENT OF DEFENSE CLAUSES AND FORMS SOLICITATION... Department of the Navy, the Department) specifically approved the assumption of liability; and (6) Must be certified as just and reasonable by the Secretary of the department or designated representative. (d) A...
Sensory Impairments and Autism: A Re-Examination of Causal Modelling
ERIC Educational Resources Information Center
Gerrard, Sue; Rugg, Gordon
2009-01-01
Sensory impairments are widely reported in autism, but remain largely unexplained by existing models. This article examines Kanner's causal reasoning and identifies unsupported assumptions implicit in later empirical work. Our analysis supports a heterogeneous causal model for autistic characteristics. We propose that the development of a…
Testimony and Interrogation of Minors: Assumptions about Maturity and Morality
ERIC Educational Resources Information Center
Owen-Kostelnik, Jessica; Reppucci, N. Dickon; Meyer, Jessica R.
2006-01-01
This article examines the legal histories and social contexts of testimony and interrogation involving minors, developmental research on suggestibility and judgment, interactions between development and legal/sociological contexts, and the reasoning behind how minors are treated in different legal contexts. The authors argue (a) that young…
Teaching Ethics in International Business Courses: The Impacts of Religions
ERIC Educational Resources Information Center
Ruhe, John; Lee, Monle
2008-01-01
Implicit in most comparative ethical studies is the assumption that cultural and religious differences between countries are the major reasons behind the variations in ethical beliefs and business practice across nations. This article examines research on the international ethical issues and the common moral concerns that permeate differing…
42 CFR 403.258 - Statement of actuarial opinion.
Code of Federal Regulations, 2011 CFR
2011-10-01
... actuarial opinion means a signed declaration in which a qualified actuary states that the assumptions used... policy experience, if any, and reasonable expectations. (b) Qualified actuary means— (1) A member in good standing of the American Academy of Actuaries; or (2) A person who has otherwise demonstrated his or her...
42 CFR 403.258 - Statement of actuarial opinion.
Code of Federal Regulations, 2012 CFR
2012-10-01
... actuarial opinion means a signed declaration in which a qualified actuary states that the assumptions used... policy experience, if any, and reasonable expectations. (b) Qualified actuary means— (1) A member in good standing of the American Academy of Actuaries; or (2) A person who has otherwise demonstrated his or her...
42 CFR 403.258 - Statement of actuarial opinion.
Code of Federal Regulations, 2013 CFR
2013-10-01
... actuarial opinion means a signed declaration in which a qualified actuary states that the assumptions used... policy experience, if any, and reasonable expectations. (b) Qualified actuary means— (1) A member in good standing of the American Academy of Actuaries; or (2) A person who has otherwise demonstrated his or her...
42 CFR 403.258 - Statement of actuarial opinion.
Code of Federal Regulations, 2014 CFR
2014-10-01
... actuarial opinion means a signed declaration in which a qualified actuary states that the assumptions used... policy experience, if any, and reasonable expectations. (b) Qualified actuary means— (1) A member in good standing of the American Academy of Actuaries; or (2) A person who has otherwise demonstrated his or her...
29 CFR 4044.53 - Mortality assumptions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... retirement benefit for any reason other than a change in the participant's health status. (2) Non-Social... from Table 4 of Appendix A to this part. (d) Social Security disabled lives. If the individual is Social Security disabled under paragraph (f)(1) of this section, the plan administrator will value the...
Cognitive Integrity Predicts Transitive Inference Performance Bias and Success
ERIC Educational Resources Information Center
Moses, Sandra N.; Villate, Christina; Binns, Malcolm A.; Davidson, Patrick S. R.; Ryan, Jennifer D.
2008-01-01
Transitive inference has traditionally been regarded as a relational proposition-based reasoning task, however, recent investigations question the validity of this assumption. Although some results support the use of a relational proposition-based approach, other studies find evidence for the use of associative learning. We examined whether…
42 CFR 403.258 - Statement of actuarial opinion.
Code of Federal Regulations, 2010 CFR
2010-10-01
... actuarial opinion means a signed declaration in which a qualified actuary states that the assumptions used... policy experience, if any, and reasonable expectations. (b) Qualified actuary means— (1) A member in good standing of the American Academy of Actuaries; or (2) A person who has otherwise demonstrated his or her...
ERIC Educational Resources Information Center
Wyver, Shirley; Engelen, Lina; Bundy, Anita; Naughton, Geraldine
2012-01-01
An assumption made when designing recess interventions in schools is that there is a clear demarcation between eating time and play time. We used observational data conducted as part of the Sydney Playground Project to test if this assumption was correct. The Sydney Playground Project is a cluster randomised controlled trial of a recess…
Ghosh, Avijit; Scott, Dennis O; Maurer, Tristan S
2014-02-14
In this work, we provide a unified theoretical framework describing how drug molecules can permeate across membranes in neutral and ionized forms for unstirred in vitro systems. The analysis provides a self-consistent basis for the origin of the unstirred water layer (UWL) within the Nernst-Planck framework in the fully unstirred limit and further provides an accounting mechanism based simply on the bulk aqueous solvent diffusion constant of the drug molecule. Our framework makes no new assumptions about the underlying physics of molecular permeation. We hold simply that Nernst-Planck is a reasonable approximation at low concentrations and all physical systems must conserve mass. The applicability of the derived framework has been examined both with respect to the effect of stirring and externally applied voltages to measured permeability. The analysis contains data for 9 compounds extracted from the literature representing a range of permeabilities and aqueous diffusion coefficients. Applicability with respect to ionized permeation is examined using literature data for the permanently charged cation, crystal violet, providing a basis for the underlying mechanism for ionized drug permeation for this molecule as being due to mobile counter-current flow. Copyright © 2013 Elsevier B.V. All rights reserved.
Motivation in caring labor: Implications for the well-being and employment outcomes of nurses.
Dill, Janette; Erickson, Rebecca J; Diefendorff, James M
2016-10-01
For nurses and other caregivers there is a strong emphasis on prosocial forms of motivation, or doing the job because you want to help others, even in formal, institutionalized care settings. This emphasis is based in gendered assumptions that altruistic motivations are the "right" reasons for being a nurse and lead to the best outcomes for workers and patients. Other motivations for pursuing care work, particularly extrinsic motivation, depart from the prosocial model of care and may be indicative of substandard outcomes, but little research has examined variation in care workers' motivations for doing their jobs. In this study, we use survey data collected from 730 acute care hospital nurses working within one health care system in the Midwestern United States to examine whether different sources of motivation for being a nurse are related to nurse job burnout, negative physical symptoms, and turnover intentions. Our findings suggest that nurses who have high intrinsic and extrinsic motivation actually have better perceived health and employment outcomes (i.e., less likely to say that they will leave, lower burnout, fewer negative physical symptoms) than those with high prosocial motivation, who are more likely to report job burnout. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Shogin, Dmitry; Amund Amundsen, Per
2016-10-01
We test the physical relevance of the full and the truncated versions of the Israel-Stewart (IS) theory of irreversible thermodynamics in a cosmological setting. Using a dynamical systems method, we determine the asymptotic future of plane symmetric Bianchi type I spacetimes with a viscous mathematical fluid, keeping track of the magnitude of the relative dissipative fluxes, which determines the applicability of the IS theory. We consider the situations where the dissipative mechanisms of shear and bulk viscosity are involved separately and simultaneously. It is demonstrated that the only case in the given model when the fluid asymptotically approaches local thermal equilibrium, and the underlying assumptions of the IS theory are therefore not violated, is that of a dissipative fluid with vanishing bulk viscosity. The truncated IS equations for shear viscosity are found to produce solutions which manifest pathological dynamical features and, in addition, to be strongly sensitive to the choice of initial conditions. Since these features are observed already in the case of an oversimplified mathematical fluid model, we have no reason to assume that the truncation of the IS transport equations will produce relevant results for physically more realistic fluids. The possible role of bulk and shear viscosity in cosmological evolution is also discussed.
How Galileo and Kepler Countered Aristotle's Cosmological Errors
NASA Astrophysics Data System (ADS)
Gingerich, O.
2009-08-01
Aristotle made two major common sense assumptions that ultimately had to be refuted to open the way to modern science. One was the dichotomy between celestial and terrestrial. The other was the separation of astronomy from physics. Galileo, particularly with his examination of the moon in the Sidereus nuncius, was a pioneer in destroying the first assumption, while Kepler, whose Astronomia nova was subtitled ``based on causes, or celestial physics,'' broke the stranglehold of the second. The importance of these fundamental contributions toward establishing the nature of modern science, which paved the way for Isaac Newton, is often overshadowed by their more specific contributions in optics or mechanics.
Chameleon Effect, the Range of Values Hypothesis and Reproducing the EPR-Bohm Correlations
NASA Astrophysics Data System (ADS)
Accardi, Luigi; Khrennikov, Andrei
2007-02-01
We present a detailed analysis of assumptions that J. Bell used to show that local realism contradicts QM. We find that Bell's viewpoint on realism is nonphysical, because it implicitly assume that observed physical variables coincides with ontic variables (i.e., these variables before measurement). The real physical process of measurement is a process of dynamical interaction between a system and a measurement device. Therefore one should check the adequacy of QM not to "Bell's realism," but to adaptive realism (chameleon realism). Dropping Bell's assumption we are able to construct a natural representation of the EPR-Bohm correlations in the local (adaptive) realistic approach.
Cabin Environment Physics Risk Model
NASA Technical Reports Server (NTRS)
Mattenberger, Christopher J.; Mathias, Donovan Leigh
2014-01-01
This paper presents a Cabin Environment Physics Risk (CEPR) model that predicts the time for an initial failure of Environmental Control and Life Support System (ECLSS) functionality to propagate into a hazardous environment and trigger a loss-of-crew (LOC) event. This physics-of failure model allows a probabilistic risk assessment of a crewed spacecraft to account for the cabin environment, which can serve as a buffer to protect the crew during an abort from orbit and ultimately enable a safe return. The results of the CEPR model replace the assumption that failure of the crew critical ECLSS functionality causes LOC instantly, and provide a more accurate representation of the spacecraft's risk posture. The instant-LOC assumption is shown to be excessively conservative and, moreover, can impact the relative risk drivers identified for the spacecraft. This, in turn, could lead the design team to allocate mass for equipment to reduce overly conservative risk estimates in a suboptimal configuration, which inherently increases the overall risk to the crew. For example, available mass could be poorly used to add redundant ECLSS components that have a negligible benefit but appear to make the vehicle safer due to poor assumptions about the propagation time of ECLSS failures.
Quantum Nonlocality and Reality
NASA Astrophysics Data System (ADS)
Bell, Mary; Gao, Shan
2016-09-01
Preface; Part I. John Stewart Bell: The Physicist: 1. John Bell: the Irish connection Andrew Whitaker; 2. Recollections of John Bell Michael Nauenberg; 3. John Bell: recollections of a great scientist and a great man Gian-Carlo Ghirardi; Part II. Bell's Theorem: 4. What did Bell really prove? Jean Bricmont; 5. The assumptions of Bell's proof Roderich Tumulka; 6. Bell on Bell's theorem: the changing face of nonlocality Harvey R. Brown and Christopher G. Timpson; 7. Experimental tests of Bell inequalities Marco Genovese; 8. Bell's theorem without inequalities: on the inception and scope of the GHZ theorem Olival Freire, Jr and Osvaldo Pessoa, Jr; 9. Strengthening Bell's theorem: removing the hidden-variable assumption Henry P. Stapp; Part III. Nonlocality: Illusions or Reality?: 10. Is any theory compatible with the quantum predictions necessarily nonlocal? Bernard d'Espagnat; 11. Local causality, probability and explanation Richard A. Healey; 12. Bell inequality and many-worlds interpretation Lev Vaidman; 13. Quantum solipsism and non-locality Travis Norsen; 14. Lessons of Bell's theorem: nonlocality, yes; action at a distance, not necessarily Wayne C. Myrvold; 15. Bell non-locality, Hardy's paradox and hyperplane dependence Gordon N. Fleming; 16. Some thoughts on quantum nonlocality and its apparent incompatibility with relativity Shan Gao; 17. A reasonable thing that just might work Daniel Rohrlich; 18. Weak values and quantum nonlocality Yakir Aharonov and Eliahu Cohen; Part IV. Nonlocal Realistic Theories: 19. Local beables and the foundations of physics Tim Maudlin; 20. John Bell's varying interpretations of quantum mechanics: memories and comments H. Dieter Zeh; 21. Some personal reflections on quantum non-locality and the contributions of John Bell Basil J. Hiley; 22. Bell on Bohm Sheldon Goldstein; 23. Interactions and inequality Philip Pearle; 24. Gravitation and the noise needed in objective reduction models Stephen L. Adler; 25. Towards an objective physics of Bell non-locality: palatial twistor theory Roger Penrose; 26. Measurement and macroscopicity: overcoming conceptual imprecision in quantum measurement theory Gregg Jaeger; Index.
Hartman, Jorine E; ten Hacken, Nick H T; Boezen, H Marike; de Greef, Mathieu H G
2013-06-01
What are the perceived reasons for people with chronic obstructive pulmonary disease (COPD) to be physically active or sedentary? Are those reasons related to the actual measured level of physical activity? A mixed-methods study combining qualitative and quantitative approaches. People with mild to very severe COPD. Participants underwent a semi-structured interview and physical activity was measured by a triaxial accelerometer worn for one week. Of 118 enrolled, 115 participants (68% male, mean age 65 years, mean FEV1 57% predicted, mean modified Medical Research Council dyspnoea score 1.4) completed the study. The most frequently reported reason to be physically active was health benefits, followed by enjoyment, continuation of an active lifestyle from the past, and functional reasons. The most frequently reported reason to be sedentary was the weather, followed by health problems, and lack of intrinsic motivation. Mean steps per day ranged between 236 and 18 433 steps. A high physical activity level was related to enjoyment and self-efficacy for physical activity. A low physical activity level was related to the weather influencing health, financial constraints, health and shame. We identified important facilitators to being physically active and barriers that could be amenable to change. Furthermore, we distinguished three important potential strategies for increasing physical activity in sedentary people with COPD, namely reducing barriers and increasing insight into health benefits, tailoring type of activity, and improvement of self-efficacy. Copyright © 2013 Australian Physiotherapy Association. Published by .. All rights reserved.
Marom, Gil; Bluestein, Danny
2016-01-01
This paper evaluated the influence of various numerical implementation assumptions on predicting blood damage in cardiovascular devices using Lagrangian methods with Eulerian computational fluid dynamics. The implementation assumptions that were tested included various seeding patterns, stochastic walk model, and simplified trajectory calculations with pathlines. Post processing implementation options that were evaluated included single passage and repeated passages stress accumulation and time averaging. This study demonstrated that the implementation assumptions can significantly affect the resulting stress accumulation, i.e., the blood damage model predictions. Careful considerations should be taken in the use of Lagrangian models. Ultimately, the appropriate assumptions should be considered based the physics of the specific case and sensitivity analysis, similar to the ones presented here, should be employed.
ERIC Educational Resources Information Center
Spittle, Sharna; Spittle, Michael
2014-01-01
This study explored the reasons for pre-service teachers choosing to specialise in primary physical education and how these choices related to their motivation. Pre-service teachers who then elected to specialise in primary physical education (n = 248) completed the Attractors and Facilitators for Physical Education (AFPE) questionnaire and the…
ERIC Educational Resources Information Center
Denison, Stephanie; Trikutam, Pallavi; Xu, Fei
2014-01-01
A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar…
Uncertain deduction and conditional reasoning
Evans, Jonathan St. B. T.; Thompson, Valerie A.; Over, David E.
2015-01-01
There has been a paradigm shift in the psychology of deductive reasoning. Many researchers no longer think it is appropriate to ask people to assume premises and decide what necessarily follows, with the results evaluated by binary extensional logic. Most every day and scientific inference is made from more or less confidently held beliefs and not assumptions, and the relevant normative standard is Bayesian probability theory. We argue that the study of “uncertain deduction” should directly ask people to assign probabilities to both premises and conclusions, and report an experiment using this method. We assess this reasoning by two Bayesian metrics: probabilistic validity and coherence according to probability theory. On both measures, participants perform above chance in conditional reasoning, but they do much better when statements are grouped as inferences, rather than evaluated in separate tasks. PMID:25904888
Prototypes Reflect Normative Perceptions: Implications for the Development of Reasoned Action Theory
Hennessy, Michael; Bleakley, Amy; Ellithorpe, Morgan
2017-01-01
The reasoned action approach is one of the most successful behavioral theories in the history of social psychology. This study outlines the theoretical principles of reasoned action and considers when it is appropriate to augment it with a new variable. To demonstrate, we use survey data collected from a 4–17 year old U.S. adolescents to test how the “prototype” variables fit into reasoned action approach. Through confirmatory factor analysis, we find that the prototype measures are normative pressure measures and when treated as a separate theoretical construct, prototype identity is not completely mediated by the proximal predictors of behavioral intention. We discuss the assumptions of the two theories and finally consider the distinction between augmenting a specific theory versus combining measures derived from different theoretical perspectives. PMID:28612624
A general method for handling missing binary outcome data in randomized controlled trials
Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen
2014-01-01
Aims The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. Design We propose a sensitivity analysis where standard analyses, which could include ‘missing = smoking’ and ‘last observation carried forward’, are embedded in a wider class of models. Setting We apply our general method to data from two smoking cessation trials. Participants A total of 489 and 1758 participants from two smoking cessation trials. Measurements The abstinence outcomes were obtained using telephone interviews. Findings The estimated intervention effects from both trials depend on the sensitivity parameters used. The findings differ considerably in magnitude and statistical significance under quite extreme assumptions about the missing data, but are reasonably consistent under more moderate assumptions. Conclusions A new method for undertaking sensitivity analyses when handling missing data in trials with binary outcomes allows a wide range of assumptions about the missing data to be assessed. In two smoking cessation trials the results were insensitive to all but extreme assumptions. PMID:25171441
Assume-Guarantee Verification of Source Code with Design-Level Assumptions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.
2004-01-01
Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.
Optimum runway orientation relative to crosswinds
NASA Technical Reports Server (NTRS)
Falls, L. W.; Brown, S. C.
1972-01-01
Specific magnitudes of crosswinds may exist that could be constraints to the success of an aircraft mission such as the landing of the proposed space shuttle. A method is required to determine the orientation or azimuth of the proposed runway which will minimize the probability of certain critical crosswinds. Two procedures for obtaining the optimum runway orientation relative to minimizing a specified crosswind speed are described and illustrated with examples. The empirical procedure requires only hand calculations on an ordinary wind rose. The theoretical method utilizes wind statistics computed after the bivariate normal elliptical distribution is applied to a data sample of component winds. This method requires only the assumption that the wind components are bivariate normally distributed. This assumption seems to be reasonable. Studies are currently in progress for testing wind components for bivariate normality for various stations. The close agreement between the theoretical and empirical results for the example chosen substantiates the bivariate normal assumption.
Twin studies in psychiatry and psychology: science or pseudoscience?
Joseph, Jay
2002-01-01
Twin studies are frequently cited in support of the influence of genetic factors for a wide range of psychiatric conditions and psychological trait differences. The most common method, known as the classical twin method, compares the concordance rates or correlations of reared-together identical (MZ) vs. reared-together same-sex fraternal (DZ) twins. However, drawing genetic inferences from MZ-DZ comparisons is problematic due to methodological problems and questionable assumptions. It is argued that the main theoretical assumption of the twin method--known as the "equal environment assumption"--is not tenable. The twin method is therefore of doubtful value as an indicator of genetic influences. Studies of reared-apart twins are discussed, and it is noted that these studies are also vulnerable to methodological problems and environmental confounds. It is concluded that there is little reason to believe that twin studies provide evidence in favor of genetic influences on psychiatric disorders and human behavioral differences.
Rock physics model-based prediction of shear wave velocity in the Barnett Shale formation
NASA Astrophysics Data System (ADS)
Guo, Zhiqi; Li, Xiang-Yang
2015-06-01
Predicting S-wave velocity is important for reservoir characterization and fluid identification in unconventional resources. A rock physics model-based method is developed for estimating pore aspect ratio and predicting shear wave velocity Vs from the information of P-wave velocity, porosity and mineralogy in a borehole. Statistical distribution of pore geometry is considered in the rock physics models. In the application to the Barnett formation, we compare the high frequency self-consistent approximation (SCA) method that corresponds to isolated pore spaces, and the low frequency SCA-Gassmann method that describes well-connected pore spaces. Inversion results indicate that compared to the surroundings, the Barnett Shale shows less fluctuation in the pore aspect ratio in spite of complex constituents in the shale. The high frequency method provides a more robust and accurate prediction of Vs for all the three intervals in the Barnett formation, while the low frequency method collapses for the Barnett Shale interval. Possible causes for this discrepancy can be explained by the fact that poor in situ pore connectivity and low permeability make well-log sonic frequencies act as high frequencies and thus invalidate the low frequency assumption of the Gassmann theory. In comparison, for the overlying Marble Falls and underlying Ellenburger carbonates, both the high and low frequency methods predict Vs with reasonable accuracy, which may reveal that sonic frequencies are within the transition frequencies zone due to higher pore connectivity in the surroundings.
Misclassification of Physical Activity Level Due to Exclusion of Workplace Activity
ERIC Educational Resources Information Center
Boslaugh, Sarah E.; Kreuter, Matthew W.; Weaver, Nancy L.; Naleid, Kimberly S.; Brownson, Ross C.
2005-01-01
This study examined the effect of including workplace physical activity in calculating the proportion of adults meeting Centers for Disease Control (CDC) guidelines for physical activity. Data on leisure-time and workplace activity were collected from 1,090 Black and White adults in St. Louis, MO. A series of assumptions were used to equate…
Craft, Daniel F; Kry, Stephen F; Balter, Peter; Salehpour, Mohammad; Woodward, Wendy; Howell, Rebecca M
2018-04-01
Using 3D printing to fabricate patient-specific devices such as tissue compensators, boluses, and phantoms is inexpensive and relatively simple. However, most 3D printing materials have not been well characterized, including their radiologic tissue equivalence. The purposes of this study were to (a) determine the variance in Hounsfield Units (HU) for printed objects, (b) determine if HU varies over time, and (c) calculate the clinical dose uncertainty caused by these material variations. For a sample of 10 printed blocks each of PLA, NinjaFlex, ABS, and Cheetah, the average HU and physical density were tracked at initial printing and over the course of 5 weeks, a typical timeframe for a standard course of radiotherapy. After initial printing, half the blocks were stored in open boxes, the other half in sealed bags with desiccant. Variances in HU and density over time were evaluated for the four materials. Various clinical photon and electron beams were used to evaluate potential errors in clinical depth dose as a function of assumptions made during treatment planning. The clinical depth error was defined as the distance between the correctly calculated 90% isodose line and the 90% isodose line calculated using clinically reasonable, but simplified, assumptions. The average HU measurements of individual blocks of PLA, ABS, NinjaFlex, and Cheetah varied by as much as 121, 30, 178, and 30 HU, respectively. The HU variation over 5 weeks was much smaller for all materials. The magnitude of clinical depth errors depended strongly on the material, energy, and assumptions, but some were as large as 9.0 mm. If proper quality assurance steps are taken, 3D printed objects can be used accurately and effectively in radiation therapy. It is critically important, however, that the properties of any material being used in patient care be well understood and accounted for. © 2018 American Association of Physicists in Medicine.
Islamic Principles and Physical Education.
ERIC Educational Resources Information Center
Lindsay, Karen; And Others
1987-01-01
Based on interviews with five Islamic respondents, this paper investigates stricter Islamic parents' difficulties with certain assumptions and practices of Australian education, particularly health and physical education. Concerns about modesty and separation of sexes conflict with central aims based on equal educational opportunities and equality…
Code of Federal Regulations, 2011 CFR
2011-10-01
..., occupational, speech, and other therapy services and the services of other health specialists (other than... 42 Public Health 2 2011-10-01 2011-10-01 false Reasonable cost of physical and other therapy... therapy services furnished under arrangements. (a) Principle. The reasonable cost of the services of...
ERIC Educational Resources Information Center
Ding, Lin
2014-01-01
This study seeks to test the causal influences of reasoning skills and epistemologies on student conceptual learning in physics. A causal model, integrating multiple variables that were investigated separately in the prior literature, is proposed and tested through path analysis. These variables include student preinstructional reasoning skills…
College Teaching and the Development of Reasoning
ERIC Educational Resources Information Center
Fuller, Robert G., Ed.; Campbell, Thomas C., Ed.; Dykstra, Dewey I., Jr., Ed.; Stevens, Scott M., Ed.
2009-01-01
This book is intended to offer college faculty members the insights of the development of reasoning movement that enlighten physics educators in the late 1970s and led to a variety of college programs directed at improving the reasoning patterns used by college students. While the original materials were directed at physics concepts, they quickly…
McLachlan, Hugh V
2015-02-01
Wardrope argues against my proposed non-consequentialist policy for the distribution of scarce influenza vaccine in the face of a pandemic. According to him, even if one accepts what he calls my deontological ethical theory, it does not follow that we are required to agree with my proposed randomised allocation of doses of vaccine by means of a lottery. He argues in particular that I fail to consider fully the prophylactic role of vaccination whereby it serves to protect from infection more people than are vaccinated. He concludes that: 'The benefits and burdens of vaccination are provided impartially and far more effectively by targeted vaccination than impartial lotteries.' He has shown convincingly that this conclusion can be established in the case of his particular envisaged scenario. However, Wardrope gives no reason to suppose that, in the circumstances that we actually face, targeted vaccination would constitute impartial treatment of citizens in the UK. I readily agree with Wardrope that if it should treat its citizens justly and impartially, it does not necessarily follow that the state should distribute vaccinations of the basis of a lottery. That will be a reasonable thing to do only if certain assumptions are made. These assumptions will not always be reasonable. However, they are reasonable ones to make in the actual circumstances that currently apply. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Variation is the universal: making cultural evolution work in developmental psychology.
Kline, Michelle Ann; Shamsudheen, Rubeena; Broesch, Tanya
2018-04-05
Culture is a human universal, yet it is a source of variation in human psychology, behaviour and development. Developmental researchers are now expanding the geographical scope of research to include populations beyond relatively wealthy Western communities. However, culture and context still play a secondary role in the theoretical grounding of developmental psychology research, far too often. In this paper, we highlight four false assumptions that are common in psychology, and that detract from the quality of both standard and cross-cultural research in development. These assumptions are: (i) the universality assumption , that empirical uniformity is evidence for universality, while any variation is evidence for culturally derived variation; (ii) the Western centrality assumption , that Western populations represent a normal and/or healthy standard against which development in all societies can be compared; (iii) the deficit assumption , that population-level differences in developmental timing or outcomes are necessarily due to something lacking among non-Western populations; and (iv) the equivalency assumption , that using identical research methods will necessarily produce equivalent and externally valid data, across disparate cultural contexts. For each assumption, we draw on cultural evolutionary theory to critique and replace the assumption with a theoretically grounded approach to culture in development. We support these suggestions with positive examples drawn from research in development. Finally, we conclude with a call for researchers to take reasonable steps towards more fully incorporating culture and context into studies of development, by expanding their participant pools in strategic ways. This will lead to a more inclusive and therefore more accurate description of human development.This article is part of the theme issue 'Bridging cultural gaps: interdisciplinary studies in human cultural evolution'. © 2018 The Author(s).
ERIC Educational Resources Information Center
Bostick, Sharon L.; Irwin, Bryan
2012-01-01
Renovating, expanding or building new libraries today is a challenge on several levels. Libraries in general are faced with image issues, such as the assumption that a library building exists only to house print material followed by the equally erroneous assumption that everything is now available online, thus rendering a physical building…
ERIC Educational Resources Information Center
Allison, Kenneth R.; Dwyer, John J. M.; Goldenberg, Ellie; Fein, Allan; Yoshida, Karen K.; Boutilier, Marie
2005-01-01
This study explored male adolescents' reasons for participating in moderate and vigorous physical activity, perceived barriers to moderate and vigorous physical activity, and suggestions as to what can be done to increase participation in physical activity. A total of 26 male 15- and 16-year-old adolescents participated in focus group sessions,…
Sri Bhashyam, Sumitra; Montibeller, Gilberto
2016-04-01
A key objective for policymakers and analysts dealing with terrorist threats is trying to predict the actions that malicious agents may take. A recent trend in counterterrorism risk analysis is to model the terrorists' judgments, as these will guide their choices of such actions. The standard assumptions in most of these models are that terrorists are fully rational, following all the normative desiderata required for rational choices, such as having a set of constant and ordered preferences, being able to perform a cost-benefit analysis of their alternatives, among many others. However, are such assumptions reasonable from a behavioral perspective? In this article, we analyze the types of assumptions made across various counterterrorism analytical models that represent malicious agents' judgments and discuss their suitability from a descriptive point of view. We then suggest how some of these assumptions could be modified to describe terrorists' preferences more accurately, by drawing knowledge from the fields of behavioral decision research, politics, philosophy of choice, public choice, and conflict management in terrorism. Such insight, we hope, might help make the assumptions of these models more behaviorally valid for counterterrorism risk analysis. © 2016 The Authors Wound Repair and Regeneration published by Wiley Periodicals, Inc. on behalf of The Wound Healing Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naselsky, Pavel; Jackson, Andrew D.; Liu, Hao, E-mail: naselsky@nbi.ku.dk, E-mail: liuhao@nbi.dk
We present a simplified method for the extraction of meaningful signals from Hanford and Livingston 32 second data for the GW150914 event made publicly available by the LIGO collaboration, and demonstrate its ability to reproduce the LIGO collaboration's own results quantitatively given the assumption that all narrow peaks in the power spectrum are a consequence of physically uninteresting signals and can be removed. After the clipping of these peaks and return to the time domain, the GW150914 event is readily distinguished from broadband background noise. This simple technique allows us to identify the GW150914 event without any assumption regarding itsmore » physical origin and with minimal assumptions regarding its shape. We also confirm that the LIGO GW150914 event is uniquely correlated in the Hanford and Livingston detectors for the full 4096 second data at the level of 6–7 σ with a temporal displacement of τ = 6.9 ± 0.4 ms. We have also identified a few events that are morphologically close to GW150914 but less strongly cross correlated with it.« less
Understanding the LIGO GW150914 event
NASA Astrophysics Data System (ADS)
Naselsky, Pavel; Jackson, Andrew D.; Liu, Hao
2016-08-01
We present a simplified method for the extraction of meaningful signals from Hanford and Livingston 32 second data for the GW150914 event made publicly available by the LIGO collaboration, and demonstrate its ability to reproduce the LIGO collaboration's own results quantitatively given the assumption that all narrow peaks in the power spectrum are a consequence of physically uninteresting signals and can be removed. After the clipping of these peaks and return to the time domain, the GW150914 event is readily distinguished from broadband background noise. This simple technique allows us to identify the GW150914 event without any assumption regarding its physical origin and with minimal assumptions regarding its shape. We also confirm that the LIGO GW150914 event is uniquely correlated in the Hanford and Livingston detectors for the full 4096 second data at the level of 6-7 σ with a temporal displacement of τ = 6.9 ± 0.4 ms. We have also identified a few events that are morphologically close to GW150914 but less strongly cross correlated with it.
Modelling Mathematical Reasoning in Physics Education
NASA Astrophysics Data System (ADS)
Uhden, Olaf; Karam, Ricardo; Pietrocola, Maurício; Pospiech, Gesche
2012-04-01
Many findings from research as well as reports from teachers describe students' problem solving strategies as manipulation of formulas by rote. The resulting dissatisfaction with quantitative physical textbook problems seems to influence the attitude towards the role of mathematics in physics education in general. Mathematics is often seen as a tool for calculation which hinders a conceptual understanding of physical principles. However, the role of mathematics cannot be reduced to this technical aspect. Hence, instead of putting mathematics away we delve into the nature of physical science to reveal the strong conceptual relationship between mathematics and physics. Moreover, we suggest that, for both prospective teaching and further research, a focus on deeply exploring such interdependency can significantly improve the understanding of physics. To provide a suitable basis, we develop a new model which can be used for analysing different levels of mathematical reasoning within physics. It is also a guideline for shifting the attention from technical to structural mathematical skills while teaching physics. We demonstrate its applicability for analysing physical-mathematical reasoning processes with an example.
Psycho-Social Correlates of Organized Physical Activity.
ERIC Educational Resources Information Center
Greendorfer, Susan L.
1987-01-01
The assumption has been that because play, games, and sport are good, positive attitudes, behaviors, and values are inevitably transmitted. This article summarizes research on a variety of topics related to psychosocial correlates of physical activity and urges caution in the way claims are made. (MT)
Lectures on Dark Matter Physics
NASA Astrophysics Data System (ADS)
Lisanti, Mariangela
Rotation curve measurements from the 1970s provided the first strong indication that a significant fraction of matter in the Universe is non-baryonic. In the intervening years, a tremendous amount of progress has been made on both the theoretical and experimental fronts in the search for this missing matter, which we now know constitutes nearly 85% of the Universe's matter density. These series of lectures provide an introduction to the basics of dark matter physics. They are geared for the advanced undergraduate or graduate student interested in pursuing research in high-energy physics. The primary goal is to build an understanding of how observations constrain the assumptions that can be made about the astro- and particle physics properties of dark matter. The lectures begin by delineating the basic assumptions that can be inferred about dark matter from rotation curves. A detailed discussion of thermal dark matter follows, motivating Weakly Interacting Massive Particles, as well as lighter-mass alternatives. As an application of these concepts, the phenomenology of direct and indirect detection experiments is discussed in detail.
More than just "plug-and-chug": Exploring how physics students make sense with equations
NASA Astrophysics Data System (ADS)
Kuo, Eric
Although a large part the Physics Education Research (PER) literature investigates students' conceptual understanding in physics, these investigations focus on qualitative, conceptual reasoning. Even in modeling expert problem solving, attention to conceptual understanding means a focus on initial qualitative analysis of the problem; the equations are typically conceived of as tools for "plug-and-chug" calculations. In this dissertation, I explore the ways that undergraduate physics students make conceptual sense of physics equations and the factors that support this type of reasoning through three separate studies. In the first study, I investigate how students' can understand physics equations intuitively through use of a particular class of cognitive elements, symbolic forms (Sherin, 2001). Additionally, I show how students leverage this intuitive, conceptual meaning of equations in problem solving. By doing so, these students avoid algorithmic manipulations, instead using a heuristic approach that leverages the equation in a conceptual argument. The second study asks the question why some students use symbolic forms and others don't. Although it is possible that students simply lack the knowledge required, I argue that this is not the only explanation. Rather, symbolic forms use is connected to particular epistemological stances, in-the-moment views on what kinds of knowledge and reasoning are appropriate in physics. Specifically, stances that value coherence between formal, mathematical knowledge and intuitive, conceptual knowledge are likely to support symbolic forms use. Through the case study of one student, I argue that both reasoning with equations and epistemological stances are dynamic, and that shifts in epistemological stance can produce shifts in whether symbolic forms are used to reason with equations. The third study expands the focus to what influences how students reason with equations across disciplinary problem contexts. In seeking to understand differences in how the same student reasons on two similar problems in calculus and physics, I show two factors, beyond the content or structure of the problems, that can help explain why reasoning on these two problems would be so different. This contributes to an understanding of what can support or impede transfer of content knowledge across disciplinary boundaries.
ERIC Educational Resources Information Center
Liu, Shiang-Yao; Lin, Chuan-Shun; Tsai, Chin-Chung
2011-01-01
This study aims to test the nature of the assumption that there are relationships between scientific epistemological views (SEVs) and reasoning processes in socioscientific decision making. A mixed methodology that combines both qualitative and quantitative approaches of data collection and analysis was adopted not only to verify the assumption…
An Unwarranted Fear of Religious Schooling
ERIC Educational Resources Information Center
Kroeker, Frances; Norris, Stephen
2007-01-01
In this article, we challenge the common liberal assumption that religious schooling undermines the goals of liberal civic education, making it impossible for children to acquire tolerance, critical reasoning skills, or personal autonomy. As a framework for this argument, we respond to some of the claims made by Harry Brighouse in his recent book,…
Morality, Culture and the Dialogic Self: Taking Cultural Pluralism Seriously
ERIC Educational Resources Information Center
Haste, Helen; Abrahams, Salie
2008-01-01
This paper explores moral reasoning within the framework of contemporary cultural theory, in which moral functioning is action mediated by tools (such as socially available discourses) within a social and cultural context. This cultural model of a "dialogic moral self" challenges many of the assumptions inherent in the individualistic Kantian…
Acquiescent Responding in Balanced Multidimensional Scales and Exploratory Factor Analysis
ERIC Educational Resources Information Center
Lorenzo-Seva, Urbano; Rodriguez-Fornells, Antoni
2006-01-01
Personality tests often consist of a set of dichotomous or Likert items. These response formats are known to be susceptible to an agreeing-response bias called acquiescence. The common assumption in balanced scales is that the sum of appropriately reversed responses should be reasonably free of acquiescence. However, inter-item correlation (or…
26 CFR 1.6045B-1 - Returns relating to actions affecting basis of securities.
Code of Federal Regulations, 2012 CFR
2012-04-01
... measured for the action. (v) Effect of the action. The quantitative effect of the organizational action on... issuer may file the return before the organizational action if the quantitative effect on basis is determinable beforehand. (ii) Reasonable assumptions. To report the quantitative effect on basis by the due...
26 CFR 1.6045B-1 - Returns relating to actions affecting basis of securities.
Code of Federal Regulations, 2011 CFR
2011-04-01
... measured for the action. (v) Effect of the action. The quantitative effect of the organizational action on... issuer may file the return before the organizational action if the quantitative effect on basis is determinable beforehand. (ii) Reasonable assumptions. To report the quantitative effect on basis by the due...
26 CFR 1.6045B-1 - Returns relating to actions affecting basis of securities.
Code of Federal Regulations, 2013 CFR
2013-04-01
... measured for the action. (v) Effect of the action. The quantitative effect of the organizational action on... issuer may file the return before the organizational action if the quantitative effect on basis is determinable beforehand. (ii) Reasonable assumptions. To report the quantitative effect on basis by the due...
Adolescent Egocentrism and Formal Operations: Tests of a Theoretical Assumption.
ERIC Educational Resources Information Center
Lapsley, David K.; And Others
1986-01-01
Describes two studies of the theoretical relation between adolescent egocentrism and formal operations. Study 1 used the Adolescent Egocentrism Scale (AES) and Lunzer's battery of formal reasoning tasks to assess 183 adolescents. Study 2 administered the AES, the Imaginary Audience Scale (IAS), and the Test of Logical Thinking to 138 adolescents.…
ERIC Educational Resources Information Center
Moallem, Mahnaz
This paper focuses on reflection and reflective thinking as a means of developing expertise in instructional designers. The need for the reflective instructional designer is discussed, and reflective thinking is examined from several perspectives, i.e., controlled thinking, tacit knowledge, epistemic assumption, abductive reasoning, willingness to…
TEACHING COMPOSITION. WHAT RESEARCH SAYS TO THE TEACHER, NUMBER 18.
ERIC Educational Resources Information Center
BURROWS, ALVINA T.
ALTHOUGH CHILDREN'S NEEDS FOR WRITTEN EXPRESSION PROBABLY PARALLEL THOSE OF ADULTS, THE REASON BEHIND CHILDREN'S CHOICE OF WRITING OVER SPEAKING IN GIVEN INSTANCES IS OPEN TO CONJECTURE. MOREOVER, THE COMMON ASSUMPTION BY TEACHERS THAT CHILDREN CAN AND SHOULD WRITE ABOUT PERSONAL INTERESTS OUGHT TO BE TEMPERED BY THE IDEA THAT MANY INTERESTS ARE…
Effects of Reflection Prompts when Learning with Hypermedia
ERIC Educational Resources Information Center
Bannert, Maria
2006-01-01
In this study the assumption was tested experimentally, whether prompting for reflection will enhance hypermedia learning and transfer. Students of the experimental group were prompted at each navigation step in a hypermedia system to say the reasons why they chose this specific information node out loud whereas the students of the control group…
Are Structural Estimates of Auction Models Reasonable? Evidence from Experimental Data
ERIC Educational Resources Information Center
Bajari, Patrick; Hortacsu, Ali
2005-01-01
Recently, economists have developed methods for structural estimation of auction models. Many researchers object to these methods because they find the strict rationality assumptions to be implausible. Using bid data from first-price auction experiments, we estimate four alternative structural models: (1) risk-neutral Bayes-Nash, (2) risk-averse…
ERIC Educational Resources Information Center
Inoue, Noriyuki
2016-01-01
In Western cultures, subjectivity has often been seen as the "black sheep" of educational research because of its heavy emphasis on objectivity. Consequently many research initiatives in education share the assumption that objective reasoning should play a central role. However, mentoring teachers' practice improvement research often…
Assessing the Utility of the Willingness/Prototype Model in Predicting Help-Seeking Decisions
ERIC Educational Resources Information Center
Hammer, Joseph H.; Vogel, David L.
2013-01-01
Prior research on professional psychological help-seeking behavior has operated on the assumption that the decision to seek help is based on intentional and reasoned processes. However, research on the dual-process prototype/willingness model (PWM; Gerrard, Gibbons, Houlihan, Stock, & Pomery, 2008) suggests health-related decisions may also…
Cooperative Education in the New Millennium: Implications for Faculty Development.
ERIC Educational Resources Information Center
Crow, Cal
1997-01-01
Colleges and universities are being challenged to examine their mission, structure, and delivery of services for two reasons with far-reaching implications for cooperative education. The first is the move from an industrial to a postindustrial economy. The second deals with incorrect assumptions about learning based on outmoded information and…
Implicit Schemata and Categories in Memory-Based Language Processing
ERIC Educational Resources Information Center
van den Bosch, Antal; Daelemans, Walter
2013-01-01
Memory-based language processing (MBLP) is an approach to language processing based on exemplar storage during learning and analogical reasoning during processing. From a cognitive perspective, the approach is attractive as a model for human language processing because it does not make any assumptions about the way abstractions are shaped, nor any…
Item Response Theory in the context of Improving Student Reasoning
NASA Astrophysics Data System (ADS)
Goddard, Chase; Davis, Jeremy; Pyper, Brian
2011-10-01
We are interested to see if Item Response Theory can help to better inform the development of reasoning ability in introductory physics. A first pass through our latest batch of data from the Heat and Temperature Conceptual Evaluation, the Lawson Classroom Test of Scientific Reasoning, and the Epistemological Beliefs About Physics Survey may help in this effort.
ERIC Educational Resources Information Center
Russ, Rosemary S.; Odden, Tor Ole B.
2017-01-01
Our field has long valued the goal of teaching students not just the facts of physics, but also the thinking and reasoning skills of professional physicists. The complexity inherent in scientific reasoning demands that we think carefully about how we conceptualize for ourselves, enact in our classes, and encourage in our students the relationship…
Marom, Gil; Bluestein, Danny
2016-01-01
Summary This paper evaluated the influence of various numerical implementation assumptions on predicting blood damage in cardiovascular devices using Lagrangian methods with Eulerian computational fluid dynamics. The implementation assumptions that were tested included various seeding patterns, stochastic walk model, and simplified trajectory calculations with pathlines. Post processing implementation options that were evaluated included single passage and repeated passages stress accumulation and time averaging. This study demonstrated that the implementation assumptions can significantly affect the resulting stress accumulation, i.e., the blood damage model predictions. Careful considerations should be taken in the use of Lagrangian models. Ultimately, the appropriate assumptions should be considered based the physics of the specific case and sensitivity analysis, similar to the ones presented here, should be employed. PMID:26679833
Automatic and controlled components of judgment and decision making.
Ferreira, Mario B; Garcia-Marques, Leonel; Sherman, Steven J; Sherman, Jeffrey W
2006-11-01
The categorization of inductive reasoning into largely automatic processes (heuristic reasoning) and controlled analytical processes (rule-based reasoning) put forward by dual-process approaches of judgment under uncertainty (e.g., K. E. Stanovich & R. F. West, 2000) has been primarily a matter of assumption with a scarcity of direct empirical findings supporting it. The present authors use the process dissociation procedure (L. L. Jacoby, 1991) to provide convergent evidence validating a dual-process perspective to judgment under uncertainty based on the independent contributions of heuristic and rule-based reasoning. Process dissociations based on experimental manipulation of variables were derived from the most relevant theoretical properties typically used to contrast the two forms of reasoning. These include processing goals (Experiment 1), cognitive resources (Experiment 2), priming (Experiment 3), and formal training (Experiment 4); the results consistently support the author's perspective. They conclude that judgment under uncertainty is neither an automatic nor a controlled process but that it reflects both processes, with each making independent contributions.
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Giannakopoulou, Dimitra
2006-01-01
This paper discusses our initial experience with introducing automated assume-guarantee verification based on learning in the SPIN tool. We believe that compositional verification techniques such as assume-guarantee reasoning could complement the state-reduction techniques that SPIN already supports, thus increasing the size of systems that SPIN can handle. We present a "light-weight" approach to evaluating the benefits of learning-based assume-guarantee reasoning in the context of SPIN: we turn our previous implementation of learning for the LTSA tool into a main program that externally invokes SPIN to provide the model checking-related answers. Despite its performance overheads (which mandate a future implementation within SPIN itself), this approach provides accurate information about the savings in memory. We have experimented with several versions of learning-based assume guarantee reasoning, including a novel heuristic introduced here for generating component assumptions when their environment is unavailable. We illustrate the benefits of learning-based assume-guarantee reasoning in SPIN through the example of a resource arbiter for a spacecraft. Keywords: assume-guarantee reasoning, model checking, learning.
Reasoning, Attitudes, and Learning: What matters in Introductory Physics?
NASA Astrophysics Data System (ADS)
Bateman, Melissa; Pyper, Brian
2009-05-01
Recent research has been revealing a connection between epistemological beliefs, reasoning ability and conceptual understanding. Our project has been taking data collected from the Fall `08 and Winter `09 semesters to supplement existing data in strengthening the statistical value of our sample size. We administered four tests to selected introductory physics courses: the Epistemological Beliefs Assessment for Physical Science, the Lawson Classroom Test of Scientific Reasoning, The Force Concept Inventory, and the Conceptual Survey in Electricity and Magnetism. With these data we have been comparing test results to demographics to answer questions such as: Does gender affect how we learn physics? Does past physics experience affect how we learn physics? Does past math experience affect how we learn physics? And how do math background successes compare to physics background successes? As we answer these questions, we will be better prepared in the Physics classroom and better identify the struggles of our students and solutions to help them better succeed.
NASA Astrophysics Data System (ADS)
Muslim; Suhandi, A.; Nugraha, M. G.
2017-02-01
The purposes of this study are to determine the quality of reasoning test instruments that follow the framework of Trends in International Mathematics and Science Study (TIMSS) as a development results and to analyse the profile of reasoning skill of senior high school students on physics materials. This research used research and development method (R&D), furthermore the subject were 104 students at three senior high schools in Bandung selected by random sampling technique. Reasoning test instruments are constructed following the TIMSS framework in multiple choice forms in 30 questions that cover five subject matters i.e. parabolic motion and circular motion, Newton’s law of gravity, work and energy, harmonic oscillation, as well as the momentum and impulse. The quality of reasoning tests were analysed using the Content Validity Ratio (CVR) and classic test analysis include the validity of item, level of difficulty, discriminating power, reliability and Ferguson’s delta. As for the students’ reasoning skills profiles were analysed by the average score of achievements on eight aspects of the reasoning TIMSS framework. The results showed that reasoning test have a good quality as instruments to measure reasoning skills of senior high school students on five matters physics which developed and able to explore the reasoning of students on all aspects of reasoning based on TIMSS framework.
Physical experience enhances science learning.
Kontra, Carly; Lyons, Daniel J; Fischer, Susan M; Beilock, Sian L
2015-06-01
Three laboratory experiments involving students' behavior and brain imaging and one randomized field experiment in a college physics class explored the importance of physical experience in science learning. We reasoned that students' understanding of science concepts such as torque and angular momentum is aided by activation of sensorimotor brain systems that add kinetic detail and meaning to students' thinking. We tested whether physical experience with angular momentum increases involvement of sensorimotor brain systems during students' subsequent reasoning and whether this involvement aids their understanding. The physical experience, a brief exposure to forces associated with angular momentum, significantly improved quiz scores. Moreover, improved performance was explained by activation of sensorimotor brain regions when students later reasoned about angular momentum. This finding specifies a mechanism underlying the value of physical experience in science education and leads the way for classroom practices in which experience with the physical world is an integral part of learning. © The Author(s) 2015.
Darwin and the uses of extinction.
Beer, Gillian
2009-01-01
We currently view extinction with dismay and even horror, but Darwin saw extinction as ordinary and as necessary to evolutionary change. Still, the degree to which extinction is fundamental to his theory is rarely discussed. This essay examines Darwin's linking of the idea of "improvement" with that of natural selection and tracks a cluster of reasons for our changed valuation of extinction now. Those reasons demonstrate how scientific information and ideological preferences have reshaped the concept. The essay challenges the reader to assess some current assumptions about extinction and concludes by considering the shift in Darwin's own understanding from the "Origin" to the late "Autobiography".
Proof Rules for Automated Compositional Verification through Learning
NASA Technical Reports Server (NTRS)
Barringer, Howard; Giannakopoulou, Dimitra; Pasareanu, Corina S.
2003-01-01
Compositional proof systems not only enable the stepwise development of concurrent processes but also provide a basis to alleviate the state explosion problem associated with model checking. An assume-guarantee style of specification and reasoning has long been advocated to achieve compositionality. However, this style of reasoning is often non-trivial, typically requiring human input to determine appropriate assumptions. In this paper, we present novel assume- guarantee rules in the setting of finite labelled transition systems with blocking communication. We show how these rules can be applied in an iterative and fully automated fashion within a framework based on learning.
Regularity Results for a Class of Functionals with Non-Standard Growth
NASA Astrophysics Data System (ADS)
Acerbi, Emilio; Mingione, Giuseppe
We consider the integral functional
NDE Research At Nondestructive Measurement Science At NASA Langley
1989-06-01
our staff include: ultrasonics, nonlinear acoustics , thermal acoustics and diffusion, magnetics , fiber optics, and x-ray tomography . We have a...based on the simple assumption that acoustic waves interact with the sample and reveal "important" properties . In practice, such assumptions have...between the acoustic wave and the media. The most useful models can generally be inverted to determine the physical properties or geometry of the
U(2)⁵ flavor symmetry and lepton universality violation in W→τν̄ τ
Filipuzzi, Alberto; Portolés, Jorge; González-Alonso, Martín
2012-06-26
The seeming violation of universality in the τ lepton coupling to the W boson suggested by LEP-II data is studied using an effective field theory (EFT) approach. Within this framework we explore how this feature fits into the current constraints from electroweak precision observables using different assumptions about the flavor structure of New Physics, namely [U(2)×U(1)]⁵ and U(2)⁵. We show the importance of leptonic and semileptonic tau decay measurements, giving 3–4 TeV bounds on the New Physics effective scale at 90% C.L. We conclude under very general assumptions that it is not possible to accommodate this deviation from universality inmore » the EFT framework, and thus such a signal could only be explained by the introduction of light degrees of freedom or New Physics strongly coupled at the electroweak scale.« less
Object Individuation and Physical Reasoning in Infancy: An Integrative Account
Baillargeon, Renée; Stavans, Maayan; Wu, Di; Gertner, Yael; Setoh, Peipei; Kittredge, Audrey K.; Bernard, Amélie
2012-01-01
Much of the research on object individuation in infancy has used a task in which two different objects emerge in alternation from behind a large screen, which is then removed to reveal either one or two objects. In their seminal work, Xu and Carey (1996) found that it is typically not until the end of the first year that infants detect a violation when a single object is revealed. Since then, a large number of investigations have modified the standard task in various ways and found that young infants succeed with some but not with other modifications, yielding a complex and unwieldy picture. In this article, we argue that this confusing picture can be better understood by bringing to bear insights from a related subfield of infancy research, physical reasoning. By considering how infants reason about object information within and across physical events, we can make sense of apparently inconsistent findings from different object-individuation tasks. In turn, object-individuation findings deepen our understanding of how physical reasoning develops in infancy. Integrating the insights from physical-reasoning and object-individuation investigations thus enriches both subfields and brings about a clearer account of how infants represent objects and events. PMID:23204946
Walmsley, Christopher W; McCurry, Matthew R; Clausen, Phillip D; McHenry, Colin R
2013-01-01
Finite element analysis (FEA) is a computational technique of growing popularity in the field of comparative biomechanics, and is an easily accessible platform for form-function analyses of biological structures. However, its rapid evolution in recent years from a novel approach to common practice demands some scrutiny in regards to the validity of results and the appropriateness of assumptions inherent in setting up simulations. Both validation and sensitivity analyses remain unexplored in many comparative analyses, and assumptions considered to be 'reasonable' are often assumed to have little influence on the results and their interpretation. HERE WE REPORT AN EXTENSIVE SENSITIVITY ANALYSIS WHERE HIGH RESOLUTION FINITE ELEMENT (FE) MODELS OF MANDIBLES FROM SEVEN SPECIES OF CROCODILE WERE ANALYSED UNDER LOADS TYPICAL FOR COMPARATIVE ANALYSIS: biting, shaking, and twisting. Simulations explored the effect on both the absolute response and the interspecies pattern of results to variations in commonly used input parameters. Our sensitivity analysis focuses on assumptions relating to the selection of material properties (heterogeneous or homogeneous), scaling (standardising volume, surface area, or length), tooth position (front, mid, or back tooth engagement), and linear load case (type of loading for each feeding type). Our findings show that in a comparative context, FE models are far less sensitive to the selection of material property values and scaling to either volume or surface area than they are to those assumptions relating to the functional aspects of the simulation, such as tooth position and linear load case. Results show a complex interaction between simulation assumptions, depending on the combination of assumptions and the overall shape of each specimen. Keeping assumptions consistent between models in an analysis does not ensure that results can be generalised beyond the specific set of assumptions used. Logically, different comparative datasets would also be sensitive to identical simulation assumptions; hence, modelling assumptions should undergo rigorous selection. The accuracy of input data is paramount, and simulations should focus on taking biological context into account. Ideally, validation of simulations should be addressed; however, where validation is impossible or unfeasible, sensitivity analyses should be performed to identify which assumptions have the greatest influence upon the results.
Automated Assume-Guarantee Reasoning by Abstraction Refinement
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Giannakopoulous, Dimitra; Glannakopoulou, Dimitra
2008-01-01
Current automated approaches for compositional model checking in the assume-guarantee style are based on learning of assumptions as deterministic automata. We propose an alternative approach based on abstraction refinement. Our new method computes the assumptions for the assume-guarantee rules as conservative and not necessarily deterministic abstractions of some of the components, and refines those abstractions using counter-examples obtained from model checking them together with the other components. Our approach also exploits the alphabets of the interfaces between components and performs iterative refinement of those alphabets as well as of the abstractions. We show experimentally that our preliminary implementation of the proposed alternative achieves similar or better performance than a previous learning-based implementation.
The role of clinician emotion in clinical reasoning: Balancing the analytical process.
Langridge, Neil; Roberts, Lisa; Pope, Catherine
2016-02-01
This review paper identifies and describes the role of clinicians' memory, emotions and physical responses in clinical reasoning processes. Clinical reasoning is complex and multi-factorial and key models of clinical reasoning within musculoskeletal physiotherapy are discussed, highlighting the omission of emotion and subsequent physical responses and how these can impact upon a clinician when making a decision. It is proposed that clinicians should consider the emotions associated with decision-making, especially when there is concern surrounding a presentation. Reflecting on practice in the clinical environment and subsequently applying this to a patient presentation should involve some acknowledgement of clinicians' physical responses, emotions and how they may play a part in any decision made. Presenting intuition and gut-feeling as separate reasoning methods and how these processes co-exist with other more accepted reasoning such as hypothetico-deductive is also discussed. Musculoskeletal physiotherapy should consider the elements of feelings, emotions and physical responses when applying reflective practice principles. Furthermore, clinicians dealing with difficult and challenging presentations should look at the emotional as well as the analytical experience when justifying decisions and learning from practice. Copyright © 2015 Elsevier Ltd. All rights reserved.
On general features of warm dark matter with reduced relativistic gas
NASA Astrophysics Data System (ADS)
Hipólito-Ricaldi, W. S.; vom Marttens, R. F.; Fabris, J. C.; Shapiro, I. L.; Casarini, L.
2018-05-01
Reduced relativistic gas (RRG) is a useful approach to describe the warm dark matter (WDM) or the warmness of baryonic matter in the approximation when the interaction between the particles is irrelevant. The use of Maxwell distribution leads to the complicated equation of state of the Jüttner model of relativistic ideal gas. The RRG enables one to reproduce the same physical situation but in a much simpler form. For this reason RRG can be a useful tool for the theories with some sort of a "new Physics". On the other hand, even without the qualitatively new physical implementations, the RRG can be useful to describe the general features of WDM in a model-independent way. In this sense one can see, in particular, to which extent the cosmological manifestations of WDM may be dependent on its Particle Physics background. In the present work RRG is used as a complementary approach to derive the main observational features for the WDM in a model-independent way. The only assumption concerns a non-negligible velocity v for dark matter particles which is parameterized by the warmness parameter b. The relatively high values of b ( b^2˜ 10^{-6}) erase the radiation (photons and neutrinos) dominated epoch and cause an early warm matter domination after inflation. Furthermore, RRG approach enables one to quantify the lack of power in linear matter spectrum at small scales and in particular, reproduces the relative transfer function commonly used in context of WDM with accuracy of ≲ 1%. A warmness with b^2≲ 10^{-6} (equivalent to v≲ 300 km/s) does not alter significantly the CMB power spectrum and is in agreement with the background observational tests.
d’Uva, Teresa Bago; Lindeboom, Maarten; O’Donnell, Owen; van Doorslaer, Eddy
2011-01-01
We propose tests of the two assumptions under which anchoring vignettes identify heterogeneity in reporting of categorical evaluations. Systematic variation in the perceived difference between any two vignette states is sufficient to reject vignette equivalence. Response consistency - the respondent uses the same response scale to evaluate the vignette and herself – is testable given sufficiently comprehensive objective indicators that independently identify response scales. Both assumptions are rejected for reporting of cognitive and physical functioning in a sample of older English individuals, although a weaker test resting on less stringent assumptions does not reject response consistency for cognition. PMID:22184479
Are K-12 Learners Motivated in Physical Education? A Meta-Analysis
ERIC Educational Resources Information Center
Chen, Senlin; Chen, Ang; Zhu, Xihe
2012-01-01
Previous studies devoted to K-12 learner motivation in physical education share a general assumption that students may lack motivation. This meta-analytic study examined published original studies (n = 79) to determine students' motivation level and the association between motivation and outcomes. Original means of motivation measures were…
Obesity, Health and Physical Education: A Bourdieuean Perspective
ERIC Educational Resources Information Center
Fitzpatrick, Katie
2011-01-01
Assumptions and interventions about the so-called "obesity epidemic" pervade health and physical education classrooms and national policy agendas in New Zealand, as they do elsewhere in the Western world. In contrast, critical scholars in these subjects advocate an active deconstruction of the tenets and presumptions underpinning public…
Physical Education Teachers' Cultural Competency
ERIC Educational Resources Information Center
Harrison, Louis, Jr.; Carson, Russell L.; Burden, Joe, Jr.
2010-01-01
The purpose of this study was to evaluate the common assumption that teachers of color (TOC) are more culturally competent than White teachers by assessing physical education teachers' cultural competency. A secondary purpose was to ascertain the possible differences in cultural competence levels of White teachers in diverse school settings versus…
Which Accelerates Faster--A Falling Ball or a Porsche?
ERIC Educational Resources Information Center
Rall, James D.; Abdul-Razzaq, Wathiq
2012-01-01
An introductory physics experiment has been developed to address the issues seen in conventional physics lab classes including assumption verification, technological dependencies, and real world motivation for the experiment. The experiment has little technology dependence and compares the acceleration due to gravity by using position versus time…
Siegmund, Thorsten; Heinemann, Lutz; Kolassa, Ralf; Thomas, Andreas
2017-01-01
Background: For decades, the major source of information used to make therapeutic decisions by patients with diabetes has been glucose measurements using capillary blood samples. Knowledge gained from clinical studies, for example, on the impact of metabolic control on diabetes-related complications, is based on such measurements. Different to traditional blood glucose measurement systems, systems for continuous glucose monitoring (CGM) measure glucose in interstitial fluid (ISF). The assumption is that glucose levels in blood and ISF are practically the same and that the information provided can be used interchangeably. Thus, therapeutic decisions, that is, the selection of insulin doses, are based on CGM system results interpreted as though they were blood glucose values. Methods: We performed a more detailed analysis and interpretation of glucose profiles obtained with CGM in situations with high glucose dynamics to evaluate this potentially misleading assumption. Results: Considering physical activity, hypoglycemic episodes, and meal-related differences between glucose levels in blood and ISF uncover clinically relevant differences that can make it risky from a therapeutic point of view to use blood glucose for therapeutic decisions. Conclusions: Further systematic and structured evaluation as to whether the use of ISF glucose is more safe and efficient when it comes to acute therapeutic decisions is necessary. These data might also have a higher prognostic relevance when it comes to long-term metabolic consequences of diabetes. In the long run, it may be reasonable to abandon blood glucose measurements as the basis for diabetes management and switch to using ISF glucose as the appropriate therapeutic target. PMID:28322063
Sliding friction between polymer surfaces: A molecular interpretation
NASA Astrophysics Data System (ADS)
Allegra, Giuseppe; Raos, Guido
2006-04-01
For two contacting rigid bodies, the friction force F is proportional to the normal load and independent of the macroscopic contact area and relative velocity V (Amonton's law). With two mutually sliding polymer samples, the surface irregularities transmit deformation to the underlying material. Energy loss along the deformation cycles is responsible for the friction force, which now appears to depend strongly on V [see, e.g., N. Maeda et al., Science 297, 379 (2002)]. We base our theoretical interpretation on the assumption that polymer chains are mainly subjected to oscillatory "reptation" along their "tubes." At high deformation frequencies—i.e., with a large sliding velocity V—the internal viscosity due to the rotational energy barriers around chain bonds hinders intramolecular mobility. As a result, energy dissipation and the correlated friction force strongly diminish at large V. Derived from a linear differential equation for chain dynamics, our results are basically consistent with the experimental data by Maeda et al. [Science 297, 379 (2002)] on modified polystyrene. Although the bulk polymer is below Tg, we regard the first few chain layers below the surface to be in the liquid state. In particular, the observed maximum of F vs V is consistent with physically reasonable values of the molecular parameters. As a general result, the ratio F /V is a steadily decreasing function of V, tending to V-2 for large velocities. We evaluate a much smaller friction for a cross-linked polymer under the assumption that the junctions are effectively immobile, also in agreement with the experimental results of Maeda et al. [Science 297, 379 (2002)].
ERIC Educational Resources Information Center
Hajisoteriou, Christina; Neophytou, Lefkios; Angelides, Panayiotis
2015-01-01
Since 2004, the Ministry of Education and Culture in Cyprus has launched an educational reform. The Ministry highlighted Cyprus' participation in the European context and, by extension, the turning-into-multicultural character of the Cypriot society as the most important reasons, which necessitated such a reform. This paper seeks to examine the…
29 CFR 4044.75 - Other lump sum benefits.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sum benefits. The value of a lump sum benefit which is not covered under § 4044.73 or § 4044.74 is equal to— (a) The value under the qualifying bid, if an insurer provides the benefit; or (b) The present value of the benefit as of the date of distribution, determined using reasonable actuarial assumptions...
Code of Federal Regulations, 2010 CFR
2010-04-01
... economic, actuarial, and other assumptions used in the determination are reasonable; (ii) The computations...) Separate accounts. This section does not apply to any contract whose value varies according to the..., Jan. 16, 2009, § 230.151A was added, effective Jan. 12, 2011. ...
ERIC Educational Resources Information Center
Truett, Carol
Women have comprised a declining percentage of public school administrators since 1928 when 55 percent of all elementary principals were female. A heretofore unchallenged reason for this decline has been the assumption that women lack geographic mobility and that consequently neither they themselves nor potential employers should consider them…
Missing Not at Random Models for Latent Growth Curve Analyses
ERIC Educational Resources Information Center
Enders, Craig K.
2011-01-01
The past decade has seen a noticeable shift in missing data handling techniques that assume a missing at random (MAR) mechanism, where the propensity for missing data on an outcome is related to other analysis variables. Although MAR is often reasonable, there are situations where this assumption is unlikely to hold, leading to biased parameter…
Reds, Greens, Yellows Ease the Spelling Blues.
ERIC Educational Resources Information Center
Irwin, Virginia
1971-01-01
This document reports on a color-coding innovation designed to improve the spelling ability of high school seniors. This color-coded system is based on two assumptions: that color will appeal to the students and that there are three principal reasons for misspelling. Two groups were chosen for the experiments. A basic list of spelling demons was…
Wittgenstein and Stage-Setting: Being Brought into the Space of Reasons
ERIC Educational Resources Information Center
Simpson, David
2014-01-01
Wittgenstein constantly invokes teaching, training and learning in his later work. It is therefore interesting to consider what role these notions play for him there. I argue that their use is central to Wittgenstein's attempt to refute cognitivist assumptions, and to show how normative practices can be understood without the threat of…
Modeling Socially Desirable Responding and Its Effects
ERIC Educational Resources Information Center
Ziegler, Matthias; Buehner, Markus
2009-01-01
The impact of socially desirable responding or faking on noncognitive assessments remains an issue of strong debate. One of the main reasons for the controversy is the lack of a statistical method to model such response sets. This article introduces a new way to model faking based on the assumption that faking occurs due to an interaction between…
Discrete Structure-Point Testing: Problems and Alternatives. TESL Reporter, Vol. 9, No. 4.
ERIC Educational Resources Information Center
Aitken, Kenneth G.
This paper presents some reasons for reconsidering the use of discrete structure-point tests of language proficiency, and suggests an alternative basis for designing proficiency tests. Discrete point tests are one of the primary tools of the audio-lingual method of teaching a foreign language and are based on certain assumptions, including the…
Teaching, Rather than Teachers, as a Path toward Improving Classroom Instruction
ERIC Educational Resources Information Center
Hiebert, James; Morris, Anne K.
2012-01-01
For several historical and cultural reasons, the United States has long pursued a strategy of improving teaching by improving teachers. The rarely questioned logic underlying this choice says that by improving the right characteristics of teachers, they will teach more effectively. The authors expose the assumptions on which this logic is built,…
7 CFR 1710.303 - Power cost studies-power supply borrowers.
Code of Federal Regulations, 2010 CFR
2010-01-01
... contracts or revisions to existing contracts, and an analysis of the effects on power costs; (4) Use of sensitivity analyses to determine the vulnerability of the alternatives to a reasonable range of assumptions... conservation alternatives as set forth in §§ 1710.253 and 1710.254; (2) A present-value analysis of the costs...
ERIC Educational Resources Information Center
Lane, Suzanne; And Others
1995-01-01
Over 5,000 students participated in a study of the dimensionality and stability of the item parameter estimates of a mathematics performance assessment developed for the Quantitative Understanding: Amplifying Student Achievement and Reasoning (QUASAR) Project. Results demonstrate the test's dimensionality and illustrate ways to examine use of the…
Expanding the New Design: The NAEP 1985-86 Technical Report.
ERIC Educational Resources Information Center
Beaton, Albert E.; And Others
This report supplies details of the design and data analysis of the 1986 National Assessment of Educational Progress (NAEP) to allow the reader to judge the utility of the design, data quality, reasonableness of assumptions, appropriateness of data analyses, and generalizability of inferences made from the data. After an introduction by A. E.…
Multi-Cultural Literacy in the Composition Classroom: Report on a Pilot Project.
ERIC Educational Resources Information Center
Hoffman, Amy
At the root of the writing problems of most college students is a lack of critical thinking. Students find analyzing an article or essay, writing a review, or arguing persuasively difficult and unpleasant because they have little practice in identifying and evaluating assumptions and reasoning. One solution to this problem, developed by a college…
ERIC Educational Resources Information Center
Kelley, Jane E.
2008-01-01
Reconstructed fairy tales provide a different point of view and challenge the assumptions of a common set of values; for that reason, these stories provide a medium in which to examine power relationships in texts by applying a critical multicultural analysis (Botelho & Rudman, forthcoming, 2008, "A critical multicultural analysis of children's…
Reflections on the surface energy imbalance problem
Ray Leuning; Eva van Gorsela; William J. Massman; Peter R. Isaac
2012-01-01
The 'energy imbalance problem' in micrometeorology arises because at most flux measurement sites the sum of eddy fluxes of sensible and latent heat (H + λE) is less than the available energy (A). Either eddy fluxes are underestimated or A is overestimated. Reasons for the imbalance are: (1) a failure to satisfy the fundamental assumption of one-...
ERIC Educational Resources Information Center
Van Hoof, Jo; Lijnen, Tristan; Verschaffel, Lieven; Van Dooren, Wim
2013-01-01
Rational numbers and particularly fractions are difficult for students. It is often claimed that the "natural number bias" underlies erroneous reasoning about rational numbers. This cross-sectional study investigated the natural number bias in first and fifth year secondary school students. Relying on dual process theory assumptions that…
Trusted computation through biologically inspired processes
NASA Astrophysics Data System (ADS)
Anderson, Gustave W.
2013-05-01
Due to supply chain threats it is no longer a reasonable assumption that traditional protections alone will provide sufficient security for enterprise systems. The proposed cognitive trust model architecture extends the state-of-the-art in enterprise anti-exploitation technologies by providing collective immunity through backup and cross-checking, proactive health monitoring and adaptive/autonomic threat response, and network resource diversity.
Making Facilitation Work: The Challenges on an International DBA Action Learning Set
ERIC Educational Resources Information Center
OFarrell, Jack
2018-01-01
This account relates my experiences as facilitator of an action learning set on a DBA cohort comprising international students and myself. It outlines the reasons for my selection as facilitator and describes my initial expectations and assumptions of action learning. I chart the difficulty in separating the 'what' of my own research from the…
ERIC Educational Resources Information Center
Smits, Ilse; Soenens, Bart; Vansteenkiste, Maarten; Luyckx, Koen; Goossens, Luc
2010-01-01
Self-determination theory (SDT) distinguishes between autonomous and controlled reasons for people's behavior and essentially states that beneficial effects for individuals' psychosocial adjustment will accrue when behavior is guided by autonomous (rather than controlled) motives. The present study tested this assumption in the area of…
ERIC Educational Resources Information Center
Byars, Alvin Gregg
The objectives of this investigation are to develop, describe, assess, and demonstrate procedures for constructing mastery tests to minimize errors of classification and to maximize decision reliability. The guidelines are based on conditions where item exchangeability is a reasonable assumption and the test constructor can control the number of…
Student Learning and Performance in Information Systems Courses: The Role of Academic Motivation
ERIC Educational Resources Information Center
Herath, Tejaswini C.
2015-01-01
Despite the need for information technology knowledge in the business world today, enrollments in information systems (IS) courses have been consistently declining. Student performance in lower level IS courses and student assumptions about the level of difficulty of the courses seem to be reasons for lower enrollments. To understand how student…
Pre-Service Physics Teachers' Difficulties in Understanding Special Relativity Topics
ERIC Educational Resources Information Center
Ünlü Yavas, Pervin; Kizilcik, Hasan Sahin
2016-01-01
The aim of this study is to identify the reasons why pre-service physics teachers have difficulties related to special relativity topics. In this study conducted with 25 pre-service physics teachers, the case study method, which is a qualitative research method, was used. Interviews were held with the participants about their reasons for…
Patterns of Physics Reasoning in Face-to-Face and Online Forum Collaboration around a Digital Game
ERIC Educational Resources Information Center
Van Eaton, Grant; Clark, Douglas B.; Smith, Blaine E.
2015-01-01
Students playing digital learning games in the classroom rarely play alone, even in digital games that are ostensibly "single-player" games. This study explores the patterns of physics reasoning that emerge in face-to-face and online forum collaboration while students play a physics-focused educational game in their classroom. We…
Can Short Duration Visual Cues Influence Students' Reasoning and Eye Movements in Physics Problems?
ERIC Educational Resources Information Center
Madsen, Adrian; Rouinfar, Amy; Larson, Adam M.; Loschky, Lester C.; Rebello, N. Sanjay
2013-01-01
We investigate the effects of visual cueing on students' eye movements and reasoning on introductory physics problems with diagrams. Participants in our study were randomly assigned to either the cued or noncued conditions, which differed by whether the participants saw conceptual physics problems overlaid with dynamic visual cues. Students in the…
Possibilities: A framework for modeling students' deductive reasoning in physics
NASA Astrophysics Data System (ADS)
Gaffney, Jonathan David Housley
Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning process, I have developed a new framework, which is based on the mental models framework in psychology championed by P. N. Johnson-Laird. My new framework models how students search possibility space when thinking about conceptual physics problems and suggests that errors arise from failing to flesh out all possibilities. It further suggests that instructional interventions should focus on making apparent those possibilities, as well as all physical consequences those possibilities would incur. The possibilities framework emerged from the analysis of data from a unique research project specifically invented for the purpose of understanding how students use deductive reasoning. In the selection task, participants were given a physics problem along with three written possible solutions with the goal of identifying which one of the three possible solutions was correct. Each participant was also asked to identify the errors in the incorrect solutions. For the study presented in this dissertation, participants not only performed the selection task individually on four problems, but they were also placed into groups of two or three and asked to discuss with each other the reasoning they used in making their choices and attempt to reach a consensus about which solution was correct. Finally, those groups were asked to work together to perform the selection task on three new problems. The possibilities framework appropriately models the reasoning that students use, and it makes useful predictions about potentially helpful instructional interventions. The study reported in this dissertation emphasizes the useful insight the possibilities framework provides. For example, this framework allows us to detect subtle differences in students' reasoning errors, even when those errors result in the same final answer. It also illuminates how simply mentioning overlooked quantities can instigate new lines of student reasoning. It allows us to better understand how well-known psychological biases, such as the belief bias, affect the reasoning process by preventing reasoners from fleshing out all of the possibilities. The possibilities framework also allows us to track student discussions about physics, revealing the need for all parties in communication to use the same set of possibilities in the conversations to facilitate successful understanding. The framework also suggests some of the influences that affect how reasoners choose between possible solutions to a given problem. This new framework for understanding how students reason when solving conceptual physics problems opens the door to a significant field of research. The framework itself needs to be further tested and developed, but it provides substantial suggestions for instructional interventions. If we hope to improve student reasoning in physics, the possibilities framework suggests that we are perhaps best served by teaching students how to fully flesh out the possibilities in every situation. This implies that we need to ensure students have a deep understanding of all of the implied possibilities afforded by the fundamental principles that are the cornerstones of the models we teach in physics classes.
NASA Astrophysics Data System (ADS)
Lowrie, Tom; Jorgensen, Robyn
2018-03-01
Since the early 70s, there has been recognition that there are specific differences in achievement based on variables, such as gender and socio-economic background, in terms of mathematics performance. However, these differences are not unilateral but rather quite specific and relate strongly to spatial reasoning. This early work has paved the way for thinking critically about who achieves in mathematics and why. This project innovatively combines the strengths of the two Chief Investigators—Lowrie's work in spatial reasoning and Jorgensen's work in equity. The assumptions, the approach and theoretical framing used in the study unite quite disparate areas of mathematics education into a cogent research program that seeks to challenge some of the long-held views in the field of mathematics education.
A general method for handling missing binary outcome data in randomized controlled trials.
Jackson, Dan; White, Ian R; Mason, Dan; Sutton, Stephen
2014-12-01
The analysis of randomized controlled trials with incomplete binary outcome data is challenging. We develop a general method for exploring the impact of missing data in such trials, with a focus on abstinence outcomes. We propose a sensitivity analysis where standard analyses, which could include 'missing = smoking' and 'last observation carried forward', are embedded in a wider class of models. We apply our general method to data from two smoking cessation trials. A total of 489 and 1758 participants from two smoking cessation trials. The abstinence outcomes were obtained using telephone interviews. The estimated intervention effects from both trials depend on the sensitivity parameters used. The findings differ considerably in magnitude and statistical significance under quite extreme assumptions about the missing data, but are reasonably consistent under more moderate assumptions. A new method for undertaking sensitivity analyses when handling missing data in trials with binary outcomes allows a wide range of assumptions about the missing data to be assessed. In two smoking cessation trials the results were insensitive to all but extreme assumptions. © 2014 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.
NASA Astrophysics Data System (ADS)
Besson, Ugo
2010-03-01
This paper presents an analysis of the different types of reasoning and physical explanation used in science, common thought, and physics teaching. It then reflects on the learning difficulties connected with these various approaches, and suggests some possible didactic strategies. Although causal reasoning occurs very frequently in common thought and daily life, it has long been the subject of debate and criticism among philosophers and scientists. In this paper, I begin by providing a description of some general tendencies of common reasoning that have been identified by didactic research. Thereafter, I briefly discuss the role of causality in science, as well as some different types of explanation employed in the field of physics. I then present some results of a study examining the causal reasoning used by students in solid and fluid mechanics. The differences found between the types of reasoning typical of common thought and those usually proposed during instruction can create learning difficulties and impede student motivation. Many students do not seem satisfied by the mere application of formal laws and functional relations. Instead, they express the need for a causal explanation, a mechanism that allows them to understand how a state of affairs has come about. I discuss few didactic strategies aimed at overcoming these problems, and describe, in general terms, two examples of mechanics teaching sequences which were developed and tested in different contexts. The paper ends with a reflection on the possible role to be played in physics learning by intuitive and imaginative thought, and the use of simple explanatory models based on physical analogies and causal mechanisms.
Key rate for calibration robust entanglement based BB84 quantum key distribution protocol
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gittsovich, O.; Moroder, T.
2014-12-04
We apply the approach of verifying entanglement, which is based on the sole knowledge of the dimension of the underlying physical system to the entanglement based version of the BB84 quantum key distribution protocol. We show that the familiar one-way key rate formula holds already if one assumes the assumption that one of the parties is measuring a qubit and no further assumptions about the measurement are needed.
Speed of transverse waves in a string revisited
NASA Astrophysics Data System (ADS)
Rizcallah, Joseph A.
2017-11-01
In many introductory-level physics textbooks, the derivation of the formula for the speed of transverse waves in a string is either omitted altogether or presented under physically overly idealized assumptions about the shape of the considered wave pulse and the related velocity and acceleration distributions. In this paper, we derive the named formula by applying Newton’s second law or the work-energy theorem to a finite element of the string, making no assumptions about the shape of the wave. We argue that the suggested method can help the student gain a deeper insight into the nature of waves and the related process of energy transport, as well as provide a new experience with the fundamental principles of mechanics as applied to extended and deformable bodies.
Status of thermalhydraulic modelling and assessment: Open issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bestion, D.; Barre, F.
1997-07-01
This paper presents the status of the physical modelling in present codes used for Nuclear Reactor Thermalhydraulics (TRAC, RELAP 5, CATHARE, ATHLET,...) and attempts to list the unresolved or partially resolved issues. First, the capabilities and limitations of present codes are presented. They are mainly known from a synthesis of the assessment calculations performed for both separate effect tests and integral effect tests. It is also interesting to list all the assumptions and simplifications which were made in the establishment of the system of equations and of the constitutive relations. Many of the present limitations are associated to physical situationsmore » where these assumptions are not valid. Then, recommendations are proposed to extend the capabilities of these codes.« less
Interpreting findings from Mendelian randomization using the MR-Egger method.
Burgess, Stephen; Thompson, Simon G
2017-05-01
Mendelian randomization-Egger (MR-Egger) is an analysis method for Mendelian randomization using summarized genetic data. MR-Egger consists of three parts: (1) a test for directional pleiotropy, (2) a test for a causal effect, and (3) an estimate of the causal effect. While conventional analysis methods for Mendelian randomization assume that all genetic variants satisfy the instrumental variable assumptions, the MR-Egger method is able to assess whether genetic variants have pleiotropic effects on the outcome that differ on average from zero (directional pleiotropy), as well as to provide a consistent estimate of the causal effect, under a weaker assumption-the InSIDE (INstrument Strength Independent of Direct Effect) assumption. In this paper, we provide a critical assessment of the MR-Egger method with regard to its implementation and interpretation. While the MR-Egger method is a worthwhile sensitivity analysis for detecting violations of the instrumental variable assumptions, there are several reasons why causal estimates from the MR-Egger method may be biased and have inflated Type 1 error rates in practice, including violations of the InSIDE assumption and the influence of outlying variants. The issues raised in this paper have potentially serious consequences for causal inferences from the MR-Egger approach. We give examples of scenarios in which the estimates from conventional Mendelian randomization methods and MR-Egger differ, and discuss how to interpret findings in such cases.
Optimal post-experiment estimation of poorly modeled dynamic systems
NASA Technical Reports Server (NTRS)
Mook, D. Joseph
1988-01-01
Recently, a novel strategy for post-experiment state estimation of discretely-measured dynamic systems has been developed. The method accounts for errors in the system dynamic model equations in a more general and rigorous manner than do filter-smoother algorithms. The dynamic model error terms do not require the usual process noise assumptions of zero-mean, symmetrically distributed random disturbances. Instead, the model error terms require no prior assumptions other than piecewise continuity. The resulting state estimates are more accurate than filters for applications in which the dynamic model error clearly violates the typical process noise assumptions, and the available measurements are sparse and/or noisy. Estimates of the dynamic model error, in addition to the states, are obtained as part of the solution of a two-point boundary value problem, and may be exploited for numerous reasons. In this paper, the basic technique is explained, and several example applications are given. Included among the examples are both state estimation and exploitation of the model error estimates.
Alternatives for discounting in the analysis of noninferiority trials.
Snapinn, Steven M
2004-05-01
Determining the efficacy of an experimental therapy relative to placebo on the basis of an active-control noninferiority trial requires reference to historical placebo-controlled trials. The validity of the resulting comparison depends on two key assumptions: assay sensitivity and constancy. Since the truth of these assumptions cannot be verified, it seems logical to raise the standard of evidence required to declare efficacy; this concept is referred to as discounting. It is not often recognized that two common design and analysis approaches, setting a noninferiority margin and requiring preservation of a fraction of the standard therapy's effect, are forms of discounting. The noninferiority margin is a particularly poor approach, since its degree of discounting depends on an irrelevant factor. Preservation of effect is more reasonable, but it addresses only the constancy assumption, not the issue of assay sensitivity. Gaining consensus on the most appropriate approach to the design and analysis of noninferiority trials will require a common understanding of the concept of discounting.
NASA Astrophysics Data System (ADS)
Makungo, Rachel; Odiyo, John O.
2017-08-01
This study was focused on testing the ability of a coupled linear and non-linear system identification model in estimating groundwater levels. System identification provides an alternative approach for estimating groundwater levels in areas that lack data required by physically-based models. It also overcomes the limitations of physically-based models due to approximations, assumptions and simplifications. Daily groundwater levels for 4 boreholes, rainfall and evaporation data covering the period 2005-2014 were used in the study. Seventy and thirty percent of the data were used to calibrate and validate the model, respectively. Correlation coefficient (R), coefficient of determination (R2), root mean square error (RMSE), percent bias (PBIAS), Nash Sutcliffe coefficient of efficiency (NSE) and graphical fits were used to evaluate the model performance. Values for R, R2, RMSE, PBIAS and NSE ranged from 0.8 to 0.99, 0.63 to 0.99, 0.01-2.06 m, -7.18 to 1.16 and 0.68 to 0.99, respectively. Comparisons of observed and simulated groundwater levels for calibration and validation runs showed close agreements. The model performance mostly varied from satisfactory, good, very good and excellent. Thus, the model is able to estimate groundwater levels. The calibrated models can reasonably capture description between input and output variables and can, thus be used to estimate long term groundwater levels.
On the notion of free will in the Free Will Theorem
NASA Astrophysics Data System (ADS)
Landsman, Klaas
2017-02-01
The (Strong) Free Will Theorem (FWT) of Conway and Kochen (2009) on the one hand follows from uncontroversial parts of modern physics and elementary mathematical and logical reasoning, but on the other hand seems predicated on an undefined notion of free will (allowing physicists to "freely choose" the settings of their experiments). This makes the theorem philosophically vulnerable, especially if it is construed as a proof of indeterminism or even of libertarian free will (as Conway & Kochen suggest). However, Cator and Landsman (Foundations of Physics 44, 781-791, 2014) previously gave a reformulation of the FWT that does not presuppose indeterminism, but rather assumes a mathematically specific form of such "free choices" even in a deterministic world (based on a non-probabilistic independence assumption). In the present paper, which is a philosophical sequel to the one just mentioned, I argue that the concept of free will used in the latter version of the FWT is essentially the one proposed by Lewis (1981), also known as 'local miracle compatibilism' (of which I give a mathematical interpretation that might be of some independent interest also beyond its application to the FWT). As such, the (reformulated) FWT in my view challenges compatibilist free will à la Lewis (albeit in a contrived way via bipartite EPR-type experiments), falling short of supporting libertarian free will.
On the possible onset of the Pioneer anomaly
NASA Astrophysics Data System (ADS)
Feldman, Michael R.; Anderson, John D.
2015-06-01
We explore the possibility that the observed onset of the Pioneer anomaly after Saturn encounter by Pioneer 11 is not necessarily due to mismodeling of solar radiation pressure but instead reflects a physically relevant characteristic of the anomaly itself. We employ the principles of a recently proposed cosmological model termed "the theory of inertial centers" along with an understanding of the fundamental assumptions taken by the Deep Space Network (DSN) to attempt to model this sudden onset. Due to an ambiguity that arises from the difference in the DSN definition of expected light-time with light-time according to the theory of inertial centers, we are forced to adopt a seemingly arbitrary convention to relate DSN-assumed clock-rates to physical clock-rates for this model. We offer a possible reason for adopting the convention employed in our analysis; however, we remain skeptical. Nevertheless, with this convention, one finds that this theory is able to replicate the previously reported Hubble-like behavior of the "clock acceleration" for the Pioneer anomaly as well as the sudden onset of the anomalous acceleration after Pioneer 11 Saturn encounter. While oscillatory behavior with a yearly period is also predicted for the anomalous clock accelerations of both Pioneer 10 and Pioneer 11, the predicted amplitude is an order of magnitude too small when compared with that reported for Pioneer 10.
ERIC Educational Resources Information Center
Hagger, Martin S.; Chatzisarantis, Nikos L. D.; Biddle, Stuart J. H.
2002-01-01
Examined relations between behavior, intentions, attitudes, subjective norms, perceived behavioral control, self-efficacy, and past behaviors using the Theories of Reasoned Action (TRA) and Planned Behavior (TPB) in physical activity. This quantitative integration of the physical activity literature supported the major relationships of the…
ERIC Educational Resources Information Center
Akatugba, Ayo Harriet; Wallace, John
2009-01-01
This study examines students' use of proportional reasoning in high school physics problem-solving in a West African school setting. An in-depth, constructivist, and interpretive case study was carried out with six physics students from a co-educational senior secondary school in Nigeria over a period of five months. The study aimed to elicit…
Diagnosis by integrating model-based reasoning with knowledge-based reasoning
NASA Technical Reports Server (NTRS)
Bylander, Tom
1988-01-01
Our research investigates how observations can be categorized by integrating a qualitative physical model with experiential knowledge. Our domain is diagnosis of pathologic gait in humans, in which the observations are the gait motions, muscle activity during gait, and physical exam data, and the diagnostic hypotheses are the potential muscle weaknesses, muscle mistimings, and joint restrictions. Patients with underlying neurological disorders typically have several malfunctions. Among the problems that need to be faced are: the ambiguity of the observations, the ambiguity of the qualitative physical model, correspondence of the observations and hypotheses to the qualitative physical model, the inherent uncertainty of experiential knowledge, and the combinatorics involved in forming composite hypotheses. Our system divides the work so that the knowledge-based reasoning suggests which hypotheses appear more likely than others, the qualitative physical model is used to determine which hypotheses explain which observations, and another process combines these functionalities to construct a composite hypothesis based on explanatory power and plausibility. We speculate that the reasoning architecture of our system is generally applicable to complex domains in which a less-than-perfect physical model and less-than-perfect experiential knowledge need to be combined to perform diagnosis.
Clendon, Jill; Walker, Léonie
2017-03-01
To examine the dual caregiving and nursing responsibilities of nurses in New Zealand with a view to identifying potential strategies, policies and employment practices that may help to retain nurses with caregiving responsibilities in the workplace. As the nursing workforce ages, child-bearing is delayed and older family members are living longer, family caregiving responsibilities are impacting more on the working life of nurses. This may complicate accurate workforce planning assumptions. An explorative, descriptive design using interviews and focus groups with 28 registered nurses with family caregiving responsibilities. A depth of (largely hidden) experience was exposed revealing considerable guilt, physical, emotional and financial hardship. Regardless of whether the nurse chose to work or had to for financial reasons, family always came first. Demographic and societal changes related to caregiving may have profound implications for nursing. Workplace support is essential to ensure that nurses are able to continue to work. Increased awareness, support, flexibility and specific planning are required to retain nurses with family caregiving responsibilities. © 2016 John Wiley & Sons Ltd.
Atomistic modeling of interphases in spider silk fibers
NASA Astrophysics Data System (ADS)
Fossey, Stephen Andrew
The objective of this work is to create an atomistic model to account for the unusual physical properties of silk fibers. Silk fibers have exceptional mechanical toughness, which makes them of interest as high performance fibers. In order to explain the toughness, a model for the molecular structure based on simple geometric reasoning was formulated. The model consists of very small crystallites, on the order of 5 nm, connected by a noncrystalline interphase. The interphase is a region between the crystalline phase and the amorphous phase, which is defined by the geometry of the system. The interphase is modeled as a very thin (<5 nm) film of noncrystalline polymer constructed using a Monte Carlo, rotational isomeric states approach followed by simulated annealing in order to achieve equilibrium chain configurations and density. No additional assumptions are made about density, orientation, or packing. The mechanical properties of the interphase are calculated using the method of Theodoreau and Suter. Finally, observable properties such as wide angle X-ray scattering and methyl rotation rates are calculated and compared with experimental data available in the literature.
Stone circles: form and soil kinematics.
Hallet, Bernard
2013-12-13
Distinct surface patterns are ubiquitous and diverse in soils of polar and alpine regions, where the ground temperature oscillates about 0 degrees C. They constitute some of the most striking examples of clearly visible, abiotic self-organization in nature. This paper outlines the interplay of frost-related physical processes that produce these patterns spontaneously and presents unique data documenting subsurface soil rotational motion and surface displacement spanning 20 years in well-developed circles of soil outlined by gravel ridges. These sorted circles are particularly attractive research targets for a number of reasons that provide focus for this paper: (i) their exceptional geometric regularity captures the attention of any observer; (ii) they are currently forming and evolving, hence the underlying processes can be monitored readily, especially because they are localized near the ground surface on a scale of metres, which facilitates comprehensive characterization; and (iii) a recent, highly successful numerical model of sorted circle development helps to draw attention to particular field observations that can be used to assess the model, its assumptions and parameter choices, and to the considerable potential for synergetic field and modelling studies.
Stone circles: form and soil kinematics.
Hallet, Bernard
2013-01-01
Distinct surface patterns are ubiquitous and diverse in soils of polar and alpine regions, where the ground temperature oscillates about 0°C. They constitute some of the most striking examples of clearly visible, abiotic self-organization in nature. This paper outlines the interplay of frost-related physical processes that produce these patterns spontaneously and presents unique data documenting subsurface soil rotational motion and surface displacement spanning 20 years in well-developed circles of soil outlined by gravel ridges. These sorted circles are particularly attractive research targets for a number of reasons that provide focus for this paper: (i) their exceptional geometric regularity captures the attention of any observer; (ii) they are currently forming and evolving, hence the underlying processes can be monitored readily, especially because they are localized near the ground surface on a scale of metres, which facilitates comprehensive characterization; and (iii) a recent, highly successful numerical model of sorted circle development helps to draw attention to particular field observations that can be used to assess the model, its assumptions and parameter choices, and to the considerable potential for synergetic field and modelling studies.
Reliability evaluation of oil pipelines operating in aggressive environment
NASA Astrophysics Data System (ADS)
Magomedov, R. M.; Paizulaev, M. M.; Gebel, E. S.
2017-08-01
In connection with modern increased requirements for ecology and safety, the development of diagnostic services complex is obligatory and necessary enabling to ensure the reliable operation of the gas transportation infrastructure. Estimation of oil pipelines technical condition should be carried out not only to establish the current values of the equipment technological parameters in operation, but also to predict the dynamics of changes in the physical and mechanical characteristics of the material, the appearance of defects, etc. to ensure reliable and safe operation. In the paper, existing Russian and foreign methods for evaluation of the oil pipelines reliability are considered, taking into account one of the main factors leading to the appearance of crevice in the pipeline material, i.e. change the shape of its cross-section, - corrosion. Without compromising the generality of the reasoning, the assumption of uniform corrosion wear for the initial rectangular cross section has been made. As a result a formula for calculation the probability of failure-free operation was formulated. The proposed mathematical model makes it possible to predict emergency situations, as well as to determine optimal operating conditions for oil pipelines.
Energy flow and energy dissipation in a free surface.
NASA Astrophysics Data System (ADS)
Goldburg, Walter; Cressman, John
2005-11-01
Turbulent flows on a free surface are strongly compressible [1] and do not conserve energy in the absence of viscosity as bulk fluids do. Despite violation of assumptions essential to Kolmogorov's theory of 1941 (K41) [2, 3], surface flows show strong agreement with Kolmogorov scaling, though intermittency is larger there. Steady state turbulence is generated in a tank of water, and the spatially averaged energy flux is measured from the four-fifth's law at each instant of time. Likewise, the energy dissipation rate as measured from velocity gradients is also a random variable in this experiment. The energy flux - dissipation rate cross-correlation is measured to be correlated in incompressible bulk flows, but strongly anti-correlated on the surface. We argue that the reason for this discrepancy between surface and bulk flows is due to compressible effects present on the surface. [1] J. R. Cressman, J. Davoudi, W. I. Goldburg, and J. Schumacher, New Journal of Physics, 6, 53, 2004. [2] U. Frisch. Turbulence: The legacy of A. N. Kolmogorov, Cambridge University Press, Cambridge, 1995. [3] A. N. Kolmogorov, Doklady Akad. Nauk SSSR, 32, 16, 1941.
Dynamic cross-correlations between entangled biofilaments as they diffuse
Tsang, Boyce; Dell, Zachary E.; Jiang, Lingxiang; Schweizer, Kenneth S.; Granick, Steve
2017-01-01
Entanglement in polymer and biological physics involves a state in which linear interthreaded macromolecules in isotropic liquids diffuse in a spatially anisotropic manner beyond a characteristic mesoscopic time and length scale (tube diameter). The physical reason is that linear macromolecules become transiently localized in directions transverse to their backbone but diffuse with relative ease parallel to it. Within the resulting broad spectrum of relaxation times there is an extended period before the longest relaxation time when filaments occupy a time-averaged cylindrical space of near-constant density. Here we show its implication with experiments based on fluorescence tracking of dilutely labeled macromolecules. The entangled pairs of aqueous F-actin biofilaments diffuse with separation-dependent dynamic cross-correlations that exceed those expected from continuum hydrodynamics up to strikingly large spatial distances of ≈15 µm, which is more than 104 times the size of the solvent water molecules in which they are dissolved, and is more than 50 times the dynamic tube diameter, but is almost equal to the filament length. Modeling this entangled system as a collection of rigid rods, we present a statistical mechanical theory that predicts these long-range dynamic correlations as an emergent consequence of an effective long-range interpolymer repulsion due to the de Gennes correlation hole, which is a combined consequence of chain connectivity and uncrossability. The key physical assumption needed to make theory and experiment agree is that solutions of entangled biofilaments localized in tubes that are effectively dynamically incompressible over the relevant intermediate time and length scales. PMID:28283664
45 CFR 605.12 - Reasonable accommodation.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION... to the known physical or mental limitations of an otherwise qualified handicapped applicant or... reasonable accommodation to the physical or mental limitations of the employee or applicant. [47 FR 8573, Mar...
45 CFR 605.12 - Reasonable accommodation.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION... to the known physical or mental limitations of an otherwise qualified handicapped applicant or... reasonable accommodation to the physical or mental limitations of the employee or applicant. [47 FR 8573, Mar...
45 CFR 605.12 - Reasonable accommodation.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION... to the known physical or mental limitations of an otherwise qualified handicapped applicant or... reasonable accommodation to the physical or mental limitations of the employee or applicant. [47 FR 8573, Mar...
45 CFR 605.12 - Reasonable accommodation.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION... to the known physical or mental limitations of an otherwise qualified handicapped applicant or... reasonable accommodation to the physical or mental limitations of the employee or applicant. [47 FR 8573, Mar...
45 CFR 605.12 - Reasonable accommodation.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION... to the known physical or mental limitations of an otherwise qualified handicapped applicant or... reasonable accommodation to the physical or mental limitations of the employee or applicant. [47 FR 8573, Mar...
NASA Astrophysics Data System (ADS)
Kryjevskaia, Mila; Stetzer, MacKenzie R.; Grosz, Nathaniel
2014-12-01
We have applied the heuristic-analytic theory of reasoning to interpret inconsistencies in student reasoning approaches to physics problems. This study was motivated by an emerging body of evidence that suggests that student conceptual and reasoning competence demonstrated on one task often fails to be exhibited on another. Indeed, even after instruction specifically designed to address student conceptual and reasoning difficulties identified by rigorous research, many undergraduate physics students fail to build reasoning chains from fundamental principles even though they possess the required knowledge and skills to do so. Instead, they often rely on a variety of intuitive reasoning strategies. In this study, we developed and employed a methodology that allowed for the disentanglement of student conceptual understanding and reasoning approaches through the use of sequences of related questions. We have shown that the heuristic-analytic theory of reasoning can be used to account for, in a mechanistic fashion, the observed inconsistencies in student responses. In particular, we found that students tended to apply their correct ideas in a selective manner that supported a specific and likely anticipated conclusion while neglecting to employ the same ideas to refute an erroneous intuitive conclusion. The observed reasoning patterns were consistent with the heuristic-analytic theory, according to which reasoners develop a "first-impression" mental model and then construct an argument in support of the answer suggested by this model. We discuss implications for instruction and argue that efforts to improve student metacognition, which serves to regulate the interaction between intuitive and analytical reasoning, is likely to lead to improved student reasoning.
Self-consistent inclusion of space-charge in the traveling wave tube
NASA Technical Reports Server (NTRS)
Freeman, Jon C.
1987-01-01
It is shown how the complete field of the electron beam may be incorporated into the transmission line model theory of the traveling wave tube (TWT). The fact that the longitudinal component of the field due to the bunched beam is not used when formulating the beam-to-circuit coupling equation is not well-known. The fundamental partial differential equation for the traveling wave field is developed and compared with the older (now standard) one. The equation can be solved numerically using the same algorithms, but now the coefficients can be updated continuously as the calculation proceeds down the tube. The coefficients in the older equations are primarily derived from preliminary measurements and some trial and error. The newer coefficients can be found by a recursive method, since each has a well defined physical interpretation and can be calculated once a reasonable first trial solution is postulated. The results of the new expression were compared with those of the older forms, as well as to a field theory model to show the ease in which a reasonable fit to the field prediction is obtained. A complete summary of the existing transmission line modeling of the TWT is given to explain the somewhat vague ideas and techniques in the general area of drifting carrier-traveling circuit wave interactions. The basic assumptions and inconsistencies of the existing theory and areas of confusion in the general literature are examined and hopefully cleared up.
ERIC Educational Resources Information Center
Shen, Bo; Li, Weidong; Sun, Haichun; Rukavina, Paul Bernard
2010-01-01
Guided by Green-Demers, Leagult, Pelletier, and Pelletier's (2008) assumption that amotivation (absence of motivation) is a multidimensional construct, we designed this study to investigate the influence of inadequate teacher-to-student social support on amotivation of high-school physical education students. Five hundred and sixty-six ninth…
Including Overweight or Obese Students in Physical Education: A Social Ecological Constraint Model
ERIC Educational Resources Information Center
Li, Weidong; Rukavina, Paul
2012-01-01
In this review, we propose a social ecological constraint model to study inclusion of overweight or obese students in physical education by integrating key concepts and assumptions from ecological constraint theory in motor development and social ecological models in health promotion and behavior. The social ecological constraint model proposes…
Speed of Transverse Waves in a String Revisited
ERIC Educational Resources Information Center
Rizcallah, Joseph A.
2017-01-01
In many introductory-level physics textbooks, the derivation of the formula for the speed of transverse waves in a string is either omitted altogether or presented under physically overly idealized assumptions about the shape of the considered wave pulse and the related velocity and acceleration distributions. In this paper, we derive the named…
Teaching Perspectives of Pre-Service Physical Education Teachers: The Shanghai Experience
ERIC Educational Resources Information Center
Wang, Lijuan
2014-01-01
Background: In the physical education (PE) domain, teachers are given the freedom to make important educational decisions. Because of the common assumption that the decisions teachers make are based on a set of educational perspectives, a considerable number of studies have addressed the importance of studying the thinking and beliefs of PE…
Inclusion in Physical Education: From the Medical Model to Social Constructionism
ERIC Educational Resources Information Center
Grenier, Michelle
2007-01-01
The purpose of this discussion is to explore assumptions that have informed constructions of disability and to challenge these as socially constituted judgments that influence the way teachers think and act in general physical education. A secondary purpose is to introduce social constructionism as a discourse that potentially reshapes…
Understanding the Relationship between Student Attitudes and Student Learning
ERIC Educational Resources Information Center
Cahill, Michael J.; McDaniel, Mark A.; Frey, Regina F.; Hynes, K. Mairin; Repice, Michelle; Zhao, Jiuqing; Trousil, Rebecca
2018-01-01
Student attitudes, defined as the extent to which one holds expertlike beliefs about and approaches to physics, are a major research topic in physics education research. An implicit but rarely tested assumption underlying much of this research is that student attitudes play a significant part in student learning and performance. The current study…
Comprehensive model for predicting elemental composition of coal pyrolysis products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricahrds, Andrew P.; Shutt, Tim; Fletcher, Thomas H.
Large-scale coal combustion simulations depend highly on the accuracy and utility of the physical submodels used to describe the various physical behaviors of the system. Coal combustion simulations depend on the particle physics to predict product compositions, temperatures, energy outputs, and other useful information. The focus of this paper is to improve the accuracy of devolatilization submodels, to be used in conjunction with other particle physics models. Many large simulations today rely on inaccurate assumptions about particle compositions, including that the volatiles that are released during pyrolysis are of the same elemental composition as the char particle. Another common assumptionmore » is that the char particle can be approximated by pure carbon. These assumptions will lead to inaccuracies in the overall simulation. There are many factors that influence pyrolysis product composition, including parent coal composition, pyrolysis conditions (including particle temperature history and heating rate), and others. All of these factors are incorporated into the correlations to predict the elemental composition of the major pyrolysis products, including coal tar, char, and light gases.« less
Physical activity alters limb bone structure but not entheseal morphology.
Wallace, Ian J; Winchester, Julia M; Su, Anne; Boyer, Doug M; Konow, Nicolai
2017-06-01
Studies of ancient human skeletal remains frequently proceed from the assumption that individuals with robust limb bones and/or rugose, hypertrophic entheses can be inferred to have been highly physically active during life. Here, we experimentally test this assumption by measuring the effects of exercise on limb bone structure and entheseal morphology in turkeys. Growing females were either treated with a treadmill-running regimen for 10 weeks or served as controls. After the experiment, femoral cortical and trabecular bone structure were quantified with μCT in the mid-diaphysis and distal epiphysis, respectively, and entheseal morphology was quantified in the lateral epicondyle. The results indicate that elevated levels of physical activity affect limb bone structure but not entheseal morphology. Specifically, animals subjected to exercise displayed enhanced diaphyseal and trabecular bone architecture relative to controls, but no significant difference was detected between experimental groups in entheseal surface topography. These findings suggest that diaphyseal and trabecular structure are more reliable proxies than entheseal morphology for inferring ancient human physical activity levels from skeletal remains. Copyright © 2017 Elsevier Ltd. All rights reserved.
Bayerl, Manfred
2015-01-01
Peyronie's disease is a connective tissue disorder in the soft tissue of the penis. The underlying cause of Peyronie's disease is not well understood but is thought to be caused by trauma or injury to the penis during sexual intercourse. The purpose of the interdisciplinary cooperation between urological surgery and physics is the development of a physical simulation tool in order to give prognosis of possible tunica albuginea fibre rupture at a certain degree of deviation of the penis. For our group the first challenge was to translate the human organ of the penis into a physical model. Starting and marginal parameters had to be defined, whereby some of them had to be based on assumption, as physical data of the human living tissue have rarely been measured up to now. The algorithm and its dependencies had to be developed. This paper is a first step of a three-dimensional mathematical-physical simulation with the assumption of a 100% filled rigid penis. The calculation gives proof of the hypothesis that the fibre-load-angle of the penis is less than 12 degrees. Thus physical simulation is able to provide the surgeon with a simple instrument to calculate and forecast the risk of the individual patient. PMID:25648614
Modeling of Heat Transfer in Rooms in the Modelica "Buildings" Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetter, Michael; Zuo, Wangda; Nouidui, Thierry Stephane
This paper describes the implementation of the room heat transfer model in the free open-source Modelica \\Buildings" library. The model can be used as a single room or to compose a multizone building model. We discuss how the model is decomposed into submodels for the individual heat transfer phenomena. We also discuss the main physical assumptions. The room model can be parameterized to use different modeling assumptions, leading to linear or non-linear differential algebraic systems of equations. We present numerical experiments that show how these assumptions affect computing time and accuracy for selected cases of the ANSI/ASHRAE Standard 140- 2007more » envelop validation tests.« less
An Economic Evaluation of Food Safety Education Interventions: Estimates and Critical Data Gaps.
Zan, Hua; Lambea, Maria; McDowell, Joyce; Scharff, Robert L
2017-08-01
The economic evaluation of food safety interventions is an important tool that practitioners and policy makers use to assess the efficacy of their efforts. These evaluations are built on models that are dependent on accurate estimation of numerous input variables. In many cases, however, there is no data available to determine input values and expert opinion is used to generate estimates. This study uses a benefit-cost analysis of the food safety component of the adult Expanded Food and Nutrition Education Program (EFNEP) in Ohio as a vehicle for demonstrating how results based on variable values that are not objectively determined may be sensitive to alternative assumptions. In particular, the focus here is on how reported behavioral change is translated into economic benefits. Current gaps in the literature make it impossible to know with certainty how many people are protected by the education (what are the spillover effects?), the length of time education remains effective, and the level of risk reduction from change in behavior. Based on EFNEP survey data, food safety education led 37.4% of participants to improve their food safety behaviors. Under reasonable default assumptions, benefits from this improvement significantly outweigh costs, yielding a benefit-cost ratio of between 6.2 and 10.0. Incorporation of a sensitivity analysis using alternative estimates yields a greater range of estimates (0.2 to 56.3), which highlights the importance of future research aimed at filling these research gaps. Nevertheless, most reasonable assumptions lead to estimates of benefits that justify their costs.
The Object Coordination Class Applied to Wave Pulses: Analyzing Student Reasoning in Wave Physics.
ERIC Educational Resources Information Center
Wittmann, Michael C.
2002-01-01
Analyzes student responses to interview and written questions on wave physics using diSessa and Sherin's coordination class model which suggests that student use of specific reasoning resources is guided by possibly unconscious cues. (Author/MM)
Physical Evaluation of Cleaning Performance: We Are Only Fooling Ourselves
NASA Technical Reports Server (NTRS)
Pratz, Earl; McCool, A. (Technical Monitor)
2000-01-01
Surface cleaning processes are normally evaluated using visual physical properties such as discolorations, streaking, staining and water-break-free conditions. There is an assumption that these physical methods will evaluate all surfaces all the time for all subsequent operations. We have found that these physical methods are lacking in sensitivity and selectivity with regard to surface residues and subsequent process performance. We will report several conditions where evaluations using visual physical properties are lacking. We will identify possible alternative methods and future needs for surface evaluations.
Kalman filter estimation of human pilot-model parameters
NASA Technical Reports Server (NTRS)
Schiess, J. R.; Roland, V. R.
1975-01-01
The parameters of a human pilot-model transfer function are estimated by applying the extended Kalman filter to the corresponding retarded differential-difference equations in the time domain. Use of computer-generated data indicates that most of the parameters, including the implicit time delay, may be reasonably estimated in this way. When applied to two sets of experimental data obtained from a closed-loop tracking task performed by a human, the Kalman filter generated diverging residuals for one of the measurement types, apparently because of model assumption errors. Application of a modified adaptive technique was found to overcome the divergence and to produce reasonable estimates of most of the parameters.
Gershman, Samuel J
2018-05-24
Human beliefs have remarkable robustness in the face of disconfirmation. This robustness is often explained as the product of heuristics or motivated reasoning. However, robustness can also arise from purely rational principles when the reasoner has recourse to ad hoc auxiliary hypotheses. Auxiliary hypotheses primarily function as the linking assumptions connecting different beliefs to one another and to observational data, but they can also function as a "protective belt" that explains away disconfirmation by absorbing some of the blame. The present article traces the role of auxiliary hypotheses from philosophy of science to Bayesian models of cognition and a host of behavioral phenomena, demonstrating their wide-ranging implications.
Purpose of the systematic physical assessment in everyday practice: critique of a "sacred cow".
Zambas, Shelaine Iris
2010-06-01
Although considered an essential nursing skill, systematic physical assessment is rarely visible in everyday practice. Some nurses question whether systematic physical assessment is relevant to nursing, and others complain that they do not see it used in practice. Why is this, when these skills are considered so integral to nursing? This article challenges nurse educators to reflect on the purpose of the systematic physical assessment within nursing by analyzing the underlying assumptions of this apparent "sacred cow." Copyright 2010, SLACK Incorporated.
ERIC Educational Resources Information Center
Tatzl, Dietmar
2011-01-01
This article presents an English for Specific Purposes (ESP) task developed for teaching aeronautical engineering students. The task Design-Build-Write rests on the assumption that engineering students are skilled at mathematical reasoning, problem solving, drawing and constructing. In Gardner's 1983 Multiple Intelligences (MI) theory, these…
ERIC Educational Resources Information Center
Garrote, Ariana
2017-01-01
Researchers claim that a lack of social skills might be the main reason why pupils with special educational needs (SEN) in inclusive classrooms often experience difficulties in social participation. However, studies that support this assumption are scarce, and none include pupils with an intellectual disability (ID). This article seeks to make an…
ERIC Educational Resources Information Center
Lee, Hyunju; Chang, Hyunsook; Choi, Kyunghee; Kim, Sung-Won; Zeidler, Dana L.
2012-01-01
Character and values are the essential driving forces that serve as general guides or points of reference for individuals to support decision-making and to act responsibly about global socioscientific issues (SSIs). Based on this assumption, we investigated to what extent pre-service science teachers (PSTs) of South Korea possess character and…
ERIC Educational Resources Information Center
Natriello, Gary; And Others
By studying the process by which disadvantaged and low-achieving high school students are assigned to classes and special programs, how and why disadvantaged students are placed in inappropriate programs can be understood. Reasons exist to question the assumption that students are assigned to programs rationally on the basis of information about…
ERIC Educational Resources Information Center
Savolainen, Jukka; Mason, W. Alex; Hughes, Lorine A.; Ebeling, Hanna; Hurtig, Tuula M.; Taanila, Anja M.
2015-01-01
There are strong reasons to assume that early onset of puberty accelerates coital debut among adolescent girls. Although many studies support this assumption, evidence regarding the putative causal processes is limited and inconclusive. In this research, longitudinal data from the 1986 Northern Finland Birth Cohort Study (N = 2,596) were used to…
Model C Is Feasible for ESEA Title I Evaluation.
ERIC Educational Resources Information Center
Echternacht, Gary
The assertion that Model C is feasible for Elementary Secondary Education Act Title I evaluation, why it is feasible, and reasons why it is so seldom used are explained. Two assumptions must be made to use the special regression model. First, a strict cut-off must be used on the pretest to assign students to Title I and comparison groups. Second,…
Code of Federal Regulations, 2010 CFR
2010-04-01
... grouping rules of paragraph (c)(2)(iii) of this section. Separate charts are provided for ages 55, 60, and...) Simplified presentations permitted—(A) Grouping of certain optional forms. Two or more optional forms of... starting date, a reasonable assumption for the age of the participant's spouse, or, in the case of a...
Strategy in the Vietnam War: Western Concepts, Eastern Conflict and the Roots of Failure.
ERIC Educational Resources Information Center
Weland, James
1990-01-01
Critiques U.S. military assumptions concerning the war in Vietnam. Discusses the North Vietnamese strategic approach to gaining control of South Vietnam. Traces the history of the Vietnam War, analyzing specific U.S. military operations in Vietnam and reasons for their failure. Contends that U.S. strategic ethnocentrism lead to defeat in Vietnam.…
ERIC Educational Resources Information Center
Warren, Keith L.; Doogan, Nathan; De Leon, George; Phillips, Gary S.; Moody, James; Hodge, Ashleigh
2013-01-01
Therapeutic communities (TCs) have a strong record of maintaining high quality social climates in prison units. One possible reason for this is the system of mutual monitoring among TC residents, based on the assumption that peer affirmation of behavior in accord with TC norms, and peer correction of behavior contrary to TC norms, will lead to…
ERIC Educational Resources Information Center
Lim, Leonel
2011-01-01
It is widely held that, by teaching individuals how to reason through and analyse everyday problems, the teaching of critical thinking develops the deliberative capacities essential to the healthy functioning of democracy. Implicit in this view is the assumption that a certain commensurability exists between the problems presented in such…
ERIC Educational Resources Information Center
Pender, Nola J.
The purpose of this research was to investigate developmental changes in encoding processes. It attempted to determine the extent to which children of varying ages utilize semantic (denotative or connotative) and acoustical encoding categories in a short-term memory task. It appears to be a reasonable assumption that as associational hierarchies…
A study of Kapton degradation under simulated shuttle environment
NASA Technical Reports Server (NTRS)
Eck, T. G.; Hoffman, R. W.
1986-01-01
A system was developed which employs a source of low energy oxygen ion to simulate the shuttle low Earth orbit environment. This source, together with diagnostic tools including surface analysis ans mass spectroscopic capability, was used to measure the dependence of ion energy of the oxygen induced CO signals from pyrolytic graphite and Kapton. For graphite the CO signal was examined at energies ranging form 4.5 to 465 eV and for Kapton from 4.5 to 188 eV. While the relative quantum yields inferred from the data are reasonably precise, there are large uncertainties in the absolute yields because of the assumptions necessary to covert the measured signal strengths to quantum yields. These assumptions are discussed in detail.
Cognition is … Fundamentally Cultural.
Bender, Andrea; Beller, Sieghard
2013-03-01
A prevailing concept of cognition in psychology is inspired by the computer metaphor. Its focus on mental states that are generated and altered by information input, processing, storage and transmission invites a disregard for the cultural dimension of cognition, based on three (implicit) assumptions: cognition is internal, processing can be distinguished from content, and processing is independent of cultural background. Arguing against each of these assumptions, we point out how culture may affect cognitive processes in various ways, drawing on instances from numerical cognition, ethnobiological reasoning, and theory of mind. Given the pervasive cultural modulation of cognition-on all of Marr's levels of description-we conclude that cognition is indeed fundamentally cultural, and that consideration of its cultural dimension is essential for a comprehensive understanding.
Optimal no-go theorem on hidden-variable predictions of effect expectations
NASA Astrophysics Data System (ADS)
Blass, Andreas; Gurevich, Yuri
2018-03-01
No-go theorems prove that, under reasonable assumptions, classical hidden-variable theories cannot reproduce the predictions of quantum mechanics. Traditional no-go theorems proved that hidden-variable theories cannot predict correctly the values of observables. Recent expectation no-go theorems prove that hidden-variable theories cannot predict the expectations of observables. We prove the strongest expectation-focused no-go theorem to date. It is optimal in the sense that the natural weakenings of the assumptions and the natural strengthenings of the conclusion make the theorem fail. The literature on expectation no-go theorems strongly suggests that the expectation-focused approach is more general than the value-focused one. We establish that the expectation approach is not more general.
Useful global-change scenarios: current issues and challenges
NASA Astrophysics Data System (ADS)
Parson, E. A.
2008-10-01
Scenarios are increasingly used to inform global-change debates, but their connection to decisions has been weak and indirect. This reflects the greater number and variety of potential users and scenario needs, relative to other decision domains where scenario use is more established. Global-change scenario needs include common elements, e.g., model-generated projections of emissions and climate change, needed by many users but in different ways and with different assumptions. For these common elements, the limited ability to engage diverse global-change users in scenario development requires extreme transparency in communicating underlying reasoning and assumptions, including probability judgments. Other scenario needs are specific to users, requiring a decentralized network of scenario and assessment organizations to disseminate and interpret common elements and add elements requiring local context or expertise. Such an approach will make global-change scenarios more useful for decisions, but not less controversial. Despite predictable attacks, scenario-based reasoning is necessary for responsible global-change decisions because decision-relevant uncertainties cannot be specified scientifically. The purpose of scenarios is not to avoid speculation, but to make the required speculation more disciplined, more anchored in relevant scientific knowledge when available, and more transparent.
Fast logic?: Examining the time course assumption of dual process theory.
Bago, Bence; De Neys, Wim
2017-01-01
Influential dual process models of human thinking posit that reasoners typically produce a fast, intuitive heuristic (i.e., Type-1) response which might subsequently be overridden and corrected by slower, deliberative processing (i.e., Type-2). In this study we directly tested this time course assumption. We used a two response paradigm in which participants have to give an immediate answer and afterwards are allowed extra time before giving a final response. In four experiments we used a range of procedures (e.g., challenging response deadline, concurrent load) to knock out Type 2 processing and make sure that the initial response was intuitive in nature. Our key finding is that we frequently observe correct, logical responses as the first, immediate response. Response confidence and latency analyses indicate that these initial correct responses are given fast, with high confidence, and in the face of conflicting heuristic responses. Findings suggest that fast and automatic Type 1 processing also cues a correct logical response from the start. We sketch a revised dual process model in which the relative strength of different types of intuitions determines reasoning performance. Copyright © 2016 Elsevier B.V. All rights reserved.
Fluency and belief bias in deductive reasoning: new indices for old effects
Trippas, Dries; Handley, Simon J.; Verde, Michael F.
2014-01-01
Models based on signal detection theory (SDT) have occupied a prominent role in domains such as perception, categorization, and memory. Recent work by Dube et al. (2010) suggests that the framework may also offer important insights in the domain of deductive reasoning. Belief bias in reasoning has traditionally been examined using indices based on raw endorsement rates—indices that critics have claimed are highly problematic. We discuss a new set of SDT indices fit for the investigation belief bias and apply them to new data examining the effect of perceptual disfluency on belief bias in syllogisms. In contrast to the traditional approach, the SDT indices do not violate important statistical assumptions, resulting in a decreased Type 1 error rate. Based on analyses using these novel indices we demonstrate that perceptual disfluency leads to decreased reasoning accuracy, contrary to predictions. Disfluency also appears to eliminate the typical link found between cognitive ability and the effect of beliefs on accuracy. Finally, replicating previous work, we demonstrate that cognitive ability leads to an increase in reasoning accuracy and a decrease in the response bias component of belief bias. PMID:25009515
The causal structure of utility conditionals.
Bonnefon, Jean-François; Sloman, Steven A
2013-01-01
The psychology of reasoning is increasingly considering agents' values and preferences, achieving greater integration with judgment and decision making, social cognition, and moral reasoning. Some of this research investigates utility conditionals, ''if p then q'' statements where the realization of p or q or both is valued by some agents. Various approaches to utility conditionals share the assumption that reasoners make inferences from utility conditionals based on the comparison between the utility of p and the expected utility of q. This article introduces a new parameter in this analysis, the underlying causal structure of the conditional. Four experiments showed that causal structure moderated utility-informed conditional reasoning. These inferences were strongly invited when the underlying structure of the conditional was causal, and significantly less so when the underlying structure of the conditional was diagnostic. This asymmetry was only observed for conditionals in which the utility of q was clear, and disappeared when the utility of q was unclear. Thus, an adequate account of utility-informed inferences conditional reasoning requires three components: utility, probability, and causal structure. Copyright © 2012 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
2002-03-01
UK Awards: Teacher of Physics Awards Institute Matters: Institute of Physics Education Conference UK Awards: Top SHAP students win prizes Competition: International creative essay competition UK Awards: Kelvin Medal Particle Physics Resources: New poster from PPARC Australia: Physics Students's Day at Adventure World UK Awards: Bragg Medal winners in a FLAP ASE Annual Meeting: Particle Physics at ASE 2002 UK Grants: PPARC Awards AAPT Winter Meeting: Physics First - but do you need maths? UK In-Service Training: The Particle Physics Institutes for A-level teachers Physics on Stage 2: Not too entertaining this time, please! Scotland: A reasoned approach wins reasonable funding Institute Matters: New education manager Germany: Physics gets real: curriculum change for better teaching Research Frontiers: Let there be light - if you hang on a minute
Clarifying assumptions to enhance our understanding and assessment of clinical reasoning.
Durning, Steven J; Artino, Anthony R; Schuwirth, Lambert; van der Vleuten, Cees
2013-04-01
Deciding on a diagnosis and treatment is essential to the practice of medicine. Developing competence in these clinical reasoning processes, commonly referred to as diagnostic and therapeutic reasoning, respectively, is required for physician success. Clinical reasoning has been a topic of research for several decades, and much has been learned. However, there still exists no clear consensus regarding what clinical reasoning entails, let alone how it might best be taught, how it should be assessed, and the research and practice implications therein.In this article, the authors first discuss two contrasting epistemological views of clinical reasoning and related conceptual frameworks. They then outline four different theoretical frameworks held by medical educators that the authors believe guide educators' views on the topic, knowingly or not. Within each theoretical framework, the authors begin with a definition of clinical reasoning (from that viewpoint) and then discuss learning, assessment, and research implications. The authors believe these epistemologies and four theoretical frameworks also apply to other concepts (or "competencies") in medical education.The authors also maintain that clinical reasoning encompasses the mental processes and behaviors that are shared (or evolve) between the patient, physician, and the environment (i.e., practice setting). Clinical reasoning thus incorporates components of all three factors (patient, physician, environment). The authors conclude by outlining practical implications and potential future areas for research.
NASA Astrophysics Data System (ADS)
Line, Michael
The field of transiting exoplanet atmosphere characterization has grown considerably over the past decade given the wealth of photometric and spectroscopic data from the Hubble and Spitzer space telescopes. In order to interpret these data, atmospheric models combined with Bayesian approaches are required. From spectra, these approaches permit us to infer fundamental atmospheric properties and how their compositions can relate back to planet formation. However, such approaches must make a wide range of assumptions regarding the physics/parameterizations included in the model atmospheres. There has yet to be a comprehensive investigation exploring how these model assumptions influence our interpretations of exoplanetary spectra. Understanding the impact of these assumptions is especially important since the James Webb Space Telescope (JWST) is expected to invest a substantial portion of its time observing transiting planet atmospheres. It is therefore prudent to optimize and enhance our tools to maximize the scientific return from the revolutionary data to come. The primary goal of the proposed work is to determine the pieces of information we can robustly learn from transiting planet spectra as obtained by JWST and other future, space-based platforms, by investigating commonly overlooked model assumptions. We propose to explore the following effects and how they impact our ability to infer exoplanet atmospheric properties: 1. Stellar/Planetary Uncertainties: Transit/occultation eclipse depths and subsequent planetary spectra are measured relative to their host stars. How do stellar uncertainties, on radius, effective temperature, metallicity, and gravity, as well as uncertainties in the planetary radius and gravity, propagate into the uncertainties on atmospheric composition and thermal structure? Will these uncertainties significantly bias our atmospheric interpretations? Is it possible to use the relative measurements of the planetary spectra to provide additional constraints on the stellar properties? 2. The "1D" Assumption: Atmospheres are inherently three-dimensional. Many exoplanet atmosphere models, especially within retrieval frameworks, assume 1D physics and chemistry when interpreting spectra. How does this "1D" atmosphere assumption bias our interpretation of exoplanet spectra? Do we have to consider global temperature variations such as day-night contrasts or hot spots? What about spatially inhomogeneous molecular abundances and clouds? How will this change our interpretations of phase resolved spectra? 3. Clouds/Hazes: Understanding how clouds/hazes impact transit spectra is absolutely critical if we are to obtain proper estimates of basic atmospheric quantities. How do the assumptions in cloud physics bias our inferences of molecular abundances in transmission? What kind of data (wavelengths, signal-to-noise, resolution) do we need to infer cloud composition, vertical extent, spatial distribution (patchy or global), and size distributions? The proposed work is relevant and timely to the scope of the NASA Exoplanet Research program. The proposed work aims to further develop the critical theoretical modeling tools required to rigorously interpret transiting exoplanet atmosphere data in order to maximize the science return from JWST and beyond. This work will serve as a benchmark study for defining the data (wavelength ranges, signal-to-noises, and resolutions) required from a modeling perspective to "characterize exoplanets and their atmospheres in order to inform target and operational choices for current NASA missions, and/or targeting, operational, and formulation data for future NASA observatories". Doing so will allow us to better "understand the chemical and physical processes of exoplanets (their atmospheres)" which will ultimately " improve understanding of the origins of exoplanetary systems" through robust planetary elemental abundance determinations.
Abstraction and Assume-Guarantee Reasoning for Automated Software Verification
NASA Technical Reports Server (NTRS)
Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.
2004-01-01
Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.
[Digital learning object for diagnostic reasoning in nursing applied to the integumentary system].
da Costa, Cecília Passos Vaz; Luz, Maria Helena Barros Araújo
2015-12-01
To describe the creation of a digital learning object for diagnostic reasoning in nursing applied to the integumentary system at a public university of Piaui. A methodological study applied to technological production based on the pedagogical framework of problem-based learning. The methodology for creating the learning object observed the stages of analysis, design, development, implementation and evaluation recommended for contextualized instructional design. The revised taxonomy of Bloom was used to list the educational goals. The four modules of the developed learning object were inserted into the educational platform Moodle. The theoretical assumptions allowed the design of an important online resource that promotes effective learning in the scope of nursing education. This study should add value to nursing teaching practices through the use of digital learning objects for teaching diagnostic reasoning applied to skin and skin appendages.
Teaching for clinical reasoning - helping students make the conceptual links.
McMillan, Wendy Jayne
2010-01-01
Dental educators complain that students struggle to apply what they have learnt theoretically in the clinical context. This paper is premised on the assumption that there is a relationship between conceptual thinking and clinical reasoning. The paper provides a theoretical framework for understanding the relationship between conceptual learning and clinical reasoning. A review of current literature is used to explain the way in which conceptual understanding influences clinical reasoning and the transfer of theoretical understandings to the clinical context. The paper argues that the connections made between concepts are what is significant about conceptual understanding. From this point of departure the paper describes teaching strategies that facilitate the kinds of learning opportunities that students need in order to develop conceptual understanding and to be able to transfer knowledge from theoretical to clinical contexts. Along with a variety of teaching strategies, the value of concept maps is discussed. The paper provides a framework for understanding the difficulties that students have in developing conceptual networks appropriate for later clinical reasoning. In explaining how students learn for clinical application, the paper provides a theoretical framework that can inform how dental educators facilitate the conceptual learning, and later clinical reasoning, of their students.
Teaching Scientific Reasoning to Liberal Arts Students
NASA Astrophysics Data System (ADS)
Rubbo, Louis
2014-03-01
University courses in conceptual physics and astronomy typically serve as the terminal science experience for the liberal arts student. Within this population significant content knowledge gains can be achieved by utilizing research verified pedagogical methods. However, from the standpoint of the Univeristy, students are expected to complete these courses not necessarily for the content knowledge but instead for the development of scientific reasoning skills. Results from physics education studies indicate that unless scientific reasoning instruction is made explicit students do not progress in their reasoning abilities. How do we complement the successful content based pedagogical methods with instruction that explicitly focuses on the development of scientific reasoning skills? This talk will explore methodologies that actively engages the non-science students with the explicit intent of fostering their scientific reasoning abilities.
Cope, F W
1981-01-01
The Weber psychophysical law, which describes much experimental data on perception by man, is derived from the Heisenberg uncertainty principle on the assumption that human perception occurs by energy detection by superconductive microregions within man . This suggests that psychophysical perception by man might be considered merely a special case of physical measurement in general. The reverse derivation-i.e., derivation of the Heisenberg principle from the Weber law-may be of even greater interest. It suggest that physical measurements could be regarded as relative to the perceptions by the detectors within man. Thus one may develop a "human" theory of relativity that could have the advantage of eliminating hidden assumptions by forcing physical theories to conform more completely to the measurements made by man rather than to concepts that might not accurately describe nature.
Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications
NASA Technical Reports Server (NTRS)
Chaki, Sagar; Gurfinkel, Arie
2010-01-01
We develop a learning-based automated Assume-Guarantee (AG) reasoning framework for verifying omega-regular properties of concurrent systems. We study the applicability of non-circular (AGNC) and circular (AG-C) AG proof rules in the context of systems with infinite behaviors. In particular, we show that AG-NC is incomplete when assumptions are restricted to strictly infinite behaviors, while AG-C remains complete. We present a general formalization, called LAG, of the learning based automated AG paradigm. We show how existing approaches for automated AG reasoning are special instances of LAG.We develop two learning algorithms for a class of systems, called infinite regular systems, that combine finite and infinite behaviors. We show that for infinity-regular systems, both AG-NC and AG-C are sound and complete. Finally, we show how to instantiate LAG to do automated AG reasoning for infinite regular, and omega-regular, systems using both AG-NC and AG-C as proof rules
Biology meets physics: Reductionism and multi-scale modeling of morphogenesis.
Green, Sara; Batterman, Robert
2017-02-01
A common reductionist assumption is that macro-scale behaviors can be described "bottom-up" if only sufficient details about lower-scale processes are available. The view that an "ideal" or "fundamental" physics would be sufficient to explain all macro-scale phenomena has been met with criticism from philosophers of biology. Specifically, scholars have pointed to the impossibility of deducing biological explanations from physical ones, and to the irreducible nature of distinctively biological processes such as gene regulation and evolution. This paper takes a step back in asking whether bottom-up modeling is feasible even when modeling simple physical systems across scales. By comparing examples of multi-scale modeling in physics and biology, we argue that the "tyranny of scales" problem presents a challenge to reductive explanations in both physics and biology. The problem refers to the scale-dependency of physical and biological behaviors that forces researchers to combine different models relying on different scale-specific mathematical strategies and boundary conditions. Analyzing the ways in which different models are combined in multi-scale modeling also has implications for the relation between physics and biology. Contrary to the assumption that physical science approaches provide reductive explanations in biology, we exemplify how inputs from physics often reveal the importance of macro-scale models and explanations. We illustrate this through an examination of the role of biomechanical modeling in developmental biology. In such contexts, the relation between models at different scales and from different disciplines is neither reductive nor completely autonomous, but interdependent. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Williams, John
2018-01-01
The recently launched Australian Curriculum Health and Physical Education has five propositions, one of which is for students to adopt a critical inquiry approach within this subject area. In particular, students are encouraged to explore issues that relate to social power and taken-for-granted assumptions. This paper problematizes the concept of…
ERIC Educational Resources Information Center
James, Wendy Michelle
2013-01-01
Science and engineering instructors often observe that students have difficulty using or applying prerequisite mathematics knowledge in their courses. This qualitative project uses a case-study method to investigate the instruction in a trigonometry course and a physics course based on a different methodology and set of assumptions about student…
Finding the Equation for a Vibrating Car Antenna.
ERIC Educational Resources Information Center
Newburgh, Ronald; Newburgh, G. Alexander
2000-01-01
Presents the physical assumptions and mathematical expressions necessary to derive a fourth-order differential equation that describes the vibration of a particular car antenna. Contends that while students may not be able to derive or use the equation, they should be able to appreciate a guided derivation as an example of how physics is done.…
ERIC Educational Resources Information Center
Dowd, Jason E.; Araujo, Ives; Mazur, Eric
2015-01-01
Although confusion is generally perceived to be negative, educators dating as far back as Socrates, who asked students to question assumptions and wrestle with ideas, have challenged this notion. Can confusion be productive? How should instructors interpret student expressions of confusion? During two semesters of introductory physics that…
Rethinking the Preparation of HPE Teachers: Ruminations on Knowledge, Identity, and Ways of Thinking
ERIC Educational Resources Information Center
Tinning, Richard
2004-01-01
This paper explores assumptions about essential knowledge in degree programs that have traditionally prepared teachers of physical education, and discusses the question of what sort of teacher education is necessary or desirable to prepare teachers for the new Health & Physical Education (HPE) key learning area. I argue that the curriculum of the…
NASA Astrophysics Data System (ADS)
Scharnagl, Benedikt; Durner, Wolfgang
2013-04-01
Models are inherently imperfect because they simplify processes that are themselves imperfectly known and understood. Moreover, the input variables and parameters needed to run a model are typically subject to various sources of error. As a consequence of these imperfections, model predictions will always deviate from corresponding observations. In most applications in soil hydrology, these deviations are clearly not random but rather show a systematic structure. From a statistical point of view, this systematic mismatch may be a reason for concern because it violates one of the basic assumptions made in inverse parameter estimation: the assumption of independence of the residuals. But what are the consequences of simply ignoring the autocorrelation in the residuals, as it is current practice in soil hydrology? Are the parameter estimates still valid even though the statistical foundation they are based on is partially collapsed? Theory and practical experience from other fields of science have shown that violation of the independence assumption will result in overconfident uncertainty bounds and that in some cases it may lead to significantly different optimal parameter values. In our contribution, we present three soil hydrological case studies, in which the effect of autocorrelated residuals on the estimated parameters was investigated in detail. We explicitly accounted for autocorrelated residuals using a formal likelihood function that incorporates an autoregressive model. The inverse problem was posed in a Bayesian framework, and the posterior probability density function of the parameters was estimated using Markov chain Monte Carlo simulation. In contrast to many other studies in related fields of science, and quite surprisingly, we found that the first-order autoregressive model, often abbreviated as AR(1), did not work well in the soil hydrological setting. We showed that a second-order autoregressive, or AR(2), model performs much better in these applications, leading to parameter and uncertainty estimates that satisfy all the underlying statistical assumptions. For theoretical reasons, these estimates are deemed more reliable than those estimates based on the neglect of autocorrelation in the residuals. In compliance with theory and results reported in the literature, our results showed that parameter uncertainty bounds were substantially wider if autocorrelation in the residuals was explicitly accounted for, and also the optimal parameter vales were slightly different in this case. We argue that the autoregressive model presented here should be used as a matter of routine in inverse modeling of soil hydrological processes.
Linking Physical Climate Research and Economic Assessments of Mitigation Policies
NASA Astrophysics Data System (ADS)
Stainforth, David; Calel, Raphael
2017-04-01
Evaluating climate change policies requires economic assessments which balance the costs and benefits of climate action. A certain class of Integrated Assessment Models (IAMS) are widely used for this type of analysis; DICE, PAGE and FUND are three of the most influential. In the economics community there has been much discussion and debate about the economic assumptions implemented within these models. Two aspects in particular have gained much attention: i) the costs of damages resulting from climate change - the so-called damage function, and ii) the choice of discount rate applied to future costs and benefits. There has, however, been rather little attention given to the consequences of the choices made in the physical climate models within these IAMS. Here we discuss the practical aspects of the implementation of the physical models in these IAMS, as well as the implications of choices made in these physical science components for economic assessments[1]. We present a simple breakdown of how these IAMS differently represent the climate system as a consequence of differing underlying physical models, different parametric assumptions (for parameters representing, for instance, feedbacks and ocean heat uptake) and different numerical approaches to solving the models. We present the physical and economic consequences of these differences and reflect on how we might better incorporate the latest physical science understanding in economic models of this type. [1] Calel, R. and Stainforth D.A., "On the Physics of Three Integrated Assessment Models", Bulletin of the American Meteorological Society, in press.
Limiting assumptions in molecular modeling: electrostatics.
Marshall, Garland R
2013-02-01
Molecular mechanics attempts to represent intermolecular interactions in terms of classical physics. Initial efforts assumed a point charge located at the atom center and coulombic interactions. It is been recognized over multiple decades that simply representing electrostatics with a charge on each atom failed to reproduce the electrostatic potential surrounding a molecule as estimated by quantum mechanics. Molecular orbitals are not spherically symmetrical, an implicit assumption of monopole electrostatics. This perspective reviews recent evidence that requires use of multipole electrostatics and polarizability in molecular modeling.
NASA Astrophysics Data System (ADS)
Curci, Gabriele
2017-04-01
The calculation of optical properties from knowledge of the composition and abundance of atmospheric aerosol implies a certain number of assumptions. First and if not known or explicitly simulated, a size distribution must be assigned to each aerosol component (e.g. sulfate-like inorganic ions, organic and back carbon, soil dust, sea salt). Second, physical-chemical properties such as the shape, density, complex refractive index, and hygroscopic factors must be associated to each aerosol species. Third, a representation of how the aerosol species combine together must be made: among those, the most popular are the assumptions of external mixing, in which each particle is assumed to be formed of a single compound and the optical properties may be calculated separately for each species, or of internal core-shell arrangement, in which each particle consists of a water-insoluble core coated with a water-soluble shell and that requires more elaborate calculations for optical properties. Previous work found that the assumption on the mixing state (external or core-shell internal) is the one that introduces the highest uncertainty, quantified in about 30% uncertainty on the calculation of monthly mean aerosol optical depth (AOD) and single-scattering albedo (SSA). The external mixing assumption is generally more reasonable for freshly emitted aerosol, while the internal mixing case is associated with aged aerosol that had the time to form the coating around the core. Both approximations are thus regarded as valid, but in general a combination of the two mixing states may be expected in a given air mass. In this work, we test a simple empirical parameterization of the fraction of internally mixed particles (F_in) in a generic air mass. The F_in fraction is calculated in two alternative ways, one exploiting the NOz to NOx ratio (proxy of the photochemical aging), and the other using the relative abundance of black carbon with respect to other aerosol components (proxy of the coating formation). We compare sunphotometer observations from the AERosol RObotic NETwork (AERONET, http://aeronet.gsfc.nasa.gov/) across Europe and North America for the year 2010 with simulations from the Air Quality Modeling Evaluation International Initiative (AQMEII, http://aqmeii.jrc.ec.europa.eu/). The calculation of optical properties from simulated aerosol profiles is carried out using a single post-processing tool (FlexAOD, http://pumpkin.aquila.infn.it/flexaod/) that allows explicit and flexible assignment of the underlying assumptions mentioned above. We found that the combination of externally and internally mixed particles weighted through the F_in fraction gives the best agreement between models and observations, in particular regarding the single-scattering albedo.
Can quantum transition state theory be defined as an exact t = 0+ limit?
NASA Astrophysics Data System (ADS)
Jang, Seogjoo; Voth, Gregory A.
2016-02-01
The definition of the classical transition state theory (TST) as a t → 0+ limit of the flux-side time correlation function relies on the assumption that simultaneous measurement of population and flux is a well defined physical process. However, the noncommutativity of the two measurements in quantum mechanics makes the extension of such a concept to the quantum regime impossible. For this reason, quantum TST (QTST) has been generally accepted as any kind of quantum rate theory reproducing the TST in the classical limit, and there has been a broad consensus that no unique QTST retaining all the properties of TST can be defined. Contrary to this widely held view, Hele and Althorpe (HA) [J. Chem. Phys. 138, 084108 (2013)] recently suggested that a true QTST can be defined as the exact t → 0+ limit of a certain kind of quantum flux-side time correlation function and that it is equivalent to the ring polymer molecular dynamics (RPMD) TST. This work seeks to question and clarify certain assumptions underlying these suggestions and their implications. First, the time correlation function used by HA as a starting expression is not related to the kinetic rate constant by virtue of linear response theory, which is the first important step in relating a t = 0+ limit to a physically measurable rate. Second, a theoretical analysis calls into question a key step in HA's proof which appears not to rely on an exact quantum mechanical identity. The correction of this makes the true t = 0+ limit of HA's QTST different from the RPMD-TST rate expression, but rather equal to the well-known path integral quantum transition state theory rate expression for the case of centroid dividing surface. An alternative quantum rate expression is then formulated starting from the linear response theory and by applying a recently developed formalism of real time dynamics of imaginary time path integrals [S. Jang, A. V. Sinitskiy, and G. A. Voth, J. Chem. Phys. 140, 154103 (2014)]. It is shown that the t → 0+ limit of the new rate expression vanishes in the exact quantum limit.
Simulation of the hybrid and steady state advanced operating modes in ITER
NASA Astrophysics Data System (ADS)
Kessel, C. E.; Giruzzi, G.; Sips, A. C. C.; Budny, R. V.; Artaud, J. F.; Basiuk, V.; Imbeaux, F.; Joffrin, E.; Schneider, M.; Murakami, M.; Luce, T.; St. John, Holger; Oikawa, T.; Hayashi, N.; Takizuka, T.; Ozeki, T.; Na, Y.-S.; Park, J. M.; Garcia, J.; Tucillo, A. A.
2007-09-01
Integrated simulations are performed to establish a physics basis, in conjunction with present tokamak experiments, for the operating modes in the International Thermonuclear Experimental Reactor (ITER). Simulations of the hybrid mode are done using both fixed and free-boundary 1.5D transport evolution codes including CRONOS, ONETWO, TSC/TRANSP, TOPICS and ASTRA. The hybrid operating mode is simulated using the GLF23 and CDBM05 energy transport models. The injected powers are limited to the negative ion neutral beam, ion cyclotron and electron cyclotron heating systems. Several plasma parameters and source parameters are specified for the hybrid cases to provide a comparison of 1.5D core transport modelling assumptions, source physics modelling assumptions, as well as numerous peripheral physics modelling. Initial results indicate that very strict guidelines will need to be imposed on the application of GLF23, for example, to make useful comparisons. Some of the variations among the simulations are due to source models which vary widely among the codes used. In addition, there are a number of peripheral physics models that should be examined, some of which include fusion power production, bootstrap current, treatment of fast particles and treatment of impurities. The hybrid simulations project to fusion gains of 5.6-8.3, βN values of 2.1-2.6 and fusion powers ranging from 350 to 500 MW, under the assumptions outlined in section 3. Simulations of the steady state operating mode are done with the same 1.5D transport evolution codes cited above, except the ASTRA code. In these cases the energy transport model is more difficult to prescribe, so that energy confinement models will range from theory based to empirically based. The injected powers include the same sources as used for the hybrid with the possible addition of lower hybrid. The simulations of the steady state mode project to fusion gains of 3.5-7, βN values of 2.3-3.0 and fusion powers of 290 to 415 MW, under the assumptions described in section 4. These simulations will be presented and compared with particular focus on the resulting temperature profiles, source profiles and peripheral physics profiles. The steady state simulations are at an early stage and are focused on developing a range of safety factor profiles with 100% non-inductive current.
Cook, Sharon A; Rosser, Robert; Toone, Helen; James, M Ian; Salmon, Peter
2006-01-01
Elective cosmetic surgery is expanding in the UK in both the public and private sectors. Because resources are constrained, many cosmetic procedures are being excluded within the National Health Service. If guidelines on who can receive such surgery are to be evidence-based, information is needed about the level of dysfunction in patients referred for elective surgery and whether this is related to their degree of physical abnormality. Consecutive patients referred to a regional plastic surgery and burns unit for assessment for elective cosmetic surgery completed standardised measures of physical and psychosocial dysfunction, and indicated their perception of the degree of their abnormality and their preoccupation with it. We distinguished between patients referred for physical reasons or appearance reasons only, and compared levels of physical and psychosocial dysfunction in each with published values for community and clinical samples. Surgeons indicated patients' degree of objective abnormality, and we identified the relationship of dysfunction with perceived and objective abnormality and preoccupation. Whether patients sought surgery for physical or appearance reasons, physical function was normal. Those seeking surgery for appearance reasons only had moderate psychosocial dysfunction, but were not as impaired as clinical groups with psychological problems. Patients seeking the correction of minor skin lesions for purely appearance reasons reported excellent physical and psychosocial function. Level of function was related (negatively) to patients' preoccupation with abnormality rather than to their perceived or objective abnormality. In general, patients referred for elective cosmetic surgery did not present with significant levels of dysfunction. Moreover, levels of functioning were related to preoccupation rather than to objective abnormality. Therefore, for most patients, whether surgical treatment is generally appropriate is questionable. Future guidelines must seek to identify the small minority who do have a clinical need for surgery.
Scientific Reasoning Abilities of Nonscience Majors in Physics-Based Courses
ERIC Educational Resources Information Center
Moore, J. Christopher; Rubbo, Louis J.
2012-01-01
We have found that non-STEM (science, technology, engineering, and mathematics) majors taking either a conceptual physics or astronomy course at two regional comprehensive institutions score significantly lower preinstruction on the Lawson's Classroom Test of Scientific Reasoning (LCTSR) in comparison to national average STEM majors. Based on…
Behavioral, Psychological, and Demographic Predictors of Physical Fitness.
ERIC Educational Resources Information Center
Conway, Terry L.
Achieving higher levels of physical fitness has become a goal of many Americans both for personal reasons (e.g., improved health, appearance, and perceived well-being) and for organizational reasons (e.g., corporate cost-savings with healthy employees, operational readiness for the military services). Understanding the factors which have an impact…
Addressing Barriers to Conceptual Understanding in IE Physics Classes
NASA Astrophysics Data System (ADS)
Coletta, Vincent P.; Phillips, Jeffrey A.
2009-11-01
We report on the Thinking in Physics project, which helps students who demonstrate weak scientific reasoning skills, as measured by low preinstruction scores on the Lawson Test of Scientific Reasoning Ability. Without special help, such students are unlikely to achieve a good conceptual understanding of introductory mechanics.
Limiting Conditions of the "Physical Attractiveness Stereotype": Attributions about Divorce.
ERIC Educational Resources Information Center
Brigham, John C.
1980-01-01
Subjects, reading a profile of a couple filing for divorce, made attributions about responsibility, financial settlement, future behavior, and personality traits. Reasons for divorce, physical attractiveness of husband and wife, and sex of subject were varied. Attractiveness strongly influenced personality ratings. Reason for divorce was related…
Questioning Engelhardt's assumptions in Bioethics and Secular Humanism.
Ahmadi Nasab Emran, Shahram
2016-06-01
In Bioethics and Secular Humanism: The Search for a Common Morality, Tristram Engelhardt examines various possibilities of finding common ground for moral discourse among people from different traditions and concludes their futility. In this paper I will argue that many of the assumptions on which Engelhardt bases his conclusion about the impossibility of a content-full secular bioethics are problematic. By starting with the notion of moral strangers, there is no possibility, by definition, for a content-full moral discourse among moral strangers. It means that there is circularity in starting the inquiry with a definition of moral strangers, which implies that they do not share enough moral background or commitment to an authority to allow for reaching a moral agreement, and concluding that content-full morality is impossible among moral strangers. I argue that assuming traditions as solid and immutable structures that insulate people across their boundaries is problematic. Another questionable assumption in Engelhardt's work is the idea that religious and philosophical traditions provide content-full moralities. As the cardinal assumption in Engelhardt's review of the various alternatives for a content-full moral discourse among moral strangers, I analyze his foundationalist account of moral reasoning and knowledge and indicate the possibility of other ways of moral knowledge, besides the foundationalist one. Then, I examine Engelhardt's view concerning the futility of attempts at justifying a content-full secular bioethics, and indicate how the assumptions have shaped Engelhardt's critique of the alternatives for the possibility of content-full secular bioethics.
Latent class instrumental variables: A clinical and biostatistical perspective
Baker, Stuart G.; Kramer, Barnett S.; Lindeman, Karen S.
2015-01-01
In some two-arm randomized trials, some participants receive the treatment assigned to the other arm as a result of technical problems, refusal of a treatment invitation, or a choice of treatment in an encouragement design. In some before-and-after studies, the availability of a new treatment changes from one time period to this next. Under assumptions that are often reasonable, the latent class instrumental variable (IV) method estimates the effect of treatment received in the aforementioned scenarios involving all-or-none compliance and all-or-none availability. Key aspects are four initial latent classes (sometimes called principal strata) based on treatment received if in each randomization group or time period, the exclusion restriction assumption (in which randomization group or time period is an instrumental variable), the monotonicity assumption (which drops an implausible latent class from the analysis), and the estimated effect of receiving treatment in one latent class (sometimes called efficacy, the local average treatment effect, or the complier average causal effect). Since its independent formulations in the biostatistics and econometrics literatures, the latent class IV method (which has no well-established name) has gained increasing popularity. We review the latent class IV method from a clinical and biostatistical perspective, focusing on underlying assumptions, methodological extensions, and applications in our fields of obstetrics and cancer research. PMID:26239275
Kenne, Deric R; Hamilton, Kelsey; Birmingham, Lauren; Oglesby, Willie H; Fischbein, Rebecca L; Delahanty, Douglas L
2017-01-02
Since the early 1990s, the United States has seen a significant increase in the prevalence of prescription opioid misuse. Despite benefits prescription opioids provide, misuse can be fatal. The current study was designed to investigate the prevalence of prescription opioid misuse, perceived harm of misuse, and reasons for misuse for physical or emotional pain instead of seeking professional medical or mental health treatment. Survey data were collected in the fall of 2013 via an online survey to a random sample of 668 students from a public Midwestern university. Lifetime prevalence of prescription opioid misuse was 9.5%. Misusers of prescription opioid drugs generally reported lower ratings of perceived harm as compared to individuals not reporting misuse of prescription opioid drugs. Primary reasons for misuse of prescription opioid drugs was to relieve pain (33.9%), "to feel good/get high" (23.2%) and experimentation (21.4%). Lifetime misuse of a prescription opioid drug for physical or emotional pain was reported by 8.1% and 2.2% of respondents, respectively. Primary reasons for misuse for physical pain included because pain was temporary, immediate relief was needed, and no health insurance/financial resources. Primary reasons for misuse for emotional pain included not wanting others to find out, embarrassment and fear. Conclusions/Importance: Reasons for misuse of prescription opioid drugs vary by type of prescription opioid drug. Reasons for not seeking treatment that ultimately lead to misuse, vary by type of pain being treated and may be important considerations in the effort to stem the misuse of prescription opioid drugs among college students.
ERIC Educational Resources Information Center
Law, James; Reilly, Sheena; Snow, Pamela C.
2013-01-01
Background: Historically speech and language therapy services for children have been framed within a rehabilitative framework with explicit assumptions made about providing therapy to individuals. While this is clearly important in many cases, we argue that this model needs revisiting for a number of reasons. First, our understanding of the nature…
1989-06-01
Measurable goals and milestones are supported by action plans which include underlying assumptions, allocation of respon- sibility, resource ...military reasons developments to the relationship be- vironmental laws? for starting where he did. To tween hardware and the environment. Responsibility...supporting integrated testing understanding of the concerns of the - Managing test resources military services and which would set - Evaluating system
Code of Federal Regulations, 2010 CFR
2010-04-01
... representative experience may be used as an assumed retirement age. Different basic assumptions or rates may be used for different classes of risks or different groups where justified by conditions or required by... proper, or except when a change is necessitated by reason of the use of different methods, factors...
More on enrolling female students in science and engineering.
Townley, Cynthia
2010-06-01
This paper investigates reasons for practices and policies that are designed to promote higher levels of enrollment by women in scientific disciplines. It challenges the assumptions and problematic arguments of a recent article questioning their legitimacy. Considering the motivations for and merits of such programs suggests a practical response to the question of whether there should be programs to attract female science and engineering students.
ERIC Educational Resources Information Center
MDRC, 2016
2016-01-01
Many social policy and education programs start from the assumption that people act in their best interest. But behavioral science shows that people often weigh intuition over reason, make inconsistent choices, and put off big decisions. The individuals and families who need services and the staff who provide them are no exception. From city…
Generalizing on Multiple Grounds: Performance Learning in Model-Based Troubleshooting
1989-02-01
Aritificial Intelligence , 24, 1984. [Ble88] Guy E. Blelloch. Scan Primitives and Parallel Vector Models. PhD thesis, Artificial Intelligence Laboratory...Diagnostic reasoning based on strcture and behavior. Aritificial Intelligence , 24, 1984. [dK86] J. de Kleer. An assumption-based truth maintenance system...diagnosis. Aritificial Intelligence , 24. . )3 94 BIBLIOGRAPHY [Ham87] Kristian J. Hammond. Learning to anticipate and avoid planning prob- lems
A shielding theory for upward lightning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shindo, Takatoshi; Aihara, Yoshinori
1993-01-01
A new shielding theory is proposed based on the assumption that the occurrence of lightning strokes on the Japan Sea coast in winter is due to the inception of upward leaders from tall structures. Ratios of the numbers of lightning strokes to high structures observed there in winter show reasonable agreement with values calculated by this theory. Shielding characteristics of a high structure in various conditions are predicted.
ERIC Educational Resources Information Center
Sanders, Martie; George, Ann
2017-01-01
This review paper focuses on likely reasons for the rhetoric-reality gap in the use of educational information and communication technology. It is based on the assumption that the present challenges being experienced with educational ICT might be avoided in the future if we look at the current challenges from a different perspective, by revisiting…
ERIC Educational Resources Information Center
Kambon, Kobi K. K.; Hopkins, Reginald
1993-01-01
In "On the Desirability of Own-Group Preference" (1993), Michael L. Penn, Stanley O. Gaines, and Layli Phillips argue that misguided and mythical ideal of racial-social integration in America is the only reasonable and effective foundation for real African empowerment in American society. Serious intellectual battle will be required to…
Thieler, E.R.; Pilkey, O.H.; Young, R.S.; Bush, D.M.; Chai, F.
2000-01-01
A number of assumed empirical relationships (e.g., the Bruun Rule, the equilibrium shoreface profile, longshore transport rate equation, beach length: durability relationship, and the renourishment factor) and deterministic numerical models (e.g., GENESIS, SBEACH) have become important tools for investigating coastal processes and for coastal engineering design in the U.S. They are also used as the basis for making public policy decisions, such as the feasibility of nourishing recreational beaches. A review of the foundations of these relationships and models, however, suggests that they are inadequate for the tasks for which they are used. Many of the assumptions used in analytical and numerical models are not valid in the context of modern oceanographic and geologic principles. We believe the models are oversimplifications of complex systems that are poorly understood. There are several reasons for this, including: (1) poor assumptions and important omissions in model formulation; (2) the use of relationships of questionable validity to predict the morphologic response to physical forcing; (3) the lack of hindsighting and objective evaluation of beach behavior predictions for engineering projects; (4) the incorrect use of model calibration and verification as assertions of model veracity; and (5) the fundamental inability to predict coastal evolution quantitatively at the engineering and planning time and space scales our society assumes and demands. It is essential that coastal geologists, beach designers and coastal modelers understand these model limitations. Each important model assumption must be examined in isolation; incorporating them into a model does not improve their validity. It is our belief that the models reviewed here should not be relied on as a design tool until they have been substantially modified and proven in real-world situations. The 'solution,' however, is not to increase the complexity of a model by increasing the number of variables. What is needed is a thoughtful review of what beach behavior questions should or could be answered by modeling. Viable alternatives to the use of models do exist to predict the behavior of beaches. Three such alternatives to models are discussed for nourished beach design.
Supersymmetry: Compactification, flavor, and dualities
NASA Astrophysics Data System (ADS)
Heidenreich, Benjamin Jones
We describe several new research directions in the area of supersymmetry. In the context of low-energy supersymmetry, we show that the assumption of R-parity can be replaced with the minimal flavor violation hypothesis, solving the issue of nucleon decay and the new physics flavor problem in one stroke. The assumption of minimal flavor violation uniquely fixes the form of the baryon number violating vertex, leading to testable predictions. The NLSP is unstable, and decays promptly to jets, evading stringent bounds on vanilla supersymmetry from LHC searches, whereas the gravitino is long-lived, and can be a dark matter component. In the case of a sbottom LSP, neutral mesinos can form and undergo oscillations before decaying, leading to same sign tops, and allowing us to place constraints on the model in this case. We show that this well-motivated phenomenology can be naturally explained by spontaneously breaking a gauged flavor symmetry at a high scale in the presence of additional vector-like quarks, leading to mass mixings which simultaneously generate the flavor structure of the baryon-number violating vertex and the Standard Model Yukawa couplings, explaining their minimal flavor violating structure. We construct a model which is robust against Planck suppressed corrections and which also solves the mu problem. In the context of flux compactifications, we begin a study of the local geometry near a stack of D7 branes supporting a gaugino condensate, an integral component of the KKLT scenario for Kahler moduli stabilization. We obtain an exact solution for the geometry in a certain limit using reasonable assumptions about symmetries, and argue that this solution exhibits BPS domain walls, as expected from field theory arguments. We also begin a larger program of understanding general supersymmetric compactifications of type IIB string theory, reformulating previous results in an SL(2, R ) covariant fashion. Finally, we present extensive evidence for a new class of N = 1 gauge theory dualities relating different world-volume gauge theories of D3 branes probing an orientifold singularity. We argue that these dualities originate from the S-duality of type IIB string theory, much like electromagnetic dualities of N = 4 gauge theories.
Was There a Decelerating Past for the Universe?
NASA Astrophysics Data System (ADS)
John, Moncy V.
2006-03-01
Some analyzes of the apparent magnitude-redshift data of type Ia supernovas indicate that the suspected dark energy in the universe cannot be regarded as a cosmological constant of general relativistic origin or as the vacuum energy encountered in quantum field theories. If this is the case, our knowledge of the physical world remains deficient since no tested theory involves such a dark energy. Under this circumstance, an equation of state of the form p = wρ is not well-motivated and one is unable to use the Einstein equation in this case as well. I argue that the very method of analysing the data by assuming exotic energy densities with strange equations of state itself is misleading and the reasonable remaining option is to make a model-independent analysis of SNe data, without reference to the energy densities. In this basically kinematic approach, we limit ourselves to the observationally justifiable assumptions of homogeneity and isotropy, i.e., to the assumption that the universe has a RW metric. This cosmographic approach is historically the original one in cosmology. The analysis was performed by expanding the scale factor into a fifth-order polynomial, an assumption that can be further generalized to any order. The values obtained for the present expansion rates h, q0, r0 etc. are relevant, since any cosmological solution would ultimately need to explain them. Using this method, we address an important question relevant to cosmology: Was there a decelerating past for the universe? To answer this, the Bayes's probability theory is employed, which is the most appropriate tool for quantifying our knowledge when it changes through the acquisition of new data. The cosmographic approach helps to sort out models which were always accelerating from those which decelerated for at least some time in the period of interest. Bayesian model comparison technique is used to discriminate these rival hypotheses with the aid of recent releases of supernova data. It is argued that the lessons learned using Bayesian theory are extremely valuable to avoid frequent U-turns in cosmology.
Sharma, Andrea J; Vesco, Kimberly K; Bulkley, Joanna; Callaghan, William M; Bruce, F Carol; Staab, Jenny; Hornbrook, Mark C; Berg, Cynthia J
2016-10-01
Objectives Low gestational weight gain (GWG) in the second and third trimesters has been associated with increased risk of preterm delivery (PTD) among women with a body mass index (BMI) < 25 mg/m(2). However, few studies have examined whether this association differs by the assumptions made for first trimester gain or by the reason for PTD. Methods We examined singleton pregnancies during 2000-2008 among women with a BMI < 25 kg/m(2) who delivered a live-birth ≥28 weeks gestation (n = 12,526). Women received care within one integrated health care delivery system and began prenatal care ≤13 weeks. Using antenatal weights measured during clinic visits, we interpolated GWG at 13 weeks gestation then estimated rate of GWG (GWGrate) during the second and third trimesters of pregnancy. We also estimated GWGrate using the common assumption of a 2-kg gain for all women by 13 weeks. We examined the covariate-adjusted association between quartiles of GWGrate and PTD (28-36 weeks gestation) using logistic regression. We also examined associations by reason for PTD [premature rupture of membranes (PROM), spontaneous labor, or medically indicated]. Results Mean GWGrate did not differ among term and preterm pregnancies regardless of interpolated or assumed GWG at 13 weeks. However, only with GWGrate estimated from interpolated GWG at 13 weeks, we observed a U-shaped relationship where odds of PTD increased with GWGrate in the lowest (OR 1.36, 95 % CI 1.10, 1.69) or highest quartile (OR 1.49, 95 % CI 1.20, 1.85) compared to GWGrate within the second quartile. Further stratifying by reason, GWGrate in the lowest quartile was positively associated with spontaneous PTD while GWGrate in the highest quartile was positively associated with PROM and medically indicated PTD. Conclusions Accurate estimates of first trimester GWG are needed. Common assumptions applied to all pregnancies may obscure the association between GWGrate and PTD. Further research is needed to fully understand whether these associations are causal or related to common antecedents.
NASA Technical Reports Server (NTRS)
Willett, J. C.; LeVine, D. M.
2002-01-01
Direct current measurements are available near the attachment point from both natural cloud-to-ground lightning and rocket-triggered lightning, but little is known about the rise time and peak amplitude of return-stroke currents aloft. We present, as functions of height, current amplitudes, rise times, and effective propagation velocities that have been estimated with a novel remote-sensing technique from data on 24 subsequent return strokes in six different lightning flashes that were triggering at the NASA Kennedy Space Center, FL, during 1987. The unique feature of this data set is the stereo pairs of still photographs, from which three-dimensional channel geometries were determined previously. This has permitted us to calculate the fine structure of the electric-field-change (E) waveforms produced by these strokes, using the current waveforms measured at the channel base together with physically reasonable assumptions about the current distributions aloft. The computed waveforms have been compared with observed E waveforms from the same strokes, and our assumptions have been adjusted to maximize agreement. In spite of the non-uniqueness of solutions derived by this technique, several conclusions seem inescapable: 1) The effective propagation speed of the current up the channel is usually significantly (but not unreasonably) faster than the two-dimensional velocity measured by a streak camera for 14 of these strokes. 2) Given the deduced propagation speed, the peak amplitude of the current waveform often must decrease dramatically with height to prevent the electric field from being over-predicted. 3) The rise time of the current wave front must always increase rapidly with height in order to keep the fine structure of the calculated field consistent with the observations.
NASA Astrophysics Data System (ADS)
Deng, Zongyi
2001-05-01
The distinction between key ideas in teaching a high school science and key ideas in the corresponding discipline of science has been largely ignored in scholarly discourse about what science teachers should teach and about what they should know. This article clarifies this distinction through exploring how and why key ideas in teaching high school physics differ from key ideas in the discipline of physics. Its theoretical underpinnings include Dewey's (1902/1990) distinction between the psychological and the logical and Harré's (1986) epistemology of science. It analyzes how and why the key ideas in teaching color, the speed of light, and light interference at the high school level differ from the key ideas at the disciplinary level. The thesis is that key ideas in teaching high school physics can differ from key ideas in the discipline in some significant ways, and that the differences manifest Dewey's distinction. As a result, the article challenges the assumption of equating key ideas in teaching a high school science with key ideas in the corresponding discipline of science, and the assumption that having a college degree in science is sufficient to teach high school science. Furthermore, the article expands the concept of pedagogical content knowledge by arguing that key ideas in teaching high school physics constitute an essential component.
NASA Technical Reports Server (NTRS)
Liu, Xiao-Feng; Thomas, Flint O.; Nelson, Robert C.
2001-01-01
Turbulence kinetic energy (TKE) is a very important quantity for turbulence modeling and the budget of this quantity in its transport equation can provide insight into the flow physics. Turbulence kinetic energy budget measurements were conducted for a symmetric turbulent wake flow subjected to constant zero, favorable and adverse pressure gradients in year-three of research effort. The purpose of this study is to clarify the flow physics issues underlying the demonstrated influence of pressure gradient on wake development and provide experimental support for turbulence modeling. To ensure the reliability of these notoriously difficult measurements, the experimental procedure was carefully designed on the basis of an uncertainty analysis. Four different approaches, based on an isotropic turbulence assumption, a locally axisymmetric homogeneous turbulence assumption, a semi-isotropy assumption and a forced balance of the TKE equation, were applied for the estimate of the dissipation term. The pressure transport term is obtained from a forced balance of the turbulence kinetic energy equation. This report will present the results of the turbulence kinetic energy budget measurement and discuss their implication on the development of strained turbulent wakes.
Lost time: Bindings do not represent temporal order information.
Moeller, Birte; Frings, Christian
2018-06-04
Many accounts of human action control assume bindings between features of stimuli and responses of individual events. One widely accepted assumption about these bindings is that they do not contain temporal-order representations regarding the integrated elements. Even though several theories either explicitly or implicitly include it, this assumption has never been tested directly. One reason for this lack of evidence is likely that typical stimulus-response binding paradigms are inapt for such a test. Adapting a new paradigm of response-response binding to include order switches between response integration and retrieval, we were able to analyze possible representation of order information in bindings for the first time. Binding effects were identical for intact and switched response orders, indicating that bindings indeed include no temporal-order information.
Park, H M; Lee, J S; Kim, T W
2007-11-15
In the analysis of electroosmotic flows, the internal electric potential is usually modeled by the Poisson-Boltzmann equation. The Poisson-Boltzmann equation is derived from the assumption of thermodynamic equilibrium where the ionic distributions are not affected by fluid flows. Although this is a reasonable assumption for steady electroosmotic flows through straight microchannels, there are some important cases where convective transport of ions has nontrivial effects. In these cases, it is necessary to adopt the Nernst-Planck equation instead of the Poisson-Boltzmann equation to model the internal electric field. In the present work, the predictions of the Nernst-Planck equation are compared with those of the Poisson-Boltzmann equation for electroosmotic flows in various microchannels where the convective transport of ions is not negligible.
Uncovering Metaethical Assumptions in Bioethical Discourse across Cultures.
Sullivan, Laura Specker
2016-03-01
Much of bioethical discourse now takes place across cultures. This does not mean that cross-cultural understanding has increased. Many cross-cultural bioethical discussions are marked by entrenched disagreement about whether and why local practices are justified. In this paper, I argue that a major reason for these entrenched disagreements is that problematic metaethical commitments are hidden in these cross-cultural discourses. Using the issue of informed consent in East Asia as an example of one such discourse, I analyze two representative positions in the discussion and identify their metaethical commitments. I suggest that the metaethical assumptions of these positions result from their shared method of ethical justification: moral principlism. I then show why moral principlism is problematic in cross-cultural analyses and propose a more useful method for pursuing ethical justification across cultures.
CDMBE: A Case Description Model Based on Evidence
Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing
2015-01-01
By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006
Influence of condensed species on thermo-physical properties of LTE and non-LTE SF6-Cu mixture
NASA Astrophysics Data System (ADS)
Chen, Zhexin; Wu, Yi; Yang, Fei; Sun, Hao; Rong, Mingzhe; Wang, Chunlin
2017-10-01
SF6-Cu mixture is frequently formed in high-voltage circuit breakers due to the electrode erosion and metal vapor diffusion. During the interruption process, the multiphase effect and deviation from local thermal equilibrium (non-LTE assumption) can both affect the thermo-physical of the arc plasma and further influence the performance of circuit breaker. In this paper, thermo-physical properties, namely composition, thermodynamic properties and transport coefficients are calculated for multiphase SF6-Cu mixture with and without LTE assumption. The composition is confirmed by combining classical two-temperature mass action law with phase equilibrium condition deduced from second law of thermodynamics. The thermodynamic properties and transport coefficients are calculated using the multiphase composition result. The influence of condensed species on thermo-physical properties is discussed at different temperature, pressure (0.1-10 atm), non-equilibrium degrees (1-10), and copper molar proportions (0-50%). It is found that the multiphase effect has significant influence on specific enthalpy, specific heat and heavy species thermal conductivity in both LTE and non-LTE SF6-Cu system. This paper provides a more accurate database for computational fluid dynamic calculation.
Trends and associated uncertainty in the global mean temperature record
NASA Astrophysics Data System (ADS)
Poppick, A. N.; Moyer, E. J.; Stein, M.
2016-12-01
Physical models suggest that the Earth's mean temperature warms in response to changing CO2 concentrations (and hence increased radiative forcing); given physical uncertainties in this relationship, the historical temperature record is a source of empirical information about global warming. A persistent thread in many analyses of the historical temperature record, however, is the reliance on methods that appear to deemphasize both physical and statistical assumptions. Examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for natural variability in nonparametric rather than parametric ways. We show here that methods that deemphasize assumptions can limit the scope of analysis and can lead to misleading inferences, particularly in the setting considered where the data record is relatively short and the scale of temporal correlation is relatively long. A proposed model that is simple but physically informed provides a more reliable estimate of trends and allows a broader array of questions to be addressed. In accounting for uncertainty, we also illustrate how parametric statistical models that are attuned to the important characteristics of natural variability can be more reliable than ostensibly more flexible approaches.
Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results
ERIC Educational Resources Information Center
Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.
2015-01-01
We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…
Sánchez Tapia, Ingrid; Gelman, Susan A; Hollander, Michelle A; Manczak, Erika M; Mannheim, Bruce; Escalante, Carmen
2016-05-01
Teleological reasoning involves the assumption that entities exist for a purpose (giraffes have long necks for reaching leaves). This study examines how teleological reasoning relates to cultural context, by studying teleological reasoning in 61 Quechua-speaking Peruvian preschoolers (Mage = 5.3 years) and adults in an indigenous community, compared to 72 English-speaking U.S. preschoolers (Mage = 4.9 years) and university students. Data were responses to open-ended "why" questions ("Why is that mountain tall?"). Teleological explanations about nonliving natural kinds were more frequent for children than adults, and for Quechua than U.S. However, changes with age were importantly distinct from differences corresponding to cultural variation. Developmental and cultural differences in teleological explanations may reflect causal analysis of the features under consideration. © 2016 The Authors. Child Development © 2016 Society for Research in Child Development, Inc.
A simple approach to nonlinear estimation of physical systems
Christakos, G.
1988-01-01
Recursive algorithms for estimating the states of nonlinear physical systems are developed. This requires some key hypotheses regarding the structure of the underlying processes. Members of this class of random processes have several desirable properties for the nonlinear estimation of random signals. An assumption is made about the form of the estimator, which may then take account of a wide range of applications. Under the above assumption, the estimation algorithm is mathematically suboptimal but effective and computationally attractive. It may be compared favorably to Taylor series-type filters, nonlinear filters which approximate the probability density by Edgeworth or Gram-Charlier series, as well as to conventional statistical linearization-type estimators. To link theory with practice, some numerical results for a simulated system are presented, in which the responses from the proposed and the extended Kalman algorithms are compared. ?? 1988.
How to (properly) strengthen Bell's theorem using counterfactuals
NASA Astrophysics Data System (ADS)
Bigaj, Tomasz
Bell's theorem in its standard version demonstrates that the joint assumptions of the hidden-variable hypothesis and the principle of local causation lead to a conflict with quantum-mechanical predictions. In his latest counterfactual strengthening of Bell's theorem, Stapp attempts to prove that the locality assumption itself contradicts the quantum-mechanical predictions in the Hardy case. His method relies on constructing a complex, non-truth functional formula which consists of statements about measurements and outcomes in some region R, and whose truth value depends on the selection of a measurement setting in a space-like separated location L. Stapp argues that this fact shows that the information about the measurement selection made in L has to be present in R. I give detailed reasons why this conclusion can and should be resisted. Next I correct and formalize an informal argument by Shimony and Stein showing that the locality condition coupled with Einstein's criterion of reality is inconsistent with quantum-mechanical predictions. I discuss the possibility of avoiding the inconsistency by rejecting Einstein's criterion rather than the locality assumption.
Cointegration and why it works for SHM
NASA Astrophysics Data System (ADS)
Cross, Elizabeth J.; Worden, Keith
2012-08-01
One of the most fundamental problems in Structural Health Monitoring (SHM) is that of projecting out operational and environmental variations from measured feature data. The reason for this is that algorithms used for SHM to detect changes in structural condition should not raise alarms if the structure of interest changes because of benign operational or environmental variations. This is sometimes called the data normalisation problem. Many solutions to this problem have been proposed over the years, but a new approach that uses cointegration, a concept from the field of econometrics, appears to provide a very promising solution. The theory of cointegration is mathematically complex and its use is based on the holding of a number of assumptions on the time series to which it is applied. An interesting observation that has emerged from its applications to SHM data is that the approach works very well even though the aforementioned assumptions do not hold in general. The objective of the current paper is to discuss how the cointegration assumptions break down individually in the context of SHM and to explain why this does not invalidate the application of the algorithm.
Questions for Assessing Higher-Order Cognitive Skills: It's Not Just Bloom’s
Lemons, Paula P.; Lemons, J. Derrick
2013-01-01
We present an exploratory study of biologists’ ideas about higher-order cognition questions. We documented the conversations of biologists who were writing and reviewing a set of higher-order cognition questions. Using a qualitative approach, we identified the themes of these conversations. Biologists in our study used Bloom's Taxonomy to logically analyze questions. However, biologists were also concerned with question difficulty, the length of time required for students to address questions, and students’ experience with questions. Finally, some biologists demonstrated an assumption that questions should have one correct answer, not multiple reasonable solutions; this assumption undermined their comfort with some higher-order cognition questions. We generated a framework for further research that provides an interpretation of participants’ ideas about higher-order questions and a model of the relationships among these ideas. Two hypotheses emerge from this framework. First, we propose that biologists look for ways to measure difficulty when writing higher-order questions. Second, we propose that biologists’ assumptions about the role of questions in student learning strongly influence the types of higher-order questions they write. PMID:23463228
On the derivation of approximations to cellular automata models and the assumption of independence.
Davies, K J; Green, J E F; Bean, N G; Binder, B J; Ross, J V
2014-07-01
Cellular automata are discrete agent-based models, generally used in cell-based applications. There is much interest in obtaining continuum models that describe the mean behaviour of the agents in these models. Previously, continuum models have been derived for agents undergoing motility and proliferation processes, however, these models only hold under restricted conditions. In order to narrow down the reason for these restrictions, we explore three possible sources of error in deriving the model. These sources are the choice of limiting arguments, the use of a discrete-time model as opposed to a continuous-time model and the assumption of independence between the state of sites. We present a rigorous analysis in order to gain a greater understanding of the significance of these three issues. By finding a limiting regime that accurately approximates the conservation equation for the cellular automata, we are able to conclude that the inaccuracy between our approximation and the cellular automata is completely based on the assumption of independence. Copyright © 2014 Elsevier Inc. All rights reserved.
Fiero, Mallorie H; Hsu, Chiu-Hsieh; Bell, Melanie L
2017-11-20
We extend the pattern-mixture approach to handle missing continuous outcome data in longitudinal cluster randomized trials, which randomize groups of individuals to treatment arms, rather than the individuals themselves. Individuals who drop out at the same time point are grouped into the same dropout pattern. We approach extrapolation of the pattern-mixture model by applying multilevel multiple imputation, which imputes missing values while appropriately accounting for the hierarchical data structure found in cluster randomized trials. To assess parameters of interest under various missing data assumptions, imputed values are multiplied by a sensitivity parameter, k, which increases or decreases imputed values. Using simulated data, we show that estimates of parameters of interest can vary widely under differing missing data assumptions. We conduct a sensitivity analysis using real data from a cluster randomized trial by increasing k until the treatment effect inference changes. By performing a sensitivity analysis for missing data, researchers can assess whether certain missing data assumptions are reasonable for their cluster randomized trial. Copyright © 2017 John Wiley & Sons, Ltd.
A capture-recapture survival analysis model for radio-tagged animals
Pollock, K.H.; Bunck, C.M.; Winterstein, S.R.; Chen, C.-L.; North, P.M.; Nichols, J.D.
1995-01-01
In recent years, survival analysis of radio-tagged animals has developed using methods based on the Kaplan-Meier method used in medical and engineering applications (Pollock et al., 1989a,b). An important assumption of this approach is that all tagged animals with a functioning radio can be relocated at each sampling time with probability 1. This assumption may not always be reasonable in practice. In this paper, we show how a general capture-recapture model can be derived which allows for some probability (less than one) for animals to be relocated. This model is not simply a Jolly-Seber model because it is possible to relocate both dead and live animals, unlike when traditional tagging is used. The model can also be viewed as a generalization of the Kaplan-Meier procedure, thus linking the Jolly-Seber and Kaplan-Meier approaches to survival estimation. We present maximum likelihood estimators and discuss testing between submodels. We also discuss model assumptions and their validity in practice. An example is presented based on canvasback data collected by G. M. Haramis of Patuxent Wildlife Research Center, Laurel, Maryland, USA.
van Trijffel, Emiel; Plochg, Thomas; van Hartingsveld, Frank; Lucas, Cees; Oostendorp, Rob A B
2010-06-01
Passive intervertebral motion (PIVM) assessment is a characterizing skill of manual physical therapists (MPTs) and is important for judgments about impairments in spinal joint function. It is unknown as to why and how MPTs use this mobility testing of spinal motion segments within their clinical reasoning and decision-making. This qualitative study aimed to explore and understand the role and position of PIVM assessment within the manual diagnostic process. Eight semistructured individual interviews with expert MPTs and three subsequent group interviews using manual physical therapy consultation platforms were conducted. Line-by-line coding was performed on the transcribed data, and final main themes were identified from subcategories. Three researchers were involved in the analysis process. Four themes emerged from the data: contextuality, consistency, impairment orientedness, and subjectivity. These themes were interrelated and linked to concepts of professionalism and clinical reasoning. MPTs used PIVM assessment within a multidimensional, biopsychosocial framework incorporating clinical data relating to mechanical dysfunction as well as to personal factors while applying various clinical reasoning strategies. Interpretation of PIVM assessment and subsequent decisions on manipulative treatment were strongly rooted within practitioners' practical knowledge. This study has identified the specific role and position of PIVM assessment as related to other clinical findings within clinical reasoning and decision-making in manual physical therapy in The Netherlands. We recommend future research in manual diagnostics to account for the multivariable character of physical examination of the spine.
van Trijffel, Emiel; Plochg, Thomas; van Hartingsveld, Frank; Lucas, Cees; Oostendorp, Rob A B
2010-01-01
Passive intervertebral motion (PIVM) assessment is a characterizing skill of manual physical therapists (MPTs) and is important for judgments about impairments in spinal joint function. It is unknown as to why and how MPTs use this mobility testing of spinal motion segments within their clinical reasoning and decision-making. This qualitative study aimed to explore and understand the role and position of PIVM assessment within the manual diagnostic process. Eight semistructured individual interviews with expert MPTs and three subsequent group interviews using manual physical therapy consultation platforms were conducted. Line-by-line coding was performed on the transcribed data, and final main themes were identified from subcategories. Three researchers were involved in the analysis process. Four themes emerged from the data: contextuality, consistency, impairment orientedness, and subjectivity. These themes were interrelated and linked to concepts of professionalism and clinical reasoning. MPTs used PIVM assessment within a multidimensional, biopsychosocial framework incorporating clinical data relating to mechanical dysfunction as well as to personal factors while applying various clinical reasoning strategies. Interpretation of PIVM assessment and subsequent decisions on manipulative treatment were strongly rooted within practitioners’ practical knowledge. This study has identified the specific role and position of PIVM assessment as related to other clinical findings within clinical reasoning and decision-making in manual physical therapy in The Netherlands. We recommend future research in manual diagnostics to account for the multivariable character of physical examination of the spine. PMID:21655394
Examining the Relationship of Scientific Reasoning with Physics Problem Solving
ERIC Educational Resources Information Center
Fabby, Carol; Koenig, Kathleen
2015-01-01
Recent research suggests students with more formal reasoning patterns are more proficient learners. However, little research has been done to establish a relationship between scientific reasoning and problem solving abilities by novices. In this exploratory study, we compared scientific reasoning abilities of students enrolled in a college level…
Quasineutral plasma expansion into infinite vacuum as a model for parallel ELM transport
NASA Astrophysics Data System (ADS)
Moulton, D.; Ghendrih, Ph; Fundamenski, W.; Manfredi, G.; Tskhakaya, D.
2013-08-01
An analytic solution for the expansion of a plasma into vacuum is assessed for its relevance to the parallel transport of edge localized mode (ELM) filaments along field lines. This solution solves the 1D1V Vlasov-Poisson equations for the adiabatic (instantaneous source), collisionless expansion of a Gaussian plasma bunch into an infinite space in the quasineutral limit. The quasineutral assumption is found to hold as long as λD0/σ0 ≲ 0.01 (where λD0 is the initial Debye length at peak density and σ0 is the parallel length of the Gaussian filament), a condition that is physically realistic. The inclusion of a boundary at x = L and consequent formation of a target sheath is found to have a negligible effect when L/σ0 ≳ 5, a condition that is physically plausible. Under the same condition, the target flux densities predicted by the analytic solution are well approximated by the ‘free-streaming’ equations used in previous experimental studies, strengthening the notion that these simple equations are physically reasonable. Importantly, the analytic solution predicts a zero heat flux density so that a fluid approach to the problem can be used equally well, at least when the source is instantaneous. It is found that, even for JET-like pedestal parameters, collisions can affect the expansion dynamics via electron temperature isotropization, although this is probably a secondary effect. Finally, the effect of a finite duration, τsrc, for the plasma source is investigated. As is found for an instantaneous source, when L/σ0 ≳ 5 the presence of a target sheath has a negligible effect, at least up to the explored range of τsrc = L/cs (where cs is the sound speed at the initial temperature).
Wheatley, Catherine M; Davies, Emma L; Dawes, Helen
2018-03-01
The health benefits of exercise in school are recognized, yet physical activity continues to decline during early adolescence despite numerous interventions. In this study, we investigated whether the prototype willingness model, an account of adolescent decision making that includes both reasoned behavioral choices and unplanned responses to social environments, might improve understanding of physical activity in school. We conducted focus groups with British pupils aged 12 to 13 years and used deductive thematic analysis to search for themes relating to the model. Participants described reasoned decisions about physical activity outside school and unplanned choices to be inactive during break, in response to social contexts described as more "judgmental" than in primary school. Social contexts appeared characterized by anxiety about competence, negative peer evaluation, and inactive playground norms. The prototype willingness model might more fully explain physical activity in school than reasoned behavioral models alone, indicating potential for interventions targeting anxieties about playground social environments.
Duggan, Katherine A; McDevitt, Elizabeth A; Whitehurst, Lauren N; Mednick, Sara C
2018-01-01
Although napping has received attention because of its associations with health and use as a method to understand the function of sleep, to our knowledge no study has systematically and statistically assessed reasons for napping. Using factor analysis, we determined the underlying structure of reasons for napping in diverse undergraduates (N = 430, 59% female) and examined their relationships with self-reported sleep, psychological health, and physical health. The five reasons for napping can be summarized using the acronym DREAM (Dysregulative, Restorative, Emotional, Appetitive, and Mindful). Only Emotional reasons for napping were uniformly related to lower well-being. The use of factor analysis raises possibilities for future research, including examining the stability, structure, and psychological and physical health processes related to napping throughout the lifespan.
To nap, perchance to DREAM: A factor analysis of college students’ self-reported reasons for napping
Duggan, Katherine A.; McDevitt, Elizabeth A.; Whitehurst, Lauren N.; Mednick, Sara C.
2017-01-01
Although napping has received attention because of its associations with health and use as a method to understand the function of sleep, to our knowledge no study has systematically and statistically assessed reasons for napping. Using factor analysis, we determined the underlying structure of reasons for napping in diverse undergraduates (N=430, 59% female) and examined their relationships with self-reported sleep, psychological, and physical health. The 5 reasons for napping can be summarized using the acronym DREAM (Dysregulative, Restorative, Emotional, Appetitive, and Mindful). Only Emotional reasons for napping were uniformly related to lower well-being. The use of factor analysis raises possibilities for future research, including examining the stability, structure, and psychological and physical health processes related to napping throughout the lifespan. PMID:27347727
Roth, Bradley J.
2002-09-01
Insidious experimental artifacts and invalid theoretical assumptions complicate the comparison of numerical predictions and observed data. Such difficulties are particularly troublesome when studying electrical stimulation of the heart. During unipolar stimulation of cardiac tissue, the artifacts include nonlinearity of membrane dyes, optical signals blocked by the stimulating electrode, averaging of optical signals with depth, lateral averaging of optical signals, limitations of the current source, and the use of excitation-contraction uncouplers. The assumptions involve electroporation, membrane models, electrode size, the perfusing bath, incorrect model parameters, the applicability of a continuum model, and tissue damage. Comparisons of theory and experiment during far-field stimulation are limited by many of these same factors, plus artifacts from plunge and epicardial recording electrodes and assumptions about the fiber angle at an insulating boundary. These pitfalls must be overcome in order to understand quantitatively how the heart responds to an electrical stimulus. (c) 2002 American Institute of Physics.
Galaxy Selection and the Surface Brightness Distribution
NASA Astrophysics Data System (ADS)
McGaugh, Stacy S.; Bothun, Gregory D.; Schombert, James M.
1995-08-01
Optical surveys for galaxies are biased against the inclusion of low surface brightness (LSB) galaxies. Disney [Nature, 263,573(1976)] suggested that the constancy of disk central surface brightness noticed by Freeman [ApJ, 160,811(1970)] was not a physical result, but instead was an artifact of sample selection. Since LSB galaxies do exist, the pertinent and still controversial issue is if these newly discovered galaxies constitute a significant percentage of the general galaxy population. In this paper, we address this issue by determining the space density of galaxies as a function of disk central surface brightness. Using the physically reasonable assumption (which is motivated by the data) that central surface brightness is independent of disk scale length, we arrive at a distribution which is roughly flat (i.e., approximately equal numbers of galaxies at each surface brightness) faintwards of the Freeman (1970) value. Brightwards of this, we find a sharp decline in the distribution which is analogous to the turn down in the luminosity function at L^*^. An intrinsically sharply peaked "Freeman law" distribution can be completely ruled out, and no Gaussian distribution can fit the data. Low surface brightness galaxies (those with central surface brightness fainter than 22 B mag arcsec^-2^) comprise >~ 1/2 the general galaxy population, so a representative sample of galaxies at z = 0 does not really exist at present since past surveys have been insensitive to this component of the general galaxy population.
Analytical one-dimensional model for laser-induced ultrasound in planar optically absorbing layer.
Svanström, Erika; Linder, Tomas; Löfqvist, Torbjörn
2014-03-01
Ultrasound generated by means of laser-based photoacoustic principles are in common use today and applications can be found both in biomedical diagnostics, non-destructive testing and materials characterisation. For certain measurement applications it could be beneficial to shape the generated ultrasound regarding spectral properties and temporal profile. To address this, we studied the generation and propagation of laser-induced ultrasound in a planar, layered structure. We derived an analytical expression for the induced pressure wave, including different physical and optical properties of each layer. A Laplace transform approach was employed in analytically solving the resulting set of photoacoustic wave equations. The results correspond to simulations and were compared to experimental results. To enable the comparison between recorded voltage from the experiments and the calculated pressure we employed a system identification procedure based on physical properties of the ultrasonic transducer to convert the calculated acoustic pressure to voltages. We found reasonable agreement between experimentally obtained voltages and the voltages determined from the calculated acoustic pressure, for the samples studied. The system identification procedure was found to be unstable, however, possibly from violations of material isotropy assumptions by film adhesives and coatings in the experiment. The presented analytical model can serve as a basis when addressing the inverse problem of shaping an acoustic pulse from absorption of a laser pulse in a planar layered structure of elastic materials. Copyright © 2013 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Wenqiang, E-mail: wfeng1@vols.utk.edu; Salgado, Abner J., E-mail: asalgad1@utk.edu; Wang, Cheng, E-mail: cwang1@umassd.edu
We describe and analyze preconditioned steepest descent (PSD) solvers for fourth and sixth-order nonlinear elliptic equations that include p-Laplacian terms on periodic domains in 2 and 3 dimensions. The highest and lowest order terms of the equations are constant-coefficient, positive linear operators, which suggests a natural preconditioning strategy. Such nonlinear elliptic equations often arise from time discretization of parabolic equations that model various biological and physical phenomena, in particular, liquid crystals, thin film epitaxial growth and phase transformations. The analyses of the schemes involve the characterization of the strictly convex energies associated with the equations. We first give a generalmore » framework for PSD in Hilbert spaces. Based on certain reasonable assumptions of the linear pre-conditioner, a geometric convergence rate is shown for the nonlinear PSD iteration. We then apply the general theory to the fourth and sixth-order problems of interest, making use of Sobolev embedding and regularity results to confirm the appropriateness of our pre-conditioners for the regularized p-Lapacian problems. Our results include a sharper theoretical convergence result for p-Laplacian systems compared to what may be found in existing works. We demonstrate rigorously how to apply the theory in the finite dimensional setting using finite difference discretization methods. Numerical simulations for some important physical application problems – including thin film epitaxy with slope selection and the square phase field crystal model – are carried out to verify the efficiency of the scheme.« less
NASA Astrophysics Data System (ADS)
Feng, Wenqiang; Salgado, Abner J.; Wang, Cheng; Wise, Steven M.
2017-04-01
We describe and analyze preconditioned steepest descent (PSD) solvers for fourth and sixth-order nonlinear elliptic equations that include p-Laplacian terms on periodic domains in 2 and 3 dimensions. The highest and lowest order terms of the equations are constant-coefficient, positive linear operators, which suggests a natural preconditioning strategy. Such nonlinear elliptic equations often arise from time discretization of parabolic equations that model various biological and physical phenomena, in particular, liquid crystals, thin film epitaxial growth and phase transformations. The analyses of the schemes involve the characterization of the strictly convex energies associated with the equations. We first give a general framework for PSD in Hilbert spaces. Based on certain reasonable assumptions of the linear pre-conditioner, a geometric convergence rate is shown for the nonlinear PSD iteration. We then apply the general theory to the fourth and sixth-order problems of interest, making use of Sobolev embedding and regularity results to confirm the appropriateness of our pre-conditioners for the regularized p-Lapacian problems. Our results include a sharper theoretical convergence result for p-Laplacian systems compared to what may be found in existing works. We demonstrate rigorously how to apply the theory in the finite dimensional setting using finite difference discretization methods. Numerical simulations for some important physical application problems - including thin film epitaxy with slope selection and the square phase field crystal model - are carried out to verify the efficiency of the scheme.
Preprocessing Inconsistent Linear System for a Meaningful Least Squares Solution
NASA Technical Reports Server (NTRS)
Sen, Syamal K.; Shaykhian, Gholam Ali
2011-01-01
Mathematical models of many physical/statistical problems are systems of linear equations. Due to measurement and possible human errors/mistakes in modeling/data, as well as due to certain assumptions to reduce complexity, inconsistency (contradiction) is injected into the model, viz. the linear system. While any inconsistent system irrespective of the degree of inconsistency has always a least-squares solution, one needs to check whether an equation is too much inconsistent or, equivalently too much contradictory. Such an equation will affect/distort the least-squares solution to such an extent that renders it unacceptable/unfit to be used in a real-world application. We propose an algorithm which (i) prunes numerically redundant linear equations from the system as these do not add any new information to the model, (ii) detects contradictory linear equations along with their degree of contradiction (inconsistency index), (iii) removes those equations presumed to be too contradictory, and then (iv) obtain the minimum norm least-squares solution of the acceptably inconsistent reduced linear system. The algorithm presented in Matlab reduces the computational and storage complexities and also improves the accuracy of the solution. It also provides the necessary warning about the existence of too much contradiction in the model. In addition, we suggest a thorough relook into the mathematical modeling to determine the reason why unacceptable contradiction has occurred thus prompting us to make necessary corrections/modifications to the models - both mathematical and, if necessary, physical.
Preprocessing in Matlab Inconsistent Linear System for a Meaningful Least Squares Solution
NASA Technical Reports Server (NTRS)
Sen, Symal K.; Shaykhian, Gholam Ali
2011-01-01
Mathematical models of many physical/statistical problems are systems of linear equations Due to measurement and possible human errors/mistakes in modeling/data, as well as due to certain assumptions to reduce complexity, inconsistency (contradiction) is injected into the model, viz. the linear system. While any inconsistent system irrespective of the degree of inconsistency has always a least-squares solution, one needs to check whether an equation is too much inconsistent or, equivalently too much contradictory. Such an equation will affect/distort the least-squares solution to such an extent that renders it unacceptable/unfit to be used in a real-world application. We propose an algorithm which (i) prunes numerically redundant linear equations from the system as these do not add any new information to the model, (ii) detects contradictory linear equations along with their degree of contradiction (inconsistency index), (iii) removes those equations presumed to be too contradictory, and then (iv) obtain the . minimum norm least-squares solution of the acceptably inconsistent reduced linear system. The algorithm presented in Matlab reduces the computational and storage complexities and also improves the accuracy of the solution. It also provides the necessary warning about the existence of too much contradiction in the model. In addition, we suggest a thorough relook into the mathematical modeling to determine the reason why unacceptable contradiction has occurred thus prompting us to make necessary corrections/modifications to the models - both mathematical and, if necessary, physical.
The diagnostic capability of iron limes
NASA Astrophysics Data System (ADS)
Giannini, Teresa; Nisini, Brunella; Antoniucci, Simone; Alcala, Juan; Bacciotti, Francesca; Bonito, Rosaria; Podio, Linda; Stelzer, Beate; Whelan, Emma
2013-07-01
We present the VLT/X-Shooter spectrum of two jets from young protostars of different luminosity and mass, ESO-Halpha 574 and Par-Lup 3-4. In the covered spectral range (350-2500 nm) we detected more than 100 [FeII] and [FeIII] lines, which are used to precisely probe the key physical parameters of the gas (electron density and temperature, ionization degree, visual extinction). These quantities have been compared with shock-model predictions, which suggest that only the higher luminosity source (ESO-Ha 574) is able to drive a high-velocity and dissociative shock. The diagnostic capability of Iron, proven on the presented objects, represents a unique tool for the following reasons: 1) the large number of lines in the uv-infrared range makes possible to trace the physical conditions in a very large range of the parameter space; 2) at variance with the diagnostic commonly performed with other species, such as Oxygen, Nitrogen, and Sulphur, no assumption on the relative abundance is needed, since all the parameters are derived from line ratios of the same species; 3) in the unperturbed ISM, Iron is locked on the grain surfaces, while it is released in gas-phase if gas-grain or grain-grain collisions occur within a shock. Therefore the Iron abundance (derivable from ratios of Iron lines with those of other volatile species) is a direct probe of the presence of dust in the jet beam, an information crucial to understand whether jets originate close to the star or in the circumstellar disk.
NASA Astrophysics Data System (ADS)
Szajewski, B. A.; Hunter, A.; Luscher, D. J.; Beyerlein, I. J.
2018-01-01
Both theoretical and numerical models of dislocations often necessitate the assumption of elastic isotropy to retain analytical tractability in addition to reducing computational load. As dislocation based models evolve towards physically realistic material descriptions, the assumption of elastic isotropy becomes increasingly worthy of examination. We present an analytical dislocation model for calculating the full dissociated core structure of dislocations within anisotropic face centered cubic (FCC) crystals as a function of the degree of material elastic anisotropy, two misfit energy densities on the γ-surface ({γ }{{isf}}, {γ }{{usf}}) and the remaining elastic constants. Our solution is independent of any additional features of the γ-surface. Towards this pursuit, we first demonstrate that the dependence of the anisotropic elasticity tensor on the orientation of the dislocation line within the FCC crystalline lattice is small and may be reasonably neglected for typical materials. With this approximation, explicit analytic solutions for the anisotropic elasticity tensor {B} for both nominally edge and screw dislocations within an FCC crystalline lattice are devised, and employed towards defining a set of effective isotropic elastic constants which reproduce fully anisotropic results, however do not retain the bulk modulus. Conversely, Hill averaged elastic constants which both retain the bulk modulus and reasonably approximate the dislocation core structure are employed within subsequent numerical calculations. We examine a wide range of materials within this study, and the features of each partial dislocation core are sufficiently localized that application of discrete linear elasticity accurately describes the separation of each partial dislocation core. In addition, the local features (the partial dislocation core distribution) are well described by a Peierls-Nabarro dislocation model. We develop a model for the displacement profile which depends upon two disparate dislocation length scales which describe the core structure; (i) the equilibrium stacking fault width between two Shockley partial dislocations, R eq and (ii) the maximum slip gradient, χ, of each Shockley partial dislocation. We demonstrate excellent agreement between our own analytic predictions, numerical calculations, and R eq computed directly by both ab-initio and molecular statics methods found elsewhere within the literature. The results suggest that understanding of various plastic mechanisms, e.g., cross-slip and nucleation may be augmented with the inclusion of elastic anisotropy.
ERIC Educational Resources Information Center
Goldhaber, Dan; Hansen, Michael
2010-01-01
Reforming teacher tenure is an idea that appears to be gaining traction with the underlying assumption being that one can infer to a reasonable degree how well a teacher will perform over her career based on estimates of her early-career effectiveness. Here we explore the potential for using value-added models to estimate performance and inform…
Risk and value analysis of SETI
NASA Technical Reports Server (NTRS)
Billingham, J.
1990-01-01
This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.
Low-energy multiple rendezvous of main belt asteroids
NASA Technical Reports Server (NTRS)
Penzo, Paul A.; Bender, David F.
1992-01-01
An approach to multiple asteroid rendezvous missions to the main belt region is proposed. In this approach key information which consists of a launch date and delta V can be generated for all possible pairs of asteroids satisfying specific constraints. This information is made available on a computer file for 1000 numbered asteroids with reasonable assumptions, limitations, and approximations to limit the computer requirements and the size of the data file.
An Evaluation of a Combat Conditioning Trial Program
2008-11-20
rests on the weaker assumption that the pretest is correlated with the posttest score. The second reason was that ANCOVA would yield training effects...performance. cSignificant Unit x Time interaction in ANCOVA. Note. Pretest performance was a significant (p < .001) predictor of posttest performance for... Pretest fitness level was defined by splitting the pretest distribution for CV into quartiles. Posttest performance improved 7.2% for participants in
Modeling of air pollution from the power plant ash dumps
NASA Astrophysics Data System (ADS)
Aleksic, Nenad M.; Balać, Nedeljko
A simple model of air pollution from power plant ash dumps is presented, with emission rates calculated from the Bagnold formula and transport simulated by the ATDL type model. Moisture effects are accounted for by assumption that there is no pollution on rain days. Annual mean daily sedimentation rates, calculated for the area around the 'Nikola Tesla' power plants near Belgrade for 1987, show reasonably good agreement with observations.
Reasoning with Incomplete and Uncertain Information
1991-08-01
are rationally compatible (just as is the case in the fundamental computational mechanisms of truth maintenance systems ). The logics we construct will...complete, pre- cise, and unvarying. This fundamental assumption is a principal source of the limitation of many diagnostic systems to single fault diagnoses...Air Force Systems Command Griffiss Air Force Base, NY 13441-5700 This report has been reviewed by the Rome Laboratory Public Affairs Dffice (PA) and
Risk and value analysis of SETI.
Billingham, J
1990-01-01
This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.
The EPR paradox, Bell's inequality, and the question of locality
NASA Astrophysics Data System (ADS)
Blaylock, Guy
2010-01-01
Most physicists agree that the Einstein-Podolsky-Rosen-Bell paradox exemplifies much of the strange behavior of quantum mechanics, but argument persists about what assumptions underlie the paradox. To clarify what the debate is about, we employ a simple and well-known thought experiment involving two correlated photons to help us focus on the logical assumptions needed to construct the EPR and Bell arguments. The view presented in this paper is that the minimal assumptions behind Bell's inequality are locality and counterfactual definiteness but not scientific realism, determinism, or hidden variables as are often suggested. We further examine the resulting constraints on physical theory with an illustration from the many-worlds interpretation of quantum mechanics—an interpretation that we argue is deterministic, local, and realist but that nonetheless violates the Bell inequality.
The momentum of an electromagnetic wave inside a dielectric derived from the Snell refraction law
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torchigin, V.P., E-mail: v_torchigin@mail.ru; Torchigin, A.V.
2014-12-15
Author of the paper [M. Testa, Ann. Physics 336 (2013) 1] has derived a conclusion that there is a connection between the Snell refraction law and the Abraham form of the momentum of light in matter. In other words, author derived the Snell law on assumption that the momentum of light in matter decreases by n times as compared with that in free space. The conclusion is derived under assumption that the forces exerted on an optical medium by an electromagnetic field do not distinguish between polarization and free charges. We show that, on the contrary, the Minkowski form ofmore » the momentum of light in matter directly follows from the Snell law. No previous assumption is required for this purpose.« less
A lattice Boltzmann model for the Burgers-Fisher equation.
Zhang, Jianying; Yan, Guangwu
2010-06-01
A lattice Boltzmann model is developed for the one- and two-dimensional Burgers-Fisher equation based on the method of the higher-order moment of equilibrium distribution functions and a series of partial differential equations in different time scales. In order to obtain the two-dimensional Burgers-Fisher equation, vector sigma(j) has been used. And in order to overcome the drawbacks of "error rebound," a new assumption of additional distribution is presented, where two additional terms, in first order and second order separately, are used. Comparisons with the results obtained by other methods reveal that the numerical solutions obtained by the proposed method converge to exact solutions. The model under new assumption gives better results than that with second order assumption. (c) 2010 American Institute of Physics.
Quality of reporting statistics in two Indian pharmacology journals.
Jaykaran; Yadav, Preeti
2011-04-01
To evaluate the reporting of the statistical methods in articles published in two Indian pharmacology journals. All original articles published since 2002 were downloaded from the journals' (Indian Journal of Pharmacology (IJP) and Indian Journal of Physiology and Pharmacology (IJPP)) website. These articles were evaluated on the basis of appropriateness of descriptive statistics and inferential statistics. Descriptive statistics was evaluated on the basis of reporting of method of description and central tendencies. Inferential statistics was evaluated on the basis of fulfilling of assumption of statistical methods and appropriateness of statistical tests. Values are described as frequencies, percentage, and 95% confidence interval (CI) around the percentages. Inappropriate descriptive statistics was observed in 150 (78.1%, 95% CI 71.7-83.3%) articles. Most common reason for this inappropriate descriptive statistics was use of mean ± SEM at the place of "mean (SD)" or "mean ± SD." Most common statistical method used was one-way ANOVA (58.4%). Information regarding checking of assumption of statistical test was mentioned in only two articles. Inappropriate statistical test was observed in 61 (31.7%, 95% CI 25.6-38.6%) articles. Most common reason for inappropriate statistical test was the use of two group test for three or more groups. Articles published in two Indian pharmacology journals are not devoid of statistical errors.
Why you cannot transform your way out of trouble for small counts.
Warton, David I
2018-03-01
While data transformation is a common strategy to satisfy linear modeling assumptions, a theoretical result is used to show that transformation cannot reasonably be expected to stabilize variances for small counts. Under broad assumptions, as counts get smaller, it is shown that the variance becomes proportional to the mean under monotonic transformations g(·) that satisfy g(0)=0, excepting a few pathological cases. A suggested rule-of-thumb is that if many predicted counts are less than one then data transformation cannot reasonably be expected to stabilize variances, even for a well-chosen transformation. This result has clear implications for the analysis of counts as often implemented in the applied sciences, but particularly for multivariate analysis in ecology. Multivariate discrete data are often collected in ecology, typically with a large proportion of zeros, and it is currently widespread to use methods of analysis that do not account for differences in variance across observations nor across responses. Simulations demonstrate that failure to account for the mean-variance relationship can have particularly severe consequences in this context, and also in the univariate context if the sampling design is unbalanced. © 2017 The Authors. Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.