ERIC Educational Resources Information Center
Slisko, Josip; Cruz, Adrian Corona
2013-01-01
There is a general agreement that critical thinking is an important element of 21st century skills. Although critical thinking is a very complex and controversial conception, many would accept that recognition and evaluation of assumptions is a basic critical-thinking process. When students use simple mathematical model to reason quantitatively…
ERIC Educational Resources Information Center
Braasch, Jason L. G.; Bråten, Ivar
2017-01-01
Despite the importance of source attention and evaluation for learning from texts, little is known about the particular conditions that encourage sourcing during reading. In this article, basic assumptions of the discrepancy-induced source comprehension (D-ISC) model are presented, which describes the moment-by-moment cognitive processes that…
High Tech Educators Network Evaluation.
ERIC Educational Resources Information Center
O'Shea, Dan
A process evaluation was conducted to assess the High Tech Educators Network's (HTEN's) activities. Four basic components to the evaluation approach were documentation review, program logic model, written survey, and participant interviews. The model mapped the basic goals and objectives, assumptions, activities, outcome expectations, and…
Causality and headache triggers
Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.
2013-01-01
Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872
Rocca, Elena; Andersen, Fredrik
2017-08-14
Scientific risk evaluations are constructed by specific evidence, value judgements and biological background assumptions. The latter are the framework-setting suppositions we apply in order to understand some new phenomenon. That background assumptions co-determine choice of methodology, data interpretation, and choice of relevant evidence is an uncontroversial claim in modern basic science. Furthermore, it is commonly accepted that, unless explicated, disagreements in background assumptions can lead to misunderstanding as well as miscommunication. Here, we extend the discussion on background assumptions from basic science to the debate over genetically modified (GM) plants risk assessment. In this realm, while the different political, social and economic values are often mentioned, the identity and role of background assumptions at play are rarely examined. We use an example from the debate over risk assessment of stacked genetically modified plants (GM stacks), obtained by applying conventional breeding techniques to GM plants. There are two main regulatory practices of GM stacks: (i) regulate as conventional hybrids and (ii) regulate as new GM plants. We analyzed eight papers representative of these positions and found that, in all cases, additional premises are needed to reach the stated conclusions. We suggest that these premises play the role of biological background assumptions and argue that the most effective way toward a unified framework for risk analysis and regulation of GM stacks is by explicating and examining the biological background assumptions of each position. Once explicated, it is possible to either evaluate which background assumptions best reflect contemporary biological knowledge, or to apply Douglas' 'inductive risk' argument.
Educational Evaluation: Analysis and Responsibility.
ERIC Educational Resources Information Center
Apple, Michael W., Ed.; And Others
This book presents controversial aspects of evaluation and aims at broadening perspectives and insights in the evaluation field. Chapter 1 criticizes modes of evaluation and the basic rationality behind them and focuses on assumptions that have problematic consequences. Chapter 2 introduces concepts of evaluation and examines methods of grading…
Hogan, Thomas J
2012-05-01
The objective was to review recent economic evaluations of influenza vaccination by injection in the US, assess their evidence, and conclude on their collective findings. The literature was searched for economic evaluations of influenza vaccination injection in healthy working adults in the US published since 1995. Ten evaluations described in nine papers were identified. These were synopsized and their results evaluated, the basic structure of all evaluations was ascertained, and sensitivity of outcomes to changes in parameter values were explored using a decision model. Areas to improve economic evaluations were noted. Eight of nine evaluations with credible economic outcomes were favourable to vaccination, representing a statistically significant result compared with a proportion of 50% that would be expected if vaccination and no vaccination were economically equivalent. Evaluations shared a basic structure, but differed considerably with respect to cost components, assumptions, methods, and parameter estimates. Sensitivity analysis indicated that changes in parameter values within the feasible range, individually or simultaneously, could reverse economic outcomes. Given stated misgivings, the methods of estimating influenza reduction ascribed to vaccination must be researched to confirm that they produce accurate and reliable estimates. Research is also needed to improve estimates of the costs per case of influenza illness and the costs of vaccination. Based on their assumptions, the reviewed papers collectively appear to support the economic benefits of influenza vaccination of healthy adults. Yet the underlying assumptions, methods and parameter estimates themselves warrant further research to confirm they are accurate, reliable and appropriate to economic evaluation purposes.
Development and Validation of a Clarinet Performance Adjudication Scale
ERIC Educational Resources Information Center
Abeles, Harold F.
1973-01-01
A basic assumption of this study is that there are generally agreed upon performance standards as evidenced by the use of adjudicators for evaluations at contests and festivals. An evaluation instrument was developed to enable raters to measure effectively those aspects of performance that have common standards of proficiency. (Author/RK)
Graduate Education in Psychology: A Comment on Rogers' Passionate Statement
ERIC Educational Resources Information Center
Brown, Robert C., Jr.; Tedeschi, James T.
1972-01-01
Authors' hope that this critical evaluation can place Carl Rogers' assumptions into perspective; they propose a compromise program meant to satisfy the basic aims of a humanistic psychology program. For Rogers' rejoinder see AA 512 869. (MB)
Trujillo, Caleb; Cooper, Melanie M; Klymkowsky, Michael W
2012-01-01
Biological systems, from the molecular to the ecological, involve dynamic interaction networks. To examine student thinking about networks we used graphical responses, since they are easier to evaluate for implied, but unarticulated assumptions. Senior college level molecular biology students were presented with simple molecular level scenarios; surprisingly, most students failed to articulate the basic assumptions needed to generate reasonable graphical representations; their graphs often contradicted their explicit assumptions. We then developed a tiered Socratic tutorial based on leading questions designed to provoke metacognitive reflection. The activity is characterized by leading questions (prompts) designed to provoke meta-cognitive reflection. When applied in a group or individual setting, there was clear improvement in targeted areas. Our results highlight the promise of using graphical responses and Socratic prompts in a tutorial context as both a formative assessment for students and an informative feedback system for instructors, in part because graphical responses are relatively easy to evaluate for implied, but unarticulated assumptions. Copyright © 2011 Wiley Periodicals, Inc.
Exceptional Children Conference Papers: Behavioral and Emotional Problems.
ERIC Educational Resources Information Center
Council for Exceptional Children, Arlington, VA.
Four of the seven conference papers treating behavioral and emotional problems concern the Conceptual Project, an attempt to provide definition and evaluation of conceptual models of the various theories of emotional disturbance and their basic assumptions, and to provide training packages based on these materials. The project is described in…
Costing interventions in primary care.
Kernick, D
2000-02-01
Against a background of increasing demands on limited resources, studies that relate benefits of health interventions to the resources they consume will be an important part of any decision-making process in primary care, and an accurate assessment of costs will be an important part of any economic evaluation. Although there is no such thing as a gold standard cost estimate, there are a number of basic costing concepts that underlie any costing study. How costs are derived and combined will depend on the assumptions that have been made in their derivation. It is important to be clear what assumptions have been made and why in order to maintain consistency across comparative studies and prevent inappropriate conclusions being drawn. This paper outlines some costing concepts and principles to enable primary care practitioners and researchers to have a basic understanding of costing exercises and their pitfalls.
United States Air Force Agency Financial Report 2014
2014-01-01
basic sciences and 45 semester hours in humanities and social sciences . This 90 semester hour total comprises 60 percent of the total academic...Test and Evaluation Support $723 F-35 $628 Defense Research Sciences $373 GPS III-Operational Control Segment $373 Long Range Strike Bomber $359...Development, Test & Evaluation Family Housing & Military Construction (Less: Earned Revenue) Net Cost before Losses/ (Gains) from Actuarial Assumption
Song, Fujian; Loke, Yoon K; Walsh, Tanya; Glenny, Anne-Marie; Eastwood, Alison J; Altman, Douglas G
2009-04-03
To investigate basic assumptions and other methodological problems in the application of indirect comparison in systematic reviews of competing healthcare interventions. Survey of published systematic reviews. Inclusion criteria Systematic reviews published between 2000 and 2007 in which an indirect approach had been explicitly used. Identified reviews were assessed for comprehensiveness of the literature search, method for indirect comparison, and whether assumptions about similarity and consistency were explicitly mentioned. The survey included 88 review reports. In 13 reviews, indirect comparison was informal. Results from different trials were naively compared without using a common control in six reviews. Adjusted indirect comparison was usually done using classic frequentist methods (n=49) or more complex methods (n=18). The key assumption of trial similarity was explicitly mentioned in only 40 of the 88 reviews. The consistency assumption was not explicit in most cases where direct and indirect evidence were compared or combined (18/30). Evidence from head to head comparison trials was not systematically searched for or not included in nine cases. Identified methodological problems were an unclear understanding of underlying assumptions, inappropriate search and selection of relevant trials, use of inappropriate or flawed methods, lack of objective and validated methods to assess or improve trial similarity, and inadequate comparison or inappropriate combination of direct and indirect evidence. Adequate understanding of basic assumptions underlying indirect and mixed treatment comparison is crucial to resolve these methodological problems. APPENDIX 1: PubMed search strategy. APPENDIX 2: Characteristics of identified reports. APPENDIX 3: Identified studies. References of included studies.
Using LISREL to Evaluate Measurement Models and Scale Reliability.
ERIC Educational Resources Information Center
Fleishman, John; Benson, Jeri
1987-01-01
LISREL program was used to examine measurement model assumptions and to assess reliability of Coopersmith Self-Esteem Inventory for Children, Form B. Data on 722 third-sixth graders from over 70 schools in large urban school district were used. LISREL program assessed (1) nature of basic measurement model for scale, (2) scale invariance across…
ERIC Educational Resources Information Center
Camerer, Rudi
2014-01-01
The testing of intercultural competence has long been regarded as the field of psychometric test procedures, which claim to analyse an individual's personality by specifying and quantifying personality traits with the help of self-answer questionnaires and the statistical evaluation of these. The underlying assumption is that what is analysed and…
ERIC Educational Resources Information Center
Haegele, Justin A.; Hodge, Samuel R.
2015-01-01
Emerging professionals, particularly senior-level undergraduate and graduate students in kinesiology who have an interest in physical education for individuals with and without disabilities, should understand the basic assumptions of the quantitative research paradigm. Knowledge of basic assumptions is critical for conducting, analyzing, and…
1983-05-01
in the presence of fillers or without it. The basic assumption made is that the heat of reaction is proportional to the extent of the reaction...disperse the SillllV* rVdi\\tion ^^9 • .canning machan ^m. ill isolate the frequency range falling on the detector In this manner. the spectrum...the molar orms with only has n absorb ing (nxp) and # by the udied. Of t have a all of the analysis a complete the same There are two basic
Misleading Theoretical Assumptions in Hypertext/Hypermedia Research.
ERIC Educational Resources Information Center
Tergan, Sigmar-Olaf
1997-01-01
Reviews basic theoretical assumptions of research on learning with hypertext/hypermedia. Focuses on whether the results of research on hypertext/hypermedia-based learning support these assumptions. Results of empirical studies and theoretical analysis reveal that many research approaches have been misled by inappropriate theoretical assumptions on…
ERIC Educational Resources Information Center
Feinberg, Walter
2006-01-01
This essay explores a disciplinary hybrid, called here, philosophical ethnography. Philosophical ethnography is a philosophy of the everyday and ethnography in the context of intercultural discourse about coordinating meaning, evaluation, norms and action. Its basic assumption is that in the affairs of human beings truth, justice and beauty are…
ERIC Educational Resources Information Center
Dimitrova, Radosveta; Ferrer-Wreder, Laura; Galanti, Maria Rosaria
2016-01-01
This study evaluated the factorial structure of the Pedagogical and Social Climate in School (PESOC) questionnaire among 307 teachers in Bulgaria. The teacher edition of PESOC consists of 11 scales (i.e., Expectations for Students, Unity Among Teachers, Approach to Students, Basic Assumptions About Students' Ability to Learn, School-Home…
ERIC Educational Resources Information Center
Cunha, George M.
This Records and Archives Management Programme (RAMP) study is intended to assist in the development of basic training programs and courses in document preservation and restoration, and to promote harmonization of such training both within the archival profession and within the broader information field. Based on the assumption that conservation…
Didactics and History of Mathematics: Knowledge and Self-Knowledge
ERIC Educational Resources Information Center
Fried, Michael N.
2007-01-01
The basic assumption of this paper is that mathematics and history of mathematics are both forms of knowledge and, therefore, represent different ways of knowing. This was also the basic assumption of Fried (2001) who maintained that these ways of knowing imply different conceptual and methodological commitments, which, in turn, lead to a conflict…
Knowledge Discovery from Relations
ERIC Educational Resources Information Center
Guo, Zhen
2010-01-01
A basic and classical assumption in the machine learning research area is "randomness assumption" (also known as i.i.d assumption), which states that data are assumed to be independent and identically generated by some known or unknown distribution. This assumption, which is the foundation of most existing approaches in the literature, simplifies…
Zipf's word frequency law in natural language: a critical review and future directions.
Piantadosi, Steven T
2014-10-01
The frequency distribution of words has been a key object of study in statistical linguistics for the past 70 years. This distribution approximately follows a simple mathematical form known as Zipf's law. This article first shows that human language has a highly complex, reliable structure in the frequency distribution over and above this classic law, although prior data visualization methods have obscured this fact. A number of empirical phenomena related to word frequencies are then reviewed. These facts are chosen to be informative about the mechanisms giving rise to Zipf's law and are then used to evaluate many of the theoretical explanations of Zipf's law in language. No prior account straightforwardly explains all the basic facts or is supported with independent evaluation of its underlying assumptions. To make progress at understanding why language obeys Zipf's law, studies must seek evidence beyond the law itself, testing assumptions and evaluating novel predictions with new, independent data.
Teaching Critical Literacy across the Curriculum in Multimedia America.
ERIC Educational Resources Information Center
Semali, Ladislaus M.
The teaching of media texts as a form of textual construction is embedded in the assumption that audiences bring individual preexisting dispositions even though the media may contribute to their shaping of basic attitudes, beliefs, values, and behavior. As summed up by D. Lusted, at the core of such textual construction are basic assumptions that…
A Survey of Report of Risk Management for Clay County, Florida.
ERIC Educational Resources Information Center
Florida State Dept. of Education, Tallahassee.
Risk management encompasses far more than an insurance program alone. The basic elements consist of--(1) elimination or reduction of exposure to loss, (2) protection from exposure to loss, (3) assumption of risk loss, and (4) transfer of risk to a professional carrier. This survey serves as a means of evaluating the methods of application of these…
Cable Television and Education: Proceedings of the CATV and Education Conference, May 11-12, 1973.
ERIC Educational Resources Information Center
Cardellino, Earl L., Comp.; Forsythe, Charles G., Comp.
Edited versions of the conference presentations are compiled. The purpose of the meeting was to bring together media specialists and other educators from throughout Pennsylvania to evaluate the basic assumptions underlying the educational use of cable television (CATV) and to share ideas about the ways in which cable could be used to change the…
Performance evaluation of Olympic weightlifters.
Garhammer, J
1979-01-01
The comparison of weights lifted by athletes in different bodyweight categories is a continuing problem for the sport of olympic weightlifting. An objective mechanical evaluation procedure was developed using basic ideas from a model proposed by Ranta in 1975. This procedure was based on more realistic assumptions than the original model and considered both vertical and horizontal bar movements. Utilization of data obtained from film of national caliber lifters indicated that the proposed method was workable, and that the evaluative indices ranked lifters in reasonable order relative to other comparative techniques.
Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.
Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian
2011-01-01
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.
Supply-demand balance in outward-directed networks and Kleiber's law
Painter, Page R
2005-01-01
Background Recent theories have attempted to derive the value of the exponent α in the allometric formula for scaling of basal metabolic rate from the properties of distribution network models for arteries and capillaries. It has recently been stated that a basic theorem relating the sum of nutrient currents to the specific nutrient uptake rate, together with a relationship claimed to be required in order to match nutrient supply to nutrient demand in 3-dimensional outward-directed networks, leads to Kleiber's law (b = 3/4). Methods The validity of the supply-demand matching principle and the assumptions required to prove the basic theorem are assessed. The supply-demand principle is evaluated by examining the supply term and the demand term in outward-directed lattice models of nutrient and water distribution systems and by applying the principle to fractal-like models of mammalian arterial systems. Results Application of the supply-demand principle to bifurcating fractal-like networks that are outward-directed does not predict 3/4-power scaling, and evaluation of water distribution system models shows that the matching principle does not match supply to demand in such systems. Furthermore, proof of the basic theorem is shown to require that the covariance of nutrient uptake and current path length is 0, an assumption unlikely to be true in mammalian arterial systems. Conclusion The supply-demand matching principle does not lead to a satisfactory explanation for the approximately 3/4-power scaling of mammalian basal metabolic rate. PMID:16283939
Supply-demand balance in outward-directed networks and Kleiber's law.
Painter, Page R
2005-11-10
Recent theories have attempted to derive the value of the exponent alpha in the allometric formula for scaling of basal metabolic rate from the properties of distribution network models for arteries and capillaries. It has recently been stated that a basic theorem relating the sum of nutrient currents to the specific nutrient uptake rate, together with a relationship claimed to be required in order to match nutrient supply to nutrient demand in 3-dimensional outward-directed networks, leads to Kleiber's law (b = 3/4). The validity of the supply-demand matching principle and the assumptions required to prove the basic theorem are assessed. The supply-demand principle is evaluated by examining the supply term and the demand term in outward-directed lattice models of nutrient and water distribution systems and by applying the principle to fractal-like models of mammalian arterial systems. Application of the supply-demand principle to bifurcating fractal-like networks that are outward-directed does not predict 3/4-power scaling, and evaluation of water distribution system models shows that the matching principle does not match supply to demand in such systems. Furthermore, proof of the basic theorem is shown to require that the covariance of nutrient uptake and current path length is 0, an assumption unlikely to be true in mammalian arterial systems. The supply-demand matching principle does not lead to a satisfactory explanation for the approximately 3/4-power scaling of mammalian basal metabolic rate.
L.H. Pardo; P. Semaoune; P.G. Schaberg; C. Eagar; M. Sebilo
2013-01-01
Stable isotopes of nitrogen (N) in plants are increasingly used to evaluate ecosystem N cycling patterns. A basic assumption in this research is that plant δ15N reflects the δ15N of the N source. Recent evidence suggests that plants may fractionate on uptake, transport, or transformation of N. If the...
Haegele, Justin A; Hodge, Samuel Russell
2015-10-01
There are basic philosophical and paradigmatic assumptions that guide scholarly research endeavors, including the methods used and the types of questions asked. Through this article, kinesiology faculty and students with interests in adapted physical activity are encouraged to understand the basic assumptions of applied behavior analysis (ABA) methodology for conducting, analyzing, and presenting research of high quality in this paradigm. The purposes of this viewpoint paper are to present information fundamental to understanding the assumptions undergirding research methodology in ABA, describe key aspects of single-subject research designs, and discuss common research designs and data-analysis strategies used in single-subject studies.
Zipf’s word frequency law in natural language: A critical review and future directions
2014-01-01
The frequency distribution of words has been a key object of study in statistical linguistics for the past 70 years. This distribution approximately follows a simple mathematical form known as Zipf ’ s law. This article first shows that human language has a highly complex, reliable structure in the frequency distribution over and above this classic law, although prior data visualization methods have obscured this fact. A number of empirical phenomena related to word frequencies are then reviewed. These facts are chosen to be informative about the mechanisms giving rise to Zipf’s law and are then used to evaluate many of the theoretical explanations of Zipf’s law in language. No prior account straightforwardly explains all the basic facts or is supported with independent evaluation of its underlying assumptions. To make progress at understanding why language obeys Zipf’s law, studies must seek evidence beyond the law itself, testing assumptions and evaluating novel predictions with new, independent data. PMID:24664880
ERIC Educational Resources Information Center
Ramseyer, Gary C.; Tcheng, Tse-Kia
The present study was directed at determining the extent to which the Type I Error rate is affected by violations in the basic assumptions of the q statistic. Monte Carlo methods were employed, and a variety of departures from the assumptions were examined. (Author)
Schultze-Lutter, F
2016-12-01
The early detection of psychoses has become increasingly relevant in research and clinic. Next to the ultra-high risk (UHR) approach that targets an immediate risk of developing frank psychosis, the basic symptom approach that targets the earliest possible detection of the developing disorder is being increasingly used worldwide. The present review gives an introduction to the development and basic assumptions of the basic symptom concept, summarizes the results of studies on the specificity of basic symptoms for psychoses in different age groups as well as on studies of their psychosis-predictive value, and gives an outlook on future results. Moreover, a brief introduction to first recent imaging studies is given that supports one of the main assumptions of the basic symptom concept, i. e., that basic symptoms are the most immediate phenomenological expression of the cerebral aberrations underlying the development of psychosis. From this, it is concluded that basic symptoms might be able to provide important information on future neurobiological research on the etiopathology of psychoses. © Georg Thieme Verlag KG Stuttgart · New York.
Can Basic Research on Children and Families Be Useful for the Policy Process?
ERIC Educational Resources Information Center
Moore, Kristin A.
Based on the assumption that basic science is the crucial building block for technological and biomedical progress, this paper examines the relevance for public policy of basic demographic and behavioral sciences research on children and families. The characteristics of basic research as they apply to policy making are explored. First, basic…
NA-241_Quarterly Report_SBLibby - 12.31.2017_v2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Libby, Stephen B.
This is an evaluation of candidate navigation solutions for GPS free inspection tools that can be used in tours of large building interiors. In principle, COTS portable inertial motion unit (IMU) sensors with satisfactory accuracy, SWAP (size, weight, power), low error, and bias drift can provide sufficiently accurate dead reckoning navigation in a large building in the absence of GPS. To explore this assumption, the capabilities of representative IMU navigation sensors to meet these requirements will be evaluated, starting with a market survey, and then carrying out a basic analysis of these sensors using LLNL’s navigation codes.
Sawan, Mouna; Jeon, Yun-Hee; Chen, Timothy F
2018-04-01
Psychotropic medicines have limited efficacy in the management of behavioural and psychological disturbances, yet they are commonly used in nursing homes. Organisational culture is an important consideration influencing use of psychotropic medicines. Schein's theory elucidates that organisational culture is underpinned by basic assumptions, which are the taken for granted beliefs driving organisational members' behaviour and practices. By exploring the basic assumptions of culture we are able to find explanations for why psychotropic medicines are prescribed contrary to standards. A qualitative study guided by Schein's theory was conducted using semi-structured interviews with 40 staff representing a broad range of roles from eight nursing homes. Findings from the study suggest two basic assumptions influenced the use of psychotropic medicines: locus of control and necessity for efficiency or comprehensiveness. Locus of control pertained to whether staff believed they could control decisions when facing negative work experiences. Necessity for efficiency or comprehensiveness concerned how much time and effort was spent on a given task. Participants' arrived at decisions to use psychotropic medicines that were inconsistent with ideal standards when they believed they were helpless to do the right thing by the resident and it was necessary to restrict time on a given task. Basic assumptions tended to provide the rationale for staff to use psychotropic medicines when it was not compatible with standards. Organisational culture is an important factor that should be addressed to optimise psychotropic medicine use. Copyright © 2018 Elsevier Ltd. All rights reserved.
On the accuracy of personality judgment: a realistic approach.
Funder, D C
1995-10-01
The "accuracy paradigm" for the study of personality judgment provides an important, new complement to the "error paradigm" that dominated this area of research for almost 2 decades. The present article introduces a specific approach within the accuracy paradigm called the Realistic Accuracy Model (RAM). RAM begins with the assumption that personality traits are real attributes of individuals. This assumption entails the use of a broad array of criteria for the evaluation of personality judgment and leads to a model that describes accuracy as a function of the availability, detection, and utilization of relevant behavioral cues. RAM provides a common explanation for basic moderators of accuracy, sheds light on how these moderators interact, and outlines a research agenda that includes the reintegration of the study of error with the study of accuracy.
Basic statistics (the fundamental concepts).
Lim, Eric
2014-12-01
An appreciation and understanding of statistics is import to all practising clinicians, not simply researchers. This is because mathematics is the fundamental basis to which we base clinical decisions, usually with reference to the benefit in relation to risk. Unless a clinician has a basic understanding of statistics, he or she will never be in a position to question healthcare management decisions that have been handed down from generation to generation, will not be able to conduct research effectively nor evaluate the validity of published evidence (usually making an assumption that most published work is either all good or all bad). This article provides a brief introduction to basic statistical methods and illustrates its use in common clinical scenarios. In addition, pitfalls of incorrect usage have been highlighted. However, it is not meant to be a substitute for formal training or consultation with a qualified and experienced medical statistician prior to starting any research project.
[Introduction to Exploratory Factor Analysis (EFA)].
Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón
2012-03-01
Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Artificial Intelligence: Underlying Assumptions and Basic Objectives.
ERIC Educational Resources Information Center
Cercone, Nick; McCalla, Gordon
1984-01-01
Presents perspectives on methodological assumptions underlying research efforts in artificial intelligence (AI) and charts activities, motivations, methods, and current status of research in each of the major AI subareas: natural language understanding; computer vision; expert systems; search, problem solving, planning; theorem proving and logic…
Teaching Practices: Reexamining Assumptions.
ERIC Educational Resources Information Center
Spodek, Bernard, Ed.
This publication contains eight papers, selected from papers presented at the Bicentennial Conference on Early Childhood Education, that discuss different aspects of teaching practices. The first two chapters reexamine basic assumptions underlying the organization of curriculum experiences for young children. Chapter 3 discusses the need to…
Pressure losses and heat transfer in non-circular channels with hydraulically smooth walls
NASA Technical Reports Server (NTRS)
Malak, J.
1982-01-01
The influence of channel geometry on pressure losses and heat transfer in noncircular channels with hydraulically smooth walls was studied. As a basic assumption for the description of this influence, integral geometrical criteria, selected according to experimental experience, were introduced. Using these geometrical criteria, a large set of experimental data for pressure losses and heat transfer in circular and annular channels with longitudinal fins was evaluated. In this way it as empirically proved that the criteria described channel geometry fairly well.
Sampling Assumptions in Inductive Generalization
ERIC Educational Resources Information Center
Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.
2012-01-01
Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…
Belief Structures about People Held by Selected Graduate Students.
ERIC Educational Resources Information Center
Dole, Arthur A.; And Others
Wrightsman has established that assumptions about human nature distinguish religious, occupational, political, gender, and other groups, and that they predict behavior in structured situations. Hjelle and Ziegler proposed a set of nine basic bipolar assumptions about the nature of people: freedom-determinism; rationality-irrationality;…
Dynamic Self-Consistent Field Theories for Polymer Blends and Block Copolymers
NASA Astrophysics Data System (ADS)
Kawakatsu, Toshihiro
Understanding the behavior of the phase separated domain structures and rheological properties of multi-component polymeric systems require detailed information on the dynamics of domains and that of conformations of constituent polymer chains. Self-consistent field (SCF) theory is a useful tool to treat such a problem because the conformation entropy of polymer chains in inhomogeneous systems can be evaluated quantitatively using this theory. However, when we turn our attention to the dynamic properties in a non-equilibrium state, the basic assumption of the SCF theory, i.e. the assumption of equilibrium chain conformation, breaks down. In order to avoid such a difficulty, dynamic SCF theories were developed. In this chapter, we give a brief review of the recent developments of dynamic SCF theories, and discuss where the cutting-edge of this theory is.
Valentine, Julie L
2014-01-01
An evaluation of the Integrated Practice Model for Forensic Nursing Science () is presented utilizing methods outlined by . A brief review of nursing theory basics and evaluation methods by Meleis is provided to enhance understanding of the ensuing theoretical evaluation and critique. The Integrated Practice Model for Forensic Nursing Science, created by forensic nursing pioneer Virginia Lynch, captures the theories, assumptions, concepts, and propositions inherent in forensic nursing practice and science. The historical background of the theory is explored as Lynch's model launched the role development of forensic nursing practice as both a nursing and forensic science specialty. It is derived from a combination of nursing, sociological, and philosophical theories to reflect the grounding of forensic nursing in the nursing, legal, psychological, and scientific communities. As Lynch's model is the first inception of forensic nursing theory, it is representative of a conceptual framework although the title implies a practice theory. The clarity and consistency displayed in the theory's structural components of assumptions, concepts, and propositions are analyzed. The model is described and evaluated. A summary of the strengths and limitations of the model is compiled followed by application to practice, education, and research with suggestions for ongoing theory development.
Thin Skin, Deep Damage: Addressing the Wounded Writer in the Basic Writing Course
ERIC Educational Resources Information Center
Boone, Stephanie D.
2010-01-01
How do institutions and their writing faculties see basic writers? What assumptions about these writers drive writing curricula, pedagogies and assessments? How do writing programs enable or frustrate these writers? How might course design facilitate the outcomes we envision? This article argues that, in order to teach basic writers to enter…
Writing Partners: Service Learning as a Route to Authority for Basic Writers
ERIC Educational Resources Information Center
Gabor, Catherine
2009-01-01
This article looks at best practices in basic writing instruction in terms of non-traditional audiences and writerly authority. Much conventional wisdom discourages participation in service-learning projects for basic writers because of the assumption that their writing is not yet ready to "go public." Countering this line of thinking, the author…
Introduction to the Application of Web-Based Surveys.
ERIC Educational Resources Information Center
Timmerman, Annemarie
This paper discusses some basic assumptions and issues concerning web-based surveys. Discussion includes: assumptions regarding cost and ease of use; disadvantages of web-based surveys, concerning the inability to compensate for four common errors of survey research: coverage error, sampling error, measurement error and nonresponse error; and…
School, Cultural Diversity, Multiculturalism, and Contact
ERIC Educational Resources Information Center
Pagani, Camilla; Robustelli, Francesco; Martinelli, Cristina
2011-01-01
The basic assumption of this paper is that school's potential to improve cross-cultural relations, as well as interpersonal relations in general, is enormous. This assumption is supported by a number of theoretical considerations and by the analysis of data we obtained from a study we conducted on the attitudes toward diversity and…
Nuclear Reactions in Micro/Nano-Scale Metal Particles
NASA Astrophysics Data System (ADS)
Kim, Y. E.
2013-03-01
Low-energy nuclear reactions in micro/nano-scale metal particles are described based on the theory of Bose-Einstein condensation nuclear fusion (BECNF). The BECNF theory is based on a single basic assumption capable of explaining the observed LENR phenomena; deuterons in metals undergo Bose-Einstein condensation. The BECNF theory is also a quantitative predictive physical theory. Experimental tests of the basic assumption and theoretical predictions are proposed. Potential application to energy generation by ignition at low temperatures is described. Generalized theory of BECNF is used to carry out theoretical analyses of recently reported experimental results for hydrogen-nickel system.
On Cognitive Constraints and Learning Progressions: The Case of "Structure of Matter"
ERIC Educational Resources Information Center
Talanquer, Vicente
2009-01-01
Based on the analysis of available research on students' alternative conceptions about the particulate nature of matter, we identified basic implicit assumptions that seem to constrain students' ideas and reasoning on this topic at various learning stages. Although many of these assumptions are interrelated, some of them seem to change or…
Rationality as the Basic Assumption in Explaining Japanese (or Any Other) Business Culture.
ERIC Educational Resources Information Center
Koike, Shohei
Economic analysis, with its explicit assumption that people are rational, is applied to the Japanese and American business cultures to illustrate how the approach is useful for understanding cultural differences. Specifically, differences in cooperative behavior among Japanese and American workers are examined. Economic analysis goes beyond simple…
Standardization of Selected Semantic Differential Scales with Secondary School Children.
ERIC Educational Resources Information Center
Evans, G. T.
A basic assumption of this study is that the meaning continuum registered by an adjective pair remains relatively constant over a large universe of concepts and over subjects within a relatively homogeneous population. An attempt was made to validate this assumption by showing the invariance of the factor structure across different types of…
What's Love Got to Do with It? Rethinking Common Sense Assumptions
ERIC Educational Resources Information Center
Trachman, Matthew; Bluestone, Cheryl
2005-01-01
One of the most basic tasks in introductory social science classes is to get students to reexamine their common sense assumptions concerning human behavior. This article introduces a shared assignment developed for a learning community that paired an introductory sociology and psychology class. The assignment challenges students to rethink the…
Hagler, Megan M.; Freeman, Mary C.; Wenger, Seth J.; Freeman, Byron J.; Rakes, Patrick L.; Shute, J.R.
2011-01-01
Rarely encountered animals may be present but undetected, potentially leading to incorrect assumptions about the persistence of a local population or the conservation priority of a particular area. The federally endangered and narrowly endemic Conasauga logperch (Percina jenkinsi) is a good example of a rarely encountered fish species of conservation concern, for which basic population statistics are lacking. We evaluated the occurrence frequency for this species using surveys conducted with a repeat-observation sampling approach during the summer of 2008. We also analyzed museum records since the late 1980s to evaluate the trends in detected status through time. The results of these analyses provided support for a declining trend in this species over a portion of its historical range, despite low estimated detection probability. We used the results to identify the expected information return for a given level of monitoring where the sampling approach incorporates incomplete detection. The method applied here may be of value where historic occurrence records are available, provided that the assumption of constant capture efficiency is reasonable.
Intellectualizing Adult Basic Literacy Education: A Case Study
ERIC Educational Resources Information Center
Bradbury, Kelly S.
2012-01-01
At a time when accusations of American ignorance and anti-intellectualism are ubiquitous, this article challenges problematic assumptions about intellectualism that overlook the work of adult basic literacy programs and proposes an expanded view of intellectualism. It is important to recognize and to challenge narrow views of intellectualism…
Adult Literacy Programs: Guidelines for Effectiveness.
ERIC Educational Resources Information Center
Lord, Jerome E.
This report is a summary of information from both research and experience about the assumptions and practices that guide successful basic skills programs. The 31 guidelines are basic to building a solid foundation on which effective instructional programs for adults can be developed. The first six guidelines address some important characteristics…
Social Studies Curriculum Guidelines.
ERIC Educational Resources Information Center
Manson, Gary; And Others
These guidelines, which set standards for social studies programs K-12, can be used to update existing programs or may serve as a baseline for further innovation. The first section, "A Basic Rationale for Social Studies Education," identifies the theoretical assumptions basic to the guidelines as knowledge, thinking, valuing, social participation,…
NASA Technical Reports Server (NTRS)
Hamrock, B. J.; Dowson, D.
1981-01-01
Lubricants, usually Newtonian fluids, are assumed to experience laminar flow. The basic equations used to describe the flow are the Navier-Stokes equation of motion. The study of hydrodynamic lubrication is, from a mathematical standpoint, the application of a reduced form of these Navier-Stokes equations in association with the continuity equation. The Reynolds equation can also be derived from first principles, provided of course that the same basic assumptions are adopted in each case. Both methods are used in deriving the Reynolds equation, and the assumptions inherent in reducing the Navier-Stokes equations are specified. Because the Reynolds equation contains viscosity and density terms and these properties depend on temperature and pressure, it is often necessary to couple the Reynolds with energy equation. The lubricant properties and the energy equation are presented. Film thickness, a parameter of the Reynolds equation, is a function of the elastic behavior of the bearing surface. The governing elasticity equation is therefore presented.
Ultrasound assisted evaluation of chest pain in the emergency department.
Colony, M Deborah; Edwards, Frank; Kellogg, Dylan
2018-04-01
Chest pain is a commonly encountered emergency department complaint, with a broad differential including several life-threatening possible conditions. Ultrasound-assisted evaluation can potentially be used to rapidly and accurately arrive at the correct diagnosis. We propose an organized, ultrasound assisted evaluation of the patient with chest pain using a combination of ultrasound, echocardiography and clinical parameters. Basic echo techniques which can be mastered by residents in a short time are used plus standardized clinical questions and examination. Information is kept on a checklist. We hypothesize that this will result in a quicker, more accurate evaluation of chest pain in the ED leading to timely treatment and disposition of the patient, less provider anxiety, a reduction in the number of diagnostic errors, and the removal of false assumptions from the diagnostic process. Copyright © 2017 Elsevier Inc. All rights reserved.
McConnachie, Matthew M; Romero, Claudia; Ferraro, Paul J; van Wilgen, Brian W
2016-04-01
The fundamental challenge of evaluating the impact of conservation interventions is that researchers must estimate the difference between the outcome after an intervention occurred and what the outcome would have been without it (counterfactual). Because the counterfactual is unobservable, researchers must make an untestable assumption that some units (e.g., organisms or sites) that were not exposed to the intervention can be used as a surrogate for the counterfactual (control). The conventional approach is to make a point estimate (i.e., single number along with a confidence interval) of impact, using, for example, regression. Point estimates provide powerful conclusions, but in nonexperimental contexts they depend on strong assumptions about the counterfactual that often lack transparency and credibility. An alternative approach, called partial identification (PI), is to first estimate what the counterfactual bounds would be if the weakest possible assumptions were made. Then, one narrows the bounds by using stronger but credible assumptions based on an understanding of why units were selected for the intervention and how they might respond to it. We applied this approach and compared it with conventional approaches by estimating the impact of a conservation program that removed invasive trees in part of the Cape Floristic Region. Even when we used our largest PI impact estimate, the program's control costs were 1.4 times higher than previously estimated. PI holds promise for applications in conservation science because it encourages researchers to better understand and account for treatment selection biases; can offer insights into the plausibility of conventional point-estimate approaches; could reduce the problem of advocacy in science; might be easier for stakeholders to agree on a bounded estimate than a point estimate where impacts are contentious; and requires only basic arithmetic skills. © 2015 Society for Conservation Biology.
[A reflection about organizational culture according to psychoanalysis' view].
Cardoso, Maria Lúcia Alves Pereira
2008-01-01
This article aims at submitting a reflection on the universal presuppositions of human culture proposed by Freud, as a prop for analyzing presuppositions of organizational culture according to Schein. In an article published in 1984, the latter claims that in order to decipher organizational culture one cannot rely upon the (visible) artifacts or to (perceptible) values, but should take a deeper plunge and identify the basic assumptions underlying organizational culture. Such pressupositions spread into the field of sttudy concerning the individual inner self, within the sphere of Psychoanalysis. We have therefore examined Freud's basic assumptions of human culture in order to ascertain its conformity with the paradigms of organizational culture as proposed by Schein.
Thermodynamic Properties of Low-Density {}^{132}Xe Gas in the Temperature Range 165-275 K
NASA Astrophysics Data System (ADS)
Akour, Abdulrahman
2018-01-01
The method of static fluctuation approximation was used to calculate selected thermodynamic properties (internal energy, entropy, energy capacity, and pressure) for xenon in a particularly low-temperature range (165-270 K) under different conditions. This integrated microscopic study started from an initial basic assumption as the main input. The basic assumption in this method was to replace the local field operator with its mean value, then numerically solve a closed set of nonlinear equations using an iterative method, considering the Hartree-Fock B2-type dispersion potential as the most appropriate potential for xenon. The results are in very good agreement with those of an ideal gas.
ERIC Educational Resources Information Center
Ngai, Courtney; Sevian, Hannah; Talanquer, Vicente
2014-01-01
Given the diversity of materials in our surroundings, one should expect scientifically literate citizens to have a basic understanding of the core ideas and practices used to analyze chemical substances. In this article, we use the term 'chemical identity' to encapsulate the assumptions, knowledge, and practices upon which chemical…
NASA Astrophysics Data System (ADS)
Rusli, Aloysius
2016-08-01
Until the 1980s, it is well known and practiced in Indonesian Basic Physics courses, to present physics by its effective technicalities: The ideally elastic spring, the pulley and moving blocks, the thermodynamics of ideal engine models, theoretical electrostatics and electrodynamics with model capacitors and inductors, wave behavior and its various superpositions, and hopefully closed with a modern physics description. A different approach was then also experimented with, using the Hobson and Moore texts, stressing the alternative aim of fostering awareness, not just mastery, of science and the scientific method. This is hypothesized to be more in line with the changed attitude of the so-called Millenials cohort who are less attentive if not interested, and are more used to multi-tasking which suits their shorter span of attention. The upside is increased awareness of science and the scientific method. The downside is that they are getting less experience of the scientific method which intensely bases itself on critical observation, analytic thinking to set up conclusions or hypotheses, and checking consistency of the hypotheses with measured data. Another aspect is recognition that the human person encompasses both the reasoning capacity and the mental- spiritual-cultural capacity. This is considered essential, as the world grows even smaller due to increased communication capacity, causing strong interactions, nonlinear effects, and showing that value systems become more challenging and challenged due to physics / science and its cosmology, which is successfully based on the scientific method. So students should be made aware of the common basis of these two capacities: the assumptions, the reasoning capacity and the consistency assumption. This shows that the limits of science are their set of basic quantifiable assumptions, and the limits of the mental-spiritual-cultural aspects of life are their set of basic metaphysical (non-quantifiable) assumptions. The bridging between these two human aspects of life, can lead to a “why” of science, and a “meaning” of life. A progress report on these efforts is presented, essentially being of the results indicated by an extended format of the usual weekly reporting used previously in Basic Physics lectures.
Genital Measures: Comments on Their Role in Understanding Human Sexuality
ERIC Educational Resources Information Center
Geer, James H.
1976-01-01
This paper discusses the use of genital measures in the study of both applied and basic work in human sexuality. Some of the advantages of psychophysiological measures are considered along with cautions concerning unwarranted assumptions. Some of the advances that are possible in both applied and basic work are examined. (Author)
39 Questionable Assumptions in Modern Physics
NASA Astrophysics Data System (ADS)
Volk, Greg
2009-03-01
The growing body of anomalies in new energy, low energy nuclear reactions, astrophysics, atomic physics, and entanglement, combined with the failure of the Standard Model and string theory to predict many of the most basic fundamental phenomena, all point to a need for major new paradigms. Not Band-Aids, but revolutionary new ways of conceptualizing physics, in the spirit of Thomas Kuhn's The Structure of Scientific Revolutions. This paper identifies a number of long-held, but unproven assumptions currently being challenged by an increasing number of alternative scientists. Two common themes, both with venerable histories, keep recurring in the many alternative theories being proposed: (1) Mach's Principle, and (2) toroidal, vortex particles. Matter-based Mach's Principle differs from both space-based universal frames and observer-based Einsteinian relativity. Toroidal particles, in addition to explaining electron spin and the fundamental constants, satisfy the basic requirement of Gauss's misunderstood B Law, that motion itself circulates. Though a comprehensive theory is beyond the scope of this paper, it will suggest alternatives to the long list of assumptions in context.
1982-03-01
to preference types, and uses capacity estimation; therefore, it is basically a good system for recreation and resource inventory and classification...quan- tity, and distribution of recreational resources. Its basic unit of inventory is landform, or the homogeneity of physical features used to...by Clark and Stankey, "the basic assumption underlying the ROS is that quality recreational experiences are best assured by providing a diverse set of
The basic aerodynamics of floatation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davies, M.J.; Wood, D.H.
1983-09-01
The original derivation of the basic theory governing the aerodynamics of both hovercraft and modern floatation ovens, requires the validity of some extremely crude assumptions. However, the basic theory is surprisingly accurate. It is shown that this accuracy occurs because the final expression of the basic theory can be derived by approximating the full Navier-Stokes equations in a manner that clearly shows the limitations of the theory. These limitations are used in discussing the relatively small discrepancies between the theory and experiment, which may not be significant for practical purposes.
Code of Federal Regulations, 2010 CFR
2010-01-01
... EMPLOYEES RETIREMENT SYSTEM-GENERAL ADMINISTRATION Employee Deductions and Government Contributions § 841... standards (using dynamic assumptions) and expressed as a level percentage of aggregate basic pay. Normal...
Experimental investigation of two-phase heat transfer in a porous matrix.
NASA Technical Reports Server (NTRS)
Von Reth, R.; Frost, W.
1972-01-01
One-dimensional two-phase flow transpiration cooling through porous metal is studied experimentally. The experimental data is compared with a previous one-dimensional analysis. Good agreement with calculated temperature distribution is obtained as long as the basic assumptions of the analytical model are satisfied. Deviations from the basic assumptions are caused by nonhomogeneous and oscillating flow conditions. Preliminary derivation of nondimensional parameters which characterize the stable and unstable flow conditions is given. Superheated liquid droplets observed sputtering from the heated surface indicated incomplete evaporation at heat fluxes well in access of the latent energy transport. A parameter is developed to account for the nonequilibrium thermodynamic effects. Measured and calculated pressure drops show contradicting trends which are attributed to capillary forces.
An Extension of the Partial Credit Model with an Application to the Measurement of Change.
ERIC Educational Resources Information Center
Fischer, Gerhard H.; Ponocny, Ivo
1994-01-01
An extension to the partial credit model, the linear partial credit model, is considered under the assumption of a certain linear decomposition of the item x category parameters into basic parameters. A conditional maximum likelihood algorithm for estimating basic parameters is presented and illustrated with simulation and an empirical study. (SLD)
On the Basis of the Basic Variety.
ERIC Educational Resources Information Center
Schwartz, Bonnie D.
1997-01-01
Considers the interplay between source and target language in relation to two points made by Klein and Perdue: (1) the argument that the analysis of the target language should not be used as the model for analyzing interlanguage data; and (2) the theoretical claim that under the technical assumptions of minimalism, the Basic Variety is a "perfect"…
The Not So Common Sense: Differences in How People Judge Social and Political Life.
ERIC Educational Resources Information Center
Rosenberg, Shawn W.
This interdisciplinary book challenges two basic assumptions that orient much contemporary social scientific thinking. Offering theory and empirical research, the book rejects the classic liberal view that people share a basic common sense or rationality; while at the same time, it questions the view of contemporary social theory that meaning is…
Network-level reproduction number and extinction threshold for vector-borne diseases.
Xue, Ling; Scoglio, Caterina
2015-06-01
The basic reproduction number of deterministic models is an essential quantity to predict whether an epidemic will spread or not. Thresholds for disease extinction contribute crucial knowledge of disease control, elimination, and mitigation of infectious diseases. Relationships between basic reproduction numbers of two deterministic network-based ordinary differential equation vector-host models, and extinction thresholds of corresponding stochastic continuous-time Markov chain models are derived under some assumptions. Numerical simulation results for malaria and Rift Valley fever transmission on heterogeneous networks are in agreement with analytical results without any assumptions, reinforcing that the relationships may always exist and proposing a mathematical problem for proving existence of the relationships in general. Moreover, numerical simulations show that the basic reproduction number does not monotonically increase or decrease with the extinction threshold. Consistent trends of extinction probability observed through numerical simulations provide novel insights into mitigation strategies to increase the disease extinction probability. Research findings may improve understandings of thresholds for disease persistence in order to control vector-borne diseases.
Conclusion: Agency in the face of complexity and the future of assumption-aware evaluation practice.
Morrow, Nathan; Nkwake, Apollo M
2016-12-01
This final chapter in the volume pulls together common themes from the diverse set of articles by a group of eight authors in this issue, and presents some reflections on the next steps for improving the ways in which evaluators work with assumptions. Collectively, the authors provide a broad overview of existing and emerging approaches to the articulation and use of assumptions in evaluation theory and practice. The authors reiterate the rationale and key terminology as a common basis for working with assumption in program design and evaluation. They highlight some useful concepts and categorizations to promote more rigorous treatment of assumptions in evaluation. A three-tier framework for fostering agency for assumption-aware evaluation practice is proposed-agency for themselves (evaluators); agency for others (stakeholders); and agency for standards and principles. Copyright © 2016 Elsevier Ltd. All rights reserved.
The Federal Role and Chapter 1: Rethinking Some Basic Assumptions.
ERIC Educational Resources Information Center
Kirst, Michael W.
In the 20 years since the major Federal program for the disadvantaged began, surprisingly little has changed from its original vision. It is now time to question some of the basic policies of Chapter 1 of the Education Consolidation and Improvement Act in view of the change in conceptions about the Federal role and the recent state and local…
ERIC Educational Resources Information Center
Radtke, Jean, Ed.
Developed as a result of an institute on rehabilitation issues, this document is a guide to assistive technology as it affects successful competitive employment outcomes for people with disabilities. Chapter 1 offers basic information on assistive technology including basic assumptions, service provider approaches, options for technology…
Code of Federal Regulations, 2010 CFR
2010-01-01
... for valuation of the System, based on dynamic assumptions. The present value factors are unisex... EMPLOYEES RETIREMENT SYSTEM-BASIC ANNUITY Alternative Forms of Annuities § 842.702 Definitions. In this...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandler, S.I.
1986-01-01
The objective of the work is to use the generalized van der Waals theory, as derived earlier (''The Generalized van der Waals Partition Function I. Basic Theory'' by S.I. Sandler, Fluid Phase Equilibria 19, 233 (1985)) to: (1) understand the molecular level assumptions inherent in current thermodynamic models; (2) use theory and computer simulation studies to test these assumptions; and (3) develop new, improved thermodynamic models based on better molecular level assumptions. From such a fundamental study, thermodynamic models will be developed that will be applicable to mixtures of molecules of widely different size and functionality, as occurs in themore » processing of heavy oils, coal liquids and other synthetic fuels. An important aspect of our work is to reduce our fundamental theoretical developments to engineering practice through extensive testing and evaluation with experimental data on real mixtures. During the first year of this project important progress was made in the areas specified in the original proposal, as well as several subsidiary areas identified as the work progressed. Some of this work has been written up and submitted for publication. Manuscripts acknowledging DOE support, together with a very brief description, are listed herein.« less
NASA Astrophysics Data System (ADS)
Brenner, Howard
2011-10-01
Linear irreversible thermodynamic principles are used to demonstrate, by counterexample, the existence of a fundamental incompleteness in the basic pre-constitutive mass, momentum, and energy equations governing fluid mechanics and transport phenomena in continua. The demonstration is effected by addressing the elementary case of steady-state heat conduction (and transport processes in general) occurring in quiescent fluids. The counterexample questions the universal assumption of equality of the four physically different velocities entering into the basic pre-constitutive mass, momentum, and energy conservation equations. Explicitly, it is argued that such equality is an implicit constitutive assumption rather than an established empirical fact of unquestioned authority. Such equality, if indeed true, would require formal proof of its validity, currently absent from the literature. In fact, our counterexample shows the assumption of equality to be false. As the current set of pre-constitutive conservation equations appearing in textbooks are regarded as applicable both to continua and noncontinua (e.g., rarefied gases), our elementary counterexample negating belief in the equality of all four velocities impacts on all aspects of fluid mechanics and transport processes, continua and noncontinua alike.
NASA Technical Reports Server (NTRS)
Timofeyev, Y. M.
1979-01-01
In order to test the error of calculation in assumed values of the transmission function for Soviet and American radiometers sounding the atmosphere thermally from orbiting satellites, the assumptions of the transmission calculation is varied with respect to atmospheric CO2 content, transmission frequency, and atmospheric absorption. The error arising from variations of the assumptions from the standard basic model is calculated.
ERIC Educational Resources Information Center
Bentz, Robert P.; And Others
The commuter institute is one to which students commute. The two basic assumptions of this study are: (1) the Chicago Circle campus of the University of Illinois will remain a commuter institution during the decade ahead; and (2) the campus will increasingly serve a more heterogeneous student body. These assumptions have important implications for…
Ling, Julia; Templeton, Jeremy Alan
2015-08-04
Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests.more » The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. As a result, feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.« less
ERIC Educational Resources Information Center
Falk, Ruma; Kendig, Keith
2013-01-01
Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.
Social factors in space station interiors
NASA Technical Reports Server (NTRS)
Cranz, Galen; Eichold, Alice; Hottes, Klaus; Jones, Kevin; Weinstein, Linda
1987-01-01
Using the example of the chair, which is often written into space station planning but which serves no non-cultural function in zero gravity, difficulties in overcoming cultural assumptions are discussed. An experimental approach is called for which would allow designers to separate cultural assumptions from logistic, social and psychological necessities. Simulations, systematic doubt and monitored brainstorming are recommended as part of basic research so that the designer will approach the problems of space module design with a complete program.
The Eleventh Quadrennial Review of Military Compensation. Supporting Research Papers
2012-06-01
value. 4. BAH + BAS is roughly equal to expenditures for housing and food for servicemembers.22 In the first phase of the formal model, we further...assume that taxes, housing, and food are the only basic living expenses. Then, in the next phase, we include estimates of noncash benefits not included...assumption 4 with assumption 2 implies that civilian housing and food expenses are also equal to military BAH and BAS. However, civilian housing and food
Riddles of masculinity: gender, bisexuality, and thirdness.
Fogel, Gerald I
2006-01-01
Clinical examples are used to illuminate several riddles of masculinity-ambiguities, enigmas, and paradoxes in relation to gender, bisexuality, and thirdness-frequently seen in male patients. Basic psychoanalytic assumptions about male psychology are examined in the light of advances in female psychology, using ideas from feminist and gender studies as well as important and now widely accepted trends in contemporary psychoanalytic theory. By reexamining basic assumptions about heterosexual men, as has been done with ideas concerning women and homosexual men, complexity and nuance come to the fore to aid the clinician in treating the complex characterological pictures seen in men today. In a context of rapid historical and theoretical change, the use of persistent gender stereotypes and unnecessarily limiting theoretical formulations, though often unintended, may mask subtle countertransference and theoretical blind spots, and limit optimal clinical effectiveness.
Interpretation of the results of statistical measurements. [search for basic probability model
NASA Technical Reports Server (NTRS)
Olshevskiy, V. V.
1973-01-01
For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.
NASA Astrophysics Data System (ADS)
Cannizzo, John K.
2017-01-01
We utilize the time dependent accretion disk model described by Ichikawa & Osaki (1992) to explore two basic ideas for the outbursts in the SU UMa systems, Osaki's Thermal-Tidal Model, and the basic accretion disk limit cycle model. We explore a range in possible input parameters and model assumptions to delineate under what conditions each model may be preferred.
NASA Astrophysics Data System (ADS)
Reid, J.; Polasky, S.; Hawthorne, P.
2014-12-01
Sustainable development requires providing for human well-being by meeting basic demands for food, energy and consumer goods and services, all while maintaining an environment capable of sustaining the provisioning of those demands for future generations. Failure to meet the basic needs of human well-being is not an ethically viable option and strategies for doubling agricultural production and providing energy and goods for a growing population exist. However, the question is, at what cost to environmental quality? We developed an integrated modeling approach to test strategies for meeting multiple objectives within the limits of the earth system. We use scenarios to explore a range of assumptions on socio-economic factors like population growth, per capita income and technological change; food systems factors like food waste, production intensification and expansion, and meat demand; and technological developments in energy efficiency and wastewater treatment. We use these scenario to test the conditions in which we can fit the simultaneous goals of sustainable development.
Investigation of new techniques for aircraft navigation using the omega navigation
NASA Technical Reports Server (NTRS)
Baxa, E. G., Jr.
1978-01-01
An OMEGA navigation receiver with a microprocessor as the computational component was investigated. A version of the INTEL 4004 microprocessor macroassembler suitable for use on the CDC-6600 system and development of a FORTRAN IV simulator program for the microprocessor was developed. Supporting studies included development and evaluation of navigation algorithms to generate relative position information from OMEGA VLF phase measurements. Simulation studies were used to evaluate assumptions made in developing a navigation equation in OMEGA Line of Position (LOP) coordinates. Included in the navigation algorithms was a procedure for calculating a position in latitude/longitude given an OMEGA LOP fix. Implementation of a digital phase locked loop (DPLL) was evaluated on the basic of phase response characteristics over a range of input phase variations. Included also is an analytical evaluation on the basis of error probability of an algorithm for automatic time synchronization of the receiver to the OMEGA broadcast format. The use of actual OMEGA phase data and published propagation prediction corrections to determine phase velocity estimates was discussed.
Determining Global Population Distribution: Methods, Applications and Data
Balk, D.L.; Deichmann, U.; Yetman, G.; Pozzi, F.; Hay, S.I.; Nelson, A.
2011-01-01
Evaluating the total numbers of people at risk from infectious disease in the world requires not just tabular population data, but data that are spatially explicit and global in extent at a moderate resolution. This review describes the basic methods for constructing estimates of global population distribution with attention to recent advances in improving both spatial and temporal resolution. To evaluate the optimal resolution for the study of disease, the native resolution of the data inputs as well as that of the resulting outputs are discussed. Assumptions used to produce different population data sets are also described, with their implications for the study of infectious disease. Lastly, the application of these population data sets in studies to assess disease distribution and health impacts is reviewed. The data described in this review are distributed in the accompanying DVD. PMID:16647969
Behavioral health at-risk contracting--a rate development and financial reporting guide.
Zinser, G R
1994-01-01
The process of developing rates for behavioral capitation contracts can seem mysterious and intimidating. The following article explains several key features of the method used to develop capitation rates. These include: (1) a basic understanding of the mechanics of rate calculation; (2) awareness of the variables to be considered and assumptions to be made; (3) a source of information to use as a basis for these assumptions; and (4) a system to collect detailed actual experience data.
1986-09-01
Brazilian-American Chamber of Commerce Mr. Frank J. Devine, Executive Director Embraer, Empresa Brasileira De Aeronautica Mr. Salo Roth Vice President...Throughout this study the following assumptions have been made. First, it is assumed that the reader has a basic familiarity with aircraft. Therefore...of the 5 1 weapons acquisition process. Third, the assumption is made that most readers are familiar with U.S. procedures involving the sale of
A comparison between EGS4 and MCNP computer modeling of an in vivo X-ray fluorescence system.
Al-Ghorabie, F H; Natto, S S; Al-Lyhiani, S H
2001-03-01
The Monte Carlo computer codes EGS4 and MCNP were used to develop a theoretical model of a 180 degrees geometry in vivo X-ray fluorescence system for the measurement of platinum concentration in head and neck tumors. The model included specification of the photon source, collimators, phantoms and detector. Theoretical results were compared and evaluated against X-ray fluorescence data obtained experimentally from an existing system developed by the Swansea In Vivo Analysis and Cancer Research Group. The EGS4 results agreed well with the MCNP results. However, agreement between the measured spectral shape obtained using the experimental X-ray fluorescence system and the simulated spectral shape obtained using the two Monte Carlo codes was relatively poor. The main reason for the disagreement between the results arises from the basic assumptions which the two codes used in their calculations. Both codes assume a "free" electron model for Compton interactions. This assumption will underestimate the results and invalidates any predicted and experimental spectra when compared with each other.
Ketcham, Jonathan D; Kuminoff, Nicolai V; Powers, Christopher A
2016-12-01
Consumers' enrollment decisions in Medicare Part D can be explained by Abaluck and Gruber’s (2011) model of utility maximization with psychological biases or by a neoclassical version of their model that precludes such biases. We evaluate these competing hypotheses by applying nonparametric tests of utility maximization and model validation tests to administrative data. We find that 79 percent of enrollment decisions from 2006 to 2010 satisfied basic axioms of consumer theory under the assumption of full information. The validation tests provide evidence against widespread psychological biases. In particular, we find that precluding psychological biases improves the structural model's out-of-sample predictions for consumer behavior.
The Central Registry for Child Abuse Cases: Rethinking Basic Assumptions
ERIC Educational Resources Information Center
Whiting, Leila
1977-01-01
Class data pools on abused and neglected children and their families are found desirable for program planning, but identification by name is of questionable value and possibly a dangerous invasion of civil liberties. (MS)
Self-transcendent positive emotions increase spirituality through basic world assumptions.
Van Cappellen, Patty; Saroglou, Vassilis; Iweins, Caroline; Piovesana, Maria; Fredrickson, Barbara L
2013-01-01
Spirituality has mostly been studied in psychology as implied in the process of overcoming adversity, being triggered by negative experiences, and providing positive outcomes. By reversing this pathway, we investigated whether spirituality may also be triggered by self-transcendent positive emotions, which are elicited by stimuli appraised as demonstrating higher good and beauty. In two studies, elevation and/or admiration were induced using different methods. These emotions were compared to two control groups, a neutral state and a positive emotion (mirth). Self-transcendent positive emotions increased participants' spirituality (Studies 1 and 2), especially for the non-religious participants (Study 1). Two basic world assumptions, i.e., belief in life as meaningful (Study 1) and in the benevolence of others and the world (Study 2) mediated the effect of these emotions on spirituality. Spirituality should be understood not only as a coping strategy, but also as an upward spiralling pathway to and from self-transcendent positive emotions.
Alfadl, Abubakr Abdelraouf; Ibrahim, Mohamed Izham Mohamed; Maraghi, Fatima Abdulla; Mohammad, Khadijah Shhab
2016-09-01
There are limited studies on consumer behaviour toward counterfeit products and the determining factors that motivate willingness to purchase counterfeit items. This study aimed to fill this literature gap through studying differences in individual ethical evaluations of counterfeit drug purchase and whether that ethical evaluation affected by difference in income. It is hypothesized that individuals with lower/higher income make a more/less permissive evaluation of ethical responsibility regarding counterfeit drug purchase. To empirically test the research assumption, a comparison was made between people who live in the low-income country Sudan and people who live in the high-income country Qatar. The study employed a face-to-face structured interview survey methodology to collect data from 1,170 subjects and the Sudanese and Qatari samples were compared using independent t-test at alpha level of 0.05 employing SPSS version 22.0. Sudanese and Qatari individuals were significantly different on all items. Sudanese individuals scored below 3 for all Awareness of Societal Consequences (ASC) items indicating that they make more permissive evaluation of ethical responsibility regarding counterfeit drug purchase. Both groups shared a basic positive moral agreement regarding subjective norm indicating that influence of income is not evident. Findings indicate that low-income individuals make more permissive evaluation of ethical responsibility regarding counterfeit drugs purchase when highlighting awareness of societal consequences used as a deterrent tool, while both low and high-income individuals share a basic positive moral agreement when subjective norm dimension is exploited to discourage unethical buying behaviour.
Li, Qiuping; Lin, Yi; Hu, Caiping; Xu, Yinghua; Zhou, Huiya; Yang, Liping; Xu, Yongyong
2016-12-01
The Hospital Anxiety and Depression Scale (HADS) acts as one of the most frequently used self-reported measures in cancer practice. The evidence for construct validity of HADS, however, remains inconclusive. The objective of this study is to evaluate the psychometric properties of the Chinese version HADS (C-HADS) in terms of construct validity, internal consistency reliability, and concurrent validity in dyads of Chinese cancer patients and their family caregivers. This was a cross-sectional study, conducted in multiple centers: one hospital in each of the seven different administrative regions in China from October 2014 to May 2015. A total of 641 dyads, consisting of cancer patients and family caregivers, completed a survey assessing their demographic and background information, anxiety and depression using C-HADS, and quality of life (QOL) using Chinese version SF-12. Data analysis methods included descriptive statistics, confirmatory factor analysis (CFA), and Pearson correlations. Both the two-factor and one-factor models offered the best and adequate fit to the data in cancer patients and family caregivers respectively. The comparison of the two-factor and single-factor models supports the basic assumption of two-factor construct of C-HADS. The overall and two subscales of C-HADS in both cancer patients and family caregivers had good internal consistency and acceptable concurrent validity. The Chinese version of the HADS may be a reliable and valid screening tool, as indicated by its original two-factor structure. The finding supports the basic assumption of two-factor construct of HADS. Copyright © 2016 Elsevier Ltd. All rights reserved.
Velocity Measurement by Scattering from Index of Refraction Fluctuations Induced in Turbulent Flows
NASA Technical Reports Server (NTRS)
Lading, Lars; Saffman, Mark; Edwards, Robert
1996-01-01
Induced phase screen scattering is defined as scatter light from a weak index of refraction fluctuations induced by turbulence. The basic assumptions and requirements for induced phase screen scattering, including scale requirements, are presented.
Is Tissue the Issue? A Critique of SOMPA's Models and Tests.
ERIC Educational Resources Information Center
Goodman, Joan F.
1979-01-01
A critical view of the underlying theoretical rationale of the System of Multicultural Pluralistic Assessment (SOMPA) model for student assessment is presented. The critique is extensive and questions the basic assumptions of the model. (JKS)
Undergraduate Cross Registration.
ERIC Educational Resources Information Center
Grupe, Fritz H.
This report discusses various aspects of undergraduate cross-registration procedures, including the dimensions, values, roles and functions, basic assumptions, and facilitating and encouragment of cross-registration. Dimensions of cross-registration encompass financial exchange, eligibility, program limitations, type of grade and credit; extent of…
The Peace Movement: An Exercise in Micro-Macro Linkages.
ERIC Educational Resources Information Center
Galtung, Johan
1988-01-01
Contends that the basic assumption of the peace movement is the abuse of military power by the state. Argues that the peace movement is most effective through linkages with cultural, political, and economic forces in society. (BSR)
Assumptions at the philosophical and programmatic levels in evaluation.
Mertens, Donna M
2016-12-01
Stakeholders and evaluators hold a variety of levels of assumptions at the philosophical, methodological, and programmatic levels. The use of a transformative philosophical framework is presented as a way for evaluators to become more aware of the implications of various assumptions made by themselves and program stakeholders. The argument is examined and demonstrated that evaluators who are aware of the assumptions that underlie their evaluation choices are able to provide useful support for stakeholders in the examination of the assumptions they hold with regard to the nature of the problem being addressed, the program designed to solve the problem, and the approach to evaluation that is appropriate in that context. Such an informed approach has the potential for development of more appropriate and culturally responsive programs being implemented in ways that lead to the desired impacts, as well as to lead to evaluation approaches that support effective solutions to intransigent social problems. These arguments are illustrated through examples of evaluations from multiple sectors; additional challenges are also identified. Copyright © 2016 Elsevier Ltd. All rights reserved.
Evaluating scaling models in biology using hierarchical Bayesian approaches
Price, Charles A; Ogle, Kiona; White, Ethan P; Weitz, Joshua S
2009-01-01
Theoretical models for allometric relationships between organismal form and function are typically tested by comparing a single predicted relationship with empirical data. Several prominent models, however, predict more than one allometric relationship, and comparisons among alternative models have not taken this into account. Here we evaluate several different scaling models of plant morphology within a hierarchical Bayesian framework that simultaneously fits multiple scaling relationships to three large allometric datasets. The scaling models include: inflexible universal models derived from biophysical assumptions (e.g. elastic similarity or fractal networks), a flexible variation of a fractal network model, and a highly flexible model constrained only by basic algebraic relationships. We demonstrate that variation in intraspecific allometric scaling exponents is inconsistent with the universal models, and that more flexible approaches that allow for biological variability at the species level outperform universal models, even when accounting for relative increases in model complexity. PMID:19453621
Should antibacterials be deregulated?
Rovira, J; Figueras, M; Segú, J L
1998-05-01
Deregulation of antibacterials is a recurrent topic in the debate on pharmaceutical policy. This article focuses on one aspect of pharmaceutical regulation, namely the requirement of a medical prescription for purchasing antibacterials. However, a strategy of deregulation should not only concern the switch from prescription-only status to nonprescription status for a given drug, but should consider some complementary measures to minimise potentially harmful effects on health and costs. Risk-benefit and economic evaluations, which are possible approaches to assess the convenience of antibacterial deregulation, force the empirical evidence, the assumptions, as well as the value judgements on which the options are evaluated, to be made explicit. We outline the basic traits of an economic-evaluation approach to assess the issues related to the public interest and the feasibility of a deregulation policy. However, the answer cannot be a generic one, but should address the question for each particular country, and for each antibacterial and indication. Given the limitations of existing evidence on that issue, a tentative research agenda is also proposed.
A radiosity-based model to compute the radiation transfer of soil surface
NASA Astrophysics Data System (ADS)
Zhao, Feng; Li, Yuguang
2011-11-01
A good understanding of interactions of electromagnetic radiation with soil surface is important for a further improvement of remote sensing methods. In this paper, a radiosity-based analytical model for soil Directional Reflectance Factor's (DRF) distributions was developed and evaluated. The model was specifically dedicated to the study of radiation transfer for the soil surface under tillage practices. The soil was abstracted as two dimensional U-shaped or V-shaped geometric structures with periodic macroscopic variations. The roughness of the simulated surfaces was expressed as a ratio of the height to the width for the U and V-shaped structures. The assumption was made that the shadowing of soil surface, simulated by U or V-shaped grooves, has a greater influence on the soil reflectance distribution than the scattering properties of basic soil particles of silt and clay. Another assumption was that the soil is a perfectly diffuse reflector at a microscopic level, which is a prerequisite for the application of the radiosity method. This radiosity-based analytical model was evaluated by a forward Monte Carlo ray-tracing model under the same structural scenes and identical spectral parameters. The statistics of these two models' BRF fitting results for several soil structures under the same conditions showed the good agreements. By using the model, the physical mechanism of the soil bidirectional reflectance pattern was revealed.
SW-846 Test Method 1340: In Vitro Bioaccessibility Assay for Lead in Soil
Describes assay procedures written on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.
Memory Errors in Alibi Generation: How an Alibi Can Turn Against Us.
Crozier, William E; Strange, Deryn; Loftus, Elizabeth F
2017-01-01
Alibis play a critical role in the criminal justice system. Yet research on the process of alibi generation and evaluation is still nascent. Indeed, similar to other widely investigated psychological phenomena in the legal system - such as false confessions, historical claims of abuse, and eyewitness memory - the basic assumptions underlying alibi generation and evaluation require closer empirical scrutiny. To date, the majority of alibi research investigates the social psychological aspects of the process. We argue that applying our understanding of basic human memory is critical to a complete understanding of the alibi process. Specifically, we challenge the use of alibi inconsistency as an indication of guilt by outlining the "cascading effects" that can put innocents at risk for conviction. We discuss how normal encoding and storage processes can pose problems at retrieval, particularly for innocent suspects that can result in alibi inconsistencies over time. Those inconsistencies are typically misunderstood as intentional deception, first by law enforcement, affecting the investigation, then by prosecutors affecting prosecution decisions, and finally by juries, ultimately affecting guilt judgments. Put differently, despite the universal nature of memory inconsistencies, a single error can produce a cascading effect, rendering an innocent individual's alibi, ironically, proof of guilt. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, Curtis M.
2005-09-19
This report describes a screening and ranking framework(SRF) developed to evaluate potential geologic carbon dioxide (CO2) storage sites on the basis of health, safety, and environmental (HSE) risk arising from possible CO2 leakage. The approach is based on the assumption that HSE risk due to CO2 leakage is dependent on three basic characteristics of a geologic CO2 storage site: (1) the potential for primary containment by the target formation; (2) the potential for secondary containment if the primary formation leaks; and (3) the potential for attenuation and dispersion of leaking CO2 if the primary formation leaks and secondary containment fails.more » The framework is implemented in a spreadsheet in which users enter numerical scores representing expert opinions or general information available from published materials along with estimates of uncertainty to evaluate the three basic characteristics in order to screen and rank candidate sites. Application of the framework to the Rio Visa Gas Field, Ventura Oil Field, and Mammoth Mountain demonstrates the approach. Refinements and extensions are possible through the use of more detailed data or model results in place of property proxies. Revisions and extensions to improve the approach are anticipated in the near future as it is used and tested by colleagues and collaborators.« less
Multispectral processing based on groups of resolution elements
NASA Technical Reports Server (NTRS)
Richardson, W.; Gleason, J. M.
1975-01-01
Several nine-point rules are defined and compared with previously studied rules. One of the rules performed well in boundary areas, but with reduced efficiency in field interiors; another combined best performance on field interiors with good sensitivity to boundary detail. The basic threshold gradient and some modifications were investigated as a means of boundary point detection. The hypothesis testing methods of closed-boundary formation were also tested and evaluated. An analysis of the boundary detection problem was initiated, employing statistical signal detection and parameter estimation techniques to analyze various formulations of the problem. These formulations permit the atmospheric and sensor system effects on the data to be thoroughly analyzed. Various boundary features and necessary assumptions can also be investigated in this manner.
Perkel, R L
1996-03-01
Managed care presents physicians with potential ethical dilemmas different from dilemmas in traditional fee-for-service practice. The ethical assumptions of managed care are explored, with special attention to the evolving dual responsibilities of physicians as patient advocates and as entrepreneurs. A number of proposals are described that delineate issues in support of and in opposition to managed care. Through an understanding of how to apply basic ethics principles to managed care participation, physicians may yet hold on to the basic ethic of the fiduciary doctor-patient relationship.
1981-09-01
corresponds to the same square footage that consumed the electrical energy. 3. The basic assumptions of multiple linear regres- sion, as enumerated in...7. Data related to the sample of bases is assumed to be representative of bases in the population. Limitations Basic limitations on this research were... Ratemaking --Overview. Rand Report R-5894, Santa Monica CA, May 1977. Chatterjee, Samprit, and Bertram Price. Regression Analysis by Example. New York: John
Hazards and occupational risk in hard coal mines - a critical analysis of legal requirements
NASA Astrophysics Data System (ADS)
Krause, Marcin
2017-11-01
This publication concerns the problems of occupational safety and health in hard coal mines, the basic elements of which are the mining hazards and the occupational risk. The work includes a comparative analysis of selected provisions of general and industry-specific law regarding the analysis of hazards and occupational risk assessment. Based on a critical analysis of legal requirements, basic assumptions regarding the practical guidelines for occupational risk assessment in underground coal mines have been proposed.
ERIC Educational Resources Information Center
Collins, Michael
1989-01-01
Describes a Canadian curriculum development project; analyzes underlying policy assumptions. Advocates involvement of prison educators and inmates in the process if curriculum is to meet the educational needs of inmates. (Author/LAM)
Computer Applications in Teaching and Learning.
ERIC Educational Resources Information Center
Halley, Fred S.; And Others
Some examples of the usage of computers in teaching and learning are examination generation, automatic exam grading, student tracking, problem generation, computational examination generators, program packages, simulation, and programing skills for problem solving. These applications are non-trivial and do fulfill the basic assumptions necessary…
Probabilistic Simulation of Territorial Seismic Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baratta, Alessandro; Corbi, Ileana
2008-07-08
The paper is focused on a stochastic process for the prevision of seismic scenarios on the territory and developed by means of some basic assumptions in the procedure and by elaborating the fundamental parameters recorded during some ground motions occurred in a seismic area.
Elements of a Research Report.
ERIC Educational Resources Information Center
Schurter, William J.
This guide for writing research or technical reports discusses eleven basic elements of such reports and provides examples of "good" and "bad" wordings. These elements are the title, problem statement, purpose statement, need statement, hypothesis, assumptions, procedures, limitations, terminology, conclusion and recommendations. This guide is…
The Case for a Hierarchical Cosmology
ERIC Educational Resources Information Center
Vaucouleurs, G. de
1970-01-01
The development of modern theoretical cosmology is presented and some questionable assumptions of orthodox cosmology are pointed out. Suggests that recent observations indicate that hierarchical clustering is a basic factor in cosmology. The implications of hierarchical models of the universe are considered. Bibliography. (LC)
The Estimation Theory Framework of Data Assimilation
NASA Technical Reports Server (NTRS)
Cohn, S.; Atlas, Robert (Technical Monitor)
2002-01-01
Lecture 1. The Estimation Theory Framework of Data Assimilation: 1. The basic framework: dynamical and observation models; 2. Assumptions and approximations; 3. The filtering, smoothing, and prediction problems; 4. Discrete Kalman filter and smoother algorithms; and 5. Example: A retrospective data assimilation system
Duarte, Adam; Adams, Michael J.; Peterson, James T.
2018-01-01
Monitoring animal populations is central to wildlife and fisheries management, and the use of N-mixture models toward these efforts has markedly increased in recent years. Nevertheless, relatively little work has evaluated estimator performance when basic assumptions are violated. Moreover, diagnostics to identify when bias in parameter estimates from N-mixture models is likely is largely unexplored. We simulated count data sets using 837 combinations of detection probability, number of sample units, number of survey occasions, and type and extent of heterogeneity in abundance or detectability. We fit Poisson N-mixture models to these data, quantified the bias associated with each combination, and evaluated if the parametric bootstrap goodness-of-fit (GOF) test can be used to indicate bias in parameter estimates. We also explored if assumption violations can be diagnosed prior to fitting N-mixture models. In doing so, we propose a new model diagnostic, which we term the quasi-coefficient of variation (QCV). N-mixture models performed well when assumptions were met and detection probabilities were moderate (i.e., ≥0.3), and the performance of the estimator improved with increasing survey occasions and sample units. However, the magnitude of bias in estimated mean abundance with even slight amounts of unmodeled heterogeneity was substantial. The parametric bootstrap GOF test did not perform well as a diagnostic for bias in parameter estimates when detectability and sample sizes were low. The results indicate the QCV is useful to diagnose potential bias and that potential bias associated with unidirectional trends in abundance or detectability can be diagnosed using Poisson regression. This study represents the most thorough assessment to date of assumption violations and diagnostics when fitting N-mixture models using the most commonly implemented error distribution. Unbiased estimates of population state variables are needed to properly inform management decision making. Therefore, we also discuss alternative approaches to yield unbiased estimates of population state variables using similar data types, and we stress that there is no substitute for an effective sample design that is grounded upon well-defined management objectives.
Sawan, Mouna; Jeon, Yun-Hee; Chen, Timothy F
2018-03-01
Psychotropic medicines are commonly used in nursing homes, despite marginal clinical benefits and association with harm in the elderly. Organizational culture is proposed as a factor explaining the high-level use of psychotropic medicines. Schein describes three levels of culture: artifacts, espoused values, and basic assumptions. This integrative review aimed to investigate the facets and role of organizational culture in the use of psychotropic medicines in nursing homes. Five databases were searched for qualitative, quantitative, and mixed method empirical studies up to 13 February 2017. Articles were included if they examined an aspect of organizational culture according to Schein's theory and the use of psychotropic medicines in nursing homes for the management of behavioral and sleep disturbances in residents. Article screening and data extraction were performed independently by one reviewer and checked by the research team. The integrative review method, an approach similar to the method of constant comparison analysis was utilized for data analysis. Twenty-four studies met the inclusion criteria: 13 used quantitative methods, 9 used qualitative methods, 1 was quasi-qualitative, and 1 used mixed methods. Included studies were found to only address two aspects of organizational culture in relation to the use of psychotropic medicines: artifacts and espoused values. No studies addressed the basic assumptions, the unsaid taken-for-granted beliefs, which provide explanations for in/consistencies between the ideal use of psychotropic medicines and the actual use of psychotropic medicines. Previous studies suggest that organizational culture influences the use of psychotropic medicines in nursing homes; however, what is known is descriptive of culture only at the surface level, that is the artifacts and espoused values. Hence, future research that explains the impact of the basic assumptions of culture on the use of psychotropic medicines is important.
Development of state and transition model assumptions used in National Forest Plan revision
Eric B. Henderson
2008-01-01
State and transition models are being utilized in forest management analysis processes to evaluate assumptions about disturbances and succession. These models assume valid information about seral class successional pathways and timing. The Forest Vegetation Simulator (FVS) was used to evaluate seral class succession assumptions for the Hiawatha National Forest in...
Alfadl, Abubakr Abdelraouf; Maraghi, Fatima Abdulla; Mohammad, Khadijah Shhab
2016-01-01
Introduction There are limited studies on consumer behaviour toward counterfeit products and the determining factors that motivate willingness to purchase counterfeit items. Aim This study aimed to fill this literature gap through studying differences in individual ethical evaluations of counterfeit drug purchase and whether that ethical evaluation affected by difference in income. It is hypothesized that individuals with lower/higher income make a more/less permissive evaluation of ethical responsibility regarding counterfeit drug purchase. Materials and Methods To empirically test the research assumption, a comparison was made between people who live in the low-income country Sudan and people who live in the high-income country Qatar. The study employed a face-to-face structured interview survey methodology to collect data from 1,170 subjects and the Sudanese and Qatari samples were compared using independent t-test at alpha level of 0.05 employing SPSS version 22.0. Results Sudanese and Qatari individuals were significantly different on all items. Sudanese individuals scored below 3 for all Awareness of Societal Consequences (ASC) items indicating that they make more permissive evaluation of ethical responsibility regarding counterfeit drug purchase. Both groups shared a basic positive moral agreement regarding subjective norm indicating that influence of income is not evident. Conclusion Findings indicate that low-income individuals make more permissive evaluation of ethical responsibility regarding counterfeit drugs purchase when highlighting awareness of societal consequences used as a deterrent tool, while both low and high-income individuals share a basic positive moral agreement when subjective norm dimension is exploited to discourage unethical buying behaviour. PMID:27790465
Science for managing ecosystem services: Beyond the Millennium Ecosystem Assessment
Carpenter, Stephen R.; Mooney, Harold A.; Agard, John; Capistrano, Doris; DeFries, Ruth S.; Díaz, Sandra; Dietz, Thomas; Duraiappah, Anantha K.; Oteng-Yeboah, Alfred; Pereira, Henrique Miguel; Perrings, Charles; Reid, Walter V.; Sarukhan, José; Scholes, Robert J.; Whyte, Anne
2009-01-01
The Millennium Ecosystem Assessment (MA) introduced a new framework for analyzing social–ecological systems that has had wide influence in the policy and scientific communities. Studies after the MA are taking up new challenges in the basic science needed to assess, project, and manage flows of ecosystem services and effects on human well-being. Yet, our ability to draw general conclusions remains limited by focus on discipline-bound sectors of the full social–ecological system. At the same time, some polices and practices intended to improve ecosystem services and human well-being are based on untested assumptions and sparse information. The people who are affected and those who provide resources are increasingly asking for evidence that interventions improve ecosystem services and human well-being. New research is needed that considers the full ensemble of processes and feedbacks, for a range of biophysical and social systems, to better understand and manage the dynamics of the relationship between humans and the ecosystems on which they rely. Such research will expand the capacity to address fundamental questions about complex social–ecological systems while evaluating assumptions of policies and practices intended to advance human well-being through improved ecosystem services. PMID:19179280
Spectral properties of blast-wave models of gamma-ray burst sources
NASA Technical Reports Server (NTRS)
Meszaros, P.; Rees, M. J.; Papathanassiou, H.
1994-01-01
We calculate the spectrum of blast-wave models of gamma-ray burst sources, for various assumptions about the magnetic field density and the relativistic particle acceleration efficiency. For a range of physically plausible models we find that the radiation efficiency is high and leads to nonthermal spectra with breaks at various energies comparable to those observed in the gamma-ray range. Radiation is also predicted at other wavebands, in particular at X-ray, optical/UV, and GeV/TeV energies. We discuss the spectra as a function of duration for three basic types of models, and for cosmological, halo, and galactic disk distances. We also evaluate the gamma-ray fluences and the spectral characteristics for a range of external densities. Impulsive burst models at cosmological distances can satisfy the conventional X-ray paucity constraint S(sub x)/S(sub gamma)less than a few percent over a wide range of durations, but galactic models can do so only for bursts shorter than a few seconds, unless additional assumptions are made. The emissivity is generally larger for bursts in a denser external environment, with the efficiency increasing up to the point where all the energy input is radiated away.
Model-based analysis of keratin intermediate filament assembly
NASA Astrophysics Data System (ADS)
Martin, Ines; Leitner, Anke; Walther, Paul; Herrmann, Harald; Marti, Othmar
2015-09-01
The cytoskeleton of epithelial cells consists of three types of filament systems: microtubules, actin filaments and intermediate filaments (IFs). Here, we took a closer look at type I and type II IF proteins, i.e. keratins. They are hallmark constituents of epithelial cells and are responsible for the generation of stiffness, the cellular response to mechanical stimuli and the integrity of entire cell layers. Thereby, keratin networks constitute an important instrument for cells to adapt to their environment. In particular, we applied models to characterize the assembly of keratin K8 and K18 into elongated filaments as a means for network formation. For this purpose, we measured the length of in vitro assembled keratin K8/K18 filaments by transmission electron microscopy at different time points. We evaluated the experimental data of the longitudinal annealing reaction using two models from polymer chemistry: the Schulz-Zimm model and the condensation polymerization model. In both scenarios one has to make assumptions about the reaction process. We compare how well the models fit the measured data and thus determine which assumptions fit best. Based on mathematical modelling of experimental filament assembly data we define basic mechanistic properties of the elongation reaction process.
ERIC Educational Resources Information Center
Resnick, Lauren B.; And Others
This paper discusses a radically different set of assumptions to improve educational outcomes for disadvantaged students. It is argued that disadvantaged children, when exposed to carefully organized thinking-oriented instruction, can acquire the traditional basic skills in the process of reasoning and solving problems. The paper is presented in…
Measurement of Inequality: The Gini Coefficient and School Finance Studies.
ERIC Educational Resources Information Center
Lows, Raymond L.
1984-01-01
Discusses application of the "Lorenz Curve" (a graphical representation of the concentration of wealth) with the "Gini Coefficient" (an index of inequality) to measure social inequality in school finance studies. Examines the basic assumptions of these measures and suggests a minor reconception. (MCG)
Beyond the Virtues-Principles Debate.
ERIC Educational Resources Information Center
Keat, Marilyn S.
1992-01-01
Indicates basic ontological assumptions in the virtues-principles debate in moral philosophy, noting Aristotle's and Kant's fundamental ideas about morality and considering a hermeneutic synthesis of theories. The article discusses what acceptance of the synthesis might mean in the theory and practice of moral pedagogy, offering examples of…
The Structuring Principle: Political Socialization and Belief Systems
ERIC Educational Resources Information Center
Searing, Donald D.; And Others
1973-01-01
Assesses the significance of data on childhood political learning to political theory by testing the structuring principle,'' considered one of the central assumptions of political socialization research. This principle asserts that basic orientations acquired during childhood structure the later learning of specific issue beliefs.'' The…
ERIC Educational Resources Information Center
Hastings, Elizabeth
1981-01-01
The author outlines the experiences of disability and demonstrates that generally unpleasant experiences are the direct result of a basic and false assumption on the part of society. Experiences of the disabled are discussed in areas the author categorizes as exclusion or segregation, deprivation, prejudice, poverty, frustration, and…
Some Remarks on the Theory of Political Education. German Studies Notes.
ERIC Educational Resources Information Center
Holtmann, Antonius
This theoretical discussion explores pedagogical assumptions of political education in West Germany. Three major methodological orientations are discussed: the normative-ontological, empirical-analytical, and dialectical-historical. The author recounts the aims, methods, and basic presuppositions of each of these approaches. Topics discussed…
Assessment of the Natural Environment.
ERIC Educational Resources Information Center
Cantrell, Mary Lynn; Cantrell, Robert P.
1985-01-01
Basic assumptions of an ecological-behavioral view of assessing behavior disordered students are reviewed along with a proposed method for ecological analysis and specific techniques for measuring ecological variables (such as environmental units, behaviors of significant others, and behavioral expectations). The use of such information in program…
Sherlock Holmes as a Social Scientist.
ERIC Educational Resources Information Center
Ward, Veronica; Orbell, John
1988-01-01
Presents a way of teaching the scientific method through studying the adventures of Sherlock Holmes. Asserting that Sherlock Holmes used the scientific method to solve cases, the authors construct Holmes' method through excerpts from novels featuring his adventures. Discusses basic assumptions, paradigms, theory building, and testing. (SLM)
Basic principles of respiratory function monitoring in ventilated newborns: A review.
Schmalisch, Gerd
2016-09-01
Respiratory monitoring during mechanical ventilation provides a real-time picture of patient-ventilator interaction and is a prerequisite for lung-protective ventilation. Nowadays, measurements of airflow, tidal volume and applied pressures are standard in neonatal ventilators. The measurement of lung volume during mechanical ventilation by tracer gas washout techniques is still under development. The clinical use of capnography, although well established in adults, has not been embraced by neonatologists because of technical and methodological problems in very small infants. While the ventilatory parameters are well defined, the calculation of other physiological parameters are based upon specific assumptions which are difficult to verify. Incomplete knowledge of the theoretical background of these calculations and their limitations can lead to incorrect interpretations with clinical consequences. Therefore, the aim of this review was to describe the basic principles and the underlying assumptions of currently used methods for respiratory function monitoring in ventilated newborns and to highlight methodological limitations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Lectures on Dark Matter Physics
NASA Astrophysics Data System (ADS)
Lisanti, Mariangela
Rotation curve measurements from the 1970s provided the first strong indication that a significant fraction of matter in the Universe is non-baryonic. In the intervening years, a tremendous amount of progress has been made on both the theoretical and experimental fronts in the search for this missing matter, which we now know constitutes nearly 85% of the Universe's matter density. These series of lectures provide an introduction to the basics of dark matter physics. They are geared for the advanced undergraduate or graduate student interested in pursuing research in high-energy physics. The primary goal is to build an understanding of how observations constrain the assumptions that can be made about the astro- and particle physics properties of dark matter. The lectures begin by delineating the basic assumptions that can be inferred about dark matter from rotation curves. A detailed discussion of thermal dark matter follows, motivating Weakly Interacting Massive Particles, as well as lighter-mass alternatives. As an application of these concepts, the phenomenology of direct and indirect detection experiments is discussed in detail.
Testing the basic assumption of the hydrogeomorphic approach to assessing wetland functions.
Hruby, T
2001-05-01
The hydrogeomorphic (HGM) approach for developing "rapid" wetland function assessment methods stipulates that the variables used are to be scaled based on data collected at sites judged to be the best at performing the wetland functions (reference standard sites). A critical step in the process is to choose the least altered wetlands in a hydrogeomorphic subclass to use as a reference standard against which other wetlands are compared. The basic assumption made in this approach is that wetlands judged to have had the least human impact have the highest level of sustainable performance for all functions. The levels at which functions are performed in these least altered wetlands are assumed to be "characteristic" for the subclass and "sustainable." Results from data collected in wetlands in the lowlands of western Washington suggest that the assumption may not be appropriate for this region. Teams developing methods for assessing wetland functions did not find that the least altered wetlands in a subclass had a range of performance levels that could be identified as "characteristic" or "sustainable." Forty-four wetlands in four hydrogeomorphic subclasses (two depressional subclasses and two riverine subclasses) were rated by teams of experts on the severity of their human alterations and on the level of performance of 15 wetland functions. An ordinal scale of 1-5 was used to quantify alterations in water regime, soils, vegetation, buffers, and contributing basin. Performance of functions was judged on an ordinal scale of 1-7. Relatively unaltered wetlands were judged to perform individual functions at levels that spanned all of the seven possible ratings in all four subclasses. The basic assumption of the HGM approach, that the least altered wetlands represent "characteristic" and "sustainable" levels of functioning that are different from those found in altered wetlands, was not confirmed. Although the intent of the HGM approach is to use level of functioning as a metric to assess the ecological integrity or "health" of the wetland ecosystem, the metric does not seem to work in western Washington for that purpose.
Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie
2017-01-01
The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals. PMID:29180977
Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie
2017-01-01
The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals.
Aspects of fluency in writing.
Uppstad, Per Henning; Solheim, Oddny Judith
2007-03-01
The notion of 'fluency' is most often associated with spoken-language phenomena such as stuttering. The present article investigates the relevance of considering fluency in writing. The basic argument for raising this question is empirical-it follows from a focus on difficulties in written and spoken language as manifestations of different problems which should be investigated separately on the basis of their symptoms. Key-logging instruments provide new possibilities for the study of writing. The obvious use of this new technology is to study writing as it unfolds in real time, instead of focusing only on aspects of the end product. A more sophisticated application is to exploit the key-logging instrument in order to test basic assumptions of contemporary theories of spelling. The present study is a dictation task involving words and non-words, intended to investigate spelling in nine-year-old pupils with regard to their mastery of the doubling of consonants in Norwegian. In this study, we report on differences with regard to temporal measures between a group of strong writers and a group of poor ones. On the basis of these pupils' writing behavior, the relevance of the concept of 'fluency' in writing is highlighted. The interpretation of the results questions basic assumptions of the cognitive hypothesis about spelling; the article concludes by hypothesizing a different conception of spelling.
Calculation of Temperature Rise in Calorimetry.
ERIC Educational Resources Information Center
Canagaratna, Sebastian G.; Witt, Jerry
1988-01-01
Gives a simple but fuller account of the basis for accurately calculating temperature rise in calorimetry. Points out some misconceptions regarding these calculations. Describes two basic methods, the extrapolation to zero time and the equal area method. Discusses the theoretical basis of each and their underlying assumptions. (CW)
Helicopter Toy and Lift Estimation
ERIC Educational Resources Information Center
Shakerin, Said
2013-01-01
A $1 plastic helicopter toy (called a Wacky Whirler) can be used to demonstrate lift. Students can make basic measurements of the toy, use reasonable assumptions and, with the lift formula, estimate the lift, and verify that it is sufficient to overcome the toy's weight. (Contains 1 figure.)
The Rural School Principalship: Unique Challenges, Opportunities.
ERIC Educational Resources Information Center
Hill, Jonathan
1993-01-01
Presents findings based on author's research and experience as principal in California's Mojave Desert. Five basic characteristics distinguish the rural principalship: lack of an assistant principal or other support staff; assumption of other duties, including central office tasks, teaching, or management of another site; less severe student…
Teacher Education: Of the People, by the People, and for the People.
ERIC Educational Resources Information Center
Clinton, Hillary Rodham
1985-01-01
Effective inservice programs are necessary to ensure that current reforms in education are properly implemented. Inservice programs must meet the needs of both the educational system and educators. Six basic policy assumptions dealing with what is needed in inservice education are discussed. (DF)
School Discipline Disproportionality: Culturally Competent Interventions for African American Males
ERIC Educational Resources Information Center
Simmons-Reed, Evette A.; Cartledge, Gwendolyn
2014-01-01
Exclusionary policies are practiced widely in schools despite being associated with extremely poor outcomes for culturally and linguistically diverse students, particularly African American males with and without disabilities. This article discusses zero tolerance policies, the related research questioning their basic assumptions, and the negative…
General Nature of Multicollinearity in Multiple Regression Analysis.
ERIC Educational Resources Information Center
Liu, Richard
1981-01-01
Discusses multiple regression, a very popular statistical technique in the field of education. One of the basic assumptions in regression analysis requires that independent variables in the equation should not be highly correlated. The problem of multicollinearity and some of the solutions to it are discussed. (Author)
Feminism, Communication and the Politics of Knowledge.
ERIC Educational Resources Information Center
Gallagher, Margaret
Recent retrieval of pre-nineteenth century feminist thought provides a telling lesson in the politics of knowledge creation and control. From a feminist perspective, very little research carried out within the critical research paradigm questions the "basic assumptions, conventional wisdom, media myths and the accepted way of doing…
A Neo-Kohlbergian Approach to Morality Research.
ERIC Educational Resources Information Center
Rest, James R.; Narvaez, Darcia; Thoma, Stephen J.; Bebeau, Muriel J.
2000-01-01
Proposes a model of moral judgment that builds on Lawrence Kohlberg's core assumptions. Addresses the concerns that have surfaced related to Kohlberg's work in moral judgment. Presents an overview of this model using Kohlberg's basic starting points, ideas from cognitive science, and developments in moral philosophy. (CMK)
Reconciling Time, Space and Function: A New Dorsal-Ventral Stream Model of Sentence Comprehension
ERIC Educational Resources Information Center
Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias
2013-01-01
We present a new dorsal-ventral stream framework for language comprehension which unifies basic neurobiological assumptions (Rauschecker & Scott, 2009) with a cross-linguistic neurocognitive sentence comprehension model (eADM; Bornkessel & Schlesewsky, 2006). The dissociation between (time-dependent) syntactic structure-building and…
Qualitative Research in Counseling Psychology: Conceptual Foundations
ERIC Educational Resources Information Center
Morrow, Susan L.
2007-01-01
Beginning with calls for methodological diversity in counseling psychology, this article addresses the history and current state of qualitative research in counseling psychology. It identifies the historical and disciplinary origins as well as basic assumptions and underpinnings of qualitative research in general, as well as within counseling…
Intergenerational resource transfers with random offspring numbers
Arrow, Kenneth J.; Levin, Simon A.
2009-01-01
A problem common to biology and economics is the transfer of resources from parents to children. We consider the issue under the assumption that the number of offspring is unknown and can be represented as a random variable. There are 3 basic assumptions. The first assumption is that a given body of resources can be divided into consumption (yielding satisfaction) and transfer to children. The second assumption is that the parents' welfare includes a concern for the welfare of their children; this is recursive in the sense that the children's welfares include concern for their children and so forth. However, the welfare of a child from a given consumption is counted somewhat differently (generally less) than that of the parent (the welfare of a child is “discounted”). The third assumption is that resources transferred may grow (or decline). In economic language, investment, including that in education or nutrition, is productive. Under suitable restrictions, precise formulas for the resulting allocation of resources are found, demonstrating that, depending on the shape of the utility curve, uncertainty regarding the number of offspring may or may not favor increased consumption. The results imply that wealth (stock of resources) will ultimately have a log-normal distribution. PMID:19617553
Telepresence for space: The state of the concept
NASA Technical Reports Server (NTRS)
Smith, Randy L.; Gillan, Douglas J.; Stuart, Mark A.
1990-01-01
The purpose here is to examine the concept of telepresence critically. To accomplish this goal, first, the assumptions that underlie telepresence and its applications are examined, and second, the issues raised by that examination are discussed. Also, these assumptions and issues are used as a means of shifting the focus in telepresence from development to user-based research. The most basic assumption of telepresence is that the information being provided to the human must be displayed in a natural fashion, i.e., the information should be displayed to the same human sensory modalities, and in the same fashion, as if the person where actually at the remote site. A further fundamental assumption for the functional use of telepresence is that a sense of being present in the work environment will produce superior performance. In other words, that sense of being there would allow the human operator of a distant machine to take greater advantage of his or her considerable perceptual, cognitive, and motor capabilities in the performance of a task than would more limited task-related feedback. Finally, a third fundamental assumption of functional telepresence is that the distant machine under the operator's control must substantially resemble a human in dexterity.
ERIC Educational Resources Information Center
Berenson, Mark L.
2013-01-01
There is consensus in the statistical literature that severe departures from its assumptions invalidate the use of regression modeling for purposes of inference. The assumptions of regression modeling are usually evaluated subjectively through visual, graphic displays in a residual analysis but such an approach, taken alone, may be insufficient…
An evaluation of complementary relationship assumptions
NASA Astrophysics Data System (ADS)
Pettijohn, J. C.; Salvucci, G. D.
2004-12-01
Complementary relationship (CR) models, based on Bouchet's (1963) somewhat heuristic CR hypothesis, are advantageous in their sole reliance on readily available climatological data. While Bouchet's CR hypothesis requires a number of questionable assumptions, CR models have been evaluated on variable time and length scales with relative success. Bouchet's hypothesis is grounded on the assumption that a change in potential evapotranspiration (Ep}) is equal and opposite in sign to a change in actual evapotranspiration (Ea), i.e., -dEp / dEa = 1. In his mathematical rationalization of the CR, Morton (1965) similarly assumes that a change in potential sensible heat flux (Hp) is equal and opposite in sign to a change in actual sensible heat flux (Ha), i.e., -dHp / dHa = 1. CR models have maintained these assumptions while focusing on defining Ep and equilibrium evapotranspiration (Epo). We question Bouchet and Morton's aforementioned assumptions by revisiting CR derivation in light of a proposed variable, φ = -dEp/dEa. We evaluate φ in a simplified Monin Obukhov surface similarity framework and demonstrate how previous error in the application of CR models may be explained in part by previous assumptions that φ =1. Finally, we discuss the various time and length scales to which φ may be evaluated.
Three regularities of recognition memory: the role of bias.
Hilford, Andrew; Maloney, Laurence T; Glanzer, Murray; Kim, Kisok
2015-12-01
A basic assumption of Signal Detection Theory is that decisions are made on the basis of likelihood ratios. In a preceding paper, Glanzer, Hilford, and Maloney (Psychonomic Bulletin & Review, 16, 431-455, 2009) showed that the likelihood ratio assumption implies that three regularities will occur in recognition memory: (1) the Mirror Effect, (2) the Variance Effect, (3) the normalized Receiver Operating Characteristic (z-ROC) Length Effect. The paper offered formal proofs and computational demonstrations that decisions based on likelihood ratios produce the three regularities. A survey of data based on group ROCs from 36 studies validated the likelihood ratio assumption by showing that its three implied regularities are ubiquitous. The study noted, however, that bias, another basic factor in Signal Detection Theory, can obscure the Mirror Effect. In this paper we examine how bias affects the regularities at the theoretical level. The theoretical analysis shows: (1) how bias obscures the Mirror Effect, not the other two regularities, and (2) four ways to counter that obscuring. We then report the results of five experiments that support the theoretical analysis. The analyses and the experimental results also demonstrate: (1) that the three regularities govern individual, as well as group, performance, (2) alternative explanations of the regularities are ruled out, and (3) that Signal Detection Theory, correctly applied, gives a simple and unified explanation of recognition memory data.
Plant uptake of elements in soil and pore water: field observations versus model assumptions.
Raguž, Veronika; Jarsjö, Jerker; Grolander, Sara; Lindborg, Regina; Avila, Rodolfo
2013-09-15
Contaminant concentrations in various edible plant parts transfer hazardous substances from polluted areas to animals and humans. Thus, the accurate prediction of plant uptake of elements is of significant importance. The processes involved contain many interacting factors and are, as such, complex. In contrast, the most common way to currently quantify element transfer from soils into plants is relatively simple, using an empirical soil-to-plant transfer factor (TF). This practice is based on theoretical assumptions that have been previously shown to not generally be valid. Using field data on concentrations of 61 basic elements in spring barley, soil and pore water at four agricultural sites in mid-eastern Sweden, we quantify element-specific TFs. Our aim is to investigate to which extent observed element-specific uptake is consistent with TF model assumptions and to which extent TF's can be used to predict observed differences in concentrations between different plant parts (root, stem and ear). Results show that for most elements, plant-ear concentrations are not linearly related to bulk soil concentrations, which is congruent with previous studies. This behaviour violates a basic TF model assumption of linearity. However, substantially better linear correlations are found when weighted average element concentrations in whole plants are used for TF estimation. The highest number of linearly-behaving elements was found when relating average plant concentrations to soil pore-water concentrations. In contrast to other elements, essential elements (micronutrients and macronutrients) exhibited relatively small differences in concentration between different plant parts. Generally, the TF model was shown to work reasonably well for micronutrients, whereas it did not for macronutrients. The results also suggest that plant uptake of elements from sources other than the soil compartment (e.g. from air) may be non-negligible. Copyright © 2013 Elsevier Ltd. All rights reserved.
Production process stability - core assumption of INDUSTRY 4.0 concept
NASA Astrophysics Data System (ADS)
Chromjakova, F.; Bobak, R.; Hrusecka, D.
2017-06-01
Today’s industrial enterprises are confronted by implementation of INDUSTRY 4.0 concept with basic problem - stabilised manufacturing and supporting processes. Through this phenomenon of stabilisation, they will achieve positive digital management of both processes and continuously throughput. There is required structural stability of horizontal (business) and vertical (digitized) manufacturing processes, supported through digitalised technologies of INDUSTRY 4.0 concept. Results presented in this paper based on the research results and survey realised in more industrial companies. Following will described basic model for structural process stabilisation in manufacturing environment.
Kinetic concepts of thermally stimulated reactions in solids
NASA Astrophysics Data System (ADS)
Vyazovkin, Sergey
Historical analysis suggests that the basic kinetic concepts of reactions in solids were inherited from homogeneous kinetics. These concepts rest upon the assumption of a single-step reaction that disagrees with the multiple-step nature of solid-state processes. The inadequate concepts inspire such unjustified anticipations of kinetic analysis as evaluating constant activation energy and/or deriving a single-step reaction mechanism for the overall process. A more adequate concept is that of the effective activation energy, which may vary with temperature and extent of conversion. The adequacy of this concept is illustrated by literature data as well as by experimental data on the thermal dehydration of calcium oxalate monohydrate and thermal decomposition of calcium carbonate, ammonium nitrate and 1,3,5,7- tetranitro-1,3,5,7-tetrazocine.
Economic Theory and Broadcasting.
ERIC Educational Resources Information Center
Bates, Benjamin J.
Focusing on access to audience through broadcast time, this paper examines the status of research into the economics of broadcasting. The paper first discusses the status of theory in the study of broadcast economics, both as described directly and as it exists in the statement of the basic assumptions generated by prior work and general…
Dewey and Schon: An Analysis of Reflective Thinking.
ERIC Educational Resources Information Center
Bauer, Norman J.
The challenge to the dominance of rationality in educational philosophy presented by John Dewey and Donald Schon is examined in this paper. The paper identifies basic assumptions of their perspective and explains concepts of reflective thinking, which include biography, context of uncertainty, and "not-yet." A model of reflective thought…
Tiedeman's Approach to Career Development.
ERIC Educational Resources Information Center
Harren, Vincent A.
Basic to Tiedeman's approach to career development and decision making is the assumption that one is responsible for one's own behavior because one has the capacity for choice and lives in a world which is not deterministic. Tiedeman, a cognitive-developmental theorist, views continuity of development as internal or psychological while…
Linking Educational Philosophy with Micro-Level Technology: The Search for a Complete Method.
ERIC Educational Resources Information Center
Januszewski, Alan
Traditionally, educational technologists have not been concerned with social or philosophical questions, and the field does not have a basic educational philosophy. Instead, it is dominated by a viewpoint characterized as "technical rationality" or "technicism"; the most important assumption of this viewpoint is that science…
Network Analysis in Comparative Social Sciences
ERIC Educational Resources Information Center
Vera, Eugenia Roldan; Schupp, Thomas
2006-01-01
This essay describes the pertinence of Social Network Analysis (SNA) for the social sciences in general, and discusses its methodological and conceptual implications for comparative research in particular. The authors first present a basic summary of the theoretical and methodological assumptions of SNA, followed by a succinct overview of its…
Conservatism in America--What Does it Mean for Teacher Education?
ERIC Educational Resources Information Center
Dolce, Carl J.
The current conflict among opposing sets of cultural ideals is illustrated by several interrelated conditions. The conservative phenomenon is more complex than the traditional liberal-conservative dichotomy would suggest. Changes in societal conditions invite a reexamination of basic assumptions across the broad spectrum of political ideology.…
Variable thickness transient ground-water flow model. Volume 1. Formulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reisenauer, A.E.
1979-12-01
Mathematical formulation for the variable thickness transient (VTT) model of an aquifer system is presented. The basic assumptions are described. Specific data requirements for the physical parameters are discussed. The boundary definitions and solution techniques of the numerical formulation of the system of equations are presented.
A SYSTEMS ANALYSIS OF SCHOOL BOARD ACTION.
ERIC Educational Resources Information Center
SCRIBNER, JAY D.
THE BASIC ASSUMPTION OF THE FUNCTIONAL-SYSTEMS THEORY IS THAT STRUCTURES FULFILL FUNCTIONS IN SYSTEMS AND THAT SUBSYSTEMS OPERATE SEPARATELY WITHIN ANY TYPE OF STRUCTURE. RELYING MAINLY ON GABRIEL ALMOND'S PARADIGM, THE AUTHOR ATTEMPTS TO DETERMINE THE USEFULNESS OF THE FUNCTIONAL-SYSTEMS THEORY IN CONDUCTING EMPIRICAL RESEARCH OF SCHOOL BOARDS.…
Distance-Based and Distributed Learning: A Decision Tool for Education Leaders.
ERIC Educational Resources Information Center
McGraw, Tammy M.; Ross, John D.
This decision tool presents a progression of data collection and decision-making strategies that can increase the effectiveness of distance-based or distributed learning instruction. A narrative and flow chart cover the following steps: (1) basic assumptions, including purpose of instruction, market scan, and financial resources; (2) needs…
ERIC Educational Resources Information Center
Fischer, Gerhard H.
1987-01-01
A natural parameterization and formalization of the problem of measuring change in dichotomous data is developed. Mathematically-exact definitions of specific objectivity are presented, and the basic structures of the linear logistic test model and the linear logistic model with relaxed assumptions are clarified. (SLD)
A Guide to Curriculum Planning in Mathematics. Bulletin No. 6284.
ERIC Educational Resources Information Center
Chambers, Donald L.; And Others
This guide was written under the basic assumptions that the mathematics curriculum must continuously change and that mathematics is most effectively learned through a spiral approach. Further, it is assumed that the audience will be members of district mathematics curriculum committees. Instructional objectives have been organized to reveal the…
Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.
Document-Oriented E-Learning Components
ERIC Educational Resources Information Center
Piotrowski, Michael
2009-01-01
This dissertation questions the common assumption that e-learning requires a "learning management system" (LMS) such as Moodle or Blackboard. Based on an analysis of the current state of the art in LMSs, we come to the conclusion that the functionality of conventional e-learning platforms consists of basic content management and…
Moral Development in Higher Education
ERIC Educational Resources Information Center
Liddell, Debora L.; Cooper, Diane L.
2012-01-01
In this article, the authors lay out the basic foundational concepts and assumptions that will guide the reader through the chapters to come as the chapter authors explore "how" moral growth can be facilitated through various initiatives on the college campus. This article presents a brief review of the theoretical frameworks that provide the…
Measuring Protein Interactions by Optical Biosensors
Zhao, Huaying; Boyd, Lisa F.; Schuck, Peter
2017-01-01
This unit gives an introduction to the basic techniques of optical biosensing for measuring equilibrium and kinetics of reversible protein interactions. Emphasis is given to the description of robust approaches that will provide reliable results with few assumptions. How to avoid the most commonly encountered problems and artifacts is also discussed. PMID:28369667
A "View from Nowhen" on Time Perception Experiments
ERIC Educational Resources Information Center
Riemer, Martin; Trojan, Jorg; Kleinbohl, Dieter; Holzl, Rupert
2012-01-01
Systematic errors in time reproduction tasks have been interpreted as a misperception of time and therefore seem to contradict basic assumptions of pacemaker-accumulator models. Here we propose an alternative explanation of this phenomenon based on methodological constraints regarding the direction of time, which cannot be manipulated in…
Teaching Literature: Some Honest Doubts.
ERIC Educational Resources Information Center
Rutledge, Donald G.
1968-01-01
The possibility that many English teachers take their subject too seriously should be considered. The assumption that literature can to any degree either improve or adversely affect students is doubtful, but the exclusive study of "great literature" in our secondary schools may invite basic reflections too early: a year's steady diet of "King…
East Europe Report, Political, Sociological and Military Affairs, No. 2219
1983-10-24
takes place in training booths and classrooms. On the way to warrant officer one must take sociology, Russian, basic construction, materials...polemics. I admit that I like this much more than the obligatory hearty kiss on both cheeks along with, of course, the assumption that polemicists have
The Binding Properties of Quechua Suffixes.
ERIC Educational Resources Information Center
Weber, David
This paper sketches an explicitly non-lexicalist application of grammatical theory to Huallaga (Huanuco) Quechua (HgQ). The advantages of applying binding theory to many suffixes that have previously been treated only as objects of the morphology are demonstrated. After an introduction, section 2 outlines basic assumptions about the nature of HgQ…
Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.
Creating a Healthy Camp Community: A Nurse's Role.
ERIC Educational Resources Information Center
Lishner, Kris Miller; Bruya, Margaret Auld
This book provides an organized, systematic overview of the basic aspects of health program management, nursing practice, and human relations issues in camp nursing. A foremost assumption is that health care in most camps needs improvement. Good health is dependent upon interventions involving social, environmental, and lifestyle factors that…
Fatherless America: Confronting Our Most Urgent Social Problem.
ERIC Educational Resources Information Center
Blankenhorn, David
The United States is rapidly becoming a fatherless society. Fatherlessness is the leading cause of declining child well-being, providing the impetus behind social problems such as crime, domestic violence, and adolescent pregnancy. Challenging the basic assumptions of opinion leaders in academia and in the media, this book debunks the prevailing…
Teaching Strategy: A New Planet.
ERIC Educational Resources Information Center
O'Brien, Edward L.
1998-01-01
Presents a lesson for middle and secondary school students in which they respond to a hypothetical scenario that enables them to develop a list of basic rights. Expounds that students compare their list of rights to the Universal Declaration of Human Rights in order to explore the assumptions about human rights. (CMK)
Session overview: forest ecosystems
John J. Battles; Robert C. Heald
2004-01-01
The core assumption of this symposium is that science can provide insight to management. Nowhere is this link more formally established than in regard to the science and management of forest ecosystems. The basic questions addressed are integral to our understanding of nature; the applications of this understanding are crucial to effective stewardship of natural...
A Comprehensive Real-World Distillation Experiment
ERIC Educational Resources Information Center
Kazameas, Christos G.; Keller, Kaitlin N.; Luyben, William L.
2015-01-01
Most undergraduate mass transfer and separation courses cover the design of distillation columns, and many undergraduate laboratories have distillation experiments. In many cases, the treatment is restricted to simple column configurations and simplifying assumptions are made so as to convey only the basic concepts. In industry, the analysis of a…
Human Praxis: A New Basic Assumption for Art Educators of the Future.
ERIC Educational Resources Information Center
Hodder, Geoffrey S.
1980-01-01
After analyzing Vincent Lanier's five characteristic roles of art education, the article briefly explains the pedagogy of Paulo Freire, based on human praxis, and applies it to the existing "oppresive" art education system. The article reduces Lanier's roles to resemble a single Freirean model. (SB)
ERIC Educational Resources Information Center
Ifenthaler, Dirk; Seel, Norbert M.
2013-01-01
In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…
Alternate hosts of Blepharipa pratensis (Meigen)
Paul A. Godwin; Thomas M. Odell
1977-01-01
A current tactic for biological control of the gypsy moth, Lymantria dispar Linnaeus, is to release its parasites in forests susceptible to gypsy moth damage before the gypsy moth arrives. The basic assumption in these anticipatory releases is that the parasites can find and utilize native insects as hosts in the interim. Blepharipa...
Children and Adolescents: Should We Teach Them or Let Them Learn?
ERIC Educational Resources Information Center
Rohwer, William D., Jr.
Research to date has provided too few answers for vital educational questions concerning teaching children or letting them learn. A basic problem is that experimentation usually begins by accepting conventional assumptions about schooling, ignoring experiments that would entail disturbing the ordering of current educational priorities.…
Chatterji, Madhabi
2016-12-01
This paper explores avenues for navigating evaluation design challenges posed by complex social programs (CSPs) and their environments when conducting studies that call for generalizable, causal inferences on the intervention's effectiveness. A definition is provided of a CSP drawing on examples from different fields, and an evaluation case is analyzed in depth to derive seven (7) major sources of complexity that typify CSPs, threatening assumptions of textbook-recommended experimental designs for performing impact evaluations. Theoretically-supported, alternative methodological strategies are discussed to navigate assumptions and counter the design challenges posed by the complex configurations and ecology of CSPs. Specific recommendations include: sequential refinement of the evaluation design through systems thinking, systems-informed logic modeling; and use of extended term, mixed methods (ETMM) approaches with exploratory and confirmatory phases of the evaluation. In the proposed approach, logic models are refined through direct induction and interactions with stakeholders. To better guide assumption evaluation, question-framing, and selection of appropriate methodological strategies, a multiphase evaluation design is recommended. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hartemink, Nienke; Cianci, Daniela; Reiter, Paul
2015-03-01
Mathematical modeling and notably the basic reproduction number R0 have become popular tools for the description of vector-borne disease dynamics. We compare two widely used methods to calculate the probability of a vector to survive the extrinsic incubation period. The two methods are based on different assumptions for the duration of the extrinsic incubation period; one method assumes a fixed period and the other method assumes a fixed daily rate of becoming infectious. We conclude that the outcomes differ substantially between the methods when the average life span of the vector is short compared to the extrinsic incubation period.
Likelihood ratio decisions in memory: three implied regularities.
Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T
2009-06-01
We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.
Shielding of substations against direct lightning strokes by shield wires
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chowdhuri, P.
1994-01-01
A new analysis for shielding outdoor substations against direct lightning strokes by shield wires is proposed. The basic assumption of this proposed method is that any lightning stroke which penetrates the shields will cause damage. The second assumption is that a certain level of risk of failure must be accepted, such as one or two failures per 100 years. The proposed method, using electrogeometric model, was applied to design shield wires for two outdoor substations: (1) 161-kV/69-kV station, and (2) 500-kV/161-kV station. The results of the proposed method were also compared with the shielding data of two other substations.
A Study of Crowd Ability and its Influence on Crowdsourced Evaluation of Design Concepts
2014-05-01
identifies the experts from the crowd, under the assumptions that ( 1 ) experts do exist and (2) only experts have consistent evaluations. These assumptions...for design evaluation tasks . Keywords: crowdsourcing, design evaluation, sparse evaluation ability, machine learning ∗Corresponding author. 1 ...intelligence” of a much larger crowd of people with diverse backgrounds [ 1 ]. Crowdsourced evaluation, or the delegation of an eval- uation task to a
Academic Public Relations Curricula: How They Compare with the Bateman-Cutlip Commission Standards.
ERIC Educational Resources Information Center
McCartney, Hunter P.
To see what effect the 1975 Bateman-Cutlip Commission's recommendations have had on improving public relations education in the United States, 173 questionnaires were sent to colleges or universities with accredited or comprehensive programs in public relations. Responding to five basic assumptions underlying the commission's recommendations,…
Faculty and Student Attitudes about Transfer of Learning
ERIC Educational Resources Information Center
Lightner, Robin; Benander, Ruth; Kramer, Eugene F.
2008-01-01
Transfer of learning is using previous knowledge in novel contexts. While this is a basic assumption of the educational process, students may not always perceive all the options for using what they have learned in different, novel situations. Within the framework of transfer of learning, this study outlines an attitudinal survey concerning faculty…
New Directions in Teacher Education: Foundations, Curriculum, Policy.
ERIC Educational Resources Information Center
Denton, Jon, Ed.; And Others
This publication includes presentations made at the Aikin-Stinnett Lecture Series and follow-up papers sponsored by the Instructional Research Laboratory at Texas A&M University. The papers in this collection focus upon the basic assumptions and conceptual bases of teacher education and the use of research in providing a foundation for…
Perspective Making: Constructivism as a Meaning-Making Structure for Simulation Gaming
ERIC Educational Resources Information Center
Lainema, Timo
2009-01-01
Constructivism has recently gained popularity, although it is not a completely new learning paradigm. Much of the work within e-learning, for example, uses constructivism as a reference "discipline" (explicitly or implicitly). However, some of the work done within the simulation gaming (SG) community discusses what the basic assumptions and…
Spiral Growth in Plants: Models and Simulations
ERIC Educational Resources Information Center
Allen, Bradford D.
2004-01-01
The analysis and simulation of spiral growth in plants integrates algebra and trigonometry in a botanical setting. When the ideas presented here are used in a mathematics classroom/computer lab, students can better understand how basic assumptions about plant growth lead to the golden ratio and how the use of circular functions leads to accurate…
Dynamic Assessment and Its Implications for RTI Models
ERIC Educational Resources Information Center
Wagner, Richard K.; Compton, Donald L.
2011-01-01
Dynamic assessment refers to assessment that combines elements of instruction for the purpose of learning something about an individual that cannot be learned as easily or at all from conventional assessment. The origins of dynamic assessment can be traced to Thorndike (1924), Rey (1934), and Vygotsky (1962), who shared three basic assumptions.…
Looking for Skinner and Finding Freud
ERIC Educational Resources Information Center
Overskeid, Geir
2007-01-01
Sigmund Freud and B. F. Skinner are often seen as psychology's polar opposites. It seems this view is fallacious. Indeed, Freud and Skinner had many things in common, including basic assumptions shaped by positivism and determinism. More important, Skinner took a clear interest in psychoanalysis and wanted to be analyzed but was turned down. His…
Student Teachers' Beliefs about the Teacher's Role in Inclusive Education
ERIC Educational Resources Information Center
Domovic, Vlatka; Vidovic Vlasta, Vizek; Bouillet, Dejana
2017-01-01
The main aim of this research is to examine the basic features of student teachers' professional beliefs about the teacher's role in relation to teaching mainstream pupils and pupils with developmental disabilities. The starting assumption of this analysis is that teacher professional development is largely dependent upon teachers' beliefs about…
United States Air Force Training Line Simulator. Final Report.
ERIC Educational Resources Information Center
Nauta, Franz; Pierce, Michael B.
This report describes the technical aspects and potential applications of a computer-based model simulating the flow of airmen through basic training and entry-level technical training. The objective of the simulation is to assess the impacts of alternative recruit classification and training policies under a wide variety of assumptions regarding…
Cable in Boston; A Basic Viability Report.
ERIC Educational Resources Information Center
Hauben, Jan Ward; And Others
The viability of urban cable television (CATV) as an economic phenomenon is examined via a case study of its feasibility in Boston, a microcosm of general urban environment. To clarify cable's economics, a unitary concept of viability is used in which all local characteristics, cost assumptions, and growth estimates are structured dynamically as a…
"I Fell off [the Mothering] Track": Barriers to "Effective Mothering" among Prostituted Women
ERIC Educational Resources Information Center
Dalla, Rochelle
2004-01-01
Ecological theory and basic assumptions for the promotion of effective mothering among low-income and working-poor women are applied in relation to a particularly vulnerable population: street-level prostitution-involved women. Qualitative data from 38 street-level prostituted women shows barriers to effective mothering at the individual,…
ERIC Educational Resources Information Center
Pickel, Andreas
2012-01-01
The social sciences rely on assumptions of a unified self for their explanatory logics. Recent work in the new multidisciplinary field of social neuroscience challenges precisely this unproblematic character of the subjective self as basic, well-defined entity. If disciplinary self-insulation is deemed unacceptable, the philosophical challenge…
ERIC Educational Resources Information Center
Pavlik, John V.
2015-01-01
Emerging technologies are fueling a third paradigm of education. Digital, networked and mobile media are enabling a disruptive transformation of the teaching and learning process. This paradigm challenges traditional assumptions that have long characterized educational institutions and processes, including basic notions of space, time, content,…
What Are We Looking For?--Pro Critical Realism in Text Interpretation
ERIC Educational Resources Information Center
Siljander, Pauli
2011-01-01
A visible role in the theoretical discourses on education has been played in the last couple of decades by the constructivist epistemologies, which have questioned the basic assumptions of realist epistemologies. The increased popularity of interpretative approaches especially has put the realist epistemologies on the defensive. Basing itself on…
The Hidden Reason Behind Children's Misbehavior.
ERIC Educational Resources Information Center
Nystul, Michael S.
1986-01-01
Discusses hidden reason theory based on the assumptions that: (1) the nature of people is positive; (2) a child's most basic psychological need is involvement; and (3) a child has four possible choices in life (good somebody, good nobody, bad somebody, or severely mentally ill.) A three step approach for implementing hidden reason theory is…
78 FR 26269 - Connect America Fund; High-Cost Universal Service Support
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-06
... the model platform, which is the basic framework for the model consisting of key assumptions about the... combination of competitive bidding and a new forward-looking model of the cost of constructing modern multi-purpose networks.'' Using the cost model to ``estimate the support necessary to serve areas where costs...
ERIC Educational Resources Information Center
Burnett, I. Emett, Jr.; Pankake, Anita M.
Although much of the current school reform movement relies on the basic assumption of effective elementary school administration, insufficient effort has been made to synthesize key concepts found in organizational theory and management studies with relevant effective schools research findings. This paper attempts such a synthesis to help develop…
Response: Training Doctoral Students to Be Scientists
ERIC Educational Resources Information Center
Pollio, David E.
2012-01-01
The purpose of this article is to begin framing doctoral training for a science of social work. This process starts by examining two seemingly simple questions: "What is a social work scientist?" and "How do we train social work scientists?" In answering the first question, some basic assumptions and concepts about what constitutes a "social work…
ERIC Educational Resources Information Center
Lotan, Gurit; Ells, Carolyn
2010-01-01
In this article, the authors challenge professionals to re-examine assumptions about basic concepts and their implications in supporting adults with intellectual and developmental disabilities. The authors focus on decisions with significant implications, such as planning transition from school to adult life, changing living environments, and…
A Convergence of Two Cultures in the Implementation of P.L. 94-142.
ERIC Educational Resources Information Center
Haas, Toni J.
The Education for All Handicapped Children Act (PL 94-142) demanded basic changes in the practices, purposes, and institutional structures of schools to accommodate handicapped students, but did not adequately address the differences between general and special educators in expectations, training, or assumptions about the functions of schooling…
From Earth to Space--Advertising Films Created in a Computer-Based Primary School Task
ERIC Educational Resources Information Center
Öman, Anne
2017-01-01
Today, teachers orchestrate computer-based tasks in software applications in Swedish primary schools. Meaning is made through various modes, and multimodal perspectives on literacy have the basic assumption that meaning is made through many representational and communicational resources. The case study presented in this paper has analysed pupils'…
Child Sexual Abuse: Intervention and Treatment Issues. The User Manual Series.
ERIC Educational Resources Information Center
Faller, Kathleen Coulborn
This manual describes professional practices in intervention and treatment of sexual abuse and discusses how to address the problems of sexually abused children and their families. It makes an assumption that the reader has basic information about sexual abuse. The discussion focuses primarily on the child's guardian as the abuser. The manual…
A Comparative Analysis of Selected Mechanical Aspects of the Ice Skating Stride.
ERIC Educational Resources Information Center
Marino, G. Wayne
This study quantitatively analyzes selected aspects of the skating strides of above-average and below-average ability skaters. Subproblems were to determine how stride length and stride rate are affected by changes in skating velocity, to ascertain whether the basic assumption that stride length accurately approximates horizontal movement of the…
Implementing a Redesign Strategy: Lessons from Educational Change.
ERIC Educational Resources Information Center
Basom, Richard E., Jr.; Crandall, David P.
The effective implementation of school redesign, based on a social systems approach, is discussed in this paper. A basic assumption is that the interdependence of system elements has implications for a complex change process. Seven barriers to redesign and five critical issues for successful redesign strategy are presented. Seven linear steps for…
ERIC Educational Resources Information Center
Bossard, James H. S.
2017-01-01
The basic assumption underlying this article is that the really significant changes in human history are those that occur, not in the mechanical gadgets which men use nor in the institutionalized arrangements by which they live, but in their attitudes and in the values which they accept. The revolutions of the past that have had the greatest…
Civility in Politics and Education. Routledge Studies in Contemporary Philosophy
ERIC Educational Resources Information Center
Mower, Deborah, Ed.; Robison, Wade L., Ed.
2011-01-01
This book examines the concept of civility and the conditions of civil disagreement in politics and education. Although many assume that civility is merely polite behavior, it functions to aid rational discourse. Building on this basic assumption, the book offers multiple accounts of civility and its contribution to citizenship, deliberative…
Improving Clinical Teaching: The ADN Experience. Pathways to Practice.
ERIC Educational Resources Information Center
Haase, Patricia T.; And Others
Three Florida associate degree in nursing (ADN) demonstration projects of the Nursing Curriculum Project (NCP) are described, and the history of the ADN program and current controversies are reviewed. In 1976, the NCP of the Southern Regional Education Board issued basic assumptions about the role of the ADN graduate, relating them to client…
Organize Your School for Improvement
ERIC Educational Resources Information Center
Truby, William F.
2017-01-01
W. Edwards Deming has suggested 96% of organization performance is a function of the organization's structure. He contends only about 4% of an organization's performance is attributable to the people. This is a fundamental difference as most school leaders work with the basic assumption that 80% of a school's performance is related to staff and…
ERIC Educational Resources Information Center
Schultz, Katherine
Although the National Workplace Literacy Program is relatively new, a new orthodoxy of program development based on particular understandings of literacy and learning has emerged. Descriptions of two model workplace education programs are the beginning points for an examination of the assumptions contained in most reports of workplace education…
ERIC Educational Resources Information Center
Elleven, Russell K.
2007-01-01
The article examines a relatively new tool to increase the effectiveness of organizations and people. The recent development and background of Appreciative Inquiry (AI) is reviewed. Basic assumptions of the model are discussed. Implications for departments and divisions of student affairs are analyzed. Finally, suggested readings and workshop…
Resegregation in Norfolk, Virginia. Does Restoring Neighborhood Schools Work?
ERIC Educational Resources Information Center
Meldrum, Christina; Eaton, Susan E.
This report reviews school department data and interviews with officials and others involved in the Norfolk (Virginia) school resegregation plan designed to stem White flight and increase parental involvement. The report finds that all the basic assumptions the local community and the court had about the potential benefits of undoing the city's…
An Economic Theory of School Governance.
ERIC Educational Resources Information Center
Rada, Roger D.
Working from the basic assumption that the primary motivation for those involved in school governance is self-interest, this paper develops and discusses 15 hypotheses that form the essential elements of an economic theory of school governance. The paper opens with a review of previous theories of governance and their origins in social science…
ERIC Educational Resources Information Center
Cameron, Kim S.
A way to assess and improve organizational effectiveness is discussed, with a focus on factors that inhibit successful organizational performance. The basic assumption is that it is easier, more accurate, and more beneficial for individuals and organizations to identify criteria of ineffectiveness (faults and weaknesses) than to identify criteria…
Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.
Lifeboat Counseling: The Issue of Survival Decisions
ERIC Educational Resources Information Center
Dowd, E. Thomas; Emener, William G.
1978-01-01
Rehabilitation counseling, as a profession, needs to look at future world possibilities, especially in light of overpopulation, and be aware that the need may arise for adjusting basic assumptions about human life--from the belief that every individual has a right to a meaningful life to the notion of selecting who shall live. (DTT)
Challenges of Adopting Constructive Alignment in Action Learning Education
ERIC Educational Resources Information Center
Remneland Wikhamn, Björn
2017-01-01
This paper will critically examine how the two influential pedagogical approaches of action-based learning and constructive alignment relate to each other, and how they may differ in focus and basic assumptions. From the outset, they are based on similar underpinnings, with the student and the learning outcomes in the center. Drawing from…
ERIC Educational Resources Information Center
Engstrom, Cathy McHugh
2008-01-01
The pedagogical assumptions and teaching practices of learning community models reflect exemplary conditions for learning, so using these models with unprepared students seems desirable and worthy of investigation. This chapter describes the key role of faculty in creating active, integrative learning experiences for students in basic skills…
Education in Conflict and Crisis for National Security.
ERIC Educational Resources Information Center
McClelland, Charles A.
A basic assumption is that the level of conflict within and between nations will escalate over the next 50 years. Trying to "muddle through" using the tools and techniques of organized violence may yield national suicide. Therefore, complex conflict resolution skills need to be developed and used by some part of society to quell disorder…
Textbooks as a Possible Influence on Unscientific Ideas about Evolution
ERIC Educational Resources Information Center
Tshuma, Tholani; Sanders, Martie
2015-01-01
While school textbooks are assumed to be written for and used by students, it is widely acknowledged that they also serve a vital support function for teachers, particularly in times of curriculum change. A basic assumption is that biology textbooks are scientifically accurate. Furthermore, because of the negative impact of…
A basic review on the inferior alveolar nerve block techniques.
Khalil, Hesham
2014-01-01
The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned.
A basic review on the inferior alveolar nerve block techniques
Khalil, Hesham
2014-01-01
The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned. PMID:25886095
A Markov chain model for reliability growth and decay
NASA Technical Reports Server (NTRS)
Siegrist, K.
1982-01-01
A mathematical model is developed to describe a complex system undergoing a sequence of trials in which there is interaction between the internal states of the system and the outcomes of the trials. For example, the model might describe a system undergoing testing that is redesigned after each failure. The basic assumptions for the model are that the state of the system after a trial depends probabilistically only on the state before the trial and on the outcome of the trial and that the outcome of a trial depends probabilistically only on the state of the system before the trial. It is shown that under these basic assumptions, the successive states form a Markov chain and the successive states and outcomes jointly form a Markov chain. General results are obtained for the transition probabilities, steady-state distributions, etc. A special case studied in detail describes a system that has two possible state ('repaired' and 'unrepaired') undergoing trials that have three possible outcomes ('inherent failure', 'assignable-cause' 'failure' and 'success'). For this model, the reliability function is computed explicitly and an optimal repair policy is obtained.
Sandwich mapping of schistosomiasis risk in Anhui Province, China.
Hu, Yi; Bergquist, Robert; Lynn, Henry; Gao, Fenghua; Wang, Qizhi; Zhang, Shiqing; Li, Rui; Sun, Liqian; Xia, Congcong; Xiong, Chenglong; Zhang, Zhijie; Jiang, Qingwu
2015-06-03
Schistosomiasis mapping using data obtained from parasitological surveys is frequently used in planning and evaluation of disease control strategies. The available geostatistical approaches are, however, subject to the assumption of stationarity, a stochastic process whose joint probability distribution does not change when shifted in time. As this is impractical for large areas, we introduce here the sandwich method, the basic idea of which is to divide the study area (with its attributes) into homogeneous subareas and estimate the values for the reporting units using spatial stratified sampling. The sandwich method was applied to map the county-level prevalence of schistosomiasis japonica in Anhui Province, China based on parasitological data collected from sample villages and land use data. We first mapped the county-level prevalence using the sandwich method, then compared our findings with block Kriging. The sandwich estimates ranged from 0.17 to 0.21% with a lower level of uncertainty, while the Kriging estimates varied from 0 to 0.97% with a higher level of uncertainty, indicating that the former is more smoothed and stable compared to latter. Aside from various forms of reporting units, the sandwich method has the particular merit of simple model assumption coupled with full utilization of sample data. It performs well when a disease presents stratified heterogeneity over space.
PET image reconstruction: a robust state space approach.
Liu, Huafeng; Tian, Yi; Shi, Pengcheng
2005-01-01
Statistical iterative reconstruction algorithms have shown improved image quality over conventional nonstatistical methods in PET by using accurate system response models and measurement noise models. Strictly speaking, however, PET measurements, pre-corrected for accidental coincidences, are neither Poisson nor Gaussian distributed and thus do not meet basic assumptions of these algorithms. In addition, the difficulty in determining the proper system response model also greatly affects the quality of the reconstructed images. In this paper, we explore the usage of state space principles for the estimation of activity map in tomographic PET imaging. The proposed strategy formulates the organ activity distribution through tracer kinetics models, and the photon-counting measurements through observation equations, thus makes it possible to unify the dynamic reconstruction problem and static reconstruction problem into a general framework. Further, it coherently treats the uncertainties of the statistical model of the imaging system and the noisy nature of measurement data. Since H(infinity) filter seeks minimummaximum-error estimates without any assumptions on the system and data noise statistics, it is particular suited for PET image reconstruction where the statistical properties of measurement data and the system model are very complicated. The performance of the proposed framework is evaluated using Shepp-Logan simulated phantom data and real phantom data with favorable results.
Morrow, Nathan; Nkwake, Apollo M
2016-12-01
Like artisans in a professional guild, we evaluators create tools to suit our ever evolving practice. The tools we use as evaluators are the primary artifacts of our profession, reflect our practice and embody an amalgamation of paradigms and assumptions. With the increasing shifts in evaluation purposes from judging program worth to understanding how programs work, the evaluator's role is changing to that of facilitating stakeholders in a learning process. This involves clarifying purposes and choices, as well as unearthing critical assumptions. In such a role, evaluators become major tool-users and begin to innovate with small refinements or produce completely new tools to fit a specific challenge or context. We interrogate the form and function of 12 tools used by evaluators when working with complex evaluands and complex contexts. The form is described in terms of traditional qualitative techniques and particular characteristics of the elements, use and presentation of each tool. Then the function of each tool is analyzed with respect to articulating assumptions and affecting the agency of evaluators and stakeholders in complex contexts. Copyright © 2016 Elsevier Ltd. All rights reserved.
Moritz, Max A.; Keeley, Jon E.; Johnson, Edward A.; Schaffner, Andrew A.
2004-01-01
This year's catastrophic wildfires in southern California highlight the need for effective planning and management for fire-prone landscapes. Fire frequency analysis of several hundred wildfires over a broad expanse of California shrublands reveals that there is generally not, as is commonly assumed, a strong relationship between fuel age and fire probabilities. Instead, the hazard of burning in most locations increases only moderately with time since the last fire, and a marked age effect of fuels is observed only in limited areas. Results indicate a serious need for a re-evaluation of current fire management and policy, which is based largely on eliminating older stands of shrubland vegetation. In many shrubland ecosystems exposed to extreme fire weather, large and intense wildfires may need to be factored in as inevitable events.
[Application of State Space model in the evaluation of the prevention and control for mumps].
Luo, C; Li, R Z; Xu, Q Q; Xiong, P; Liu, Y X; Xue, F Z; Xu, Q; Li, X J
2017-09-10
Objective: To analyze the epidemiological characteristics of mumps in 2012 and 2014, and to explore the preventive effect of the second dose of mumps-containing vaccine (MuCV) in mumps in Shandong province. Methods: On the basis of certain model assumptions, a Space State model was formulated. Iterated Filter was applied to the epidemic model to estimate the parameters. Results: The basic reproduction number ( R (0)) for children in schools was 4.49 (95 %CI : 4.30-4.67) and 2.50 (95 %CI : 2.38-2.61) respectively for the year of 2012 and 2014. Conclusions: Space State model seems suitable for mumps prevalence description. The policy of 2-dose MuCV can effectively reduce the number of total patients. Children in schools are the key to reduce the mumps.
Statistical foundations of liquid-crystal theory
Seguin, Brian; Fried, Eliot
2013-01-01
We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals. PMID:23772091
Luna, L H
2015-10-01
The use of dental metrics as a reliable tool for the assessment of biological distances has diversified dramatically in the last decades. In this paper some of the basic assumptions on this issue and the potential of cervical measurements in biodistance protocols are discussed. A sample of 1173 permanent teeth from 57 male and female individuals, recovered in Chenque I site (western Pampas, central Argentina), a Late Holocene hunter-gatherer cemetery, is examined in order to test the impact of exogenous factors that may have influenced the phenotypic manifestation and affected dental crown sizes. The statistical association between dental metric data, obtained by measuring the mesiodistal and buccolingual diameters of the crown and cervix, and the quantification of hypoplastic defects as a measure to evaluate the influence of the environment in the dental phenotypic expression is evaluated. The results show that socioenvironmental stress did not affect dental metrics and that only the more stable teeth (first incisors, canines, first premolars and first molars) and three variables (buccolingual diameter of the crown and both mesiodistal and buccolingual measurements of the cervix) should be included in multivariate analyses. These suggestions must be strengthened with additional studies of other regional samples to identify factors of variation among populations, so as to develop general guidelines for dental survey and biodistance analysis, but they are a first step for discussing assumptions usually used and maximizing the available information for low-density hunter-gatherer societies. Copyright © 2015 Elsevier GmbH. All rights reserved.
Power and Method: Political Activism and Educational Research. Critical Social Thought Series.
ERIC Educational Resources Information Center
Gitlin, Andrew, Ed.
This book scrutinizes some basic assumptions about educational research with the aim that such research may act more powerfully on those persistent and important problems of our schools surrounding issues of race, class, and gender. In particular, the 13 essays in this book examine how power is infused in research by addressing such questions as…
The Future of Family Business Education in UK Business Schools
ERIC Educational Resources Information Center
Collins, Lorna; Seaman, Claire; Graham, Stuart; Stepek, Martin
2013-01-01
Purpose: This practitioner paper aims to question basic assumptions about management education and to argue that a new paradigm is needed for UK business schools which embraces an oft neglected, yet economically vital, stakeholder group, namely family businesses. It seeks to pose the question of why we have forgotten to teach about family business…
ERIC Educational Resources Information Center
Olympia, Daniel; Farley, Megan; Christiansen, Elizabeth; Pettersson, Hollie; Jenson, William; Clark, Elaine
2004-01-01
While much of the current focus in special education remains on reauthorization of the Individuals with Disabilities Act of 1997, disparities in the identification of children with serious emotional disorders continue to plague special educators and school psychologists. Several years after the issue of social maladjustment and its relationship to…
Locations of Racism in Education: A Speech Act Analysis of a Policy Chain
ERIC Educational Resources Information Center
Arneback, Emma; Quennerstedt, Ann
2016-01-01
This article explores how racism is located in an educational policy chain and identifies how its interpretation changes throughout the chain. A basic assumption is that the policy formation process can be seen as a chain in which international, national and local policies are "links"--separate entities yet joined. With Sweden as the…
The Education System in Greece. [Revised.
ERIC Educational Resources Information Center
EURYDICE Central Unit, Brussels (Belgium).
The education policy of the Greek government rests on the basic assumption that effective education is a social goal and that every citizen has a right to an education. A brief description of the Greek education system and of the adjustments made to give practical meaning to the provisions on education in the Constitution is presented in the…
Experiences in Rural Mental Health II: Organizing a Low Budget Program.
ERIC Educational Resources Information Center
Hollister, William G.; And Others
Based on a North Carolina feasibility study (1967-73) which focused on development of a pattern for providing comprehensive mental health services to rural people, this second program guide deals with organization of a low-income program budget. Presenting the basic assumptions utilized in the development of a low-budget program in Franklin and…
ERIC Educational Resources Information Center
Gunthorpe, Sydney
2006-01-01
From the assumption that matching a student's learning style with the learning method best suited for the student, it follows that developing courses that correlate learning method with learning style would be more successful for students. Albuquerque Technical Vocational Institute (TVI) in New Mexico has attempted to provide students with more…
Reds, Greens, Yellows Ease the Spelling Blues.
ERIC Educational Resources Information Center
Irwin, Virginia
1971-01-01
This document reports on a color-coding innovation designed to improve the spelling ability of high school seniors. This color-coded system is based on two assumptions: that color will appeal to the students and that there are three principal reasons for misspelling. Two groups were chosen for the experiments. A basic list of spelling demons was…
The Politics and Coverage of Terror: From Media Images to Public Consciousness.
ERIC Educational Resources Information Center
Wittebols, James H.
This paper presents a typology of terrorism which is grounded in how media differentially cover each type. The typology challenges some of the basic assumptions, such as that the media "allow" themselves to be exploited by terrorists and "encourage" terrorism, and the conventional wisdom about the net effects of the media's…
The Past as Prologue: Examining the Consequences of Business as Usual. Center Paper 01-93.
ERIC Educational Resources Information Center
Jones, Dennis P.; And Others
This study examined the ability of California to meet increased demand for postsecondary education without significantly altering the basic historical assumptions and policies that have governed relations between the state and its institutions of higher learning. Results of a series of analyses that estimated projected enrollments and costs under…
The Spouse and Familial Incest: An Adlerian Perspective.
ERIC Educational Resources Information Center
Quinn, Kathleen L.
A major component of Adlerian psychology concerns the belief in responsibility to self and others. In both incest perpetrator and spouse the basic underlying assumption of responsibility to self and others is often not present. Activities and behaviors occur in a social context and as such need to be regarded within a social context that may serve…
Initial Comparison of Single Cylinder Stirling Engine Computer Model Predictions with Test Results
NASA Technical Reports Server (NTRS)
Tew, R. C., Jr.; Thieme, L. G.; Miao, D.
1979-01-01
A Stirling engine digital computer model developed at NASA Lewis Research Center was configured to predict the performance of the GPU-3 single-cylinder rhombic drive engine. Revisions to the basic equations and assumptions are discussed. Model predictions with the early results of the Lewis Research Center GPU-3 tests are compared.
Effects of Problem Scope and Creativity Instructions on Idea Generation and Selection
ERIC Educational Resources Information Center
Rietzschel, Eric F.; Nijstad, Bernard A.; Stroebe, Wolfgang
2014-01-01
The basic assumption of brainstorming is that increased quantity of ideas results in increased generation as well as selection of creative ideas. Although previous research suggests that idea quantity correlates strongly with the number of good ideas generated, quantity has been found to be unrelated to the quality of selected ideas. This article…
The Role of the Social Studies in Public Education.
ERIC Educational Resources Information Center
Byrne, T. C.
This paper was prepared for a social studies curriculum conference in Alberta in June, 1967. It provides a point of view on curriculum building which could be useful in establishing a national service in this field. The basic assumption is that the social studies should in some measure change the behavior of the students (a sharp departure from…
Twisting of thin walled columns perfectly restrained at one end
NASA Technical Reports Server (NTRS)
Lazzarino, Lucio
1938-01-01
Proceeding from the basic assumptions of the Batho-Bredt theory on twisting failure of thin-walled columns, the discrepancies most frequently encountered are analyzed. A generalized approximate method is suggested for the determination of the disturbances in the stress condition of the column, induced by the constrained warping in one of the end sections.
Adolescent Literacy in Europe--An Urgent Call for Action
ERIC Educational Resources Information Center
Sulkunen, Sari
2013-01-01
This article focuses on the literacy of the adolescents who, in most European countries, are about to leave or have recently left basic education with the assumption that they have the command of functional literacy as required in and for further studies, citizenship, work life and a fulfilling life as individuals. First, the overall performance…
Is the European (Active) Citizenship Ideal Fostering Inclusion within the Union? A Critical Review
ERIC Educational Resources Information Center
Milana, Marcella
2008-01-01
This article reviews: (1) the establishment and functioning of EU citizenship: (2) the resulting perception of education for European active citizenship; and (3) the question of its adequacy for enhancing democratic values and practices within the Union. Key policy documents produced by the EU help to unfold the basic assumptions on which…
Improving Child Management Practices of Parents and Teachers. Maxi I Practicum. Final Report.
ERIC Educational Resources Information Center
Adreani, Arnold J.; McCaffrey, Robert
The practicum design reported in this document was based on one basic assumption, that the adult perceptions of children influence adult behavior toward children which in turn influences the child's behavior. Therefore, behavior changes by children could best be effected by changing the adult perception of, and behavior toward, the child.…
The Importance of Woody Twig Ends to Deer in the Southeast
Charles T. Cushwa; Robert L. Downing; Richard F. Harlow; David F. Urbston
1970-01-01
One of the basic assumptions underlying research on wildlife habitat in the five Atlantic states of the Southeast is that white-tailed deer (Odocoileus virginianus) rely heavily on the ends of woody twigs during the winter. Considerable research has been undertaken to determine methods for increasing and measuring the availability of woody twigs to...
Going off the Grid: Re-Examining Technology in the Basic Writing Classroom
ERIC Educational Resources Information Center
Clay-Buck, Holly; Tuberville, Brenda
2015-01-01
The notion that today's students are constantly exposed to information technology has become so pervasive that it seems the academic conversation assumes students are "tech savvy." The proliferation of apps and smart phones aimed at the traditional college-aged population feeds into this assumption, aided in no small part by a growing…
Model for Developing an In-Service Teacher Workshop To Help Multilingual and Multicultural Students.
ERIC Educational Resources Information Center
Kachaturoff, Grace; Romatowski, Jane A.
This is a model for designing an inservice teacher workshop to assist teachers working with multicultural students. The basic assumption underlying the model is universities and schools need to work cooperatively to provide experiences for improving the quality of teaching by increasing awareness of educational issues and situations and by…
NASA Technical Reports Server (NTRS)
1976-01-01
Assumptions made and techniques used in modeling the power network to the 480 volt level are discussed. Basic computational techniques used in the short circuit program are described along with a flow diagram of the program and operational procedures. Procedures for incorporating network changes are included in this user's manual.
Challenging Freedom: Neoliberalism and the Erosion of Democratic Education
ERIC Educational Resources Information Center
Karaba, Robert
2016-01-01
Goodlad, et al. (2002) rightly point out that a culture can either resist or support change. Schein's (2010) model of culture indicates observable behaviors of a culture can be explained by exposing underlying shared values and basic assumptions that give meaning to the performance. Yet culture is many-faceted and complex. So Schein advised a…
ERIC Educational Resources Information Center
Liskin-Gasparro, Judith E., Ed.
This collection of papers is divided into three parts. Part 1, "Changing Patterns: Curricular Implications," includes "Basic Assumptions Revisited: Today's French and Spanish Students at a Large Metropolitan University" (Gail Guntermann, Suzanne Hendrickson, and Carmen de Urioste) and "Le Francais et Mort, Vive le…
The Immoral Assumption Effect: Moralization Drives Negative Trait Attributions.
Meindl, Peter; Johnson, Kate M; Graham, Jesse
2016-04-01
Jumping to negative conclusions about other people's traits is judged as morally bad by many people. Despite this, across six experiments (total N = 2,151), we find that multiple types of moral evaluations--even evaluations related to open-mindedness, tolerance, and compassion--play a causal role in these potentially pernicious trait assumptions. Our results also indicate that moralization affects negative-but not positive-trait assumptions, and that the effect of morality on negative assumptions cannot be explained merely by people's general (nonmoral) preferences or other factors that distinguish moral and nonmoral traits, such as controllability or desirability. Together, these results suggest that one of the more destructive human tendencies--making negative assumptions about others--can be caused by the better angels of our nature. © 2016 by the Society for Personality and Social Psychology, Inc.
Assumptions Underlying the Use of Different Types of Simulations.
ERIC Educational Resources Information Center
Cunningham, J. Barton
1984-01-01
Clarifies appropriateness of certain simulation approaches by distinguishing between different types of simulations--experimental, predictive, evaluative, and educational--on the basis of purpose, assumptions, procedures, and criteria for evaluating. The kinds of questions each type best responds to are discussed. (65 references) (MBR)
ERIC Educational Resources Information Center
Wang, Yan; Rodríguez de Gil, Patricia; Chen, Yi-Hsin; Kromrey, Jeffrey D.; Kim, Eun Sook; Pham, Thanh; Nguyen, Diep; Romano, Jeanine L.
2017-01-01
Various tests to check the homogeneity of variance assumption have been proposed in the literature, yet there is no consensus as to their robustness when the assumption of normality does not hold. This simulation study evaluated the performance of 14 tests for the homogeneity of variance assumption in one-way ANOVA models in terms of Type I error…
Optimal post-experiment estimation of poorly modeled dynamic systems
NASA Technical Reports Server (NTRS)
Mook, D. Joseph
1988-01-01
Recently, a novel strategy for post-experiment state estimation of discretely-measured dynamic systems has been developed. The method accounts for errors in the system dynamic model equations in a more general and rigorous manner than do filter-smoother algorithms. The dynamic model error terms do not require the usual process noise assumptions of zero-mean, symmetrically distributed random disturbances. Instead, the model error terms require no prior assumptions other than piecewise continuity. The resulting state estimates are more accurate than filters for applications in which the dynamic model error clearly violates the typical process noise assumptions, and the available measurements are sparse and/or noisy. Estimates of the dynamic model error, in addition to the states, are obtained as part of the solution of a two-point boundary value problem, and may be exploited for numerous reasons. In this paper, the basic technique is explained, and several example applications are given. Included among the examples are both state estimation and exploitation of the model error estimates.
Hong, Sehee; Kim, Soyoung
2018-01-01
There are basically two modeling approaches applicable to analyzing an actor-partner interdependence model: the multilevel modeling (hierarchical linear model) and the structural equation modeling. This article explains how to use these two models in analyzing an actor-partner interdependence model and how these two approaches work differently. As an empirical example, marital conflict data were used to analyze an actor-partner interdependence model. The multilevel modeling and the structural equation modeling produced virtually identical estimates for a basic model. However, the structural equation modeling approach allowed more realistic assumptions on measurement errors and factor loadings, rendering better model fit indices.
Status of the Space Station environmental control and life support system design concept
NASA Technical Reports Server (NTRS)
Ray, C. D.; Humphries, W. R.
1986-01-01
The current status of the Space Station (SS) environmental control and life support system (ECLSS) design is outlined. The concept has been defined at the subsystem level. Data supporting these definitions are provided which identify general configuratioons for all modules. Requirements, guidelines and assumptions used in generating these configurations are detailed. The basic 2 US module 'core' Space Station is addressed along with system synergism issues and early man-tended and future growth considerations. Along with these basic studies, also addressed here are options related to variation in the 'core' module makeup and more austere Station concepts such as commonality, automation and design to cost.
Refraction effects of atmosphere on geodetic measurements to celestial bodies
NASA Technical Reports Server (NTRS)
Joshi, C. S.
1973-01-01
The problem is considered of obtaining accurate values of refraction corrections for geodetic measurements of celestial bodies. The basic principles of optics governing the phenomenon of refraction are defined, and differential equations are derived for the refraction corrections. The corrections fall into two main categories: (1) refraction effects due to change in the direction of propagation, and (2) refraction effects mainly due to change in the velocity of propagation. The various assumptions made by earlier investigators are reviewed along with the basic principles of improved models designed by investigators of the twentieth century. The accuracy problem for various quantities is discussed, and the conclusions and recommendations are summarized.
NASA Technical Reports Server (NTRS)
Hoover, D. Q.
1976-01-01
Electric power plant costs and efficiencies are presented for three basic open-cycle MHD systems: (1) direct coal fired system, (2) a system with a separately fired air heater, and (3) a system burning low-Btu gas from an integrated gasifier. Power plant designs were developed corresponding to the basic cases with variation of major parameters for which major system components were sized and costed. Flow diagrams describing each design are presented. A discussion of the limitations of each design is made within the framework of the assumptions made.
Intospace a European industrial initiative to commercialise space
NASA Astrophysics Data System (ADS)
von der Lippe, Juergen K.; Sprenger, Heinz J.
2005-07-01
Intospace, founded in 1985, was the response to the government's request to provide evidence to the industrial promises of commercial utilisation of space systems such as Spacelab and the already planned space station. The company was set up with an exceptional structure comprising 95 shareholders from all over western Europe from space and non-space industry and financial institutes. The companies joined as shareholders and committed beyond the basic capital to cover financial losses up to a given limit allowing the company to invest in market development. Compared to other commercial initiatives in the European space scenario the product that Intospace was supposed to offer, was without doubt the most demanding one regarding its market prospects. The primary product of Intospace was to provide services to commercial customers for using microgravity for research and production in space. This was based on the assumption that an effective operational infrastructure with frequent flights of Spacelab and Eureca would be available leading finally to the space station with Columbus. A further assumption had been that basic research projects of the agencies would provide sufficient data as a basis for commercial project planning. The conflict with these assumptions is best illustrated by the fact that the lifetime of Intospace is framed by the two shuttle disasters, the Challenger accident a couple of months after foundation of Intospace and the Columbia accident with Spacehab on board leading to liquidation of the company. The paper will present the background behind the foundation of the Intospace initiative, describe the objectives and major strategic steps to develop the market.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This Feasibility Analysis covers a wide range of studies and evaluations. The Report is divided into five parts. Section 1 contains all material relating to the Institutional Assessment including consideration of the requirements and position of the Potomac Electric Co. as they relate to cogeneration at Georgetown in parallel with the utility (Task 1). Sections 2 through 7 contain all technical information relating to the Alternative Subsystems Analysis (Task 4). This includes the energy demand profiles upon which the evaluations were based (Task 3). It further includes the results of the Life-Cycle-Cost Analyses (Task 5) which are developed in detailmore » in the Appendix for evaluation in the Technical Report. Also included is the material relating to Incremental Savings and Optimization (Task 6) and the Conceptual Design for candidate alternate subsystems (Task 7). Section 8 contains all material relating to the Environmental Impact Assessment (Task 2). The Appendix contains supplementary material including the budget cost estimates used in the life-cycle-cost analyses, the basic assumptions upon which the life-cycle analyses were developed, and the detailed life-cycle-cost anlysis for each subsystem considered in detail.« less
Evaluation of computed tomography numbers for treatment planning of lung cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mira, J.G.; Fullerton, G.D.; Ezekiel, J.
1982-09-01
Computerized tomography numbers (CTN) were evaluated in 32 computerized tomography scans performed on patients with carcinoma of the lung, with the aim of evaluating CTN in normal (lung, blood, muscle, etc) and pathologic tissues (tumor, atelectasis, effusion, post-radiation fibrosis). Our main findings are: 1. Large individual CTN variations are encountered in both normal and pathologic tissues, above and below mean values. Hence, absolute numbers are meaningless. Measurements of any abnormal intrathoracic structure should be compared in relation to normal tissue CTN values in the same scan. 2. Tumor and complete atelectasis have CTN basically similar to soft tissue. Hence, thesemore » numbers are not useful for differential diagnosis. 3. Effusions usually have lower CTN and can be distinguished from previous situations. 4. Dosimetry based on uniform lung density assumptions (i.e., 300 mg/cm/sup 3/) might produce substantial dose errors as lung CTN exhibit variations indicating densities well above and below this value. 5. Preliminary information indicates that partial atelectasis and incipient post-radiation fibrosis can have very low CTN. Hence, they can be differentiated from solid tumors in certain cases, and help in differential diagnosis of post radiation recurrence within the radiotherapy field versus fibrosis.« less
Neural models on temperature regulation for cold-stressed animals
NASA Technical Reports Server (NTRS)
Horowitz, J. M.
1975-01-01
The present review evaluates several assumptions common to a variety of current models for thermoregulation in cold-stressed animals. Three areas covered by the models are discussed: signals to and from the central nervous system (CNS), portions of the CNS involved, and the arrangement of neurons within networks. Assumptions in each of these categories are considered. The evaluation of the models is based on the experimental foundations of the assumptions. Regions of the nervous system concerned here include the hypothalamus, the skin, the spinal cord, the hippocampus, and the septal area of the brain.
Marom, Gil; Bluestein, Danny
2016-01-01
This paper evaluated the influence of various numerical implementation assumptions on predicting blood damage in cardiovascular devices using Lagrangian methods with Eulerian computational fluid dynamics. The implementation assumptions that were tested included various seeding patterns, stochastic walk model, and simplified trajectory calculations with pathlines. Post processing implementation options that were evaluated included single passage and repeated passages stress accumulation and time averaging. This study demonstrated that the implementation assumptions can significantly affect the resulting stress accumulation, i.e., the blood damage model predictions. Careful considerations should be taken in the use of Lagrangian models. Ultimately, the appropriate assumptions should be considered based the physics of the specific case and sensitivity analysis, similar to the ones presented here, should be employed.
The unique world of the Everett version of quantum theory
NASA Astrophysics Data System (ADS)
Squires, Euan J.
1988-03-01
We ask whether the basic Everett assumption, that there are no changes of the wavefunction other than those given by the Schrödinger equation, is compatible with experience. We conclude that it is, provided we allow the world of observation to be partially a creation of consciousness. The model suggests the possible existence of quantum paranormal effects.
ERIC Educational Resources Information Center
Pressman, Harvey
This paper outlines several schemes for developing quality private schools for inner city students. The basic assumption justifying the proposal that such schools be independently managed is that the urban public school systems have patently failed to educate poor children. Therefore, a new national network of independent schools should be…
ENRICHMENT PROGRAM FOR ACADEMICALLY TALENTED JUNIOR HIGH SCHOOL STUDENTS FROM LOW INCOME FAMILIES.
ERIC Educational Resources Information Center
PRESSMAN, HARVEY
A PROPOSAL FOR AN ENRICHMENT PROGRAM FOR ACADEMICALLY TALENTED JUNIOR HIGH SCHOOL STUDENTS FROM LOW-INCOME FAMILIES IN CERTAIN AREAS OF BOSTON IS PRESENTED. BASIC ASSUMPTIONS ARE THAT THERE IS AND OBVIOUS AND PRESSING NEED TO GIVE EXTRA HELP TO THE ABLE STUDENT FROM A DISADVANTAGED BACKGROUND, AND THAT A RELATIVELY BRIEF ENRICHMENT EXPERIENCE FOR…
Redwoods—responsibilities for a long-lived species/resource
Robert Ewing
2017-01-01
What responsibilities do humans have to ensure that redwoods survive? And what values and strategies are required to accomplish such a purpose? A basic assumption is that the saving of a species, or more broadly of an ecosystem, is ultimately about human survival and that there is a responsibility to use all tools available to this end. To date, our actions to sustain...
Comments on ""Contact Diffusion Interaction of Materials with Cladding''
NASA Technical Reports Server (NTRS)
Morris, J. F.
1972-01-01
A Russian paper by A. A. Babad-Zakhryapina contributes much to the understanding of fuel, clad interactions, and thus to nuclear thermionic technology. In that publication the basic diffusion expression is a simple one. A more general but complicated equation for this mass transport results from the present work. With appropriate assumptions, however, the new relation reduces to Babad-Zakhryapina's version.
First order ball bearing kinematics
NASA Technical Reports Server (NTRS)
Kingbury, E.
1984-01-01
Two first order equations are given connecting geometry and internal motions in an angular contact ball bearing. Total speed, kinematic equivalence, basic speed ratio, and modal speed ratio are defined and discussed; charts are given for the speed ratios covering all bearings and all rotational modes. Instances where specific first order assumptions might fail are discussed, and the resulting effects on bearing performance reviewed.
Christoph Keinn; Goran Stahl
2009-01-01
Current research in forest inventory focuses very much on technical-statistical problems geared mainly to the optimization of data collection and information generation. The basic assumption is that better information leads to better decisions and, therefore, to better forest management and forest policy. Not many studies, however, strive to explicitly establish the...
Four Scenarios for Determining the Size and Reusability of Learning Objects
ERIC Educational Resources Information Center
Schoonenboom, Judith
2012-01-01
The best method for determining the size of learning objects (LOs) so as to optimise their reusability has been a topic of debate for years now. Although there appears to be agreement on basic assumptions, developed guidelines and principles are often in conflict. This study shows that this confusion stems from the fact that in the literature,…
Data-Driven Leadership: Determining Your Indicators and Building Your Dashboards
ERIC Educational Resources Information Center
Copeland, Mo
2016-01-01
For years, schools have tended to approach budgets with some basic assumptions and aspirations and general wish lists but with scant data to drive the budget conversation. Suppose there were a better way? What if the conversation started with a review of the last five to ten years of data on three key mission- and strategy-driven indicators:…
The "Cause" of Low Self-Control: The Influence of Maternal Self-Control
ERIC Educational Resources Information Center
Nofziger, Stacey
2008-01-01
Self-control theory is one of the most tested theories within the field of criminology. However, one of the basic assumptions of the theory has remained largely ignored. Gottfredson and Hirschi stated that the focus of their general theory of crime is the "connection between the self-control of the parent and the subsequent self-control of the…
Consumption of Mass Communication--Construction of a Model on Information Consumption Behaviour.
ERIC Educational Resources Information Center
Sepstrup, Preben
A general conceptual model on the consumption of information is introduced. Information as the output of the mass media is treated as a product, and a model on the consumption of this product is developed by merging elements from consumer behavior theory and mass communication theory. Chapter I gives basic assumptions about the individual and the…
ERIC Educational Resources Information Center
Sells, Scott P.
A model for treating difficult adolescents and their families is presented. Part 1 offers six basic assumptions about the causes of severe behavioral problems and presents the treatment model with guidelines necessary to address each of these six causes. Case examples highlight and clarify major points within each of the 15 procedural steps of the…
The Nature of Living Systems: An Exposition of the Basic Concepts in General Systems Theory.
ERIC Educational Resources Information Center
Miller, James G.
General systems theory is a set of related definitions, assumptions, and propositions which deal with reality as an integrated hierarchy of organizations of matter and energy. In this paper, the author defines the concepts of space, time, matter, energy, and information in terms of their meaning in general systems theory. He defines a system as a…
Giaquinto, Marcus
2017-02-19
How can we acquire a grasp of cardinal numbers, even the first very small positive cardinal numbers, given that they are abstract mathematical entities? That problem of cognitive access is the main focus of this paper. All the major rival views about the nature and existence of cardinal numbers face difficulties; and the view most consonant with our normal thought and talk about numbers, the view that cardinal numbers are sizes of sets, runs into the cognitive access problem. The source of the problem is the plausible assumption that cognitive access to something requires causal contact with it. It is argued that this assumption is in fact wrong, and that in this and similar cases, we should accept that a certain recognize-and-distinguish capacity is sufficient for cognitive access. We can then go on to solve the cognitive access problem, and thereby support the set-size view of cardinal numbers, by paying attention to empirical findings about basic number abilities. To this end, some selected studies of infants, pre-school children and a trained chimpanzee are briefly discussed.This article is part of a discussion meeting issue 'The origins of numerical abilities'. © 2017 The Author(s).
On Assisting a Visual-Facial Affect Recognition System with Keyboard-Stroke Pattern Information
NASA Astrophysics Data System (ADS)
Stathopoulou, I.-O.; Alepis, E.; Tsihrintzis, G. A.; Virvou, M.
Towards realizing a multimodal affect recognition system, we are considering the advantages of assisting a visual-facial expression recognition system with keyboard-stroke pattern information. Our work is based on the assumption that the visual-facial and keyboard modalities are complementary to each other and that their combination can significantly improve the accuracy in affective user models. Specifically, we present and discuss the development and evaluation process of two corresponding affect recognition subsystems, with emphasis on the recognition of 6 basic emotional states, namely happiness, sadness, surprise, anger and disgust as well as the emotion-less state which we refer to as neutral. We find that emotion recognition by the visual-facial modality can be aided greatly by keyboard-stroke pattern information and the combination of the two modalities can lead to better results towards building a multimodal affect recognition system.
NASA Astrophysics Data System (ADS)
Jalbout, Abraham F.; Roy, Amlan K.; Shipar, Abul Haider; Ahmed, M. Samsuddin
Theoretical energy changes of various intermediates leading to the formation of the Amadori rearrangement products (ARPs) under different mechanistic assumptions have been calculated, by using open chain glucose (O-Glu)/closed chain glucose (A-Glu and B-Glu) and glycine (Gly) as a model for the Maillard reaction. Density functional theory (DFT) computations have been applied on the proposed mechanisms under different pH conditions. Thus, the possibility of the formation of different compounds and electronic energy changes for different steps in the proposed mechanisms has been evaluated. B-Glu has been found to be more efficient than A-Glu, and A-Glu has been found more efficient than O-Glu in the reaction. The reaction under basic condition is the most favorable for the formation of ARPs. Other reaction pathways have been computed and discussed in this work.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, Curtis M.
2006-11-27
A screening and ranking framework (SRF) has been developedto evaluate potential geologic carbon dioxide (CO2) storage sites on thebasis of health, safety, and environmental (HSE) risk arising from CO2leakage. The approach is based on the assumption that CO2 leakage risk isdependent on three basic characteristics of a geologic CO2 storage site:(1) the potential for primary containment by the target formation; (2)the potential for secondary containment if the primary formation leaks;and (3) the potential for attenuation and dispersion of leaking CO2 ifthe primary formation leaks and secondary containment fails. Theframework is implemented in a spreadsheet in which users enter numericalscores representingmore » expert opinions or published information along withestimates of uncertainty. Applications to three sites in Californiademonstrate the approach. Refinements and extensions are possible throughthe use of more detailed data or model results in place of propertyproxies.« less
Hoffman, Steve G
2015-04-01
Some scholars dismiss the distinction between basic and applied science as passé, yet substantive assumptions about this boundary remain obdurate in research policy, popular rhetoric, the sociology and philosophy of science, and, indeed, at the level of bench practice. In this article, I draw on a multiple ontology framework to provide a more stable affirmation of a constructivist position in science and technology studies that cannot be reduced to a matter of competing perspectives on a single reality. The analysis is grounded in ethnographic research in the border zone of Artificial Intelligence science. I translate in-situ moments in which members of neighboring but differently situated labs engage in three distinct repertoires that render the reality of basic and applied science: partitioning, flipping, and collapsing. While the essences of scientific objects are nowhere to be found, the boundary between basic and applied is neither illusion nor mere propaganda. Instead, distinctions among scientific knowledge are made real as a matter of course.
Statistical foundations of liquid-crystal theory: I. Discrete systems of rod-like molecules.
Seguin, Brian; Fried, Eliot
2012-12-01
We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals.
On the physical parameters for Centaurus X-3 and Hercules X-1.
NASA Technical Reports Server (NTRS)
Mccluskey, G. E., Jr.; Kondo, Y.
1972-01-01
It is shown how upper and lower limits on the physical parameters of X-ray sources in Centaurus X-3 and Hercules X-1 may be determined from a reasonably simple and straightforward consideration. The basic assumption is that component A (the non-X-ray emitting component) is not a star collapsing toward its Schwartzschild radius (i.e., a black hole). This assumption appears reasonable since component A (the radius of the central occulting star) appears to physically occult component X. If component A is a 'normal' star, both observation and theory indicate that its mass is not greater than about 60 solar masses. The possibility in which component X is either a neutron star or a white dwarf is considered.
Links between causal effects and causal association for surrogacy evaluation in a gaussian setting.
Conlon, Anna; Taylor, Jeremy; Li, Yun; Diaz-Ordaz, Karla; Elliott, Michael
2017-11-30
Two paradigms for the evaluation of surrogate markers in randomized clinical trials have been proposed: the causal effects paradigm and the causal association paradigm. Each of these paradigms rely on assumptions that must be made to proceed with estimation and to validate a candidate surrogate marker (S) for the true outcome of interest (T). We consider the setting in which S and T are Gaussian and are generated from structural models that include an unobserved confounder. Under the assumed structural models, we relate the quantities used to evaluate surrogacy within both the causal effects and causal association frameworks. We review some of the common assumptions made to aid in estimating these quantities and show that assumptions made within one framework can imply strong assumptions within the alternative framework. We demonstrate that there is a similarity, but not exact correspondence between the quantities used to evaluate surrogacy within each framework, and show that the conditions for identifiability of the surrogacy parameters are different from the conditions, which lead to a correspondence of these quantities. Copyright © 2017 John Wiley & Sons, Ltd.
Mel'nikova, Ye B
2017-05-01
Night-time changes in bioluminescence intensity in the coastal area of the Black Sea were recorded. It was noted that the biomass of luminous organisms is closely correlated with the biomass of plankton and other pelagic organisms, including commercial pelagic fish. The parameters of plankton communities' basic biological rhythms were determined using the discrete Fourier transform method. These rhythms were manifest as spatial and temporal changes in the bioluminescence intensity. It was shown that changes in the bioluminescence intensity over a 14.0-h period were due to the duration of the light/dark cycles. By contrast, changes in bioluminescence intensity with periods of 4.7 and 2.8 h were due to the endogenous rhythms of the plankton community (feeding and cell division). An original method for evaluating of errors in the calculated periods of the biological rhythms was proposed. A strong correlation (r = 0.906) was observed between the measured and calculated values for the bioluminescence intensity, which provided support for the assumptions made. Copyright © 2016 John Wiley & Sons, Ltd.
How rational should bioethics be? The value of empirical approaches.
Alvarez, A A
2001-10-01
Rational justification of claims with empirical content calls for empirical and not only normative philosophical investigation. Empirical approaches to bioethics are epistemically valuable, i.e., such methods may be necessary in providing and verifying basic knowledge about cultural values and norms. Our assumptions in moral reasoning can be verified or corrected using these methods. Moral arguments can be initiated or adjudicated by data drawn from empirical investigation. One may argue that individualistic informed consent, for example, is not compatible with the Asian communitarian orientation. But this normative claim uses an empirical assumption that may be contrary to the fact that some Asians do value and argue for informed consent. Is it necessary and factual to neatly characterize some cultures as individualistic and some as communitarian? Empirical investigation can provide a reasonable way to inform such generalizations. In a multi-cultural context, such as in the Philippines, there is a need to investigate the nature of the local ethos before making any appeal to authenticity. Otherwise we may succumb to the same ethical imperialism we are trying hard to resist. Normative claims that involve empirical premises cannot be reasonable verified or evaluated without utilizing empirical methods along with philosophical reflection. The integration of empirical methods to the standard normative approach to moral reasoning should be reasonably guided by the epistemic demands of claims arising from cross-cultural discourse in bioethics.
Training clinicians in cultural psychiatry: a Canadian perspective.
Kirmayer, Laurence J; Rousseau, Cécile; Guzder, Jaswant; Jarvis, G Eric
2008-01-01
The authors summarize the pedagogical approaches and curriculum used in the training of clinicians in cultural psychiatry at the Division of Social and Transcultural Psychiatry, McGill University. We reviewed available published and unpublished reports on the history and development of training in cultural psychiatry at McGill to identify the main orientations, teaching methods, curriculum, and course content. Student evaluations of teaching were reviewed. The training strategies and curriculum are related to the larger social context of Canadian society including the history of migration, current demography, and policies of multiculturalism. The McGill program includes core teaching, clinical rotations, an intensive summer program, and annual Advanced Study Institutes. The interdisciplinary training setting emphasizes general knowledge rather than specific ethnocultural groups, including: understanding the cultural assumptions implicit in psychiatric theory and practice; exploring the clinician's personal and professional identity and social position; evidence-based conceptual frameworks for understanding the interaction of culture and psychopathology; learning to use an expanded version of the cultural formulation in DSM-IV for diagnostic assessment and treatment planning; and developing skills for working with interpreters and culture-brokers, who mediate and interpret the cultural meaning and assumptions of patient and clinician. An approach to cultural psychiatry grounded in basic social science perspectives and in trainees' appreciation of their own background can prepare clinicians to respond effectively to the changing configurations of culture, ethnicity, and identity in contemporary health care settings.
Marketing and population problems.
Farley, J U; Leavitt, H J
1971-07-01
There are many elements in population programs that are more familiar to marketing men than to some population experts. Advertising is essential to reach the target population, and advertising evaluation techniques (e.g., surrogate indexes or audience measures) might be useful for evaluating both population information activities and the import of the entire program. Fundamental research on basid demand for fertility control is needed and a marketer's experience with planning and evaluating test markets can be useful in assessing potential selling targets and evaluating alternative promotional and distributional strategies. Special family planning clinics have certain disadvantages: expensive and scarce personnel are needed; red tape may be present; the network is based on the assumption that the client is willing to travel relatively great distances repeatedly; and clinics lack anonymity which may scare potential acceptors away. Most developing cultures have an intensively functioning distribution structure which delivers basic commodities to the most remote areas, providing relatively anonymous outlets that are physically close to the customs. Materials requiring a prescription might be distributed in exchange for script issued at and ultimately redeemed by clinics, this requiring only an occasional visit to a clinic. Mail-order service can be used to supplement a clinic's distribution of some contraceptives. It should be remembered that population administrators often have an antipathetic view toward business and marketing and "suspect" the profit motive.
2018-01-01
Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Danny H; Elwood Jr, Robert H
2011-01-01
Analysis of the material protection, control, and accountability (MPC&A) system is necessary to understand the limits and vulnerabilities of the system to internal threats. A self-appraisal helps the facility be prepared to respond to internal threats and reduce the risk of theft or diversion of nuclear material. The material control and accountability (MC&A) system effectiveness tool (MSET) fault tree was developed to depict the failure of the MPC&A system as a result of poor practices and random failures in the MC&A system. It can also be employed as a basis for assessing deliberate threats against a facility. MSET uses faultmore » tree analysis, which is a top-down approach to examining system failure. The analysis starts with identifying a potential undesirable event called a 'top event' and then determining the ways it can occur (e.g., 'Fail To Maintain Nuclear Materials Under The Purview Of The MC&A System'). The analysis proceeds by determining how the top event can be caused by individual or combined lower level faults or failures. These faults, which are the causes of the top event, are 'connected' through logic gates. The MSET model uses AND-gates and OR-gates and propagates the effect of event failure using Boolean algebra. To enable the fault tree analysis calculations, the basic events in the fault tree are populated with probability risk values derived by conversion of questionnaire data to numeric values. The basic events are treated as independent variables. This assumption affects the Boolean algebraic calculations used to calculate results. All the necessary calculations are built into the fault tree codes, but it is often useful to estimate the probabilities manually as a check on code functioning. The probability of failure of a given basic event is the probability that the basic event primary question fails to meet the performance metric for that question. The failure probability is related to how well the facility performs the task identified in that basic event over time (not just one performance or exercise). Fault tree calculations provide a failure probability for the top event in the fault tree. The basic fault tree calculations establish a baseline relative risk value for the system. This probability depicts relative risk, not absolute risk. Subsequent calculations are made to evaluate the change in relative risk that would occur if system performance is improved or degraded. During the development effort of MSET, the fault tree analysis program used was SAPHIRE. SAPHIRE is an acronym for 'Systems Analysis Programs for Hands-on Integrated Reliability Evaluations.' Version 1 of the SAPHIRE code was sponsored by the Nuclear Regulatory Commission in 1987 as an innovative way to draw, edit, and analyze graphical fault trees primarily for safe operation of nuclear power reactors. When the fault tree calculations are performed, the fault tree analysis program will produce several reports that can be used to analyze the MPC&A system. SAPHIRE produces reports showing risk importance factors for all basic events in the operational MC&A system. The risk importance information is used to examine the potential impacts when performance of certain basic events increases or decreases. The initial results produced by the SAPHIRE program are considered relative risk values. None of the results can be interpreted as absolute risk values since the basic event probability values represent estimates of risk associated with the performance of MPC&A tasks throughout the material balance area (MBA). The RRR for a basic event represents the decrease in total system risk that would result from improvement of that one event to a perfect performance level. Improvement of the basic event with the greatest RRR value produces a greater decrease in total system risk than improvement of any other basic event. Basic events with the greatest potential for system risk reduction are assigned performance improvement values, and new fault tree calculations show the improvement in total system risk. The operational impact or cost-effectiveness from implementing the performance improvements can then be evaluated. The improvements being evaluated can be system performance improvements, or they can be potential, or actual, upgrades to the system. The RIR for a basic event represents the increase in total system risk that would result from failure of that one event. Failure of the basic event with the greatest RIR value produces a greater increase in total system risk than failure of any other basic event. Basic events with the greatest potential for system risk increase are assigned failure performance values, and new fault tree calculations show the increase in total system risk. This evaluation shows the importance of preventing performance degradation of the basic events. SAPHIRE identifies combinations of basic events where concurrent failure of the events results in failure of the top event.« less
Is the hypothesis of preimplantation genetic screening (PGS) still supportable? A review.
Gleicher, Norbert; Orvieto, Raoul
2017-03-27
The hypothesis of preimplantation genetic diagnosis (PGS) was first proposed 20 years ago, suggesting that elimination of aneuploid embryos prior to transfer will improve implantation rates of remaining embryos during in vitro fertilization (IVF), increase pregnancy and live birth rates and reduce miscarriages. The aforementioned improved outcome was based on 5 essential assumptions: (i) Most IVF cycles fail because of aneuploid embryos. (ii) Their elimination prior to embryo transfer will improve IVF outcomes. (iii) A single trophectoderm biopsy (TEB) at blastocyst stage is representative of the whole TE. (iv) TE ploidy reliably represents the inner cell mass (ICM). (v) Ploidy does not change (i.e., self-correct) downstream from blastocyst stage. We aim to offer a review of the aforementioned assumptions and challenge the general hypothesis of PGS. We reviewed 455 publications, which as of January 20, 2017 were listed in PubMed under the search phrase < preimplantation genetic screening (PGS) for aneuploidy>. The literature review was performed by both authors who agreed on the final 55 references. Various reports over the last 18 months have raised significant questions not only about the basic clinical utility of PGS but the biological underpinnings of the hypothesis, the technical ability of a single trophectoderm (TE) biopsy to accurately assess an embryo's ploidy, and suggested that PGS actually negatively affects IVF outcomes while not affecting miscarriage rates. Moreover, due to high rates of false positive diagnoses as a consequence of high mosaicism rates in TE, PGS leads to the discarding of large numbers of normal embryos with potential for normal euploid pregnancies if transferred rather than disposed of. We found all 5 basic assumptions underlying the hypothesis of PGS to be unsupported: (i) The association of embryo aneuploidy with IVF failure has to be reevaluated in view how much more common TE mosaicism is than has until recently been appreciated. (ii) Reliable elimination of presumed aneuploid embryos prior to embryo transfer appears unrealistic. (iii) Mathematical models demonstrate that a single TEB cannot provide reliable information about the whole TE. (iv) TE does not reliably reflect the ICM. (v) Embryos, likely, still have strong innate ability to self-correct downstream from blastocyst stage, with ICM doing so better than TE. The hypothesis of PGS, therefore, no longer appears supportable. With all 5 basic assumptions underlying the hypothesis of PGS demonstrated to have been mistaken, the hypothesis of PGS, itself, appears to be discredited. Clinical use of PGS for the purpose of IVF outcome improvements should, therefore, going forward be restricted to research studies.
Marom, Gil; Bluestein, Danny
2016-01-01
Summary This paper evaluated the influence of various numerical implementation assumptions on predicting blood damage in cardiovascular devices using Lagrangian methods with Eulerian computational fluid dynamics. The implementation assumptions that were tested included various seeding patterns, stochastic walk model, and simplified trajectory calculations with pathlines. Post processing implementation options that were evaluated included single passage and repeated passages stress accumulation and time averaging. This study demonstrated that the implementation assumptions can significantly affect the resulting stress accumulation, i.e., the blood damage model predictions. Careful considerations should be taken in the use of Lagrangian models. Ultimately, the appropriate assumptions should be considered based the physics of the specific case and sensitivity analysis, similar to the ones presented here, should be employed. PMID:26679833
Archibald, Thomas; Sharrock, Guy; Buckley, Jane; Cook, Natalie
2016-12-01
Unexamined and unjustified assumptions are the Achilles' heel of development programs. In this paper, we describe an evaluation capacity building (ECB) approach designed to help community development practitioners work more effectively with assumptions through the intentional infusion of evaluative thinking (ET) into the program planning, monitoring, and evaluation process. We focus specifically on one component of our ET promotion approach involving the creation and analysis of theory of change (ToC) models. We describe our recent efforts to pilot this ET ECB approach with Catholic Relief Services (CRS) in Ethiopia and Zambia. The use of ToC models, plus the addition of ET, is a way to encourage individual and organizational learning and adaptive management that supports more reflective and responsive programming. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Xin; Tu, Chuanyi; Marsch, Eckart; He, Jiansen; Wang, Linghua
2016-01-01
Turbulence in the solar wind was recently reported to be anisotropic, with the average power spectral index close to -2 when sampling parallel to the local mean magnetic field B0 and close to -5/3 when sampling perpendicular to the local B0. This result was widely considered to be observational evidence for the critical balance theory (CBT), which is derived by making the assumption that the turbulence strength is close to one. However, this basic assumption has not yet been checked carefully with observational data. Here we present for the first time the scale-dependent magnetic-field fluctuation amplitude, which is normalized by the local B0 and evaluated for both parallel and perpendicular sampling directions, using two 30-day intervals of Ulysses data. From our results, the turbulence strength is evaluated as much less than one at small scales in the parallel direction. An even stricter criterion is imposed when selecting the wavelet coefficients for a given sampling direction, so that the time stationarity of the local B0 is better ensured during the local sampling interval. The spectral index for the parallel direction is then found to be -1.75, whereas the spectral index in the perpendicular direction remains close to -1.65. These two new results, namely that the value of the turbulence strength is much less than one in the parallel direction and that the angle dependence of the spectral index is weak, cannot be explained by existing turbulence theories, like CBT, and thus will require new theoretical considerations and promote further observations of solar-wind turbulence.
ERIC Educational Resources Information Center
Field, John; Schemmann, Michael
2017-01-01
The article analyses how citizenship is conceptualised in policy documents of four key international organisations. The basic assumption is that public policy has not turned away from adult learning for active citizenship, but that there are rather new ways in which international governmental organisations conceptualise and in some cases seek to…
Code of Federal Regulations, 2010 CFR
2010-04-01
... representative experience may be used as an assumed retirement age. Different basic assumptions or rates may be used for different classes of risks or different groups where justified by conditions or required by... proper, or except when a change is necessitated by reason of the use of different methods, factors...
ERIC Educational Resources Information Center
Hodgson, Ann; Steer, Richard; Spours, Ken; Edward, Sheila; Coffield, Frank; Finlay, Ian; Gregson, Maggie
2007-01-01
The English Learning and Skills Sector (LSS) contains a highly diverse range of learners and covers all aspects of post-16 learning with the exception of higher education. In the research on which this paper is based we are concerned with the effects of policy on three types of learners--unemployed adults attempting to improve their basic skills…
ERIC Educational Resources Information Center
Zigarmi, Drea; Roberts, Taylor Peyton
2017-01-01
Purpose: This study aims to test the following three assertions underlying the Situational Leadership® II (SLII) Model: all four leadership styles are received by followers; all four leadership styles are needed by followers; and if there is a fit between the leadership style a follower receives and needs, that follower will demonstrate favorable…
The Trouble with Levels: A Reexamination of Craik and Lockhart's Framework for Memory Research
ERIC Educational Resources Information Center
Baddeley, Alan D.
1978-01-01
Begins by discussing a number of problems in applying a levels-of-processing approach to memory as proposed in the late 1960s and then revised in 1972 by Craik and Lockhart, suggests that some of the basic assumptions are false, and argues for information-processing models devised to study working memory and reading, which aim to explore the…
Modernism, Postmodernism, or Neither? A Fresh Look at "Fine Art"
ERIC Educational Resources Information Center
Kamhi, Michelle Marder
2006-01-01
Numerous incidents have been reported in recent years wherein a work of art is mistaken as trash. The question is, how have people reached the point in the civilized world where a purported work of art cannot be distinguished from a pile of rubbish or a grid of condensation pipes? The answer to that question lies in the basic assumption of nearly…
ERIC Educational Resources Information Center
El-Sherbini, Magda; Wilson, Amanda J
2007-01-01
The focus of this paper is to examine the current library practice of processing and delivering information and to introduce alternative scenarios that may keep librarians relevant in the technological era. In the scenarios presented here, the authors will attempt to challenge basic assumptions about the usefulness of and need for OPAC systems,…
ERIC Educational Resources Information Center
Hovardas, Tasos
2013-01-01
The aim of the paper is to make a critical reading of ecocentrism and its meta-scientific use of ecology. First, basic assumptions of ecocentrism will be examined, which involve nature's intrinsic value, postmodern and modern positions in ecocentrism, and the subject-object dichotomy under the lenses of ecocentrism. Then, we will discuss…
ERIC Educational Resources Information Center
Abrams, Joan
Based on the assumption that the content and symbolism of nursery rhymes reflect the particular needs of those who respond to them, this paper analyzes Mother Goose rhymes in relation to the psychological stages of child development. Each basic need of the child, as defined in Bruno Bettelheim's "The Uses of Enchantment," is applied to…
ERIC Educational Resources Information Center
Rienties, Bart; Lewis, Tim; McFarlane, Ruth; Nguyen, Quan; Toetenel, Lisette
2018-01-01
Language education has a rich history of research and scholarship focusing on the effectiveness of learning activities and the impact these have on student behaviour and outcomes. One of the basic assumptions in foreign language pedagogy and CALL in particular is that learners want to be able to communicate effectively with native speakers of…
Changing the Culture of Fuel Efficiency: A Change in Attitude
2014-05-09
2011 September). Organizational Culture: Assessment and Transformation. Journal of Change Management, 11(3), 305-328. Bandura , A. (1986). Social ...describes that, “organizational culture is a set of basic assumptions that a group has invented, discovered or developed in learning to cope with its...change. In the first category they found the most influential factors are leadership, attraction-selection-attrition, socialization , reward systems
1975-01-01
Studies Program. The results of AGARD work are reported to the member nations and the NATO Authorities through the AGARD series of publications of...calculated based on a low altitude mission profile. 2. GROUND RULES AND BASIC ASSUMPTIONS Base Design All aircraft synthesized for this study are...In this study manoeuverability is defined in terms of specific excess power (as shown in Fig. 5) at specified Mach number, altitude,and load
ERIC Educational Resources Information Center
Allison, Derek J.
Focusing on the problem of authority, an analysis of the theories of Max Weber, James D. Thompson, and Elliott Jaques forms the basis for this proposal for improved organizational effectiveness in public schools. Basic assumptions are that modern organizations are established and operated under rational principles and subject to rational analysis,…
High Voltage Testing. Volume 2. Specifications and Test Procedures
1982-08-01
the greatest impact on the initial assumption and criteria developed in the published criteria documents include: dielectric withstanding voltage...3382-75 Measurement of Energy and Integrated Charge Transfer Due to Partial Discharges (Corona) Using Bridge Techniques. ASTM-D 3426 - Dielectric... Energy (NEMA Publication No. WC 7-1971). NEMA Publication No. 109 - AIEE-EEI-NEMA Standard Basic Insulation Level. 092-57 - Method of Test for Flash and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simonen, E.P.; Johnson, K.I.; Simonen, F.A.
The Vessel Integrity Simulation Analysis (VISA-II) code was developed to allow calculations of the failure probability of a reactor pressure vessel subject to defined pressure/temperature transients. A version of the code, revised by Pacific Northwest Laboratory for the US Nuclear Regulatory Commission, was used to evaluate the sensitivities of calculated through-wall flaw probability to material, flaw and calculational assumptions. Probabilities were more sensitive to flaw assumptions than to material or calculational assumptions. Alternative flaw assumptions changed the probabilities by one to two orders of magnitude, whereas alternative material assumptions typically changed the probabilities by a factor of two or less.more » Flaw shape, flaw through-wall position and flaw inspection were sensitivities examined. Material property sensitivities included the assumed distributions in copper content and fracture toughness. Methods of modeling flaw propagation that were evaluated included arrest/reinitiation toughness correlations, multiple toughness values along the length of a flaw, flaw jump distance for each computer simulation and added error in estimating irradiated properties caused by the trend curve correlation error.« less
Some important considerations in the development of stress corrosion cracking test methods.
NASA Technical Reports Server (NTRS)
Wei, R. P.; Novak, S. R.; Williams, D. P.
1972-01-01
Discussion of some of the precaution needs the development of fracture-mechanics based test methods for studying stress corrosion cracking involves. Following a review of pertinent analytical fracture mechanics considerations and of basic test methods, the implications for test corrosion cracking studies of the time-to-failure determining kinetics of crack growth and life are examined. It is shown that the basic assumption of the linear-elastic fracture mechanics analyses must be clearly recognized and satisfied in experimentation and that the effects of incubation and nonsteady-state crack growth must also be properly taken into account in determining the crack growth kinetics, if valid data are to be obtained from fracture-mechanics based test methods.
Acoustic Absorption in Porous Materials
NASA Technical Reports Server (NTRS)
Kuczmarski, Maria A.; Johnston, James C.
2011-01-01
An understanding of both the areas of materials science and acoustics is necessary to successfully develop materials for acoustic absorption applications. This paper presents the basic knowledge and approaches for determining the acoustic performance of porous materials in a manner that will help materials researchers new to this area gain the understanding and skills necessary to make meaningful contributions to this field of study. Beginning with the basics and making as few assumptions as possible, this paper reviews relevant topics in the acoustic performance of porous materials, which are often used to make acoustic bulk absorbers, moving from the physics of sound wave interactions with porous materials to measurement techniques for flow resistivity, characteristic impedance, and wavenumber.
Hydrogen donors and acceptors and basic amino acids jointly contribute to carcinogenesis.
Tang, Man; Zhou, Yanchao; Li, Yiqi; Zou, Juntong; Yang, Beicheng; Cai, Li; Zhang, Xuelan; Liu, Qiuyun
2017-01-01
A hypothesis is postulated that high content of hydrogen donors and acceptors, and basic amino acids cause the intracellular trapping of the H + and Cl - ions, which increases cancer risks as local formation of HCl is mutagenic to DNA. Other cations such as Ca 2+ , and weak acids such as short-chain organic acids may attenuate the intracellular gathering of the H + and Cl - , two of the most abundant ions in the cells. Current data on increased cancer risks in diabetic and obese patients are consistent with the assumption that hydrogen bonding propensity on glucose, triglycerides and other molecules is among the causative factors. Copyright © 2016 Elsevier Ltd. All rights reserved.
MHD processes in the outer heliosphere
NASA Technical Reports Server (NTRS)
Burlaga, L. F.
1984-01-01
The magnetic field measurements from Voyager and the magnetohydrodynamic (MHD) processes in the outer heliosphere are reviewed. A bibliography of the experimental and theoretical work concerning magnetic fields and plasmas observed in the outer heliosphere is given. Emphasis in this review is on basic concepts and dynamical processes involving the magnetic field. The theory that serves to explain and unify the interplanetary magnetic field and plasma observations is magnetohydrodynamics. Basic physical processes and observations that relate directly to solutions of the MHD equations are emphasized, but obtaining solutions of this complex system of equations involves various assumptions and approximations. The spatial and temporal complexity of the outer heliosphere and some approaches for dealing with this complexity are discussed.
The Equations of Oceanic Motions
NASA Astrophysics Data System (ADS)
Müller, Peter
2006-10-01
Modeling and prediction of oceanographic phenomena and climate is based on the integration of dynamic equations. The Equations of Oceanic Motions derives and systematically classifies the most common dynamic equations used in physical oceanography, from large scale thermohaline circulations to those governing small scale motions and turbulence. After establishing the basic dynamical equations that describe all oceanic motions, M|ller then derives approximate equations, emphasizing the assumptions made and physical processes eliminated. He distinguishes between geometric, thermodynamic and dynamic approximations and between the acoustic, gravity, vortical and temperature-salinity modes of motion. Basic concepts and formulae of equilibrium thermodynamics, vector and tensor calculus, curvilinear coordinate systems, and the kinematics of fluid motion and wave propagation are covered in appendices. Providing the basic theoretical background for graduate students and researchers of physical oceanography and climate science, this book will serve as both a comprehensive text and an essential reference.
NASA Astrophysics Data System (ADS)
Lansard, Erick; Frayssinhes, Eric; Palmade, Jean-Luc
Basically, the problem of designing a multisatellite constellation exhibits a lot of parameters with many possible combinations: total number of satellites, orbital parameters of each individual satellite, number of orbital planes, number of satellites in each plane, spacings between satellites of each plane, spacings between orbital planes, relative phasings between consecutive orbital planes. Hopefully, some authors have theoretically solved this complex problem under simplified assumptions: the permanent (or continuous) coverage by a single and multiple satellites of the whole Earth and zonal areas has been entirely solved from a pure geometrical point of view. These solutions exhibit strong symmetry properties (e.g. Walker, Ballard, Rider, Draim constellations): altitude and inclination are identical, orbital planes and satellites are regularly spaced, etc. The problem with such constellations is their oversimplified and restricted geometrical assumption. In fact, the evaluation function which is used implicitly only takes into account the point-to-point visibility between users and satellites and does not deal with very important constraints and considerations that become mandatory when designing a real satellite system (e.g. robustness to satellite failures, total system cost, common view between satellites and ground stations, service availability and satellite reliability, launch and early operations phase, production constraints, etc.). An original and global methodology relying on a powerful optimization tool based on genetic algorithms has been developed at ALCATEL ESPACE. In this approach, symmetrical constellations can be used as initial conditions of the optimization process together with specific evaluation functions. A multi-criteria performance analysis is conducted and presented here in a parametric way in order to identify and evaluate the main sensitive parameters. Quantitative results are given for three examples in the fields of navigation, telecommunication and multimedia satellite systems. In particular, a new design pattern with very efficient properties in terms of robustness to satellite failures is presented and compared with classical Walker patterns.
Experimental measurement of binding energy, selectivity, and allostery using fluctuation theorems.
Camunas-Soler, Joan; Alemany, Anna; Ritort, Felix
2017-01-27
Thermodynamic bulk measurements of binding reactions rely on the validity of the law of mass action and the assumption of a dilute solution. Yet, important biological systems such as allosteric ligand-receptor binding, macromolecular crowding, or misfolded molecules may not follow these assumptions and may require a particular reaction model. Here we introduce a fluctuation theorem for ligand binding and an experimental approach using single-molecule force spectroscopy to determine binding energies, selectivity, and allostery of nucleic acids and peptides in a model-independent fashion. A similar approach could be used for proteins. This work extends the use of fluctuation theorems beyond unimolecular folding reactions, bridging the thermodynamics of small systems and the basic laws of chemical equilibrium. Copyright © 2017, American Association for the Advancement of Science.
The limits of discipline: towards interdisciplinary food studies.
Wilk, Richard
2012-11-05
While the number of scholars working on the broad topic of food has never been greater, the topic is still divided among numerous disciplines and specialists who do not often communicate with each other. This paper discusses some of the deep differences between disciplinary approaches, and concludes that food scientists differ in some of their basic assumptions about human nature. After outlining some of the institutional issues standing in the way of interdisciplinary work, the paper argues for a more synthetic and empirical approach, grounded in the study of everyday life. True interdisciplinary collaboration will have to go beyond assembling multidisciplinary teams. Instead we must accept the limitations of the classic disciplinary paradigms, and be willing to question and test our methods and assumptions. Copyright © 2012 Elsevier Inc. All rights reserved.
PTSD as Meaning Violation: Testing a Cognitive Worldview Perspective.
Park, Crystal L; Mills, Mary Alice; Edmondson, Donald
2012-01-01
The cognitive perspective on post-traumatic stress disorder (PTSD) has been successful in explaining many PTSD-related phenomena and in developing effective treatments, yet some of its basic assumptions remain surprisingly under-examined. The present study tested two of these assumptions: (1) situational appraisals of the event as violating global meaning (i.e., beliefs and goals) is related to PTSD symptomatology, and (2) the effect of situational appraisals of violation on PTSD symptomatology is mediated by global meaning (i.e., views of self and world). We tested these assumptions in a cross-sectional study of 130 college students who had experienced a DSM-IV level trauma. Structural equation modeling showed that appraisals of the extent to which the trauma violated one's beliefs and goals related fairly strongly to PTSD. In addition, the effects of appraisals of belief and goal violations on PTSD symptoms were fully mediated through negative global beliefs about both the self and the world. These findings support the cognitive worldview perspective, highlighting the importance of the meaning individuals assign to traumatic events, particularly the role of meaning violation.
PTSD as Meaning Violation: Testing a Cognitive Worldview Perspective
Park, Crystal L.; Mills, Mary Alice; Edmondson, Donald
2014-01-01
The cognitive perspective on post-traumatic stress disorder (PTSD) has been successful in explaining many PTSD-related phenomena and in developing effective treatments, yet some of its basic assumptions remain surprisingly under-examined. The present study tested two of these assumptions: (1) situational appraisals of the event as violating global meaning (i.e., beliefs and goals) is related to PTSD symptomatology, and (2) the effect of situational appraisals of violation on PTSD symptomatology is mediated by global meaning (i.e., views of self and world). We tested these assumptions in a cross-sectional study of 130 college students who had experienced a DSM-IV level trauma. Structural equation modeling showed that appraisals of the extent to which the trauma violated one’s beliefs and goals related fairly strongly to PTSD. In addition, the effects of appraisals of belief and goal violations on PTSD symptoms were fully mediated through negative global beliefs about both the self and the world. These findings support the cognitive worldview perspective, highlighting the importance of the meaning individuals assign to traumatic events, particularly the role of meaning violation. PMID:24860641
Adaptive control: Myths and realities
NASA Technical Reports Server (NTRS)
Athans, M.; Valavani, L.
1984-01-01
It was found that all currently existing globally stable adaptive algorithms have three basic properties in common: positive realness of the error equation, square-integrability of the parameter adjustment law and, need for sufficient excitation for asymptotic parameter convergence. Of the three, the first property is of primary importance since it satisfies a sufficient condition for stabillity of the overall system, which is a baseline design objective. The second property has been instrumental in the proof of asymptotic error convergence to zero, while the third addresses the issue of parameter convergence. Positive-real error dynamics can be generated only if the relative degree (excess of poles over zeroes) of the process to be controlled is known exactly; this, in turn, implies perfect modeling. This and other assumptions, such as absence of nonminimum phase plant zeros on which the mathematical arguments are based, do not necessarily reflect properties of real systems. As a result, it is natural to inquire what happens to the designs under less than ideal assumptions. The issues arising from violation of the exact modeling assumption which is extremely restrictive in practice and impacts the most important system property, stability, are discussed.
d’Uva, Teresa Bago; Lindeboom, Maarten; O’Donnell, Owen; van Doorslaer, Eddy
2011-01-01
We propose tests of the two assumptions under which anchoring vignettes identify heterogeneity in reporting of categorical evaluations. Systematic variation in the perceived difference between any two vignette states is sufficient to reject vignette equivalence. Response consistency - the respondent uses the same response scale to evaluate the vignette and herself – is testable given sufficiently comprehensive objective indicators that independently identify response scales. Both assumptions are rejected for reporting of cognitive and physical functioning in a sample of older English individuals, although a weaker test resting on less stringent assumptions does not reject response consistency for cognition. PMID:22184479
Vrzheshch, P V
2015-01-01
Quantitative evaluation of the accuracy of the rapid equilibrium assumption in the steady-state enzyme kinetics was obtained for an arbitrary mechanism of an enzyme-catalyzed reaction. This evaluation depends only on the structure and properties of the equilibrium segment, but doesn't depend on the structure and properties of the rest (stationary part) of the kinetic scheme. The smaller the values of the edges leaving equilibrium segment in relation to values of the edges within the equilibrium segment, the higher the accuracy of determination of intermediate concentrations and reaction velocity in a case of the rapid equilibrium assumption.
ERIC Educational Resources Information Center
Wineburg, Sam
What ways of thinking, writing, and questioning would be lost if we eliminated history from the curriculum? The essays in this book begin with the basic assumption that history teaches people a way to make choices, to balance opinions, to tell stories, and to become uneasy--when necessary--about the stories that are told. The book is concerned…
Recombination-generation currents in degenerate semiconductors
NASA Technical Reports Server (NTRS)
Von Roos, O.
1978-01-01
The classical Shockley-Read-Hall theory of free carrier recombination and generation via traps is extended to degenerate semiconductors. A concise and simple expression is found which avoids completely the concept of a Fermi level, a concept which is alien to nonequilibrium situations. Assumptions made in deriving the recombination generation current are carefully delineated and are found to be basically identical to those made in the original theory applicable to nondegenerate semiconductors.
French NATO Policy: The Next Five Years
1990-06-01
tradeoffs on the ambitious French modernization programs. Most dramatic have been the projected strategic consequences of perestroika: France , like... project power into areas of French influence in the Third World. In the mid-I 980s, France was spending roughly 3.9 percent of gross domestic product on...policy environment and its effects on the basic assumptions underpinning French policy. He concludes that in the future, France will be easier to work
[Medical service marketing at the time of medical insurance].
Polyakov, I V; Uvarov, S A; Mikhaylova, L S; Lankin, K A
1997-01-01
Presents the approaches to applying the fundamentals of marketing to public health. Medical insurance organization may effectively work as arbitrators and marketing agents; the basic assumption in the theory of marketing underlies their activity. The concept of marketing implies investigation of the requirements of the users of medical services and the development of measures aimed at meeting the requirements of man in terms of health service and health maintenance.
Techniques for the computation in demographic projections of health manpower.
Horbach, L
1979-01-01
Some basic principles and algorithms are presented which can be used for projective calculations of medical staff on the basis of demographic data. The effects of modifications of the input data such as by health policy measures concerning training capacity, can be demonstrated by repeated calculations with assumptions. Such models give a variety of results and may highlight the probable future balance between health manpower supply and requirements.
Forging a Combat Mobility Culture
2006-04-01
values and beliefs, and basic assumptions. Artifacts are the most visible aspects of an organization. They include physical environment...Leadership, Command, and Communication Studies Academic Year 2006 Coursebook (Edited by Sharon McBride, Maxwell AFB, AL: Air Command and Staff...Air Force Doing it Right?.” In Leadership, Command, and Communication Studies Academic Year 2006 Coursebook . Edited by Sharon McBride, Maxwell AFB, AL: Air Command and Staff College, October 2005. 38
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wen, Xinyu; Liu, Zhengyu; Chen, Zhongxiao
Water isotopes in precipitation have played a key role in the reconstruction of past climate on millennial timescales and longer. But, for midlatitude regions like East Asia with complex terrain, the reliability behind the basic assumptions of the temperature effect and amount effect is based on modern observational data and still remains unclear for past climate. In the present work, we reexamine the two basic effects on seasonal, interannual, and millennial timescales in a set of time slice experiments for the period 22–0 ka using an isotope-enabled atmospheric general circulation model (AGCM). Our study confirms the robustness of the temperaturemore » and amount effects on the seasonal cycle over China in the present climatic conditions, with the temperature effect dominating in northern China and the amount effect dominating in the far south of China but no distinct effect in the transition region of central China. However, our analysis shows that neither temperature nor amount effect is significantly dominant over China on millennial and interannual timescales, which is a challenge to those classic assumptions in past climate reconstruction. This work helps shed light on the interpretation of the proxy record of δ 18O from a modeling point of view.« less
Wen, Xinyu; Liu, Zhengyu; Chen, Zhongxiao; ...
2016-11-06
Water isotopes in precipitation have played a key role in the reconstruction of past climate on millennial timescales and longer. But, for midlatitude regions like East Asia with complex terrain, the reliability behind the basic assumptions of the temperature effect and amount effect is based on modern observational data and still remains unclear for past climate. In the present work, we reexamine the two basic effects on seasonal, interannual, and millennial timescales in a set of time slice experiments for the period 22–0 ka using an isotope-enabled atmospheric general circulation model (AGCM). Our study confirms the robustness of the temperaturemore » and amount effects on the seasonal cycle over China in the present climatic conditions, with the temperature effect dominating in northern China and the amount effect dominating in the far south of China but no distinct effect in the transition region of central China. However, our analysis shows that neither temperature nor amount effect is significantly dominant over China on millennial and interannual timescales, which is a challenge to those classic assumptions in past climate reconstruction. This work helps shed light on the interpretation of the proxy record of δ 18O from a modeling point of view.« less
Dynamics of an HIV-1 infection model with cell mediated immunity
NASA Astrophysics Data System (ADS)
Yu, Pei; Huang, Jianing; Jiang, Jiao
2014-10-01
In this paper, we study the dynamics of an improved mathematical model on HIV-1 virus with cell mediated immunity. This new 5-dimensional model is based on the combination of a basic 3-dimensional HIV-1 model and a 4-dimensional immunity response model, which more realistically describes dynamics between the uninfected cells, infected cells, virus, the CTL response cells and CTL effector cells. Our 5-dimensional model may be reduced to the 4-dimensional model by applying a quasi-steady state assumption on the variable of virus. However, it is shown in this paper that virus is necessary to be involved in the modeling, and that a quasi-steady state assumption should be applied carefully, which may miss some important dynamical behavior of the system. Detailed bifurcation analysis is given to show that the system has three equilibrium solutions, namely the infection-free equilibrium, the infectious equilibrium without CTL, and the infectious equilibrium with CTL, and a series of bifurcations including two transcritical bifurcations and one or two possible Hopf bifurcations occur from these three equilibria as the basic reproduction number is varied. The mathematical methods applied in this paper include characteristic equations, Routh-Hurwitz condition, fluctuation lemma, Lyapunov function and computation of normal forms. Numerical simulation is also presented to demonstrate the applicability of the theoretical predictions.
NASA Astrophysics Data System (ADS)
Rhodes, Russel E.; Byrd, Raymond J.
1998-01-01
This paper presents a ``back of the envelope'' technique for fast, timely, on-the-spot, assessment of affordability (profitability) of commercial space transportation architectural concepts. The tool presented here is not intended to replace conventional, detailed costing methodology. The process described enables ``quick look'' estimations and assumptions to effectively determine whether an initial concept (with its attendant cost estimating line items) provides focus for major leapfrog improvement. The Cost Charts Users Guide provides a generic sample tutorial, building an approximate understanding of the basic launch system cost factors and their representative magnitudes. This process will enable the user to develop a net ``cost (and price) per payload-mass unit to orbit'' incorporating a variety of significant cost drivers, supplemental to basic vehicle cost estimates. If acquisition cost and recurring cost factors (as a function of cost per payload-mass unit to orbit) do not meet the predetermined system-profitability goal, the concept in question will be clearly seen as non-competitive. Multiple analytical approaches, and applications of a variety of interrelated assumptions, can be examined in a quick, (on-the-spot) cost approximation analysis as this tool has inherent flexibility. The technique will allow determination of concept conformance to system objectives.
Evaluating growth assumptions using diameter or radial increments in natural even-aged longleaf pine
John C. Gilbert; Ralph S. Meldahl; Jyoti N. Rayamajhi; John S. Kush
2010-01-01
When using increment cores to predict future growth, one often assumes future growth is identical to past growth for individual trees. Once this assumption is accepted, a decision has to be made between which growth estimate should be used, constant diameter growth or constant basal area growth. Often, the assumption of constant diameter growth is used due to the ease...
Programmatic Re-Evaluation of Ion Exchange as a 1st Generation ITP Replacement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, A.B.
This re-evaluation differs from previous work in that (1) the Ion Exchange option was evaluated from a standpoint assuming that ITP would never start up, thus Ion Exchange was the only viable option, (2) the DOE prescribed balanced assumptions were quite different than the WSRC Assumptions used previously, and (3) other Site events and changes within HLWM have tended to reduce the disadvantages of Ion Exchange relative to ITP as the first generation salt decontamination process.
Marra, Carlo A; Bansback, Nick; Anis, Aslam H; Shojania, Kamran
2011-03-01
Rheumatoid arthritis (RA) is a chronic, debilitating inflammatory, progressive musculoskeletal disease that affects 0.5-1.0% of the adult population in Western countries. The joint destruction and progressive functional disability associated with uncontrolled RA result in tremendous impacts on health-related quality of life, ability to work, and mortality. In addition, the treatment of the disease and associated complications exact a substantial economic burden to the patients, their families, and society. In the last decade, several biological agents (biologics) have been approved for use in RA, revolutionizing treatment. These biologics, which target cytokines such as tumor necrosis factor or lymphocytes such as B or T cells, reduce functional disability and substantially slow the progression of joint damage. However, because these agents typically cost ten to 100 times more than existing available older drug therapies, there has been worldwide concern regarding their impact on healthcare budgets. As such, there has been increased attention towards economic evaluation as a means to determine whether, and in which subgroup of patients, these newer, more expensive agents confer appropriate value for their additional cost. Indeed, evaluations have guided coverage decisions for both private and public health insurance agencies such as the National Institute for Health and Clinical Excellence in the UK. The use of economic evaluations to determine value for money for these agents has attracted both debate and controversy. Some of the controversy is related to the appropriateness of the structure of, and assumptions underlying, the decision models employed to estimate the long-term costs and benefits of these agents over existing therapies. To fully appreciate the debate, one must first understand the basic principles of economic evaluation and the necessity for using decision models to evaluate cost effectiveness. To understand the basic principles of economic evaluation, we refer the reader to an introductory article aimed at clinical rheumatologists. This paper attempts to explain the rationale for the use of economic modeling approaches to assess the value of biologics for RA using specific examples from the literature.
On the validity of time-dependent AUC estimators.
Schmid, Matthias; Kestler, Hans A; Potapov, Sergej
2015-01-01
Recent developments in molecular biology have led to the massive discovery of new marker candidates for the prediction of patient survival. To evaluate the predictive value of these markers, statistical tools for measuring the performance of survival models are needed. We consider estimators of discrimination measures, which are a popular approach to evaluate survival predictions in biomarker studies. Estimators of discrimination measures are usually based on regularity assumptions such as the proportional hazards assumption. Based on two sets of molecular data and a simulation study, we show that violations of the regularity assumptions may lead to over-optimistic estimates of prediction accuracy and may therefore result in biased conclusions regarding the clinical utility of new biomarkers. In particular, we demonstrate that biased medical decision making is possible even if statistical checks indicate that all regularity assumptions are satisfied. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Twisk, Divera; Vlakveld, Willem; Mesken, Jolieke; Shope, Jean T; Kok, Gerjo
2013-06-01
Road injuries are a prime cause of death in early adolescence. Often road safety education (RSE) is used to target risky road behaviour in this age group. These RSE programmes are frequently based on the assumption that deliberate risk taking rather than lack of competency underlies risk behaviour. This study tested the competency of 10-13 year olds, by examining their decisions - as pedestrians and cyclists - in dealing with blind spot areas around lorries. Also, the effects of an awareness programme and a competency programme on these decisions were evaluated. Table-top models were used, representing seven scenarios that differed in complexity: one basic scenario to test the identification of blind spot areas, and 6 traffic scenarios to test behaviour in traffic situations of low or high task complexity. Using a quasi-experimental design (pre-test and post-test reference group design without randomization), the programme effects were assessed by requiring participants (n=62) to show, for each table-top traffic scenario, how they would act if they were in that traffic situation. On the basic scenario, at pre-test 42% of the youngsters identified all blind spots correctly, but only 27% showed safe behaviour in simple scenarios and 5% in complex scenarios. The competency programme yielded improved performance on the basic scenario but not on the traffic scenarios, whereas the awareness programme did not result in any improvements. The correlation between improvements on the basic scenarios and the traffic scenarios was not significant. Young adolescents have not yet mastered the necessary skills for safe performance in simple and complex traffic situations, thus underlining the need for effective prevention programmes. RSE may improve the understanding of blind spot areas but this does not 'automatically' transfer to performance in traffic situations. Implications for the design of RSE are discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.
Discussion of examination of a cored hydraulic fracture in a deep gas well
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nolte, K.G.
Warpinski et al. document information found from a core through a formation after a hydraulic fracture treatment. As they indicate, the core provides the first detailed evaluation of an actual propped hydraulic fracture away from the well and at a significant depth, and this evaluation leads to findings that deviate substantially from the assumptions incorporated into current fracturing models. In this discussion, a defense of current fracture design assumptions is developed. The affirmation of current assumptions, for general industry applications, is based on an assessment of the global impact of the local complexity found in the core. The assessment leadsmore » to recommendations for the evolution of fracture design practice.« less
Spitzer, R L
2001-06-01
It is widely acknowledged that the approach taken in the development of a classification of mental disorders is guided by various values and assumptions. The author, who played a central role in the development of DSM-III (American Psychiatric Association [1980] Diagnostic and statistical manual of mental disorders, 3rd ed. Washington, DC:Author) and DSM-III-R (American Psychiatric Association [1987] Diagnostic and statistical manual of mental disorders, 3rd ed, rev. Washington, DC:Author) will explicate the basic values and assumptions that guided the development of these two diagnostic manuals. In so doing, the author will respond to the critique of DSM-III and DSM-III-R made by Sadler et al. in their 1994 paper (Sadler JZ, Hulgus YF, Agich GJ [1994] On values in recent American psychiatric classification. JMed Phil 19:261-277). The author will attempt to demonstrate that the stated goals of DSM-III and DSM-III-R are not inherently in conflict and are easily explicated by appealing to widely held values and assumptions, most of which appeared in the literature during the development of the manuals. Furthermore, we will demonstrate that it is not true that DSM-III places greater emphasis on reliability over validity and is covertly committed to a biological approach to explaining psychiatric disturbance.
Approximations of Two-Attribute Utility Functions
1976-09-01
preferred to") be a bina-zy relation on the set • of simple probability measures or ’gambles’ defined on a set T of consequences. Throughout this study it...simplifying independence assumptions. Although there are several approaches to this problem, the21 present study will focus on approximations of u... study will elicit additional interest in the topic. 2. REMARKS ON APPROXIMATION THEORY This section outlines a few basic ideas of approximation theory
WalkThrough Example Procedures for MAMA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruggiero, Christy E.; Gaschen, Brian Keith; Bloch, Jeffrey Joseph
This documentation is a growing set of walk through examples of analyses using the MAMA V2.0 software. It does not cover all the features or possibilities with the MAMA software, but will address using many of the basic analysis tools to quantify particle size and shape in an image. This document will continue to evolve as additional procedures and examples are added. The starting assumption is that the MAMA software has been successfully installed.
1978-01-01
South Carolina fo 9*10=0 ~c cmd me, hA. lUU~h~hum~gd.~ JANUARY 1978 85 01 11 084 S ~~ . . . . . . . . . . . . . . FEASIBILITY REPORT REVIEW OF REPORT...ADOPTED JANUARY 28, 1958 2 ENVIRONMENTAL ASSESSMENT . . °.. . . . . . . . . . . .. . .. -" . , .". * * . . . . . . . .. -~ . . . -. REVIEW OF REPORTS... review . As a result of this review , it was judged that some of the basic assumptions presented in the draft report were no longer applicable and that
Predictability of currency market exchange
NASA Astrophysics Data System (ADS)
Ohira, Toru; Sazuka, Naoya; Marumo, Kouhei; Shimizu, Tokiko; Takayasu, Misako; Takayasu, Hideki
2002-05-01
We analyze tick data of yen-dollar exchange with a focus on its up and down movement. We show that there exists a rather particular conditional probability structure with such high frequency data. This result provides us with evidence to question one of the basic assumptions of the traditional market theory, where such bias in high frequency price movements is regarded as not present. We also construct systematically a random walk model reflecting this probability structure.
ERIC Educational Resources Information Center
New York City Board of Education, Brooklyn, NY. Office of Bilingual Education.
The report presents a conceptual framework and related strategies designed to help policymakers and practitioners re-examine, and when necessary, rework the basic assumptions and practices defining the educational experiences of bilingual/English-as-a-Second-Language (ESL) learners in New York City (New York) public schools. The report consists of…
Miller, David A. W.; Bailey, Larissa L.; Grant, Evan H. Campbell; McClintock, Brett T.; Weir, Linda A.; Simons, Theodore R.
2015-01-01
Our results demonstrate that even small probabilities of misidentification and among-site detection heterogeneity can have severe effects on estimator reliability if ignored. We challenge researchers to place greater attention on both heterogeneity and false positives when designing and analysing occupancy studies. We provide 9 specific recommendations for the design, implementation and analysis of occupancy studies to better meet this challenge.
Conditioned Limit Theorems for Some Null Recurrent Markov Processes
1976-08-01
Chapter 1 INTRODUCTION 1.1 Summary of Results Let (Vk, k ! 0) be a discrete time Markov process with state space EC(- , ) and let S be...explain our results in some detail. 2 We begin by stating our three basic assumptions: (1) vk s k 2 0 Is a Markov process with state space E C(-o,%); (Ii... 12 n 3. CONDITIONING ON T (, > n.................................1.9 3.1 Preliminary Results
Factors Affecting Post-Service Wage Growth for Veterans
1991-12-01
Labor economics is primarily concerned with how employers and employees respond to changes in wages, prices, profits, and the non-pecuniary aspects...of the employment reLaticnship [Ref: 4, pg. 31 Two of the basic assumptions underlying labor economics are Lhat resources are scarce, and that people...Retiree’ Post-Service Earnigs and Empjoyment, February 1981, Fand Corporation. 4. Ehrenberq, R. G. and Smith, R. S., Modern Labor Economics . 3ra Edit on
Anthropometric Source Book. Volume 1: Anthropometry for Designers
1978-07-01
diet initiates replacement of the tissue loss incurred in the first day or two of flight. Any further caloric excess or deficit would be superimposed...the Skylab missions, a calorically inadequate basic diet was supplied as a result of the assumption that in-flight requirements were less than those...from one-g to weightlessness conditions or vice versa, any remaining volume changes are probably tissue changes. If a diet is calorically inadequate
The Assumption of Adequacy: Operation Safe Haven, A Chaplain’s View.
1999-06-04
poverty , their ignorance regarding everything from literacy to the most basic hygiene was overwhelming. One chaplain assistant from Fort Carson...perspective, the Panamanians, ninety percent of whom lived in absolute poverty were less than enamored with this state of affairs. The Canal Zone...was soon discovered that the entire adult population on the island of Cuba is addicted to nicotine ), and a brand new pair of running shoes. While going
In vivo stationary flux analysis by 13C labeling experiments.
Wiechert, W; de Graaf, A A
1996-01-01
Stationary flux analysis is an invaluable tool for metabolic engineering. In the last years the metabolite balancing technique has become well established in the bioengineering community. On the other hand metabolic tracer experiments using 13C isotopes have long been used for intracellular flux determination. Only recently have both techniques been fully combined to form a considerably more powerful flux analysis method. This paper concentrates on modeling and data analysis for the evaluation of such stationary 13C labeling experiments. After reviewing recent experimental developments, the basic equations for modeling carbon labeling in metabolic systems, i.e. metabolite, carbon label and isotopomer balances, are introduced and discussed in some detail. Then the basics of flux estimation from measured extracellular fluxes combined with carbon labeling data are presented and, finally, this method is illustrated by using an example from C. glutamicum. The main emphasis is on the investigation of the extra information that can be obtained with tracer experiments compared with the metabolite balancing technique alone. As a principal result it is shown that the combined flux analysis method can dispense with some rather doubtful assumptions on energy balancing and that the forward and backward flux rates of bidirectional reaction steps can be simultaneously determined in certain situations. Finally, it is demonstrated that the variant of fractional isotopomer measurement is even more powerful than fractional labeling measurement but requires much higher numerical effort to solve the balance equations.
Rosta, Edina; Warshel, Arieh
2012-01-01
Understanding the relationship between the adiabatic free energy profiles of chemical reactions and the underlining diabatic states is central to the description of chemical reactivity. The diabatic states form the theoretical basis of Linear Free Energy Relationships (LFERs) and thus play a major role in physical organic chemistry and related fields. However, the theoretical justification for some of the implicit LFER assumptions has not been fully established by quantum mechanical studies. This study follows our earlier works1,2 and uses the ab initio frozen density functional theory (FDFT) method3 to evaluate both the diabatic and adiabatic free energy surfaces and to determine the corresponding off-diagonal coupling matrix elements for a series of SN2 reactions. It is found that the off-diagonal coupling matrix elements are almost the same regardless of the nucleophile and the leaving group but change upon changing the central group. Furthermore, it is also found that the off diagonal elements are basically the same in gas phase and in solution, even when the solvent is explicitly included in the ab initio calculations. Furthermore, our study establishes that the FDFT diabatic profiles are parabolic to a good approximation thus providing a first principle support to the origin of LFER. These findings further support the basic approximation of the EVB treatment. PMID:23329895
Theory, modelling and calibration of passive samplers used in water monitoring: Chapter 7
Booij, K.; Vrana, B.; Huckins, James N.; Greenwood, Richard B.; Mills, Graham; Vrana, B.
2007-01-01
This chapter discusses contaminant uptake by a passive sampling device (PSD) that consists of a central sorption phase, surrounded by a membrane. A variety of models has been used over the past few years to better understand the kinetics of contaminant transfer to passive samplers. These models are essential for understanding how the amounts of absorbed contaminants relate to ambient concentrations, as well as for the design and evaluation of calibration experiments. Models differ in the number of phases and simplifying assumptions that are taken into consideration, such as the existence of (pseudo-) steady-state conditions, the presence or absence of linear concentration gradients within the membrane phase, the way in which transport within the WBL is modeled and whether or not the aqueous concentration is constant during the sampler exposure. The chapter introduces the basic concepts and models used in the literature on passive samplers for the special case of triolein-containing semipermeable membrane devices (SPMDs). These can easily be extended to samplers with more or with less sorption phases. It also discusses the transport of chemicals through the various phases constituting PSDs. the implications of these models for designing and evaluating calibration studies have been discussed.
Korennoy, F I; Gulenkin, V M; Gogin, A E; Vergne, T; Karaulov, A K
2017-12-01
In 1977, Ukraine experienced a local epidemic of African swine fever (ASF) in the Odessa region. A total of 20 settlements were affected during the course of the epidemic, including both large farms and backyard households. Thanks to timely interventions, the virus circulation was successfully eradicated within 6 months, leading to no additional outbreaks. Detailed report of the outbreak's investigation has been publically available from 2014. The report contains some quantitative data that allow studying the ASF-spread dynamics in the course of the epidemic. In our study, we used this historical epidemic to estimate the basic reproductive number of the ASF virus both within and between farms. The basic reproductive number (R 0 ) represents the average number of secondary infections caused by one infectious unit during its infectious period in a susceptible population. Calculations were made under assumption of an exponential initial growth by fitting the approximating curve to the initial segments of the epidemic curves. The R 0 both within farm and between farms was estimated at 7.46 (95% confidence interval: 5.68-9.21) and 1.65 (1.42-1.88), respectively. Corresponding daily transmission rates were estimated at 1.07 (0.81-1.32) and 0.09 (0.07-0.10). These estimations based on historical data are consistent with those using data generated by the recent epidemic currently affecting eastern Europe. Such results contribute to the published knowledge on the ASF transmission dynamics under natural conditions and could be used to model and predict the spread of ASF in affected and non-affected regions and to evaluate the effectiveness of different control measures. © 2016 Blackwell Verlag GmbH.
Dindo, Maria Luisa; Vandicke, Jonas; Marchetti, Elisa; Spranghers, Thomas; Bonte, Jochem; De Clercq, Patrick
2016-04-01
The effect of supplementing hemolymph of the black soldier fly, Hermetia illucens (L.), or the Chinese oak silkworm, Antheraea pernyi (Guérin-Méneville), to a basic insect-free artificial medium for the tachinid Exorista larvarum (L.) was investigated. The supplementation (20% w/w) was based on the assumption that insect additives may optimize the media for this parasitoid. Egg hatch, pupal and adult yields, and sex ratio did not differ among the enriched and basic media. Preimaginal development was faster on both hemolymph-enriched media than on the basic medium. Despite the shorter development on the medium supplemented with H. illucens hemolymph than on the basic medium, on the two media puparium weights were comparable. The female flies reared on the medium enriched with H. illucens hemolymph did not lay more eggs, but the latter yielded significantly more puparia compared with the control females. Conversely, the medium enriched with A. pernyi hemolymph yielded lower female puparium weights than the basic medium and produced only one ovipositing female out of the five obtained female adults. These results indicate that the in vitro development of E. larvarum improved when the basic artificial medium was enriched with H. illucens hemolymph, whereas the supplementation with A. pernyi hemolymph negatively affected the quality of the in vitro-reared females.
Juvenile Radio-Tag Study: Lower Granite Dam, 1985 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stuehrenberg, Lowell C.
The concept of using mass releases of juvenile radio tags represents a new and potentially powerful research tool that could be effectively applied to juvenile salmonid passage problems at dams on the Columbia and Snake Rivers. A system of detector antennas, strategically located, would automatically detect and record individually tagged juvenile salmonids as they pass through the spillway, powerhouse, bypass system, or tailrace areas below the dam. Accurate measurements of spill effectiveness, fish guiding efficiency (FGE), collection efficiency (CE), spillway survival, powerhouse survival, and bypass survival would be possible without handling large numbers of unmarked fish. A prototype juvenile radio-tagmore » system was developed and tested by the National Marine Fisheries Service (NMFS) and Bonneville Power Administration (BPA) at John Day Dam and at Lower Granite Dam. This report summarizes research to: (1) evaluate the effectiveness of the prototype juvenile radio-tag system in a field situation and (2) to test the basic assumptions inherent in using the juvenile radio tag as a research tool.« less
Ghasemizadeh, Reza; Hellweger, Ferdinand; Butscher, Christoph; Padilla, Ingrid; Vesper, Dorothy; Field, Malcolm; Alshawabkeh, Akram
2013-01-01
Karst systems have a high degree of heterogeneity and anisotropy, which makes them behave very differently from other aquifers. Slow seepage through the rock matrix and fast flow through conduits and fractures result in a high variation in spring response to precipitation events. Contaminant storage occurs in the rock matrix and epikarst, but contaminant transport occurs mostly along preferential pathways that are typically inaccessible locations, which makes modeling of karst systems challenging. Computer models for understanding and predicting hydraulics and contaminant transport in aquifers make assumptions about the distribution and hydraulic properties of geologic features that may not always apply to karst aquifers. This paper reviews the basic concepts, mathematical descriptions, and modeling approaches for karst systems. The North Coast Limestone aquifer system of Puerto Rico (USA) is introduced as a case study to illustrate and discuss the application of groundwater models in karst aquifer systems to evaluate aquifer contamination. PMID:23645996
NASA Astrophysics Data System (ADS)
Park, J.; Lim, Y. J.; Sung, J. H.; Kang, H. S.
2017-12-01
The widely used meteorological drought index, the Standardized Precipitation Index (SPI) basically assumes stationarity, but recent change in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process has been proposed. The results are evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the shape of probability distribution function wider than before. This understanding implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.
Calculation of thermomechanical fatigue life based on isothermal behavior
NASA Technical Reports Server (NTRS)
Halford, Gary R.; Saltsman, James F.
1987-01-01
The isothermal and thermomechanical fatigue (TMF) crack initiation response of a hypothetical material was analyzed. Expected thermomechanical behavior was evaluated numerically based on simple, isothermal, cyclic stress-strain - time characteristics and on strainrange versus cyclic life relations that have been assigned to the material. The attempt was made to establish basic minimum requirements for the development of a physically accurate TMF life-prediction model. A worthy method must be able to deal with the simplest of conditions: that is, those for which thermal cycling, per se, introduces no damage mechanisms other than those found in isothermal behavior. Under these assumed conditions, the TMF life should be obtained uniquely from known isothermal behavior. The ramifications of making more complex assumptions will be dealt with in future studies. Although analyses are only in their early stages, considerable insight has been gained in understanding the characteristics of several existing high-temperature life-prediction methods. The present work indicates that the most viable damage parameter is based on the inelastic strainrange.
NASA Astrophysics Data System (ADS)
Park, Junehyeong; Sung, Jang Hyun; Lim, Yoon-Jin; Kang, Hyun-Suk
2018-05-01
The widely used meteorological drought index, the Standardized Precipitation Index (SPI), basically assumes stationarity, but recent changes in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process was proposed. The results were evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered that the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite that these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the probability distribution wider than before. This implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.
Validity of the mockwitness paradigm: testing the assumptions.
McQuiston, Dawn E; Malpass, Roy S
2002-08-01
Mockwitness identifications are used to provide a quantitative measure of lineup fairness. Some theoretical and practical assumptions of this paradigm have not been studied in terms of mockwitnesses' decision processes and procedural variation (e.g., instructions, lineup presentation method), and the current experiment was conducted to empirically evaluate these assumptions. Four hundred and eighty mockwitnesses were given physical information about a culprit, received 1 of 4 variations of lineup instructions, and were asked to identify the culprit from either a fair or unfair sequential lineup containing 1 of 2 targets. Lineup bias estimates varied as a result of lineup fairness and the target presented. Mockwitnesses generally reported that the target's physical description was their main source of identifying information. Our findings support the use of mockwitness identifications as a useful technique for sequential lineup evaluation, but only for mockwitnesses who selected only 1 lineup member. Recommendations for the use of this evaluation procedure are discussed.
Evaluation of assumptions in soil moisture triple collocation analysis
USDA-ARS?s Scientific Manuscript database
Triple collocation analysis (TCA) enables estimation of error variances for three or more products that retrieve or estimate the same geophysical variable using mutually-independent methods. Several statistical assumptions regarding the statistical nature of errors (e.g., mutual independence and ort...
He, Xin; Frey, Eric C
2006-08-01
Previously, we have developed a decision model for three-class receiver operating characteristic (ROC) analysis based on decision theory. The proposed decision model maximizes the expected decision utility under the assumption that incorrect decisions have equal utilities under the same hypothesis (equal error utility assumption). This assumption reduced the dimensionality of the "general" three-class ROC analysis and provided a practical figure-of-merit to evaluate the three-class task performance. However, it also limits the generality of the resulting model because the equal error utility assumption will not apply for all clinical three-class decision tasks. The goal of this study was to investigate the optimality of the proposed three-class decision model with respect to several other decision criteria. In particular, besides the maximum expected utility (MEU) criterion used in the previous study, we investigated the maximum-correctness (MC) (or minimum-error), maximum likelihood (ML), and Nyman-Pearson (N-P) criteria. We found that by making assumptions for both MEU and N-P criteria, all decision criteria lead to the previously-proposed three-class decision model. As a result, this model maximizes the expected utility under the equal error utility assumption, maximizes the probability of making correct decisions, satisfies the N-P criterion in the sense that it maximizes the sensitivity of one class given the sensitivities of the other two classes, and the resulting ROC surface contains the maximum likelihood decision operating point. While the proposed three-class ROC analysis model is not optimal in the general sense due to the use of the equal error utility assumption, the range of criteria for which it is optimal increases its applicability for evaluating and comparing a range of diagnostic systems.
Koopmeiners, Joseph S; Hobbs, Brian P
2018-05-01
Randomized, placebo-controlled clinical trials are the gold standard for evaluating a novel therapeutic agent. In some instances, it may not be considered ethical or desirable to complete a placebo-controlled clinical trial and, instead, the placebo is replaced by an active comparator with the objective of showing either superiority or non-inferiority to the active comparator. In a non-inferiority trial, the experimental treatment is considered non-inferior if it retains a pre-specified proportion of the effect of the active comparator as represented by the non-inferiority margin. A key assumption required for valid inference in the non-inferiority setting is the constancy assumption, which requires that the effect of the active comparator in the non-inferiority trial is consistent with the effect that was observed in previous trials. It has been shown that violations of the constancy assumption can result in a dramatic increase in the rate of incorrectly concluding non-inferiority in the presence of ineffective or even harmful treatment. In this paper, we illustrate how Bayesian hierarchical modeling can be used to facilitate multi-source smoothing of the data from the current trial with the data from historical studies, enabling direct probabilistic evaluation of the constancy assumption. We then show how this result can be used to adapt the non-inferiority margin when the constancy assumption is violated and present simulation results illustrating that our method controls the type-I error rate when the constancy assumption is violated, while retaining the power of the standard approach when the constancy assumption holds. We illustrate our adaptive procedure using a non-inferiority trial of raltegravir, an antiretroviral drug for the treatment of HIV.
Koopmeiners, Joseph S.; Hobbs, Brian P.
2016-01-01
Randomized, placebo-controlled clinical trials are the gold standard for evaluating a novel therapeutic agent. In some instances, it may not be considered ethical or desirable to complete a placebo-controlled clinical trial and, instead, the placebo is replaced by an active comparator (AC) with the objective of showing either superiority or non-inferiority to the AC. In a non-inferiority trial, the experimental treatment is considered non-inferior if it retains a pre-specified proportion of the effect of the AC as represented by the non-inferiority margin. A key assumption required for valid inference in the non-inferiority setting is the constancy assumption, which requires that the effect of the AC in the non-inferiority trial is consistent with the effect that was observed in previous trials. It has been shown that violations of the constancy assumption can result in a dramatic increase in the rate of incorrectly concluding non-inferiority in the presence of ineffective or even harmful treatment. In this paper, we illustrate how Bayesian hierarchical modeling can be used to facilitate multi-source smoothing of the data from the current trial with the data from historical studies, enabling direct probabilistic evaluation of the constancy assumption. We then show how this result can be used to adapt the non-inferiority margin when the constancy assumption is violated and present simulation results illustrating that our method controls the type-I error rate when the constancy assumption is violated, while retaining the power of the standard approach when the constancy assumption holds. We illustrate our adaptive procedure using a non-inferiority trial of raltegravir, an antiretroviral drug for the treatment of HIV. PMID:27587591
A control method for bilateral teleoperating systems
NASA Astrophysics Data System (ADS)
Strassberg, Yesayahu
1992-01-01
The thesis focuses on control of bilateral master-slave teleoperators. The bilateral control issue of teleoperators is studied and a new scheme that overcomes basic unsolved problems is proposed. A performance measure, based on the multiport modeling method, is introduced in order to evaluate and understand the limitations of earlier published bilateral control laws. Based on the study evaluating the different methods, the objective of the thesis is stated. The proposed control law is then introduced, its ideal performance is demonstrated, and conditions for stability and robustness are derived. It is shown that stability, desired performance, and robustness can be obtained under the assumption that the deviation of the model from the actual system satisfies certain norm inequalities and the measurement uncertainties are bounded. The proposed scheme is validated by numerical simulation. The simulated system is based on the configuration of the RAL (Robotics and Automation Laboratory) telerobot. From the simulation results it is shown that good tracking performance can be obtained. In order to verify the performance of the proposed scheme when applied to a real hardware system, an experimental setup of a three degree of freedom master-slave teleoperator (i.e. three degree of freedom master and three degree of freedom slave robot) was built. Three basic experiments were conducted to verify the performance of the proposed control scheme. The first experiment verified the master control law and its contribution to the robustness and performance of the entire system. The second experiment demonstrated the actual performance of the system while performing a free motion teleoperating task. From the experimental results, it is shown that the control law has good performance and is robust to uncertainties in the models of the master and slave.
On knowing the unconscious: lessons from the epistemology of geometry and space.
Brakel, L A
1994-02-01
Concepts involving unconscious processes and contents are central to any understanding of psychoanalysis. Indeed, the dynamic unconscious is familiar as a necessary assumption of the psychoanalytic method. Using the manner of knowing the geometry of space, including non-ordinary sized space, this paper attempts to demonstrate by analogy the possibility of knowing (and knowing the nature of) unconscious mentation-that of which by definition we cannot be aware; and yet that which constitutes a basic assumption of psychoanalysis. As an assumption of the psychoanalytic method, no amount of data from within the psychoanalytic method can ever provide evidence for the existence of the unconscious, nor for knowing its nature; hence the need for this sort of illustration by analogy. Along the way, three claims are made: (1) Freudian 'secondary process' operating during everyday adult, normal, logical thought can be considered a modernised version of the Kantian categories. (2) Use of models facilitates a generation of outside-the-Kantian-categories possibilities, and also provides a conserving function, as outside-the-categories possibilities can be assimilated. (3) Transformations are different from translations; knowledge of transformations can provide non-trivial knowledge about various substrates, otherwise difficult to know.
NASA Astrophysics Data System (ADS)
Fontaine, G.; Dufour, P.; Chayer, P.; Dupuis, J.; Brassard, P.
2015-06-01
The accretion-diffusion picture is the model par excellence for describing the presence of planetary debris polluting the atmospheres of relatively cool white dwarfs. Inferences on the process based on diffusion timescale arguments make the implicit assumption that the concentration gradient of a given metal at the base of the convection zone is negligible. This assumption is, in fact, not rigorously valid, but it allows the decoupling of the surface abundance from the evolving distribution of a given metal in deeper layers. A better approach is a full time-dependent calculation of the evolution of the abundance profile of an accreting-diffusing element. We used the same approach as that developed by Dupuis et al. to model accretion episodes involving many more elements than those considered by these authors. Our calculations incorporate the improvements to diffusion physics mentioned in Paper I. The basic assumption in the Dupuis et al. approach is that the accreted metals are trace elements, i.e, that they have no effects on the background (DA or non-DA) stellar structure. This allows us to consider an arbitrary number of accreting elements.
Area, length and thickness conservation: Dogma or reality?
NASA Astrophysics Data System (ADS)
Moretti, Isabelle; Callot, Jean Paul
2012-08-01
The basic assumption of quantitative structural geology is the preservation of material during deformation. However the hypothesis of volume conservation alone does not help to predict past or future geometries and so this assumption is usually translated into bed length in 2D (or area in 3D) and thickness conservation. When subsurface data are missing, geologists may extrapolate surface data to depth using the kink-band approach. These extrapolations, preserving both thicknesses and dips, lead to geometries which are restorable but often erroneous, due to both disharmonic deformation and internal deformation of layers. First, the Bolivian Sub-Andean Zone case is presented to highlight the evolution of the concepts on which balancing is based, and the important role played by a decoupling level in enhancing disharmony. Second, analogue models are analyzed to test the validity of the balancing techniques. Chamberlin's excess area approach is shown to be on average valid. However, neither the length nor the thicknesses are preserved. We propose that in real cases, the length preservation hypothesis during shortening could also be a wrong assumption. If the data are good enough to image the decollement level, the Chamberlin excess area method could be used to compute the bed length changes.
General solutions for the oxidation kinetics of polymers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillen, K.T.; Clough, R.L.; Wise, J.
1996-08-01
The simplest general kinetic schemes applicable to the oxidation of polymers are presented, discussed and analyzed in terms of the underlying kinetic assumptions. For the classic basic autoxidation scheme (BAS), which involves three bimolecular termination steps and is applicable mainly to unstabilized polymers, typical assumptions used singly or in groups include (1) long kinetic chain length, (2) a specific ratio of the termination rate constants and (3) insensitivity to the oxygen concentration (e.g., domination by a single termination step). Steady-state solutions for the rate of oxidation are given in terms of one, two, three, or four parameters, corresponding respectively tomore » three, two, one, or zero kinetic assumptions. The recently derived four-parameter solution predicts conditions yielding unusual dependencies of the oxidation rate on oxygen concentration and on initiation rate, as well as conditions leading to some unusual diffusion-limited oxidation profile shapes. For stabilized polymers, unimolecular termination schemes are typically more appropriate than bimolecular. Kinetics incorporating unimolecular termination reactions are shown to result in very simple oxidation expressions which have been experimentally verified for both radiation-initiated oxidation of an EPDM and thermoxidative degradation of nitrile and chloroprene elastomers.« less
Classical geometric resolution of the Einstein—Podolsky—Rosen paradox
Ne'eman, Yuval
1983-01-01
I show that, in the geometry of a fiber bundle describing a gauge theory, curvature and parallel transport ensure and impose nonseparability. The “Einstein—Podolsky—Rosen paradox” is thus resolved “classically.” I conjecture that the ostentatiously “implausible” features of the quantum treatment are due to the fact that space—time separability, a basic assumption of single-particle nonrelativistic quantum mechanics, does not fit the bundle geometry of the complete physics. PMID:16593392
Boundary layer transition: A review of theory, experiment and related phenomena
NASA Technical Reports Server (NTRS)
Kistler, E. L.
1971-01-01
The overall problem of boundary layer flow transition is reviewed. Evidence indicates a need for new, basic physical hypotheses in classical fluid mechanics math models based on the Navier-Stokes equations. The Navier-Stokes equations are challenged as inadequate for the investigation of fluid transition, since they are based on several assumptions which should be expected to alter significantly the stability characteristics of the resulting math model. Strong prima facie evidence is presented to this effect.
NASA Technical Reports Server (NTRS)
Perez-Peraza, J.; Alvarez, M.; Laville, A.; Gallegos, A.
1985-01-01
The study of charge changing cross sections of fast ions colliding with matter provides the fundamental basis for the analysis of the charge states produced in such interactions. Given the high degree of complexity of the phenomena, there is no theoretical treatment able to give a comprehensive description. In fact, the involved processes are very dependent on the basic parameters of the projectile, such as velocity charge state, and atomic number, and on the target parameters, the physical state (molecular, atomic or ionized matter) and density. The target velocity, may have also incidence on the process, through the temperature of the traversed medium. In addition, multiple electron transfer in single collisions intrincates more the phenomena. Though, in simplified cases, such as protons moving through atomic hydrogen, considerable agreement has been obtained between theory and experiments However, in general the available theoretical approaches have only limited validity in restricted regions of the basic parameters. Since most measurements of charge changing cross sections are performed in atomic matter at ambient temperature, models are commonly based on the assumption of targets at rest, however at Astrophysical scales, temperature displays a wide range in atomic and ionized matter. Therefore, due to the lack of experimental data , an attempt is made here to quantify temperature dependent cross sections on basis to somewhat arbitrary, but physically reasonable assumptions.
Curvilinear steel elements in load-bearing structures of high-rise building spatial frames
NASA Astrophysics Data System (ADS)
Ibragimov, Alexander; Danilov, Alexander
2018-03-01
The application of curvilinear elements in load-bearing metal structures of high-rise buildings supposes ensuring of their bearing capacity and serviceability. There may exist a great variety of shapes and orientations of such structural elements. In particular, it may be various flat curves of an open or closed oval profile such as circular or parabolic arch or ellipse. The considered approach implies creating vast internal volumes without loss in the load-bearing capacity of the frame. The basic concept makes possible a wide variety of layout and design solutions. The presence of free internal spaces of large volume in "skyscraper" type buildings contributes to resolving a great number of problems, including those of communicative nature. The calculation results confirm the basic assumptions.
Dendrite and Axon Specific Geometrical Transformation in Neurite Development
Mironov, Vasily I.; Semyanov, Alexey V.; Kazantsev, Victor B.
2016-01-01
We propose a model of neurite growth to explain the differences in dendrite and axon specific neurite development. The model implements basic molecular kinetics, e.g., building protein synthesis and transport to the growth cone, and includes explicit dependence of the building kinetics on the geometry of the neurite. The basic assumption was that the radius of the neurite decreases with length. We found that the neurite dynamics crucially depended on the relationship between the rate of active transport and the rate of morphological changes. If these rates were in the balance, then the neurite displayed axon specific development with a constant elongation speed. For dendrite specific growth, the maximal length was rapidly saturated by degradation of building protein structures or limited by proximal part expansion reaching the characteristic cell size. PMID:26858635
Soft Robotics: New Perspectives for Robot Bodyware and Control
Laschi, Cecilia; Cianchetti, Matteo
2014-01-01
The remarkable advances of robotics in the last 50 years, which represent an incredible wealth of knowledge, are based on the fundamental assumption that robots are chains of rigid links. The use of soft materials in robotics, driven not only by new scientific paradigms (biomimetics, morphological computation, and others), but also by many applications (biomedical, service, rescue robots, and many more), is going to overcome these basic assumptions and makes the well-known theories and techniques poorly applicable, opening new perspectives for robot design and control. The current examples of soft robots represent a variety of solutions for actuation and control. Though very first steps, they have the potential for a radical technological change. Soft robotics is not just a new direction of technological development, but a novel approach to robotics, unhinging its fundamentals, with the potential to produce a new generation of robots, in the support of humans in our natural environments. PMID:25022259
Spreading dynamics on complex networks: a general stochastic approach.
Noël, Pierre-André; Allard, Antoine; Hébert-Dufresne, Laurent; Marceau, Vincent; Dubé, Louis J
2014-12-01
Dynamics on networks is considered from the perspective of Markov stochastic processes. We partially describe the state of the system through network motifs and infer any missing data using the available information. This versatile approach is especially well adapted for modelling spreading processes and/or population dynamics. In particular, the generality of our framework and the fact that its assumptions are explicitly stated suggests that it could be used as a common ground for comparing existing epidemics models too complex for direct comparison, such as agent-based computer simulations. We provide many examples for the special cases of susceptible-infectious-susceptible and susceptible-infectious-removed dynamics (e.g., epidemics propagation) and we observe multiple situations where accurate results may be obtained at low computational cost. Our perspective reveals a subtle balance between the complex requirements of a realistic model and its basic assumptions.
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda Shaller; Willoughby, John K.
1991-01-01
Traditional practice of systems engineering management assumes requirements can be precisely determined and unambiguously defined prior to system design and implementation; practice further assumes requirements are held static during implementation. Human-computer decision support systems for service planning and scheduling applications do not conform well to these assumptions. Adaptation to the traditional practice of systems engineering management are required. Basic technology exists to support these adaptations. Additional innovations must be encouraged and nutured. Continued partnership between the programmatic and technical perspective assures proper balance of the impossible with the possible. Past problems have the following origins: not recognizing the unusual and perverse nature of the requirements for planning and scheduling; not recognizing the best starting point assumptions for the design; not understanding the type of system that being built; and not understanding the design consequences of the operations concept selected.
Shafir, Eldar; LeBoeuf, Robyn A
2002-01-01
This chapter reviews selected findings in research on reasoning, judgment, and choice and considers the systematic ways in which people violate basic requirements of the corresponding normative analyses. Recent objections to the empirical findings are then considered; these objections question the findings' relevance to assumptions about rationality. These objections address the adequacy of the tasks used in the aforementioned research and the appropriateness of the critical interpretation of participants' responses, as well as the justifiability of some of the theoretical assumptions made by experimenters. The objections are each found not to seriously impinge on the general conclusion that people often violate tenets of rationality in inadvisable ways. In the process, relevant psychological constructs, ranging from cognitive ability and need for cognition, to dual process theories and the role of incentives, are discussed. It is proposed that the rationality critique is compelling and rightfully gaining influence in the social sciences in general.
NASA Astrophysics Data System (ADS)
Chung, Kun-Jen
2013-09-01
An inventory problem involves a lot of factors influencing inventory decisions. To understand it, the traditional economic production quantity (EPQ) model plays rather important role for inventory analysis. Although the traditional EPQ models are still widely used in industry, practitioners frequently question validities of assumptions of these models such that their use encounters challenges and difficulties. So, this article tries to present a new inventory model by considering two levels of trade credit, finite replenishment rate and limited storage capacity together to relax the basic assumptions of the traditional EPQ model to improve the environment of the use of it. Keeping in mind cost-minimisation strategy, four easy-to-use theorems are developed to characterise the optimal solution. Finally, the sensitivity analyses are executed to investigate the effects of the various parameters on ordering policies and the annual total relevant costs of the inventory system.
Meyer, Gitte
2016-04-01
There is widespread agreement that the potential of gene therapy was oversold in the early 1990s. This study, however, comparing written material from the British, Danish and German gene therapy discourses of the period finds significant differences: Over-optimism was not equally strong everywhere; gene therapy was not universally hyped. Against that background, attention is directed towards another area of variation in the material: different basic assumptions about science and scientists. Exploring such culturally rooted assumptions and beliefs and their possible significance to science communication practices, it is argued that deep beliefs may constitute drivers of hype that are particularly difficult to deal with. To participants in science communication, the discouragement of hype, viewed as a practical-ethical challenge, can be seen as a learning exercise that includes critical attention to internalised beliefs. © The Author(s) 2014.
Expanding Advanced Civilizations in the Universe
NASA Astrophysics Data System (ADS)
Gros, C.
The 1950 lunch-table remark by Enrico Fermi `Where is everybody' has started intensive scientific and philosophical discussions about what we call nowadays the `Fermi paradox': If there had been ever a single advanced civilization in the cosmological history of our galaxy, dedicated to expansion, it would have had plenty of time to colonize the entire galaxy via exponential growth. No evidence of present or past alien visits to earth are known to us, leading to the standard conclusion that no advanced expanding civilization has ever existed in the milky-way. This conclusion rest fundamentally on the ad-hoc assumption, that any alien civilizations dedicated to expansion at one time would remain dedicated to expansions forever. Considering our limited knowledge about alien civilizations we need however to relax this basic assumption. Here we show that a substantial and stable population of expanding advanced civilization might consequently exist in our galaxy.
Guikema, Seth
2012-07-01
Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. © 2011 Society for Risk Analysis.
Applications of non-parametric statistics and analysis of variance on sample variances
NASA Technical Reports Server (NTRS)
Myers, R. H.
1981-01-01
Nonparametric methods that are available for NASA-type applications are discussed. An attempt will be made here to survey what can be used, to attempt recommendations as to when each would be applicable, and to compare the methods, when possible, with the usual normal-theory procedures that are avavilable for the Gaussion analog. It is important here to point out the hypotheses that are being tested, the assumptions that are being made, and limitations of the nonparametric procedures. The appropriateness of doing analysis of variance on sample variances are also discussed and studied. This procedure is followed in several NASA simulation projects. On the surface this would appear to be reasonably sound procedure. However, difficulties involved center around the normality problem and the basic homogeneous variance assumption that is mase in usual analysis of variance problems. These difficulties discussed and guidelines given for using the methods.
Causal Mediation Analysis: Warning! Assumptions Ahead
ERIC Educational Resources Information Center
Keele, Luke
2015-01-01
In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…
Tip of the Tongue States Increase Under Evaluative Observation.
James, Lori E; Schmank, Christopher J; Castro, Nichol; Buchanan, Tony W
2018-02-01
We tested the frequent assumption that the difficulty of word retrieval increases when a speaker is being observed and evaluated. We modified the Trier Social Stress Test (TSST) so that participants believed that its evaluative observation components continued throughout the duration of a subsequent word retrieval task, and measured participants' reported tip of the tongue (TOT) states. Participants in this TSST condition experienced more TOTs than participants in a comparable, placebo TSST condition in which there was no suggestion of evaluative observation. This experiment provides initial evidence confirming the assumption that evaluative observation by a third party can be disruptive to word retrieval. We interpret our findings by proposing an extension to a well-supported theoretical model of TOTs.
Basic Emotions in Human Neuroscience: Neuroimaging and Beyond.
Celeghin, Alessia; Diano, Matteo; Bagnis, Arianna; Viola, Marco; Tamietto, Marco
2017-01-01
The existence of so-called 'basic emotions' and their defining attributes represents a long lasting and yet unsettled issue in psychology. Recently, neuroimaging evidence, especially related to the advent of neuroimaging meta-analytic methods, has revitalized this debate in the endeavor of systems and human neuroscience. The core theme focuses on the existence of unique neural bases that are specific and characteristic for each instance of basic emotion. Here we review this evidence, outlining contradictory findings, strengths and limits of different approaches. Constructionism dismisses the existence of dedicated neural structures for basic emotions, considering that the assumption of a one-to-one relationship between neural structures and their functions is central to basic emotion theories. While these critiques are useful to pinpoint current limitations of basic emotions theories, we argue that they do not always appear equally generative in fostering new testable accounts on how the brain relates to affective functions. We then consider evidence beyond PET and fMRI, including results concerning the relation between basic emotions and awareness and data from neuropsychology on patients with focal brain damage. Evidence from lesion studies are indeed particularly informative, as they are able to bring correlational evidence typical of neuroimaging studies to causation, thereby characterizing which brain structures are necessary for, rather than simply related to, basic emotion processing. These other studies shed light on attributes often ascribed to basic emotions, such as automaticity of perception, quick onset, and brief duration. Overall, we consider that evidence in favor of the neurobiological underpinnings of basic emotions outweighs dismissive approaches. In fact, the concept of basic emotions can still be fruitful, if updated to current neurobiological knowledge that overcomes traditional one-to-one localization of functions in the brain. In particular, we propose that the structure-function relationship between brain and emotions is better described in terms of pluripotentiality, which refers to the fact that one neural structure can fulfill multiple functions, depending on the functional network and pattern of co-activations displayed at any given moment.
Robust multi-atlas label propagation by deep sparse representation
Zu, Chen; Wang, Zhengxia; Zhang, Daoqiang; Liang, Peipeng; Shi, Yonghong; Shen, Dinggang; Wu, Guorong
2016-01-01
Recently, multi-atlas patch-based label fusion has achieved many successes in medical imaging area. The basic assumption in the current state-of-the-art approaches is that the image patch at the target image point can be represented by a patch dictionary consisting of atlas patches from registered atlas images. Therefore, the label at the target image point can be determined by fusing labels of atlas image patches with similar anatomical structures. However, such assumption on image patch representation does not always hold in label fusion since (1) the image content within the patch may be corrupted due to noise and artifact; and (2) the distribution of morphometric patterns among atlas patches might be unbalanced such that the majority patterns can dominate label fusion result over other minority patterns. The violation of the above basic assumptions could significantly undermine the label fusion accuracy. To overcome these issues, we first consider forming label-specific group for the atlas patches with the same label. Then, we alter the conventional flat and shallow dictionary to a deep multi-layer structure, where the top layer (label-specific dictionaries) consists of groups of representative atlas patches and the subsequent layers (residual dictionaries) hierarchically encode the patchwise residual information in different scales. Thus, the label fusion follows the representation consensus across representative dictionaries. However, the representation of target patch in each group is iteratively optimized by using the representative atlas patches in each label-specific dictionary exclusively to match the principal patterns and also using all residual patterns across groups collaboratively to overcome the issue that some groups might be absent of certain variation patterns presented in the target image patch. Promising segmentation results have been achieved in labeling hippocampus on ADNI dataset, as well as basal ganglia and brainstem structures, compared to other counterpart label fusion methods. PMID:27942077
Robust multi-atlas label propagation by deep sparse representation.
Zu, Chen; Wang, Zhengxia; Zhang, Daoqiang; Liang, Peipeng; Shi, Yonghong; Shen, Dinggang; Wu, Guorong
2017-03-01
Recently, multi-atlas patch-based label fusion has achieved many successes in medical imaging area. The basic assumption in the current state-of-the-art approaches is that the image patch at the target image point can be represented by a patch dictionary consisting of atlas patches from registered atlas images. Therefore, the label at the target image point can be determined by fusing labels of atlas image patches with similar anatomical structures. However, such assumption on image patch representation does not always hold in label fusion since (1) the image content within the patch may be corrupted due to noise and artifact; and (2) the distribution of morphometric patterns among atlas patches might be unbalanced such that the majority patterns can dominate label fusion result over other minority patterns. The violation of the above basic assumptions could significantly undermine the label fusion accuracy. To overcome these issues, we first consider forming label-specific group for the atlas patches with the same label. Then, we alter the conventional flat and shallow dictionary to a deep multi-layer structure, where the top layer ( label-specific dictionaries ) consists of groups of representative atlas patches and the subsequent layers ( residual dictionaries ) hierarchically encode the patchwise residual information in different scales. Thus, the label fusion follows the representation consensus across representative dictionaries. However, the representation of target patch in each group is iteratively optimized by using the representative atlas patches in each label-specific dictionary exclusively to match the principal patterns and also using all residual patterns across groups collaboratively to overcome the issue that some groups might be absent of certain variation patterns presented in the target image patch. Promising segmentation results have been achieved in labeling hippocampus on ADNI dataset, as well as basal ganglia and brainstem structures, compared to other counterpart label fusion methods.
An epidemic model to evaluate the homogeneous mixing assumption
NASA Astrophysics Data System (ADS)
Turnes, P. P.; Monteiro, L. H. A.
2014-11-01
Many epidemic models are written in terms of ordinary differential equations (ODE). This approach relies on the homogeneous mixing assumption; that is, the topological structure of the contact network established by the individuals of the host population is not relevant to predict the spread of a pathogen in this population. Here, we propose an epidemic model based on ODE to study the propagation of contagious diseases conferring no immunity. The state variables of this model are the percentages of susceptible individuals, infectious individuals and empty space. We show that this dynamical system can experience transcritical and Hopf bifurcations. Then, we employ this model to evaluate the validity of the homogeneous mixing assumption by using real data related to the transmission of gonorrhea, hepatitis C virus, human immunodeficiency virus, and obesity.
Galactic Cosmic Ray Event-Based Risk Model (GERM) Code
NASA Technical Reports Server (NTRS)
Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.
2013-01-01
This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at the NASA Space Radiation Laboratory (NSRL) for the purpose of simulating space radiation biological effects. In the first option, properties of monoenergetic beams are treated. In the second option, the transport of beams in different materials is treated. Similar biophysical properties as in the first option are evaluated for the primary ion and its secondary particles. Additional properties related to the nuclear fragmentation of the beam are evaluated. The GERM code is a computationally efficient Monte-Carlo heavy-ion-beam model. It includes accurate models of LET, range, residual energy, and straggling, and the quantum multiple scattering fragmentation (QMSGRG) nuclear database.
Evaluation of thyroid radioactivity measurement data from Hanford workers, 1944--1946
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ikenberry, T.A.
1991-05-01
This report describes the preliminary results of an evaluation conducted in support of the Hanford Environmental Dose Reconstruction (HEDR) Project. The primary objective of the HEDR Project is to estimate the radiation doses that populations could have received from nuclear operations at the Hanford Site since 1944. A secondary objective is to make information that HEDR staff members used in estimate radiation doses available to the public. The objectives of this report to make available thyroid measurement data from Hanford workers for the year 1944 through 1946, and to investigate the suitability of those data for use in the HEDRmore » dose estimation process. An important part of this investigation was to provide a description of the uncertainty associated with the data. Lack of documentation on thyroid measurements from this period required that assumptions be made to perform data evaluations. These assumptions introduce uncertainty into the evaluations that could be significant. It is important to recognize the nature of these assumptions, the inherent uncertainty, and the propagation of this uncertainty, and the propagation of this uncertainty through data evaluations to any conclusions that can be made by using the data. 15 refs., 1 fig., 5 tabs.« less
Photometry and spectroscopy of a newly discovered polar - Nova Cygni 1975 (V1500 CYG)
NASA Technical Reports Server (NTRS)
Kaluzny, Janusz; Chlebowski, Tomasz
1988-01-01
The paper reports photometric and spectroscopic observations which led to the conclusion that Nova Cygni 1975 (V1500 Cyg) is a polar (of AM Her-type).The CCD photometry confirms the constancy of the photometric period which is again interpreted as an orbital cycle. The time-resolved MMT spectra make it possible to reconstruct, under several assumptions, the basic system parameters: M1=0.9M solar mass and M2=0.31M solar mass.
A beginner's guide to belief revision and truth maintenance systems
NASA Technical Reports Server (NTRS)
Mason, Cindy L.
1992-01-01
This brief note is intended to familiarize the non-TMS audience with some of the basic ideas surrounding classic TMS's (truth maintenance systems), namely the justification-based TMS and the assumption-based TMS. Topics of further interest include the relation between non-monotonic logics and TMS's, efficiency and search issues, complexity concerns, as well as the variety of TMS systems that have surfaced in the past decade or so. These include probabilistic-based TMS systems, fuzzy TMS systems, tri-valued belief systems, and so on.
The Nonlinear Dynamic Response of an Elastic-Plastic Thin Plate under Impulsive Loading,
1987-06-11
Among those numerical methods, the finite element method is the most effective one. The method presented in this paper is an " influence function " numerical...computational time is much less than the finite element method. Its precision is higher also. II. Basic Assumption and the Influence Function of a Simple...calculation. Fig. 1 3 2. The Influence function of a Simple Supported Plate The motion differential equation of a thin plate can be written as DV’w+ _.eluq() (1
Dark energy cosmology with tachyon field in teleparallel gravity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Motavalli, H., E-mail: Motavalli@Tabrizu.ac.ir; Akbarieh, A. Rezaei; Nasiry, M.
2016-07-15
We construct a tachyon teleparallel dark energy model for a homogeneous and isotropic flat universe in which a tachyon as a non-canonical scalar field is non-minimally coupled to gravity in the framework of teleparallel gravity. The explicit form of potential and coupling functions are obtained under the assumption that the Lagrangian admits the Noether symmetry approach. The dynamical behavior of the basic cosmological observables is compared to recent observational data, which implies that the tachyon field may serve as a candidate for dark energy.
An interactive quality of work life model applied to organizational transition.
Knox, S; Irving, J A
1997-01-01
Most healthcare organizations in the United States are in the process of some type of organizational change or transition. Professional nurses and other healthcare providers practicing in U.S. healthcare delivery organizations are very aware of the dramatic effects of restructuring processes. A phenomenal amount of change and concern is occurring with organizational redesign, generating many questions and uncertainties. These transitions challenge the basic assumptions and principles guiding the practice of clinical and management roles in healthcare.
Principles of cost-benefit analysis for ERTS experiments, volumes 1 and 2
NASA Technical Reports Server (NTRS)
1973-01-01
The basic elements of a cost-benefit study are discussed along with special considerations for ERTS experiments. Elements required for a complete economic analysis of ERTS are considered to be: statement of objectives, specification of assumptions, enumeration of system alternatives, benefit analysis, cost analysis nonefficiency considerations, and final system selection. A hypothetical cost-benefit example is presented with the assumed objective of an increase in remote sensing surveys of grazing lands to better utilize available forage to lower meat prices.
Understanding Business Models in Health Care.
Sharan, Alok D; Schroeder, Gregory D; West, Michael E; Vaccaro, Alexander R
2016-05-01
The increasing focus on the costs of care is forcing health care organizations to critically look at their basic set of processes and activities, to determine what type of value they can deliver. A business model describes the resources, processes, and cost assumptions that an organization makes that will lead to the delivery of a unique value proposition to a customer. As health care organizations are beginning to transform their structure in preparation for a value-based delivery system, understanding business model theory can help in the redesign process.
Alien plants confront expectations of climate change impacts.
Hulme, Philip E
2014-09-01
The success of alien plants in novel environments questions basic assumptions about the fate of native species under climate change. Aliens generally spread faster than the velocity of climate change, display considerable phenotypic plasticity as well as adaptation to new selection pressures, and their ranges are often shaped by biotic rather than climatic factors. Given that many native species also exhibit these attributes, their risk of extinction as a result of climate change might be overestimated. Copyright © 2014 Elsevier Ltd. All rights reserved.
2006-07-01
and methamphetamine Our basic assumption is that protective treatments alter both post-translational and translational events so as to reduce the...impact of voluntary running on trophic factor levels and the neurotoxic effects of 6-OHDA. Reportable Outcomes: • Like exercise, GDNF protects DA...also protects against the increased vulnerability to toxins caused by other stressors; and (4) the generality of our results with 6-OHDA to other
The Effects of Military Assignments and Duties on the Martial Status of Navy Officers
2003-03-01
by phone or e-mail, by sending Officer Preference and Personal information cards (NAVPERS 1301/1), by Super JASS (Job Advertising and Selection...Kuwait 24 12 Laos NA 12 Malaysia 36 24 Mexico 24 18 Midway Island NA 12 Morocco 24 15 Netherlands 36 24 New Zealand 36 24 Niger 24 12 Norway...basic assumptions: (1) each person tries to find a mate who maximizes his or her well-being, with well-being measured by the consumption of
1987-08-01
HVAC duct hanger system over an extensive frequency range. The finite element, component mode synthesis, and statistical energy analysis methods are...800-5,000 Hz) analysis was conducted with Statistical Energy Analysis (SEA) coupled with a closed-form harmonic beam analysis program. These...resonances may be obtained by using a finer frequency increment. Statistical Energy Analysis The basic assumption used in SEA analysis is that within each band
Automatic item generation implemented for measuring artistic judgment aptitude.
Bezruczko, Nikolaus
2014-01-01
Automatic item generation (AIG) is a broad class of methods that are being developed to address psychometric issues arising from internet and computer-based testing. In general, issues emphasize efficiency, validity, and diagnostic usefulness of large scale mental testing. Rapid prominence of AIG methods and their implicit perspective on mental testing is bringing painful scrutiny to many sacred psychometric assumptions. This report reviews basic AIG ideas, then presents conceptual foundations, image model development, and operational application to artistic judgment aptitude testing.
Gathara, David; Malla, Lucas; Ayieko, Philip; Karuri, Stella; Nyamai, Rachel; Irimu, Grace; van Hensbroek, Michael Boele; Allen, Elizabeth; English, Mike
2017-04-05
Hospital mortality data can inform planning for health interventions and may help optimize resource allocation if they are reliable and appropriately interpreted. However such data are often not available in low income countries including Kenya. Data from the Clinical Information Network covering 12 county hospitals' paediatric admissions aged 2-59 months for the periods September 2013 to March 2015 were used to describe mortality across differing contexts and to explore whether simple clinical characteristics used to classify severity of illness in common treatment guidelines are consistently associated with inpatient mortality. Regression models accounting for hospital identity and malaria prevalence (low or high) were used. Multiple imputation for missing data was based on a missing at random assumption with sensitivity analyses based on pattern mixture missing not at random assumptions. The overall cluster adjusted crude mortality rate across hospitals was 6 · 2% with an almost 5 fold variation across sites (95% CI 4 · 9 to 7 · 8; range 2 · 1% - 11 · 0%). Hospital identity was significantly associated with mortality. Clinical features included in guidelines for common diseases to assess severity of illness were consistently associated with mortality in multivariable analyses (AROC =0 · 86). All-cause mortality is highly variable across hospitals and associated with clinical risk factors identified in disease specific guidelines. A panel of these clinical features may provide a basic common data framework as part of improved health information systems to support evaluations of quality and outcomes of care at scale and inform health system strengthening efforts.
Numerical distance effect size is a poor metric of approximate number system acuity.
Chesney, Dana
2018-04-12
Individual differences in the ability to compare and evaluate nonsymbolic numerical magnitudes-approximate number system (ANS) acuity-are emerging as an important predictor in many research areas. Unfortunately, recent empirical studies have called into question whether a historically common ANS-acuity metric-the size of the numerical distance effect (NDE size)-is an effective measure of ANS acuity. NDE size has been shown to frequently yield divergent results from other ANS-acuity metrics. Given these concerns and the measure's past popularity, it behooves us to question whether the use of NDE size as an ANS-acuity metric is theoretically supported. This study seeks to address this gap in the literature by using modeling to test the basic assumption underpinning use of NDE size as an ANS-acuity metric: that larger NDE size indicates poorer ANS acuity. This assumption did not hold up under test. Results demonstrate that the theoretically ideal relationship between NDE size and ANS acuity is not linear, but rather resembles an inverted J-shaped distribution, with the inflection points varying based on precise NDE task methodology. Thus, depending on specific methodology and the distribution of ANS acuity in the tested population, positive, negative, or null correlations between NDE size and ANS acuity could be predicted. Moreover, peak NDE sizes would be found for near-average ANS acuities on common NDE tasks. This indicates that NDE size has limited and inconsistent utility as an ANS-acuity metric. Past results should be interpreted on a case-by-case basis, considering both specifics of the NDE task and expected ANS acuity of the sampled population.
A robust two-way semi-linear model for normalization of cDNA microarray data
Wang, Deli; Huang, Jian; Xie, Hehuang; Manzella, Liliana; Soares, Marcelo Bento
2005-01-01
Background Normalization is a basic step in microarray data analysis. A proper normalization procedure ensures that the intensity ratios provide meaningful measures of relative expression values. Methods We propose a robust semiparametric method in a two-way semi-linear model (TW-SLM) for normalization of cDNA microarray data. This method does not make the usual assumptions underlying some of the existing methods. For example, it does not assume that: (i) the percentage of differentially expressed genes is small; or (ii) the numbers of up- and down-regulated genes are about the same, as required in the LOWESS normalization method. We conduct simulation studies to evaluate the proposed method and use a real data set from a specially designed microarray experiment to compare the performance of the proposed method with that of the LOWESS normalization approach. Results The simulation results show that the proposed method performs better than the LOWESS normalization method in terms of mean square errors for estimated gene effects. The results of analysis of the real data set also show that the proposed method yields more consistent results between the direct and the indirect comparisons and also can detect more differentially expressed genes than the LOWESS method. Conclusions Our simulation studies and the real data example indicate that the proposed robust TW-SLM method works at least as well as the LOWESS method and works better when the underlying assumptions for the LOWESS method are not satisfied. Therefore, it is a powerful alternative to the existing normalization methods. PMID:15663789
Sliding friction between polymer surfaces: A molecular interpretation
NASA Astrophysics Data System (ADS)
Allegra, Giuseppe; Raos, Guido
2006-04-01
For two contacting rigid bodies, the friction force F is proportional to the normal load and independent of the macroscopic contact area and relative velocity V (Amonton's law). With two mutually sliding polymer samples, the surface irregularities transmit deformation to the underlying material. Energy loss along the deformation cycles is responsible for the friction force, which now appears to depend strongly on V [see, e.g., N. Maeda et al., Science 297, 379 (2002)]. We base our theoretical interpretation on the assumption that polymer chains are mainly subjected to oscillatory "reptation" along their "tubes." At high deformation frequencies—i.e., with a large sliding velocity V—the internal viscosity due to the rotational energy barriers around chain bonds hinders intramolecular mobility. As a result, energy dissipation and the correlated friction force strongly diminish at large V. Derived from a linear differential equation for chain dynamics, our results are basically consistent with the experimental data by Maeda et al. [Science 297, 379 (2002)] on modified polystyrene. Although the bulk polymer is below Tg, we regard the first few chain layers below the surface to be in the liquid state. In particular, the observed maximum of F vs V is consistent with physically reasonable values of the molecular parameters. As a general result, the ratio F /V is a steadily decreasing function of V, tending to V-2 for large velocities. We evaluate a much smaller friction for a cross-linked polymer under the assumption that the junctions are effectively immobile, also in agreement with the experimental results of Maeda et al. [Science 297, 379 (2002)].
NASA Technical Reports Server (NTRS)
Bast, Callie Corinne Scheidt
1994-01-01
This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.
Newsome, R; Tran, N; Paoli, G M; Jaykus, L A; Tompkin, B; Miliotis, M; Ruthman, T; Hartnett, E; Busta, F F; Petersen, B; Shank, F; McEntire, J; Hotchkiss, J; Wagner, M; Schaffner, D W
2009-03-01
Through a cooperative agreement with the U.S. Food and Drug Administration, the Institute of Food Technologists developed a risk-ranking framework prototype to enable comparison of microbiological and chemical hazards in foods and to assist policy makers, risk managers, risk analysts, and others in determining the relative public health impact of specific hazard-food combinations. The prototype is a bottom-up system based on assumptions that incorporate expert opinion/insight with a number of exposure and hazard-related risk criteria variables, which are propagated forward with food intake data to produce risk-ranking determinations. The prototype produces a semi-quantitative comparative assessment of food safety hazards and the impacts of hazard control measures. For a specific hazard-food combination the prototype can produce a single metric: a final risk value expressed as annual pseudo-disability adjusted life years (pDALY). The pDALY is a harmonization of the very different dose-response relationships observed for chemicals and microbes. The prototype was developed on 2 platforms, a web-based user interface and an Analytica(R) model (Lumina Decision Systems, Los Gatos, Calif., U.S.A.). Comprising visual basic language, the web-based platform facilitates data input and allows use concurrently from multiple locations. The Analytica model facilitates visualization of the logic flow, interrelationship of input and output variables, and calculations/algorithms comprising the prototype. A variety of sortable risk-ranking reports and summary information can be generated for hazard-food pairs, showing hazard and dose-response assumptions and data, per capita consumption by population group, and annual p-DALY.
Collins, Scott F.; Marcarelli, Amy M.; Baxter, Colden V.; Wipfli, Mark S.
2015-01-01
We critically evaluate some of the key ecological assumptions underpinning the use of nutrient replacement as a means of recovering salmon populations and a range of other organisms thought to be linked to productive salmon runs. These assumptions include: (1) nutrient mitigation mimics the ecological roles of salmon, (2) mitigation is needed to replace salmon-derived nutrients and stimulate primary and invertebrate production in streams, and (3) food resources in rearing habitats limit populations of salmon and resident fishes. First, we call into question assumption one because an array of evidence points to the multi-faceted role played by spawning salmon, including disturbance via redd-building, nutrient recycling by live fish, and consumption by terrestrial consumers. Second, we show that assumption two may require qualification based upon a more complete understanding of nutrient cycling and productivity in streams. Third, we evaluate the empirical evidence supporting food limitation of fish populations and conclude it has been only weakly tested. On the basis of this assessment, we urge caution in the application of nutrient mitigation as a management tool. Although applications of nutrients and other materials intended to mitigate for lost or diminished runs of Pacific salmon may trigger ecological responses within treated ecosystems, contributions of these activities toward actual mitigation may be limited.
Evaluation of 2D shallow-water model for spillway flow with a complex geometry
USDA-ARS?s Scientific Manuscript database
Although the two-dimensional (2D) shallow water model is formulated based on several assumptions such as hydrostatic pressure distribution and vertical velocity is negligible, as a simple alternative to the complex 3D model, it has been used to compute water flows in which these assumptions may be ...
ERIC Educational Resources Information Center
Diaz, Juan Jose; Handa, Sudhanshu
2006-01-01
Not all policy questions can be addressed by social experiments. Nonexperimental evaluation methods provide an alternative to experimental designs but their results depend on untestable assumptions. This paper presents evidence on the reliability of propensity score matching (PSM), which estimates treatment effects under the assumption of…
"Touch Me, Like Me": Testing an Encounter Group Assumption
ERIC Educational Resources Information Center
Boderman, Alvin; And Others
1972-01-01
An experiment to test an encounter group assumption that touching increases interpersonal attraction was conducted. College women were randomly assigned to a touch or no-touch condition. A comparison of total evaluation scores verified the hypothesis: subjects who touched the accomplice perceived her as a more attractive person than those who did…
Own-Language Use in Language Teaching and Learning
ERIC Educational Resources Information Center
Hall, Graham; Cook, Guy
2012-01-01
Until recently, the assumption of the language-teaching literature has been that new languages are best taught and learned monolingually, without the use of the students' own language(s). In recent years, however, this monolingual assumption has been increasingly questioned, and a re-evaluation of teaching that relates the language being taught to…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-25
... actuarial and economic assumptions and methods by which Trustees might more accurately project health... (a)(2)). The Panel will discuss the long range (75 year) projection methods and assumptions in... making recommendations to the Medicare Trustees on how the Trustees might more accurately project health...
Biological control agents elevate hantavirus by subsidizing deer mouse populations
Dean E. Pearson; Ragan M. Callaway
2006-01-01
Biological control of exotic invasive plants using exotic insects is practiced under the assumption that biological control agents are safe if they do not directly attack non-target species. We tested this assumption by evaluating the potential for two host-specific biological control agents (Urophora spp.), widely established in North America for spotted...
ERIC Educational Resources Information Center
Mazor, Kathleen M.; Ockene, Judith K.; Rogers, H. Jane; Carlin, Michele M.; Quirk, Mark E.
2005-01-01
Many efforts to teach and evaluate physician-patient communication are based on two assumptions: first, that communication can be conceptualized as consisting of specific observable behaviors, and second, that physicians who exhibit certain behaviors are more effective in communicating with patients. These assumptions are usually implicit, and are…
E-Basics: Online Basic Training in Program Evaluation
ERIC Educational Resources Information Center
Silliman, Ben
2016-01-01
E-Basics is an online training in program evaluation concepts and skills designed for youth development professionals, especially those working in nonformal science education. Ten hours of online training in seven modules is designed to prepare participants for mentoring and applied practice, mastery, and/or team leadership in program evaluation.…
Interpretation of correlations in clinical research.
Hung, Man; Bounsanga, Jerry; Voss, Maren Wright
2017-11-01
Critically analyzing research is a key skill in evidence-based practice and requires knowledge of research methods, results interpretation, and applications, all of which rely on a foundation based in statistics. Evidence-based practice makes high demands on trained medical professionals to interpret an ever-expanding array of research evidence. As clinical training emphasizes medical care rather than statistics, it is useful to review the basics of statistical methods and what they mean for interpreting clinical studies. We reviewed the basic concepts of correlational associations, violations of normality, unobserved variable bias, sample size, and alpha inflation. The foundations of causal inference were discussed and sound statistical analyses were examined. We discuss four ways in which correlational analysis is misused, including causal inference overreach, over-reliance on significance, alpha inflation, and sample size bias. Recent published studies in the medical field provide evidence of causal assertion overreach drawn from correlational findings. The findings present a primer on the assumptions and nature of correlational methods of analysis and urge clinicians to exercise appropriate caution as they critically analyze the evidence before them and evaluate evidence that supports practice. Critically analyzing new evidence requires statistical knowledge in addition to clinical knowledge. Studies can overstate relationships, expressing causal assertions when only correlational evidence is available. Failure to account for the effect of sample size in the analyses tends to overstate the importance of predictive variables. It is important not to overemphasize the statistical significance without consideration of effect size and whether differences could be considered clinically meaningful.
Radiology applications of financial accounting.
Leibenhaut, Mark H
2005-03-01
A basic knowledge of financial accounting can help radiologists analyze business opportunities and examine the potential impacts of new technology or predict the adverse consequences of new competitors entering their service area. The income statement, balance sheet, and cash flow statement are the three basic financial statements that document the current financial position of the radiology practice and allow managers to monitor the ongoing financial operations of the enterprise. Pro forma, or hypothetical, financial statements can be generated to predict the financial impact of specific business decisions or investments on the profitability of the practice. Sensitivity analysis, or what-if scenarios, can be performed to determine the potential impact of changing key revenue, investment, operating cost or financial assumptions. By viewing radiology as both a profession and a business, radiologists can optimize their use of scarce economic resources and maximize the return on their financial investments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Lijuan; Gonder, Jeff; Burton, Evan
This study evaluates the costs and benefits associated with the use of a plug-in hybrid electric bus and determines the cost effectiveness relative to a conventional bus and a hybrid electric bus. A sensitivity sweep analysis was performed over a number of a different battery sizes, charging powers, and charging stations. The net present value was calculated for each vehicle design and provided the basis for the design evaluation. In all cases, given present day economic assumptions, the conventional bus achieved the lowest net present value while the optimal plug-in hybrid electric bus scenario reached lower lifetime costs than themore » hybrid electric bus. The study also performed parameter sensitivity analysis under low market potential assumptions and high market potential assumptions. The net present value of plug-in hybrid electric bus is close to that of conventional bus.« less
Siemann, Julia; Petermann, Franz
2018-01-01
This review reconciles past findings on numerical processing with key assumptions of the most predominant model of arithmetic in the literature, the Triple Code Model (TCM). This is implemented by reporting diverse findings in the literature ranging from behavioral studies on basic arithmetic operations over neuroimaging studies on numerical processing to developmental studies concerned with arithmetic acquisition, with a special focus on developmental dyscalculia (DD). We evaluate whether these studies corroborate the model and discuss possible reasons for contradictory findings. A separate section is dedicated to the transfer of TCM to arithmetic development and to alternative accounts focusing on developmental questions of numerical processing. We conclude with recommendations for future directions of arithmetic research, raising questions that require answers in models of healthy as well as abnormal mathematical development. This review assesses the leading model in the field of arithmetic processing (Triple Code Model) by presenting knowledge from interdisciplinary research. It assesses the observed contradictory findings and integrates the resulting opposing viewpoints. The focus is on the development of arithmetic expertise as well as abnormal mathematical development. The original aspect of this article is that it points to a gap in research on these topics and provides possible solutions for future models. Copyright © 2017 Elsevier Ltd. All rights reserved.
Probability in reasoning: a developmental test on conditionals.
Barrouillet, Pierre; Gauffroy, Caroline
2015-04-01
Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.
The "common good" phenomenon: Why similarities are positive and differences are negative.
Alves, Hans; Koch, Alex; Unkelbach, Christian
2017-04-01
Positive attributes are more prevalent than negative attributes in the social environment. From this basic assumption, 2 implications that have been overlooked thus far: Positive compared with negative attributes are more likely to be shared by individuals, and people's shared attributes (similarities) are more positive than their unshared attributes (differences). Consequently, similarity-based comparisons should lead to more positive evaluations than difference-based comparisons. We formalized our probabilistic reasoning in a model and tested its predictions in a simulation and 8 experiments (N = 1,181). When participants generated traits about 2 target persons, positive compared with negative traits were more likely to be shared by the targets (Experiment 1a) and by other participants' targets (Experiment 1b). Conversely, searching for targets' shared traits resulted in more positive traits than searching for unshared traits (Experiments 2, 4a, and 4b). In addition, positive traits were more accessible than negative traits among shared traits but not among unshared traits (Experiment 3). Finally, shared traits were only more positive when positive traits were indeed prevalent (Experiments 5 and 6). The current framework has a number of implications for comparison processes and provides a new interpretation of well-known evaluative asymmetries such as intergroup bias and self-superiority effects. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
The Development and Validation of a Rapid Assessment Tool of Primary Care in China
Mei, Jie; Liang, Yuan; Shi, LeiYu; Zhao, JingGe; Wang, YuTan; Kuang, Li
2016-01-01
Introduction. With Chinese health care reform increasingly emphasizing the importance of primary care, the need for a tool to evaluate primary care performance and service delivery is clear. This study presents a methodology for a rapid assessment of primary care organizations and service delivery in China. Methods. The study translated and adapted the Primary Care Assessment Tool-Adult Edition (PCAT-AE) into a Chinese version to measure core dimensions of primary care, namely, first contact, continuity, comprehensiveness, and coordination. A cross-sectional survey was conducted to assess the validity and reliability of the Chinese Rapid Primary Care Assessment Tool (CR-PCAT). Eight community health centers in Guangdong province have been selected to participate in the survey. Results. A total of 1465 effective samples were included for data analysis. Eight items were eliminated following principal component analysis and reliability testing. The principal component analysis extracted five multiple-item scales (first contact utilization, first contact accessibility, ongoing care, comprehensiveness, and coordination). The tests of scaling assumptions were basically met. Conclusion. The standard psychometric evaluation indicates that the scales have achieved relatively good reliability and validity. The CR-PCAT provides a rapid and reliable measure of four core dimensions of primary care, which could be applied in various scenarios. PMID:26885509
Discrete Neural Signatures of Basic Emotions.
Saarimäki, Heini; Gotsopoulos, Athanasios; Jääskeläinen, Iiro P; Lampinen, Jouko; Vuilleumier, Patrik; Hari, Riitta; Sams, Mikko; Nummenmaa, Lauri
2016-06-01
Categorical models of emotions posit neurally and physiologically distinct human basic emotions. We tested this assumption by using multivariate pattern analysis (MVPA) to classify brain activity patterns of 6 basic emotions (disgust, fear, happiness, sadness, anger, and surprise) in 3 experiments. Emotions were induced with short movies or mental imagery during functional magnetic resonance imaging. MVPA accurately classified emotions induced by both methods, and the classification generalized from one induction condition to another and across individuals. Brain regions contributing most to the classification accuracy included medial and inferior lateral prefrontal cortices, frontal pole, precentral and postcentral gyri, precuneus, and posterior cingulate cortex. Thus, specific neural signatures across these regions hold representations of different emotional states in multimodal fashion, independently of how the emotions are induced. Similarity of subjective experiences between emotions was associated with similarity of neural patterns for the same emotions, suggesting a direct link between activity in these brain regions and the subjective emotional experience. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Neutrino masses and mixings as an evidence of GUT, and the impact to (flavor changing) nucleon decay
NASA Astrophysics Data System (ADS)
Maekawa, Nobuhiro; Muramatsu, Yu
2017-11-01
First, we see that the observed data of quark and lepton masses and mixings, which has been completed by adding neutrino data, can be a qualitative signature of S U(5) grand unified theory (GUT). Actually, an assumption, 10 fields induce stronger hierarchy in Yukawa couplings than 5 ¯ fields, can explain all hierarchical structures of quark and lepton masses and mixings. Second, we see the attractiveness of E6 GUT, in which the above assumption in S U(5) GUT can be derived and as the result various Yukawa hierarchies of quarks and leptons can be obtained from only one basic hierarchy. Third, we compare the predictions for nucleon decay among several GUTs with S U(5), S O(10), and E6 unification group which satisfy the above important assumption for Yukawa hierarchy, since this understanding about Yukawa structures reduces the ambiguities in prediction of nucleon decay via superheavy gauge boson exchange. We stress the importance of observations for several decay modes. One of them is flavor changing nucleon decay, for example, P → π0 µ+, which is the decay mode that SuperKamiokande has reported two events in the signal region. This article is based on our works in Ref.[1, 2
An eco-epidemiological system with infected prey and predator subject to the weak Allee effect.
Sasmal, Sourav Kumar; Chattopadhyay, Joydev
2013-12-01
In this article, we propose a general prey–predator model with disease in prey and predator subject to the weak Allee effects. We make the following assumptions: (i) infected prey competes for resources but does not contribute to reproduction; and (ii) in comparison to the consumption of the susceptible prey, consumption of infected prey would contribute less or negatively to the growth of predator. Based on these assumptions, we provide basic dynamic properties for the full model and corresponding submodels with and without the Allee effects. By comparing the disease free submodels (susceptible prey–predator model) with and without the Allee effects, we conclude that the Allee effects can create or destroy the interior attractors. This enables us to obtain the complete dynamics of the full model and conclude that the model has only one attractor (only susceptible prey survives or susceptible-infected coexist), or two attractors (bi-stability with only susceptible prey and susceptible prey–predator coexist or susceptible prey-infected prey coexists and susceptible prey–predator coexist). This model does not support the coexistence of susceptible-infected-predator, which is caused by the assumption that infected population contributes less or are harmful to the growth of predator in comparison to the consumption of susceptible prey.
Equivalence of binormal likelihood-ratio and bi-chi-squared ROC curve models
Hillis, Stephen L.
2015-01-01
A basic assumption for a meaningful diagnostic decision variable is that there is a monotone relationship between it and its likelihood ratio. This relationship, however, generally does not hold for a decision variable that results in a binormal ROC curve. As a result, receiver operating characteristic (ROC) curve estimation based on the assumption of a binormal ROC-curve model produces improper ROC curves that have “hooks,” are not concave over the entire domain, and cross the chance line. Although in practice this “improperness” is usually not noticeable, sometimes it is evident and problematic. To avoid this problem, Metz and Pan proposed basing ROC-curve estimation on the assumption of a binormal likelihood-ratio (binormal-LR) model, which states that the decision variable is an increasing transformation of the likelihood-ratio function of a random variable having normal conditional diseased and nondiseased distributions. However, their development is not easy to follow. I show that the binormal-LR model is equivalent to a bi-chi-squared model in the sense that the families of corresponding ROC curves are the same. The bi-chi-squared formulation provides an easier-to-follow development of the binormal-LR ROC curve and its properties in terms of well-known distributions. PMID:26608405
Stochastic analysis of surface roughness models in quantum wires
NASA Astrophysics Data System (ADS)
Nedjalkov, Mihail; Ellinghaus, Paul; Weinbub, Josef; Sadi, Toufik; Asenov, Asen; Dimov, Ivan; Selberherr, Siegfried
2018-07-01
We present a signed particle computational approach for the Wigner transport model and use it to analyze the electron state dynamics in quantum wires focusing on the effect of surface roughness. Usually surface roughness is considered as a scattering model, accounted for by the Fermi Golden Rule, which relies on approximations like statistical averaging and in the case of quantum wires incorporates quantum corrections based on the mode space approach. We provide a novel computational approach to enable physical analysis of these assumptions in terms of phase space and particles. Utilized is the signed particles model of Wigner evolution, which, besides providing a full quantum description of the electron dynamics, enables intuitive insights into the processes of tunneling, which govern the physical evolution. It is shown that the basic assumptions of the quantum-corrected scattering model correspond to the quantum behavior of the electron system. Of particular importance is the distribution of the density: Due to the quantum confinement, electrons are kept away from the walls, which is in contrast to the classical scattering model. Further quantum effects are retardation of the electron dynamics and quantum reflection. Far from equilibrium the assumption of homogeneous conditions along the wire breaks even in the case of ideal wire walls.
Whose drag is it anyway? Drag kings and monarchy in the UK.
Willox, Annabelle
2002-01-01
This chapter will show that the term "drag" in drag queen has a different meaning, history and value to the term "drag" in drag king. By exposing this basic, yet fundamental, difference this paper will expose the problems inherent in the assumption of parity between the two forms of drag. An exposition of how camp has been used to comprehend and theorise drag queens will facilitating an understanding of the parasitic interrelationship between camp and drag queen performances, while a critique of "Towards a Butch-Femme Aesthetic," by Sue Ellen Case, will point out the problematic assumptions made about camp when attributed to a cultural location different to the drag queen. By interrogating the historical, cultural and theoretical similarities and differences between drag kings, butches, drag queens and femmes this paper will expose the flawed assumption that camp can be attributed to all of the above without proviso, and hence expose why drag has a fundamentally different contextual meaning for kings and queens. This chapter will conclude by examining the work of both Judith Halberstam and Biddy Martin and the practical examples of drag king and queen performances provided at the UK drag contest held at The Fridge in Brixton, London on 23 June 1999.
Why is metal bioaccumulation so variable? Biodynamics as a unifying concept
Luoma, Samuel N.; Rainbow, Philip S.
2005-01-01
Ecological risks from metal contaminants are difficult to document because responses differ among species, threats differ among metals, and environmental influences are complex. Unifying concepts are needed to better tie together such complexities. Here we suggest that a biologically based conceptualization, the biodynamic model, provides the necessary unification for a key aspect in risk: metal bioaccumulation (internal exposure). The model is mechanistically based, but empirically considers geochemical influences, biological differences, and differences among metals. Forecasts from the model agree closely with observations from nature, validating its basic assumptions. The biodynamic metal bioaccumulation model combines targeted, high-quality geochemical analyses from a site of interest with parametrization of key physiological constants for a species from that site. The physiological parameters include metal influx rates from water, influx rates from food, rate constants of loss, and growth rates (when high). We compiled results from 15 publications that forecast species-specific bioaccumulation, and compare the forecasts to bioaccumulation data from the field. These data consider concentrations that cover 7 orders of magnitude. They include 7 metals and 14 species of animals from 3 phyla and 11 marine, estuarine, and freshwater environments. The coefficient of determination (R2) between forecasts and independently observed bioaccumulation from the field was 0.98. Most forecasts agreed with observations within 2-fold. The agreement suggests that the basic assumptions of the biodynamic model are tenable. A unified explanation of metal bioaccumulation sets the stage for a realistic understanding of toxicity and ecological effects of metals in nature.
Phillips, Steven; Wilson, William H.
2011-01-01
A complete theory of cognitive architecture (i.e., the basic processes and modes of composition that together constitute cognitive behaviour) must explain the systematicity property—why our cognitive capacities are organized into particular groups of capacities, rather than some other, arbitrary collection. The classical account supposes: (1) syntactically compositional representations; and (2) processes that are sensitive to—compatible with—their structure. Classical compositionality, however, does not explain why these two components must be compatible; they are only compatible by the ad hoc assumption (convention) of employing the same mode of (concatenative) compositionality (e.g., prefix/postfix, where a relation symbol is always prepended/appended to the symbols for the related entities). Architectures employing mixed modes do not support systematicity. Recently, we proposed an alternative explanation without ad hoc assumptions, using category theory. Here, we extend our explanation to domains that are quasi-systematic (e.g., aspects of most languages), where the domain includes some but not all possible combinations of constituents. The central category-theoretic construct is an adjunction involving pullbacks, where the primary focus is on the relationship between processes modelled as functors, rather than the representations. A functor is a structure-preserving map (or construction, for our purposes). An adjunction guarantees that the only pairings of functors are the systematic ones. Thus, (quasi-)systematicity is a necessary consequence of a categorial cognitive architecture whose basic processes are functors that participate in adjunctions. PMID:21857816
Jackson, Charlotte; Mangtani, Punam; Hawker, Jeremy; Olowokure, Babatunde; Vynnycky, Emilia
2014-01-01
School closure is a potential intervention during an influenza pandemic and has been investigated in many modelling studies. To systematically review the effects of school closure on influenza outbreaks as predicted by simulation studies. We searched Medline and Embase for relevant modelling studies published by the end of October 2012, and handsearched key journals. We summarised the predicted effects of school closure on the peak and cumulative attack rates and the duration of the epidemic. We investigated how these predictions depended on the basic reproduction number, the timing and duration of closure and the assumed effects of school closures on contact patterns. School closures were usually predicted to be most effective if they caused large reductions in contact, if transmissibility was low (e.g. a basic reproduction number <2), and if attack rates were higher in children than in adults. The cumulative attack rate was expected to change less than the peak, but quantitative predictions varied (e.g. reductions in the peak were frequently 20-60% but some studies predicted >90% reductions or even increases under certain assumptions). This partly reflected differences in model assumptions, such as those regarding population contact patterns. Simulation studies suggest that school closure can be a useful control measure during an influenza pandemic, particularly for reducing peak demand on health services. However, it is difficult to accurately quantify the likely benefits. Further studies of the effects of reactive school closures on contact patterns are needed to improve the accuracy of model predictions.
Theory and interpretation in qualitative studies from general practice: Why and how?
Malterud, Kirsti
2016-03-01
In this article, I want to promote theoretical awareness and commitment among qualitative researchers in general practice and suggest adequate and feasible theoretical approaches. I discuss different theoretical aspects of qualitative research and present the basic foundations of the interpretative paradigm. Associations between paradigms, philosophies, methodologies and methods are examined and different strategies for theoretical commitment presented. Finally, I discuss the impact of theory for interpretation and the development of general practice knowledge. A scientific theory is a consistent and soundly based set of assumptions about a specific aspect of the world, predicting or explaining a phenomenon. Qualitative research is situated in an interpretative paradigm where notions about particular human experiences in context are recognized from different subject positions. Basic theoretical features from the philosophy of science explain why and how this is different from positivism. Reflexivity, including theoretical awareness and consistency, demonstrates interpretative assumptions, accounting for situated knowledge. Different types of theoretical commitment in qualitative analysis are presented, emphasizing substantive theories to sharpen the interpretative focus. Such approaches are clearly within reach for a general practice researcher contributing to clinical practice by doing more than summarizing what the participants talked about, without trying to become a philosopher. Qualitative studies from general practice deserve stronger theoretical awareness and commitment than what is currently established. Persistent attention to and respect for the distinctive domain of knowledge and practice where the research deliveries are targeted is necessary to choose adequate theoretical endeavours. © 2015 the Nordic Societies of Public Health.
Heterogeneity, Mixing, and the Spatial Scales of Mosquito-Borne Pathogen Transmission
Perkins, T. Alex; Scott, Thomas W.; Le Menach, Arnaud; Smith, David L.
2013-01-01
The Ross-Macdonald model has dominated theory for mosquito-borne pathogen transmission dynamics and control for over a century. The model, like many other basic population models, makes the mathematically convenient assumption that populations are well mixed; i.e., that each mosquito is equally likely to bite any vertebrate host. This assumption raises questions about the validity and utility of current theory because it is in conflict with preponderant empirical evidence that transmission is heterogeneous. Here, we propose a new dynamic framework that is realistic enough to describe biological causes of heterogeneous transmission of mosquito-borne pathogens of humans, yet tractable enough to provide a basis for developing and improving general theory. The framework is based on the ecological context of mosquito blood meals and the fine-scale movements of individual mosquitoes and human hosts that give rise to heterogeneous transmission. Using this framework, we describe pathogen dispersion in terms of individual-level analogues of two classical quantities: vectorial capacity and the basic reproductive number, . Importantly, this framework explicitly accounts for three key components of overall heterogeneity in transmission: heterogeneous exposure, poor mixing, and finite host numbers. Using these tools, we propose two ways of characterizing the spatial scales of transmission—pathogen dispersion kernels and the evenness of mixing across scales of aggregation—and demonstrate the consequences of a model's choice of spatial scale for epidemic dynamics and for estimation of , both by a priori model formulas and by inference of the force of infection from time-series data. PMID:24348223
Tanaka, Hiroyoshi
Under the basic tenet that syntactic derivation offers an optimal solution to both phonological realization and semantic interpretation of linguistic expression, the recent minimalist framework of syntactic theory claims that the basic unit for the derivation is equivalent to a syntactic propositional element, which is called a phase. In this analysis, syntactic derivation is assumed to proceed at phasal projections that include Complementizer Phrases (CP). However, there have been pointed out some empirical problems with respect to the failure of multiple occurrences of discourse-related elements in the CP domain. This problem can be easily overcome if the alternative approach in the recent minimalist perspective, which is called Cartographic CP analysis, is adopted, but this may raise a theoretical issue about the tension between phasality and four kinds of functional projections assumed in this analysis (Force Phrase (ForceP), Finite Phrase (FinP), Topic Phrase (TopP) and Focus Phrase (FocP)). This paper argues that a hybrid analysis with these two influential approaches can be proposed by claiming a reasonable assumption that syntactically requisite projections (i.e., ForceP and FinP) are phases and independently constitute a phasehood with relevant heads in the derivation. This then enables us to capture various syntactic properties of the Topicalization construction in English. Our proposed analysis, coupled with some additional assumptions and observations in recent minimalist studies, can be extended to incorporate peculiar properties in temporal/conditional adverbials and imperatives.
ERIC Educational Resources Information Center
Straubhaar, Rolf
2017-01-01
The purpose of this article is to ethnographically document the market-based ideological assumptions of Rio de Janeiro's educational policymakers, and the ways in which those assumptions have informed these policymakers' decision to implement value-added modeling-based teacher evaluation policies. Drawing on the anthropological literature on…
An Evaluation of Normal versus Lognormal Distribution in Data Description and Empirical Analysis
ERIC Educational Resources Information Center
Diwakar, Rekha
2017-01-01
Many existing methods of statistical inference and analysis rely heavily on the assumption that the data are normally distributed. However, the normality assumption is not fulfilled when dealing with data which does not contain negative values or are otherwise skewed--a common occurrence in diverse disciplines such as finance, economics, political…
Consistent Tolerance Bounds for Statistical Distributions
NASA Technical Reports Server (NTRS)
Mezzacappa, M. A.
1983-01-01
Assumption that sample comes from population with particular distribution is made with confidence C if data lie between certain bounds. These "confidence bounds" depend on C and assumption about distribution of sampling errors around regression line. Graphical test criteria using tolerance bounds are applied in industry where statistical analysis influences product development and use. Applied to evaluate equipment life.
Turning great strategy into great performance.
Mankins, Michael C; Steele, Richard
2005-01-01
Despite the enormous time and energy that goes into strategy development, many companies have little to show for their efforts. Indeed, research by the consultancy Marakon Associates suggests that companies on average deliver only 63% of the financial performance their strategies promise. In this article, Michael Mankins and Richard Steele of Marakon present the findings of this research. They draw on their experience with high-performing companies like Barclays, Cisco, Dow Chemical, 3M, and Roche to establish some basic rules for setting and delivering strategy: Keep it simple, make it concrete. Avoid long, drawn-out descriptions of lofty goals and instead stick to clear language describing what your company will and won't do. Debate assumptions, not forecasts. Create cross-functional teams drawn from strategy, marketing, and finance to ensure the assumptions underlying your long-term plans reflect both the real economics of your company's markets and its actual performance relative to competitors. Use a rigorous analytic framework. Ensure that the dialogue between the corporate center and the business units about market trends and assumptions is conducted within a rigorous framework, such as that of "profit pools". Discuss resource deployments early. Create more realistic forecasts and more executable plans by discussing up front the level and timing of critical deployments. Clearly identify priorities. Prioritize tactics so that employees have a clear sense of where to direct their efforts. Continuously monitor performance. Track resource deployment and results against plan, using continuous feedback to reset assumptions and reallocate resources. Reward and develop execution capabilities. Motivate and develop staff. Following these rules strictly can help narrow the strategy-to-performance gap.
NASA Astrophysics Data System (ADS)
Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.
2017-12-01
Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.
Characterization of plasma current quench during disruptions at HL-2A
NASA Astrophysics Data System (ADS)
Zhu, Jinxia; Zhang, Yipo; Dong, Yunbo; HL-2A Team
2017-05-01
The most essential assumptions of physics for the evaluation of electromagnetic forces on the plasma-facing components due to a disruption-induced eddy current are characteristics of plasma current quenches including the current quench rate or its waveforms. The characteristics of plasma current quenches at HL-2A have been analyzed during spontaneous disruptions. Both linear decay and exponential decay are found in the disruptions with the fastest current quenches. However, there are two stages of current quench in the slow current quench case. The first stage with an exponential decay and the second stage followed by a rapid linear decay. The faster current quench rate corresponds to the faster movement of plasma displacement. The parameter regimes on the current quench time and the current quench rates have been obtained from disruption statistics at HL-2A. There exists no remarkable difference for distributions obtained between the limiter and the divertor configuration. This data from HL-2A provides basic data of the derivation of design criteria for a large-sized machine during the current decay phase of the disruptions.
NASA Technical Reports Server (NTRS)
Bonnice, W. F.; Wagner, E.; Motyka, P.; Hall, S. R.
1985-01-01
The performance of the detection filter in detecting and isolating aircraft control surface and actuator failures is evaluated. The basic detection filter theory assumption of no direct input-output coupling is violated in this application due to the use of acceleration measurements for detecting and isolating failures. With this coupling, residuals produced by control surface failures may only be constrained to a known plane rather than to a single direction. A detection filter design with such planar failure signatures is presented, with the design issues briefly addressed. In addition, a modification to constrain the residual to a single known direction even with direct input-output coupling is also presented. Both the detection filter and the modification are tested using a nonlinear aircraft simulation. While no thresholds were selected, both filters demonstrated an ability to detect control surface and actuator failures. Failure isolation may be a problem if there are several control surfaces which produce similar effects on the aircraft. In addition, the detection filter was sensitive to wind turbulence and modeling errors.
Rubin, David C.; Berntsen, Dorthe; Johansen, Malene Klindt
2009-01-01
In the mnemonic model of PTSD, the current memory of a negative event, not the event itself determines symptoms. The model is an alternative to the current event-based etiology of PTSD represented in the DSM. The model accounts for important and reliable findings that are often inconsistent with the current diagnostic view and that have been neglected by theoretical accounts of the disorder, including the following observations. The diagnosis needs objective information about the trauma and peritraumatic emotions, but uses retrospective memory reports that can have substantial biases. Negative events and emotions that do not satisfy the current diagnostic criteria for a trauma can be followed by symptoms that would otherwise qualify for PTSD. Predisposing factors that affect the current memory have large effects on symptoms. The inability-to-recall-an-important-aspect-of-the-trauma symptom does not correlate with other symptoms. Loss or enhancement of the trauma memory affects PTSD symptoms in predictable ways. Special mechanisms that apply only to traumatic memories are not needed, increasing parsimony and the knowledge that can be applied to understanding PTSD. PMID:18954211
Development of an automated ammunition processing system for battlefield use
DOE Office of Scientific and Technical Information (OSTI.GOV)
Speaks, D.M.; Chesser, J.B.; Lloyd, P.D.
1995-03-01
The Future Armored Resupply Vehicle (FARV) will be the companion ammunition resupply vehicle to the Advanced Field Artillery System (AFAS). These systems are currently being investigated by the US Army for future acquisition. The FARV will sustain the AFAS with ammunition and fuel and will significantly increase capabilities over current resupply vehicles. Currently ammunition is transferred to field artillery almost entirely by hand. The level of automation to be included into the FARV is still under consideration. At the request of the US Army`s Project Manager, AFAS/FARV, Oak Ridge National Laboratory (ORNL) identified and evaluated various concepts for the automatedmore » upload, processing, storage, and delivery equipment for the FARV. ORNL, working with the sponsor, established basic requirements and assumptions for concept development and the methodology for concept selection. A preliminary concept has been selected, and the associated critical technologies have been identified. ORNL has provided technology demonstrations of many of these critical technologies. A technology demonstrator which incorporates all individual components into a total process demonstration is planned for late FY 1995.« less
Keegan, Lindsay; Dushoff, Jonathan
2014-05-01
The basic reproductive number, R0, provides a foundation for evaluating how various factors affect the incidence of infectious diseases. Recently, it has been suggested that, particularly for vector-transmitted diseases, R0 should be modified to account for the effects of finite host population within a single disease transmission generation. Here, we use a transmission factor approach to calculate such "finite-population reproductive numbers," under the assumption of homogeneous mixing, for both vector-borne and directly transmitted diseases. In the case of vector-borne diseases, we estimate finite-population reproductive numbers for both host-to-host and vector-to-vector generations, assuming that the vector population is effectively infinite. We find simple, interpretable formulas for all three of these quantities. In the direct case, we find that finite-population reproductive numbers diverge from R0 before R0 reaches half of the population size. In the vector-transmitted case, we find that the host-to-host number diverges at even lower values of R0, while the vector-to-vector number diverges very little over realistic parameter ranges.
NASA Technical Reports Server (NTRS)
Moxson, V. S.; Moracz, D. J.; Bhat, B. N.; Dolan, F. J.; Thom, R.
1987-01-01
Traditionally, vacuum melted 440C stainless steel is used for high performance bearings for aerospace cryogenic systems where corrosion due to condensation is a major concern. For the Space Shuttle Main Engine (SSME), however, 440C performance in the high-pressure turbopumps has been marginal. A basic assumption of this study was that powder metallurgy, rather than cast/wrought, processing would provide the finest, most homogeneous bearing alloy structure. Preliminary testing of P/M alloys (hardness, corrosion resistance, wear resistance, fatigue resistance, and fracture toughness) was used to 'de-select' alloys which did perform as well as baseline 440C. Five out of eleven candidate materials (14-4/6V, X-405, MRC-2001, T-440V, and D-5) based on preliminary screening were selected for the actual rolling-sliding five-ball testing. The results of this test were compared with high-performance vacuum-melted M50 bearing steel. The results of the testing indicated outstanding performance of two P/M alloys, X-405 and MRC-2001, which eventually will be further evaluated by full-scale bearing testing.
Adaptive hidden Markov model with anomaly States for price manipulation detection.
Cao, Yi; Li, Yuhua; Coleman, Sonya; Belatreche, Ammar; McGinnity, Thomas Martin
2015-02-01
Price manipulation refers to the activities of those traders who use carefully designed trading behaviors to manually push up or down the underlying equity prices for making profits. With increasing volumes and frequency of trading, price manipulation can be extremely damaging to the proper functioning and integrity of capital markets. The existing literature focuses on either empirical studies of market abuse cases or analysis of particular manipulation types based on certain assumptions. Effective approaches for analyzing and detecting price manipulation in real time are yet to be developed. This paper proposes a novel approach, called adaptive hidden Markov model with anomaly states (AHMMAS) for modeling and detecting price manipulation activities. Together with wavelet transformations and gradients as the feature extraction methods, the AHMMAS model caters to price manipulation detection and basic manipulation type recognition. The evaluation experiments conducted on seven stock tick data from NASDAQ and the London Stock Exchange and 10 simulated stock prices by stochastic differential equation show that the proposed AHMMAS model can effectively detect price manipulation patterns and outperforms the selected benchmark models.
Comparison of main-shock and aftershock fragility curves developed for New Zealand and US buildings
Uma, S.R.; Ryu, H.; Luco, N.; Liel, A.B.; Raghunandan, M.
2011-01-01
Seismic risk assessment involves the development of fragility functions to express the relationship between ground motion intensity and damage potential. In evaluating the risk associated with the building inventory in a region, it is essential to capture 'actual' characteristics of the buildings and group them so that 'generic building types' can be generated for further analysis of their damage potential. Variations in building characteristics across regions/countries largely influence the resulting fragility functions, such that building models are unsuitable to be adopted for risk assessment in any other region where a different set of building is present. In this paper, for a given building type (represented in terms of height and structural system), typical New Zealand and US building models are considered to illustrate the differences in structural model parameters and their effects on resulting fragility functions for a set of main-shocks and aftershocks. From this study, the general conclusion is that the methodology and assumptions used to derive basic capacity curve parameters have a considerable influence on fragility curves.
Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program
NASA Technical Reports Server (NTRS)
Ryan, Shannon
2013-01-01
This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks.
Farmer, Richard F; Goldberg, Lewis R
2008-09-01
In this reply we address comments by Cloninger (this issue) related to our report (Farmer & Goldberg, this issue) on the psychometric properties of the revised Temperament and Character Inventory (TCI-R) and a short inventory derivative, the TCI-140. Even though Cloninger's psychobiological model has undergone substantial theoretical modifications, the relevance of these changes for the evaluation and use of the TCI-R remains unclear. Aspects of TCI-R assessment also appear to be theoretically and empirically incongruent with Cloninger's assertion that TCI-R personality domains are non-linear and dynamic in nature. Several other core assumptions from the psychobiological model, including this most recent iteration, are non-falsifiable, inconsistently supported, or have no apparent empirical basis. Although researchers using the TCI and TCI-R have frequently accepted the temperament/character distinction and associated theoretical ramifications, for example, we find little overall support for the differentiation of TCI-R domains into these two basic categories. The implications of these observations for TCI-R assessment are briefly discussed.
Using Ecosystem Experiments to Improve Vegetation Models
Medlyn, Belinda; Zaehle, S; DeKauwe, Martin G.; ...
2015-05-21
Ecosystem responses to rising CO2 concentrations are a major source of uncertainty in climate change projections. Data from ecosystem-scale Free-Air CO2 Enrichment (FACE) experiments provide a unique opportunity to reduce this uncertainty. The recent FACE Model–Data Synthesis project aimed to use the information gathered in two forest FACE experiments to assess and improve land ecosystem models. A new 'assumption-centred' model intercomparison approach was used, in which participating models were evaluated against experimental data based on the ways in which they represent key ecological processes. Identifying and evaluating the main assumptions caused differences among models, and the assumption-centered approach produced amore » clear roadmap for reducing model uncertainty. We explain this approach and summarize the resulting research agenda. We encourage the application of this approach in other model intercomparison projects to fundamentally improve predictive understanding of the Earth system.« less
Statistical Issues for Uncontrolled Reentry Hazards
NASA Technical Reports Server (NTRS)
Matney, Mark
2008-01-01
A number of statistical tools have been developed over the years for assessing the risk of reentering objects to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. The statistical tools use this information to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of the analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper looks at a number of these theoretical assumptions, examining the mathematical basis for the hazard calculations, and outlining the conditions under which the simplifying assumptions hold. In addition, this paper will also outline some new tools for assessing ground hazard risk in useful ways. Also, this study is able to make use of a database of known uncontrolled reentry locations measured by the United States Department of Defense. By using data from objects that were in orbit more than 30 days before reentry, sufficient time is allowed for the orbital parameters to be randomized in the way the models are designed to compute. The predicted ground footprint distributions of these objects are based on the theory that their orbits behave basically like simple Kepler orbits. However, there are a number of factors - including the effects of gravitational harmonics, the effects of the Earth's equatorial bulge on the atmosphere, and the rotation of the Earth and atmosphere - that could cause them to diverge from simple Kepler orbit behavior and change the ground footprints. The measured latitude and longitude distributions of these objects provide data that can be directly compared with the predicted distributions, providing a fundamental empirical test of the model assumptions.
Practical Stereology Applications for the Pathologist.
Brown, Danielle L
2017-05-01
Qualitative histopathology is the gold standard for routine examination of morphological tissue changes in the regulatory or academic environment. The human eye is exceptional for pattern recognition but often cannot detect small changes in quantity. In cases where detection of subtle quantitative changes is critical, more sensitive methods are required. Two-dimensional histomorphometry can provide additional quantitative information and is quite useful in many cases. However, the provided data may not be referent to the entire tissue and, as such, it makes several assumptions, which are sources of bias. In contrast, stereology is design based rather than assumption based and uses stringent sampling methods to obtain accurate and precise 3-dimensional information using geometrical and statistical principles. Recent advances in technology have made stereology more approachable and practical for the pathologist in both regulatory and academic environments. This review introduces pathologists to the basic principles of stereology and walks the reader through some real-world examples for the application of these principles in the workplace.
Modeling of the illumination driven coma of 67P/Churyumov-Gerasimenko
NASA Astrophysics Data System (ADS)
Bieler, André
2015-04-01
In this paper we present results modeling 67P/Churyumov-Gerasimenko's (C-G) neutral coma properties observed by the Rosetta ROSINA experiment with 3 different model approaches. The basic assumption for all models is the idea that the out-gassing properties of C-G are mainly illumination driven. With this assumption all models are capable of reproducing most features in the neutral coma signature as detected by the ROSINA-COPS instrument over several months. The models include the realistic shape model of the nucleus to calculate the illumination conditions over time which are used to define the boundary conditions for the hydrodynamic (BATS-R-US code) and the Direct Simulation Monte Carlo (AMPS code) simulations. The third model finally computes the projection of the total illumination on the comet surface towards the spacecraft. Our results indicate that at large heliocentric distances (3.5 to 2.8 AU) most gas coma structures observed by the in-situ instruments can be explained by uniformly distributed activity regions spread over the whole nucleus surface.
Anderson, Christine A; Whall, Ann L
2013-10-01
Opinion leaders are informal leaders who have the ability to influence others' decisions about adopting new products, practices or ideas. In the healthcare setting, the importance of translating new research evidence into practice has led to interest in understanding how opinion leaders could be used to speed this process. Despite continued interest, gaps in understanding opinion leadership remain. Agent-based models are computer models that have proven to be useful for representing dynamic and contextual phenomena such as opinion leadership. The purpose of this paper is to describe the work conducted in preparation for the development of an agent-based model of nursing opinion leadership. The aim of this phase of the model development project was to clarify basic assumptions about opinions, the individual attributes of opinion leaders and characteristics of the context in which they are effective. The process used to clarify these assumptions was the construction of a preliminary nursing opinion leader model, derived from philosophical theories about belief formation. © 2013 John Wiley & Sons Ltd.
Notes on power of normality tests of error terms in regression models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Střelec, Luboš
2015-03-10
Normality is one of the basic assumptions in applying statistical procedures. For example in linear regression most of the inferential procedures are based on the assumption of normality, i.e. the disturbance vector is assumed to be normally distributed. Failure to assess non-normality of the error terms may lead to incorrect results of usual statistical inference techniques such as t-test or F-test. Thus, error terms should be normally distributed in order to allow us to make exact inferences. As a consequence, normally distributed stochastic errors are necessary in order to make a not misleading inferences which explains a necessity and importancemore » of robust tests of normality. Therefore, the aim of this contribution is to discuss normality testing of error terms in regression models. In this contribution, we introduce the general RT class of robust tests for normality, and present and discuss the trade-off between power and robustness of selected classical and robust normality tests of error terms in regression models.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kline, Keith L; Oladosu, Gbadebo A; Dale, Virginia H
2011-01-01
Vigorous debate on the effects of biofuels derives largely from the changes in land use estimated using economic models designed mainly for the analysis of agricultural trade and markets. The models referenced for land-use change (LUC) analysis in the U.S. Environmental Protection Agency Final Rule on the Renewable Fuel Standard include GTAP, FAPRI-CARD, and FASOM. To address bioenergy impacts, these models were expanded and modified to facilitate simulations of hypothesized LUC. However, even when models use similar basic assumptions and data, the range of LUC results can vary by ten-fold or more. While the market dynamics simulated in these modelsmore » include processes that are important in estimating effects of biofuel policies, the models have not been validated for estimating land-use changes and employ crucial assumptions and simplifications that contradict empirical evidence.« less
Photonics and terahertz tchnologies: part 1
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.
2011-10-01
This digest paper debates basic features of the terahertz band of frequencies. There are presented fundamental characteristics of the basic terahertz system consisting of a THz source, propagation media, transmission lines, THz signal processing, and detectors. Such a system finds research application, but also practical in two main areas: terahertz imaging - transmissive and reflective, and as a close range THz radar, but also as sensory systems mainly for molecular sensing. There were launched in this country a few THz research projects concerning the THz sources, detectors and their applications. Among these projects there is an infrastructural one called FOTEH, opened at the WUT. The details of this project are debated and the consequences of its realization in this country. The first part of the paper is an introduction debating THz band and comparing it with the photonics one. The second part presents the assumptions of the infrastructural FOTEH project on Photonics and Terahertz Technologies.
A comparison of experimental and calculated thin-shell leading-edge buckling due to thermal stresses
NASA Technical Reports Server (NTRS)
Jenkins, Jerald M.
1988-01-01
High-temperature thin-shell leading-edge buckling test data are analyzed using NASA structural analysis (NASTRAN) as a finite element tool for predicting thermal buckling characteristics. Buckling points are predicted for several combinations of edge boundary conditions. The problem of relating the appropriate plate area to the edge stress distribution and the stress gradient is addressed in terms of analysis assumptions. Local plasticity was found to occur on the specimen analyzed, and this tended to simplify the basic problem since it effectively equalized the stress gradient from loaded edge to loaded edge. The initial loading was found to be difficult to select for the buckling analysis because of the transient nature of thermal stress. Multiple initial model loadings are likely required for complicated thermal stress time histories before a pertinent finite element buckling analysis can be achieved. The basic mode shapes determined from experimentation were correctly identified from computation.
Thermodynamics and Diffusion Coupling in Alloys—Application-Driven Science
NASA Astrophysics Data System (ADS)
Ågren, John
2012-10-01
As emphasized by Stokes (1997), the common assumption of a linear progression from basic research (science), via applied research, to technological innovations (engineering) should be questioned. In fact, society would gain much by supporting long-term research that stems from practical problems and has usefulness as a key word. Such research may be fundamental, and often, it cannot be distinguished from "basic" research if it were not for its different motivation. The development of the Calphad method and the more recent development of accompanying kinetic approaches for diffusion serve as excellent examples and are the themes of this symposium. The drivers are, e.g., the development of new materials, processes, and lifetime predictions. Many challenges of the utmost practical importance require long-term fundamental research. This presentation will address some of them, e.g., the effect of various ordering phenomena on activation barriers, and the strength and practical importance of correlation effects.
Semi-supervised Learning for Phenotyping Tasks.
Dligach, Dmitriy; Miller, Timothy; Savova, Guergana K
2015-01-01
Supervised learning is the dominant approach to automatic electronic health records-based phenotyping, but it is expensive due to the cost of manual chart review. Semi-supervised learning takes advantage of both scarce labeled and plentiful unlabeled data. In this work, we study a family of semi-supervised learning algorithms based on Expectation Maximization (EM) in the context of several phenotyping tasks. We first experiment with the basic EM algorithm. When the modeling assumptions are violated, basic EM leads to inaccurate parameter estimation. Augmented EM attenuates this shortcoming by introducing a weighting factor that downweights the unlabeled data. Cross-validation does not always lead to the best setting of the weighting factor and other heuristic methods may be preferred. We show that accurate phenotyping models can be trained with only a few hundred labeled (and a large number of unlabeled) examples, potentially providing substantial savings in the amount of the required manual chart review.
An uncertainty analysis of the flood-stage upstream from a bridge.
Sowiński, M
2006-01-01
The paper begins with the formulation of the problem in the form of a general performance function. Next the Latin hypercube sampling (LHS) technique--a modified version of the Monte Carlo method is briefly described. The essential uncertainty analysis of the flood-stage upstream from a bridge starts with a description of the hydraulic model. This model concept is based on the HEC-RAS model developed for subcritical flow under a bridge without piers in which the energy equation is applied. The next section contains the characteristic of the basic variables including a specification of their statistics (means and variances). Next the problem of correlated variables is discussed and assumptions concerning correlation among basic variables are formulated. The analysis of results is based on LHS ranking lists obtained from the computer package UNCSAM. Results fot two examples are given: one for independent and the other for correlated variables.
1989-10-01
apiots to rerlliii their los1 ’ at act ions iii Ilie rouist ’lirt iii of thle plain. Th’lis fast piece of iiiforiat liolu is plrovidied tlioiigli Ow 1use of...maximum compatible sets and delete subsets otherwise for every plan fragment pf, for g,. tile first goal in goals, if p.f- does not exceed resource... deletes an non-default assumption. 4.5.3.2 Data Structures The MATMS is a frame-based system in which there are five basic types of objects: beliefs
Inverting seismic data for rock physical properties; Mathematical background and application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farfour, Mohammed; Yoon, Wang Jung; Kim, Jinmo
2016-06-08
The basic concept behind seismic inversion is that mathematical assumptions can be established to relate seismic to geological formation properties that caused their seismic responses. In this presentation we address some widely used seismic inversion method in hydrocarbon reservoirs identification and characterization. A successful use of the inversion in real example from gas sand reservoir in Boonsville field, Noth Central Texas is presented. Seismic data was not unambiguous indicator of reservoir facies distribution. The use of the inversion led to remove the ambiguity and reveal clear information about the target.
[Medical errors from positions of mutual relations of patient-lawyer-doctor].
Radysh, Ia F; Tsema, Ie V; Mehed', V P
2013-01-01
The basic theoretical and practical aspects of problem of malpractice in the system of health protection Ukraine are presented in the article. On specific examples the essence of the term "malpractice" is expounded. It was considered types of malpractice, conditions of beginning and kinds of responsibility to assumption of malpractice. The special attention to the legal and mental and ethical questions of problem from positions of protection of rights for a patient and medical worker is spared. The necessity of qualification malpractices on intentional and unintentional, possible and impermissible is grounded.
Application Note: Power Grid Modeling With Xyce.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sholander, Peter E.
This application note describes how to model steady-state power flows and transient events in electric power grids with the SPICE-compatible Xyce TM Parallel Electronic Simulator developed at Sandia National Labs. This application notes provides a brief tutorial on the basic devices (branches, bus shunts, transformers and generators) found in power grids. The focus is on the features supported and assumptions made by the Xyce models for power grid elements. It then provides a detailed explanation, including working Xyce netlists, for simulating some simple power grid examples such as the IEEE 14-bus test case.
Mediation skills for conflict resolution in nursing education.
Cheng, Fung Kei
2015-07-01
Encountering conflicts among family members in hospital produces burnout among nurses, implying a need for alternative dispute resolution training. However, current nursing education pays more attention to counselling skills training than to mediation. The present report examines the fundamental concepts of mediation, including its nature, basic assumptions and values, and compares those with counselling. Its implications may open a discussion on enhancing contemporary nursing education by providing mediation training in the workplace to nurses so that they can deal more effectively with disputes. Copyright © 2015 Elsevier Ltd. All rights reserved.
Surface Oscillations of a Free-Falling Droplet of an Ideal Fluid
NASA Astrophysics Data System (ADS)
Kistovich, A. V.; Chashechkin, Yu. D.
2018-03-01
According to observations, drops freely falling in the air under the action of gravity are deformed and oscillate in a wide range of frequencies and scales. A technique for calculating surface axisymmetric oscillations of a deformed droplet in the linear approximation under the assumption that the amplitude and wavelength are small when compared to the droplet diameter is proposed. The basic form of an axisymmetric droplet is chosen from observations. The calculation results for surface oscillations agree with recorded data on the varying shape of water droplets falling in the air.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corbett, J.E.
1996-02-01
This report documents the completion of a preliminary design review for the Rotary Mode Core Sample Truck (RMCST) modifications for flammable gas tanks. The RMCST modifications are intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to validate basic design assumptions and concepts to support a path forward leading to a final design. The conclusion reached by the review committee was that the design was acceptable and efforts should continue toward a final design review.
A Novel Quantum Solution to Privacy-Preserving Nearest Neighbor Query in Location-Based Services
NASA Astrophysics Data System (ADS)
Luo, Zhen-yu; Shi, Run-hua; Xu, Min; Zhang, Shun
2018-04-01
We present a cheating-sensitive quantum protocol for Privacy-Preserving Nearest Neighbor Query based on Oblivious Quantum Key Distribution and Quantum Encryption. Compared with the classical related protocols, our proposed protocol has higher security, because the security of our protocol is based on basic physical principles of quantum mechanics, instead of difficulty assumptions. Especially, our protocol takes single photons as quantum resources and only needs to perform single-photon projective measurement. Therefore, it is feasible to implement this protocol with the present technologies.
Survey of decentralized control methods. [for large scale dynamic systems
NASA Technical Reports Server (NTRS)
Athans, M.
1975-01-01
An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.
Analyses of group sequential clinical trials.
Koepcke, W
1989-12-01
In the first part of this article the methodology of group sequential plans is reviewed. After introducing the basic definition of such plans the main properties are shown. At the end of this section three different plans (Pocock, O'Brien-Fleming, Koepcke) are compared. In the second part of the article some unresolved issues and recent developments in the application of group sequential methods to long-term controlled clinical trials are discussed. These include deviation from the assumptions, life table methods, multiple-arm clinical trials, multiple outcome measures, and confidence intervals.
A Metric to Evaluate Mobile Satellite Systems
NASA Technical Reports Server (NTRS)
Young, Elizabeth L.
1997-01-01
The concept of a "cost per billable minute" methodology to analyze mobile satellite systems is reviewed. Certain assumptions, notably those about the marketplace and regulatory policies, may need to be revisited. Fading and power control assumptions need to be tested. Overall, the metric would seem to have value in the design phase of a system and for comparisons between and among alternative systems.
ERIC Educational Resources Information Center
Jang, Hyesuk
2014-01-01
This study aims to evaluate a multidimensional latent trait model to determine how well the model works in various empirical contexts. Contrary to the assumption of these latent trait models that the traits are normally distributed, situations in which the latent trait is not shaped with a normal distribution may occur (Sass et al, 2008; Woods…
Do forest community types provide a sufficient basis to evaluate biological diversity?
Samuel A. Cushman; Kevin S. McKelvey; Curtis H. Flather; Kevin McGarigal
2008-01-01
Forest communities, defined by the size and configuration of cover types and stand ages, have commonly been used as proxies for the abundance or viability of wildlife populations. However, for community types to succeed as proxies for species abundance, several assumptions must be met. We tested these assumptions for birds in an Oregon forest environment. Measured...