Sample records for basic design assumptions

  1. The Applied Behavior Analysis Research Paradigm and Single-Subject Designs in Adapted Physical Activity Research.

    PubMed

    Haegele, Justin A; Hodge, Samuel Russell

    2015-10-01

    There are basic philosophical and paradigmatic assumptions that guide scholarly research endeavors, including the methods used and the types of questions asked. Through this article, kinesiology faculty and students with interests in adapted physical activity are encouraged to understand the basic assumptions of applied behavior analysis (ABA) methodology for conducting, analyzing, and presenting research of high quality in this paradigm. The purposes of this viewpoint paper are to present information fundamental to understanding the assumptions undergirding research methodology in ABA, describe key aspects of single-subject research designs, and discuss common research designs and data-analysis strategies used in single-subject studies.

  2. Social factors in space station interiors

    NASA Technical Reports Server (NTRS)

    Cranz, Galen; Eichold, Alice; Hottes, Klaus; Jones, Kevin; Weinstein, Linda

    1987-01-01

    Using the example of the chair, which is often written into space station planning but which serves no non-cultural function in zero gravity, difficulties in overcoming cultural assumptions are discussed. An experimental approach is called for which would allow designers to separate cultural assumptions from logistic, social and psychological necessities. Simulations, systematic doubt and monitored brainstorming are recommended as part of basic research so that the designer will approach the problems of space module design with a complete program.

  3. Thin Skin, Deep Damage: Addressing the Wounded Writer in the Basic Writing Course

    ERIC Educational Resources Information Center

    Boone, Stephanie D.

    2010-01-01

    How do institutions and their writing faculties see basic writers? What assumptions about these writers drive writing curricula, pedagogies and assessments? How do writing programs enable or frustrate these writers? How might course design facilitate the outcomes we envision? This article argues that, in order to teach basic writers to enter…

  4. Causality and headache triggers

    PubMed Central

    Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.

    2013-01-01

    Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872

  5. Energy Conversion Alternatives Study (ECAS), Westinghouse phase 1. Volume 8: Open-cycle MHD. [energy conversion efficiency and design analysis of electric power plants employing magnetohydrodynamics

    NASA Technical Reports Server (NTRS)

    Hoover, D. Q.

    1976-01-01

    Electric power plant costs and efficiencies are presented for three basic open-cycle MHD systems: (1) direct coal fired system, (2) a system with a separately fired air heater, and (3) a system burning low-Btu gas from an integrated gasifier. Power plant designs were developed corresponding to the basic cases with variation of major parameters for which major system components were sized and costed. Flow diagrams describing each design are presented. A discussion of the limitations of each design is made within the framework of the assumptions made.

  6. Status of the Space Station environmental control and life support system design concept

    NASA Technical Reports Server (NTRS)

    Ray, C. D.; Humphries, W. R.

    1986-01-01

    The current status of the Space Station (SS) environmental control and life support system (ECLSS) design is outlined. The concept has been defined at the subsystem level. Data supporting these definitions are provided which identify general configuratioons for all modules. Requirements, guidelines and assumptions used in generating these configurations are detailed. The basic 2 US module 'core' Space Station is addressed along with system synergism issues and early man-tended and future growth considerations. Along with these basic studies, also addressed here are options related to variation in the 'core' module makeup and more austere Station concepts such as commonality, automation and design to cost.

  7. A Basic Literacy Project for the Correctional Service of Canada: Curriculum Design as a Strategy for Staff Development.

    ERIC Educational Resources Information Center

    Collins, Michael

    1989-01-01

    Describes a Canadian curriculum development project; analyzes underlying policy assumptions. Advocates involvement of prison educators and inmates in the process if curriculum is to meet the educational needs of inmates. (Author/LAM)

  8. Using graph-based assessments within socratic tutorials to reveal and refine students' analytical thinking about molecular networks.

    PubMed

    Trujillo, Caleb; Cooper, Melanie M; Klymkowsky, Michael W

    2012-01-01

    Biological systems, from the molecular to the ecological, involve dynamic interaction networks. To examine student thinking about networks we used graphical responses, since they are easier to evaluate for implied, but unarticulated assumptions. Senior college level molecular biology students were presented with simple molecular level scenarios; surprisingly, most students failed to articulate the basic assumptions needed to generate reasonable graphical representations; their graphs often contradicted their explicit assumptions. We then developed a tiered Socratic tutorial based on leading questions designed to provoke metacognitive reflection. The activity is characterized by leading questions (prompts) designed to provoke meta-cognitive reflection. When applied in a group or individual setting, there was clear improvement in targeted areas. Our results highlight the promise of using graphical responses and Socratic prompts in a tutorial context as both a formative assessment for students and an informative feedback system for instructors, in part because graphical responses are relatively easy to evaluate for implied, but unarticulated assumptions. Copyright © 2011 Wiley Periodicals, Inc.

  9. Design review report for rotary mode core sample truck (RMCST) modifications for flammable gas tanks, preliminary design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corbett, J.E.

    1996-02-01

    This report documents the completion of a preliminary design review for the Rotary Mode Core Sample Truck (RMCST) modifications for flammable gas tanks. The RMCST modifications are intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to validate basic design assumptions and concepts to support a path forward leading to a final design. The conclusion reached by the review committee was that the design was acceptable and efforts should continue toward a final design review.

  10. A Comprehensive Real-World Distillation Experiment

    ERIC Educational Resources Information Center

    Kazameas, Christos G.; Keller, Kaitlin N.; Luyben, William L.

    2015-01-01

    Most undergraduate mass transfer and separation courses cover the design of distillation columns, and many undergraduate laboratories have distillation experiments. In many cases, the treatment is restricted to simple column configurations and simplifying assumptions are made so as to convey only the basic concepts. In industry, the analysis of a…

  11. Concepts, requirements, and design approaches for building successful planning and scheduling systems

    NASA Technical Reports Server (NTRS)

    Hornstein, Rhoda Shaller; Willoughby, John K.

    1991-01-01

    Traditional practice of systems engineering management assumes requirements can be precisely determined and unambiguously defined prior to system design and implementation; practice further assumes requirements are held static during implementation. Human-computer decision support systems for service planning and scheduling applications do not conform well to these assumptions. Adaptation to the traditional practice of systems engineering management are required. Basic technology exists to support these adaptations. Additional innovations must be encouraged and nutured. Continued partnership between the programmatic and technical perspective assures proper balance of the impossible with the possible. Past problems have the following origins: not recognizing the unusual and perverse nature of the requirements for planning and scheduling; not recognizing the best starting point assumptions for the design; not understanding the type of system that being built; and not understanding the design consequences of the operations concept selected.

  12. Resegregation in Norfolk, Virginia. Does Restoring Neighborhood Schools Work?

    ERIC Educational Resources Information Center

    Meldrum, Christina; Eaton, Susan E.

    This report reviews school department data and interviews with officials and others involved in the Norfolk (Virginia) school resegregation plan designed to stem White flight and increase parental involvement. The report finds that all the basic assumptions the local community and the court had about the potential benefits of undoing the city's…

  13. Performance of species occurrence estimators when basic assumptions are not met: a test using field data where true occupancy status is known

    USGS Publications Warehouse

    Miller, David A. W.; Bailey, Larissa L.; Grant, Evan H. Campbell; McClintock, Brett T.; Weir, Linda A.; Simons, Theodore R.

    2015-01-01

    Our results demonstrate that even small probabilities of misidentification and among-site detection heterogeneity can have severe effects on estimator reliability if ignored. We challenge researchers to place greater attention on both heterogeneity and false positives when designing and analysing occupancy studies. We provide 9 specific recommendations for the design, implementation and analysis of occupancy studies to better meet this challenge.

  14. Reds, Greens, Yellows Ease the Spelling Blues.

    ERIC Educational Resources Information Center

    Irwin, Virginia

    1971-01-01

    This document reports on a color-coding innovation designed to improve the spelling ability of high school seniors. This color-coded system is based on two assumptions: that color will appeal to the students and that there are three principal reasons for misspelling. Two groups were chosen for the experiments. A basic list of spelling demons was…

  15. Improving Child Management Practices of Parents and Teachers. Maxi I Practicum. Final Report.

    ERIC Educational Resources Information Center

    Adreani, Arnold J.; McCaffrey, Robert

    The practicum design reported in this document was based on one basic assumption, that the adult perceptions of children influence adult behavior toward children which in turn influences the child's behavior. Therefore, behavior changes by children could best be effected by changing the adult perception of, and behavior toward, the child.…

  16. Model for Developing an In-Service Teacher Workshop To Help Multilingual and Multicultural Students.

    ERIC Educational Resources Information Center

    Kachaturoff, Grace; Romatowski, Jane A.

    This is a model for designing an inservice teacher workshop to assist teachers working with multicultural students. The basic assumption underlying the model is universities and schools need to work cooperatively to provide experiences for improving the quality of teaching by increasing awareness of educational issues and situations and by…

  17. New Schools for the Cities: Designs for Equality and Excellence. A Working Paper prepared for the Citizens' Crusade Against Poverty.

    ERIC Educational Resources Information Center

    Pressman, Harvey

    This paper outlines several schemes for developing quality private schools for inner city students. The basic assumption justifying the proposal that such schools be independently managed is that the urban public school systems have patently failed to educate poor children. Therefore, a new national network of independent schools should be…

  18. Shielding of substations against direct lightning strokes by shield wires

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chowdhuri, P.

    1994-01-01

    A new analysis for shielding outdoor substations against direct lightning strokes by shield wires is proposed. The basic assumption of this proposed method is that any lightning stroke which penetrates the shields will cause damage. The second assumption is that a certain level of risk of failure must be accepted, such as one or two failures per 100 years. The proposed method, using electrogeometric model, was applied to design shield wires for two outdoor substations: (1) 161-kV/69-kV station, and (2) 500-kV/161-kV station. The results of the proposed method were also compared with the shielding data of two other substations.

  19. Quantitative Methodology: A Guide for Emerging Physical Education and Adapted Physical Education Researchers

    ERIC Educational Resources Information Center

    Haegele, Justin A.; Hodge, Samuel R.

    2015-01-01

    Emerging professionals, particularly senior-level undergraduate and graduate students in kinesiology who have an interest in physical education for individuals with and without disabilities, should understand the basic assumptions of the quantitative research paradigm. Knowledge of basic assumptions is critical for conducting, analyzing, and…

  20. Misleading Theoretical Assumptions in Hypertext/Hypermedia Research.

    ERIC Educational Resources Information Center

    Tergan, Sigmar-Olaf

    1997-01-01

    Reviews basic theoretical assumptions of research on learning with hypertext/hypermedia. Focuses on whether the results of research on hypertext/hypermedia-based learning support these assumptions. Results of empirical studies and theoretical analysis reveal that many research approaches have been misled by inappropriate theoretical assumptions on…

  1. Didactics and History of Mathematics: Knowledge and Self-Knowledge

    ERIC Educational Resources Information Center

    Fried, Michael N.

    2007-01-01

    The basic assumption of this paper is that mathematics and history of mathematics are both forms of knowledge and, therefore, represent different ways of knowing. This was also the basic assumption of Fried (2001) who maintained that these ways of knowing imply different conceptual and methodological commitments, which, in turn, lead to a conflict…

  2. The Discrepancy-Induced Source Comprehension (D-ISC) Model: Basic Assumptions and Preliminary Evidence

    ERIC Educational Resources Information Center

    Braasch, Jason L. G.; Bråten, Ivar

    2017-01-01

    Despite the importance of source attention and evaluation for learning from texts, little is known about the particular conditions that encourage sourcing during reading. In this article, basic assumptions of the discrepancy-induced source comprehension (D-ISC) model are presented, which describes the moment-by-moment cognitive processes that…

  3. Adaptive control: Myths and realities

    NASA Technical Reports Server (NTRS)

    Athans, M.; Valavani, L.

    1984-01-01

    It was found that all currently existing globally stable adaptive algorithms have three basic properties in common: positive realness of the error equation, square-integrability of the parameter adjustment law and, need for sufficient excitation for asymptotic parameter convergence. Of the three, the first property is of primary importance since it satisfies a sufficient condition for stabillity of the overall system, which is a baseline design objective. The second property has been instrumental in the proof of asymptotic error convergence to zero, while the third addresses the issue of parameter convergence. Positive-real error dynamics can be generated only if the relative degree (excess of poles over zeroes) of the process to be controlled is known exactly; this, in turn, implies perfect modeling. This and other assumptions, such as absence of nonminimum phase plant zeros on which the mathematical arguments are based, do not necessarily reflect properties of real systems. As a result, it is natural to inquire what happens to the designs under less than ideal assumptions. The issues arising from violation of the exact modeling assumption which is extremely restrictive in practice and impacts the most important system property, stability, are discussed.

  4. Analytics in Online and Offline Language Learning Environments: The Role of Learning Design to Understand Student Online Engagement

    ERIC Educational Resources Information Center

    Rienties, Bart; Lewis, Tim; McFarlane, Ruth; Nguyen, Quan; Toetenel, Lisette

    2018-01-01

    Language education has a rich history of research and scholarship focusing on the effectiveness of learning activities and the impact these have on student behaviour and outcomes. One of the basic assumptions in foreign language pedagogy and CALL in particular is that learners want to be able to communicate effectively with native speakers of…

  5. Take-off and Landing

    DTIC Science & Technology

    1975-01-01

    Studies Program. The results of AGARD work are reported to the member nations and the NATO Authorities through the AGARD series of publications of...calculated based on a low altitude mission profile. 2. GROUND RULES AND BASIC ASSUMPTIONS Base Design All aircraft synthesized for this study are...In this study manoeuverability is defined in terms of specific excess power (as shown in Fig. 5) at specified Mach number, altitude,and load

  6. Knowledge Discovery from Relations

    ERIC Educational Resources Information Center

    Guo, Zhen

    2010-01-01

    A basic and classical assumption in the machine learning research area is "randomness assumption" (also known as i.i.d assumption), which states that data are assumed to be independent and identically generated by some known or unknown distribution. This assumption, which is the foundation of most existing approaches in the literature, simplifies…

  7. RX: a nonimaging concentrator.

    PubMed

    Miñano, J C; Benítez, P; González, J C

    1995-05-01

    A detailed description of the design procedure for a new concentrator, RX, and some examples of it's use are given. The method of design is basically the same as that used in the design of two other concentrators: the RR and the XR [Appl. Opt. 31, 3051 (1992)]. The RX is ideal in two-dimensional geometry. The performance of the rotational RX is good when the average angular spread of the input bundle is small: up to 95% of the power of the input bundle can be transferred to the output bundle (with the assumption of a constant radiance for the rays of the input bundle).

  8. Teaching Critical Literacy across the Curriculum in Multimedia America.

    ERIC Educational Resources Information Center

    Semali, Ladislaus M.

    The teaching of media texts as a form of textual construction is embedded in the assumption that audiences bring individual preexisting dispositions even though the media may contribute to their shaping of basic attitudes, beliefs, values, and behavior. As summed up by D. Lusted, at the core of such textual construction are basic assumptions that…

  9. Refraction effects of atmosphere on geodetic measurements to celestial bodies

    NASA Technical Reports Server (NTRS)

    Joshi, C. S.

    1973-01-01

    The problem is considered of obtaining accurate values of refraction corrections for geodetic measurements of celestial bodies. The basic principles of optics governing the phenomenon of refraction are defined, and differential equations are derived for the refraction corrections. The corrections fall into two main categories: (1) refraction effects due to change in the direction of propagation, and (2) refraction effects mainly due to change in the velocity of propagation. The various assumptions made by earlier investigators are reviewed along with the basic principles of improved models designed by investigators of the twentieth century. The accuracy problem for various quantities is discussed, and the conclusions and recommendations are summarized.

  10. Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.

    PubMed

    Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian

    2011-01-01

    Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.

  11. Helping Students to Recognize and Evaluate an Assumption in Quantitative Reasoning: A Basic Critical-Thinking Activity with Marbles and Electronic Balance

    ERIC Educational Resources Information Center

    Slisko, Josip; Cruz, Adrian Corona

    2013-01-01

    There is a general agreement that critical thinking is an important element of 21st century skills. Although critical thinking is a very complex and controversial conception, many would accept that recognition and evaluation of assumptions is a basic critical-thinking process. When students use simple mathematical model to reason quantitatively…

  12. Conceptual second-generation lunar equipment

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The spring 1990 Introduction to Design class was asked to conceptually design second-generation lunar vehicles and equipment as a semester design project. The basic assumption made in designing second-generation lunar vehicles and equipment was that a network of permanent lunar bases already existed. The designs were to facilitate the transportation of personnel and materials. The eight topics to choose from included flying vehicles, ground-based vehicles, robotic arms, and life support systems. Two teams of two or three members competed on each topic and results were exhibited at a formal presentation. A clean-propellant powered lunar flying transport vehicle, an extra-vehicular activity life support system, a pressurized lunar rover for greater distances, and a robotic arm design project are discussed.

  13. New Beginnings: Ensuring Quality Bilingual/ESL Instruction in New York City Public Schools. Executive Summary [and] Report of the Chancellor's Bilingual/ESL Education Practitioners' Workgroup and Policy/Research Panels.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Office of Bilingual Education.

    The report presents a conceptual framework and related strategies designed to help policymakers and practitioners re-examine, and when necessary, rework the basic assumptions and practices defining the educational experiences of bilingual/English-as-a-Second-Language (ESL) learners in New York City (New York) public schools. The report consists of…

  14. Anthropometric Source Book. Volume 1: Anthropometry for Designers

    DTIC Science & Technology

    1978-07-01

    diet initiates replacement of the tissue loss incurred in the first day or two of flight. Any further caloric excess or deficit would be superimposed...the Skylab missions, a calorically inadequate basic diet was supplied as a result of the assumption that in-flight requirements were less than those...from one-g to weightlessness conditions or vice versa, any remaining volume changes are probably tissue changes. If a diet is calorically inadequate

  15. The Robustness of the Studentized Range Statistic to Violations of the Normality and Homogeneity of Variance Assumptions.

    ERIC Educational Resources Information Center

    Ramseyer, Gary C.; Tcheng, Tse-Kia

    The present study was directed at determining the extent to which the Type I Error rate is affected by violations in the basic assumptions of the q statistic. Monte Carlo methods were employed, and a variety of departures from the assumptions were examined. (Author)

  16. [Introduction to Exploratory Factor Analysis (EFA)].

    PubMed

    Martínez, Carolina Méndez; Sepúlveda, Martín Alonso Rondón

    2012-03-01

    Exploratory Factor Analysis (EFA) has become one of the most frequently used statistical techniques, especially in the medical and social sciences. Given its popularity, it is essential to understand the basic concepts necessary for its proper application and to take into consideration the main strengths and weaknesses of this technique. To present in a clear and concise manner the main applications of this technique, to determine the basic requirements for its use providing a description step by step of its methodology, and to establish the elements that must be taken into account during its preparation in order to not incur in erroneous results and interpretations. Narrative review. This review identifies the basic concepts and briefly describes the objectives, design, assumptions, and methodology to achieve factor derivation, global adjustment evaluation, and adequate interpretation of results. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  17. [The Basic-Symptom Concept and its Influence on Current International Research on the Prediction of Psychoses].

    PubMed

    Schultze-Lutter, F

    2016-12-01

    The early detection of psychoses has become increasingly relevant in research and clinic. Next to the ultra-high risk (UHR) approach that targets an immediate risk of developing frank psychosis, the basic symptom approach that targets the earliest possible detection of the developing disorder is being increasingly used worldwide. The present review gives an introduction to the development and basic assumptions of the basic symptom concept, summarizes the results of studies on the specificity of basic symptoms for psychoses in different age groups as well as on studies of their psychosis-predictive value, and gives an outlook on future results. Moreover, a brief introduction to first recent imaging studies is given that supports one of the main assumptions of the basic symptom concept, i. e., that basic symptoms are the most immediate phenomenological expression of the cerebral aberrations underlying the development of psychosis. From this, it is concluded that basic symptoms might be able to provide important information on future neurobiological research on the etiopathology of psychoses. © Georg Thieme Verlag KG Stuttgart · New York.

  18. Affordable passive solar homes - low-cost, compact designs. [Glossary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowther, R.L.

    1984-01-01

    The designs and plans of this book present total, integrative, energy design. They carefully integrate site, architecture, and interior for various population segments that meet a frugal budget. The book is divided into two sections. The first part gives data concerning design, construction, site, climatic factors, materials, interiors, financing, and other home ownership factors that enhance affordability. Basic information on the design assumptions and considerations incorporate into the homes is presented, along with passive solar systems descriptions. The second part presents designs and plans with a brief review of considerations that serve defined human living needs, as well single-family, attached,more » or multiple residential configurations. The plans are based on a dimensional grid using 4-foot and 2-foot (1.2 meter and .61 meter) increments compatible with economic standard lumber and materials sizes.« less

  19. Can Basic Research on Children and Families Be Useful for the Policy Process?

    ERIC Educational Resources Information Center

    Moore, Kristin A.

    Based on the assumption that basic science is the crucial building block for technological and biomedical progress, this paper examines the relevance for public policy of basic demographic and behavioral sciences research on children and families. The characteristics of basic research as they apply to policy making are explored. First, basic…

  20. Shaping the use of psychotropic medicines in nursing homes: A qualitative study on organisational culture.

    PubMed

    Sawan, Mouna; Jeon, Yun-Hee; Chen, Timothy F

    2018-04-01

    Psychotropic medicines have limited efficacy in the management of behavioural and psychological disturbances, yet they are commonly used in nursing homes. Organisational culture is an important consideration influencing use of psychotropic medicines. Schein's theory elucidates that organisational culture is underpinned by basic assumptions, which are the taken for granted beliefs driving organisational members' behaviour and practices. By exploring the basic assumptions of culture we are able to find explanations for why psychotropic medicines are prescribed contrary to standards. A qualitative study guided by Schein's theory was conducted using semi-structured interviews with 40 staff representing a broad range of roles from eight nursing homes. Findings from the study suggest two basic assumptions influenced the use of psychotropic medicines: locus of control and necessity for efficiency or comprehensiveness. Locus of control pertained to whether staff believed they could control decisions when facing negative work experiences. Necessity for efficiency or comprehensiveness concerned how much time and effort was spent on a given task. Participants' arrived at decisions to use psychotropic medicines that were inconsistent with ideal standards when they believed they were helpless to do the right thing by the resident and it was necessary to restrict time on a given task. Basic assumptions tended to provide the rationale for staff to use psychotropic medicines when it was not compatible with standards. Organisational culture is an important factor that should be addressed to optimise psychotropic medicine use. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Artificial Intelligence: Underlying Assumptions and Basic Objectives.

    ERIC Educational Resources Information Center

    Cercone, Nick; McCalla, Gordon

    1984-01-01

    Presents perspectives on methodological assumptions underlying research efforts in artificial intelligence (AI) and charts activities, motivations, methods, and current status of research in each of the major AI subareas: natural language understanding; computer vision; expert systems; search, problem solving, planning; theorem proving and logic…

  2. Teaching Practices: Reexamining Assumptions.

    ERIC Educational Resources Information Center

    Spodek, Bernard, Ed.

    This publication contains eight papers, selected from papers presented at the Bicentennial Conference on Early Childhood Education, that discuss different aspects of teaching practices. The first two chapters reexamine basic assumptions underlying the organization of curriculum experiences for young children. Chapter 3 discusses the need to…

  3. Curvilinear steel elements in load-bearing structures of high-rise building spatial frames

    NASA Astrophysics Data System (ADS)

    Ibragimov, Alexander; Danilov, Alexander

    2018-03-01

    The application of curvilinear elements in load-bearing metal structures of high-rise buildings supposes ensuring of their bearing capacity and serviceability. There may exist a great variety of shapes and orientations of such structural elements. In particular, it may be various flat curves of an open or closed oval profile such as circular or parabolic arch or ellipse. The considered approach implies creating vast internal volumes without loss in the load-bearing capacity of the frame. The basic concept makes possible a wide variety of layout and design solutions. The presence of free internal spaces of large volume in "skyscraper" type buildings contributes to resolving a great number of problems, including those of communicative nature. The calculation results confirm the basic assumptions.

  4. Small Molecule Docking from Theoretical Structural Models

    NASA Astrophysics Data System (ADS)

    Novoa, Eva Maria; de Pouplana, Lluis Ribas; Orozco, Modesto

    Structural approaches to rational drug design rely on the basic assumption that pharmacological activity requires, as necessary but not sufficient condition, the binding of a drug to one or several cellular targets, proteins in most cases. The traditional paradigm assumes that drugs that interact only with a single cellular target are specific and accordingly have little secondary effects, while promiscuous molecules are more likely to generate undesirable side effects. However, current examples indicate that often efficient drugs are able to interact with several biological targets [1] and in fact some dirty drugs, such as chlorpromazine, dextromethorphan, and ibogaine exhibit desired pharmacological properties [2]. These considerations highlight the tremendous difficulty of designing small molecules that both have satisfactory ADME properties and the ability of interacting with a limited set of target proteins with a high affinity, avoiding at the same time undesirable interactions with other proteins. In this complex and challenging scenario, computer simulations emerge as the basic tool to guide medicinal chemists during the drug discovery process.

  5. How biological background assumptions influence scientific risk evaluation of stacked genetically modified plants: an analysis of research hypotheses and argumentations.

    PubMed

    Rocca, Elena; Andersen, Fredrik

    2017-08-14

    Scientific risk evaluations are constructed by specific evidence, value judgements and biological background assumptions. The latter are the framework-setting suppositions we apply in order to understand some new phenomenon. That background assumptions co-determine choice of methodology, data interpretation, and choice of relevant evidence is an uncontroversial claim in modern basic science. Furthermore, it is commonly accepted that, unless explicated, disagreements in background assumptions can lead to misunderstanding as well as miscommunication. Here, we extend the discussion on background assumptions from basic science to the debate over genetically modified (GM) plants risk assessment. In this realm, while the different political, social and economic values are often mentioned, the identity and role of background assumptions at play are rarely examined. We use an example from the debate over risk assessment of stacked genetically modified plants (GM stacks), obtained by applying conventional breeding techniques to GM plants. There are two main regulatory practices of GM stacks: (i) regulate as conventional hybrids and (ii) regulate as new GM plants. We analyzed eight papers representative of these positions and found that, in all cases, additional premises are needed to reach the stated conclusions. We suggest that these premises play the role of biological background assumptions and argue that the most effective way toward a unified framework for risk analysis and regulation of GM stacks is by explicating and examining the biological background assumptions of each position. Once explicated, it is possible to either evaluate which background assumptions best reflect contemporary biological knowledge, or to apply Douglas' 'inductive risk' argument.

  6. Sampling Assumptions in Inductive Generalization

    ERIC Educational Resources Information Center

    Navarro, Daniel J.; Dry, Matthew J.; Lee, Michael D.

    2012-01-01

    Inductive generalization, where people go beyond the data provided, is a basic cognitive capability, and it underpins theoretical accounts of learning, categorization, and decision making. To complete the inductive leap needed for generalization, people must make a key "sampling" assumption about how the available data were generated.…

  7. Application of the Deming management method to equipment-inspection processes.

    PubMed

    Campbell, C A

    1996-01-01

    The Biomedical Engineering staff at the Washington Hospital Center has designed an inspection process that optimizes timely completion of scheduled equipment inspections. The method used to revise the process was primarily Deming, but certainly the method incorporates the re-engineering concept of questioning the basic assumptions around which the original process was designed. This effort involved a review of the existing process in its entirety by task groups made up of representatives from all involved departments. Complete success in all areas has remained elusive. However, the lower variability of inspection completion ratios follows Deming's description of a successfully revised process. Further CQI efforts targeted at specific areas with low completion ratios will decrease this variability even further.

  8. Soft Robotics: New Perspectives for Robot Bodyware and Control

    PubMed Central

    Laschi, Cecilia; Cianchetti, Matteo

    2014-01-01

    The remarkable advances of robotics in the last 50 years, which represent an incredible wealth of knowledge, are based on the fundamental assumption that robots are chains of rigid links. The use of soft materials in robotics, driven not only by new scientific paradigms (biomimetics, morphological computation, and others), but also by many applications (biomedical, service, rescue robots, and many more), is going to overcome these basic assumptions and makes the well-known theories and techniques poorly applicable, opening new perspectives for robot design and control. The current examples of soft robots represent a variety of solutions for actuation and control. Though very first steps, they have the potential for a radical technological change. Soft robotics is not just a new direction of technological development, but a novel approach to robotics, unhinging its fundamentals, with the potential to produce a new generation of robots, in the support of humans in our natural environments. PMID:25022259

  9. Belief Structures about People Held by Selected Graduate Students.

    ERIC Educational Resources Information Center

    Dole, Arthur A.; And Others

    Wrightsman has established that assumptions about human nature distinguish religious, occupational, political, gender, and other groups, and that they predict behavior in structured situations. Hjelle and Ziegler proposed a set of nine basic bipolar assumptions about the nature of people: freedom-determinism; rationality-irrationality;…

  10. Writing Partners: Service Learning as a Route to Authority for Basic Writers

    ERIC Educational Resources Information Center

    Gabor, Catherine

    2009-01-01

    This article looks at best practices in basic writing instruction in terms of non-traditional audiences and writerly authority. Much conventional wisdom discourages participation in service-learning projects for basic writers because of the assumption that their writing is not yet ready to "go public." Countering this line of thinking, the author…

  11. Introduction to the Application of Web-Based Surveys.

    ERIC Educational Resources Information Center

    Timmerman, Annemarie

    This paper discusses some basic assumptions and issues concerning web-based surveys. Discussion includes: assumptions regarding cost and ease of use; disadvantages of web-based surveys, concerning the inability to compensate for four common errors of survey research: coverage error, sampling error, measurement error and nonresponse error; and…

  12. School, Cultural Diversity, Multiculturalism, and Contact

    ERIC Educational Resources Information Center

    Pagani, Camilla; Robustelli, Francesco; Martinelli, Cristina

    2011-01-01

    The basic assumption of this paper is that school's potential to improve cross-cultural relations, as well as interpersonal relations in general, is enormous. This assumption is supported by a number of theoretical considerations and by the analysis of data we obtained from a study we conducted on the attitudes toward diversity and…

  13. Nuclear Reactions in Micro/Nano-Scale Metal Particles

    NASA Astrophysics Data System (ADS)

    Kim, Y. E.

    2013-03-01

    Low-energy nuclear reactions in micro/nano-scale metal particles are described based on the theory of Bose-Einstein condensation nuclear fusion (BECNF). The BECNF theory is based on a single basic assumption capable of explaining the observed LENR phenomena; deuterons in metals undergo Bose-Einstein condensation. The BECNF theory is also a quantitative predictive physical theory. Experimental tests of the basic assumption and theoretical predictions are proposed. Potential application to energy generation by ignition at low temperatures is described. Generalized theory of BECNF is used to carry out theoretical analyses of recently reported experimental results for hydrogen-nickel system.

  14. High Tech Educators Network Evaluation.

    ERIC Educational Resources Information Center

    O'Shea, Dan

    A process evaluation was conducted to assess the High Tech Educators Network's (HTEN's) activities. Four basic components to the evaluation approach were documentation review, program logic model, written survey, and participant interviews. The model mapped the basic goals and objectives, assumptions, activities, outcome expectations, and…

  15. On Cognitive Constraints and Learning Progressions: The Case of "Structure of Matter"

    ERIC Educational Resources Information Center

    Talanquer, Vicente

    2009-01-01

    Based on the analysis of available research on students' alternative conceptions about the particulate nature of matter, we identified basic implicit assumptions that seem to constrain students' ideas and reasoning on this topic at various learning stages. Although many of these assumptions are interrelated, some of them seem to change or…

  16. Rationality as the Basic Assumption in Explaining Japanese (or Any Other) Business Culture.

    ERIC Educational Resources Information Center

    Koike, Shohei

    Economic analysis, with its explicit assumption that people are rational, is applied to the Japanese and American business cultures to illustrate how the approach is useful for understanding cultural differences. Specifically, differences in cooperative behavior among Japanese and American workers are examined. Economic analysis goes beyond simple…

  17. Standardization of Selected Semantic Differential Scales with Secondary School Children.

    ERIC Educational Resources Information Center

    Evans, G. T.

    A basic assumption of this study is that the meaning continuum registered by an adjective pair remains relatively constant over a large universe of concepts and over subjects within a relatively homogeneous population. An attempt was made to validate this assumption by showing the invariance of the factor structure across different types of…

  18. What's Love Got to Do with It? Rethinking Common Sense Assumptions

    ERIC Educational Resources Information Center

    Trachman, Matthew; Bluestone, Cheryl

    2005-01-01

    One of the most basic tasks in introductory social science classes is to get students to reexamine their common sense assumptions concerning human behavior. This article introduces a shared assignment developed for a learning community that paired an introductory sociology and psychology class. The assignment challenges students to rethink the…

  19. Design data needs modular high-temperature gas-cooled reactor. Revision 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1987-03-01

    The Design Data Needs (DDNs) provide summary statements for program management, of the designer`s need for experimental data to confirm or validate assumptions made in the design. These assumptions were developed using the Integrated Approach and are tabulated in the Functional Analysis Report. These assumptions were also necessary in the analyses or trade studies (A/TS) to develop selections of hardware design or design requirements. Each DDN includes statements providing traceability to the function and the associated assumption that requires the need.

  20. How to Design Buildings, Housing Estates and Towns So That Their Impact On the Environment Will Be Acceptable?

    NASA Astrophysics Data System (ADS)

    Majerska-Pałubicka, Beata

    2017-10-01

    Currently, there is a tendency in architecture to search for solutions implementing the assumptions of the sustainable development paradigm. A number of them are components of architecture, which in the future will certainly affect urban planning and architecture to a much greater extent. On the one hand, an issue of great significance is the need to integrate sustainable system elements with the spatial structure of environmentally friendly architectural facilities and complexes and to determine their influence on design solutions as well as the implementation, operation and recycling, while on the other hand, it is very important to solve the problem of how to design buildings, housing estates and towns so that their impact on the environment will be acceptable, i.e. will not exceed the possibilities of natural environment regeneration and, how to cooperate in interdisciplinary design teams to reach an agreement and acceptance so as to achieve harmony between the built and natural environment, which is a basis of sustainable development. In this broad interdisciplinary context an increasing importance is being attached to design strategies, systems of evaluating designs and buildings as well as tools to support integrated activities in the field of architectural design. The above topics are the subject of research presented in this paper. The basic research aim of the paper is: to look for a current method of solving design tasks within the framework of Integrated Design Process (IDP) using modern design tools and technical possibilities, in the context of sustainable development imperative, including, the optimisation of IDP design strategies regarding the assumptions of conscious creation of sustainable built environment, adjusted to Polish conditions. As a case study used examples of Scandinavian housing settlements, sustainable in a broad context.

  1. Generalizing the Network Scale-Up Method: A New Estimator for the Size of Hidden Populations*

    PubMed Central

    Feehan, Dennis M.; Salganik, Matthew J.

    2018-01-01

    The network scale-up method enables researchers to estimate the size of hidden populations, such as drug injectors and sex workers, using sampled social network data. The basic scale-up estimator offers advantages over other size estimation techniques, but it depends on problematic modeling assumptions. We propose a new generalized scale-up estimator that can be used in settings with non-random social mixing and imperfect awareness about membership in the hidden population. Further, the new estimator can be used when data are collected via complex sample designs and from incomplete sampling frames. However, the generalized scale-up estimator also requires data from two samples: one from the frame population and one from the hidden population. In some situations these data from the hidden population can be collected by adding a small number of questions to already planned studies. For other situations, we develop interpretable adjustment factors that can be applied to the basic scale-up estimator. We conclude with practical recommendations for the design and analysis of future studies. PMID:29375167

  2. Intellectualizing Adult Basic Literacy Education: A Case Study

    ERIC Educational Resources Information Center

    Bradbury, Kelly S.

    2012-01-01

    At a time when accusations of American ignorance and anti-intellectualism are ubiquitous, this article challenges problematic assumptions about intellectualism that overlook the work of adult basic literacy programs and proposes an expanded view of intellectualism. It is important to recognize and to challenge narrow views of intellectualism…

  3. Adult Literacy Programs: Guidelines for Effectiveness.

    ERIC Educational Resources Information Center

    Lord, Jerome E.

    This report is a summary of information from both research and experience about the assumptions and practices that guide successful basic skills programs. The 31 guidelines are basic to building a solid foundation on which effective instructional programs for adults can be developed. The first six guidelines address some important characteristics…

  4. Social Studies Curriculum Guidelines.

    ERIC Educational Resources Information Center

    Manson, Gary; And Others

    These guidelines, which set standards for social studies programs K-12, can be used to update existing programs or may serve as a baseline for further innovation. The first section, "A Basic Rationale for Social Studies Education," identifies the theoretical assumptions basic to the guidelines as knowledge, thinking, valuing, social participation,…

  5. Basic lubrication equations

    NASA Technical Reports Server (NTRS)

    Hamrock, B. J.; Dowson, D.

    1981-01-01

    Lubricants, usually Newtonian fluids, are assumed to experience laminar flow. The basic equations used to describe the flow are the Navier-Stokes equation of motion. The study of hydrodynamic lubrication is, from a mathematical standpoint, the application of a reduced form of these Navier-Stokes equations in association with the continuity equation. The Reynolds equation can also be derived from first principles, provided of course that the same basic assumptions are adopted in each case. Both methods are used in deriving the Reynolds equation, and the assumptions inherent in reducing the Navier-Stokes equations are specified. Because the Reynolds equation contains viscosity and density terms and these properties depend on temperature and pressure, it is often necessary to couple the Reynolds with energy equation. The lubricant properties and the energy equation are presented. Film thickness, a parameter of the Reynolds equation, is a function of the elastic behavior of the bearing surface. The governing elasticity equation is therefore presented.

  6. Practical Stereology Applications for the Pathologist.

    PubMed

    Brown, Danielle L

    2017-05-01

    Qualitative histopathology is the gold standard for routine examination of morphological tissue changes in the regulatory or academic environment. The human eye is exceptional for pattern recognition but often cannot detect small changes in quantity. In cases where detection of subtle quantitative changes is critical, more sensitive methods are required. Two-dimensional histomorphometry can provide additional quantitative information and is quite useful in many cases. However, the provided data may not be referent to the entire tissue and, as such, it makes several assumptions, which are sources of bias. In contrast, stereology is design based rather than assumption based and uses stringent sampling methods to obtain accurate and precise 3-dimensional information using geometrical and statistical principles. Recent advances in technology have made stereology more approachable and practical for the pathologist in both regulatory and academic environments. This review introduces pathologists to the basic principles of stereology and walks the reader through some real-world examples for the application of these principles in the workplace.

  7. Scientific analysis is essential to assess biofuel policy effects: in response to the paper by Kim and Dale on "Indirect land use change for biofuels: Testing predictions and improving analytical methodologies"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kline, Keith L; Oladosu, Gbadebo A; Dale, Virginia H

    2011-01-01

    Vigorous debate on the effects of biofuels derives largely from the changes in land use estimated using economic models designed mainly for the analysis of agricultural trade and markets. The models referenced for land-use change (LUC) analysis in the U.S. Environmental Protection Agency Final Rule on the Renewable Fuel Standard include GTAP, FAPRI-CARD, and FASOM. To address bioenergy impacts, these models were expanded and modified to facilitate simulations of hypothesized LUC. However, even when models use similar basic assumptions and data, the range of LUC results can vary by ten-fold or more. While the market dynamics simulated in these modelsmore » include processes that are important in estimating effects of biofuel policies, the models have not been validated for estimating land-use changes and employ crucial assumptions and simplifications that contradict empirical evidence.« less

  8. [A reflection about organizational culture according to psychoanalysis' view].

    PubMed

    Cardoso, Maria Lúcia Alves Pereira

    2008-01-01

    This article aims at submitting a reflection on the universal presuppositions of human culture proposed by Freud, as a prop for analyzing presuppositions of organizational culture according to Schein. In an article published in 1984, the latter claims that in order to decipher organizational culture one cannot rely upon the (visible) artifacts or to (perceptible) values, but should take a deeper plunge and identify the basic assumptions underlying organizational culture. Such pressupositions spread into the field of sttudy concerning the individual inner self, within the sphere of Psychoanalysis. We have therefore examined Freud's basic assumptions of human culture in order to ascertain its conformity with the paradigms of organizational culture as proposed by Schein.

  9. Thermodynamic Properties of Low-Density {}^{132}Xe Gas in the Temperature Range 165-275 K

    NASA Astrophysics Data System (ADS)

    Akour, Abdulrahman

    2018-01-01

    The method of static fluctuation approximation was used to calculate selected thermodynamic properties (internal energy, entropy, energy capacity, and pressure) for xenon in a particularly low-temperature range (165-270 K) under different conditions. This integrated microscopic study started from an initial basic assumption as the main input. The basic assumption in this method was to replace the local field operator with its mean value, then numerically solve a closed set of nonlinear equations using an iterative method, considering the Hartree-Fock B2-type dispersion potential as the most appropriate potential for xenon. The results are in very good agreement with those of an ideal gas.

  10. What Is This Substance? What Makes It Different? Mapping Progression in Students' Assumptions about Chemical Identity

    ERIC Educational Resources Information Center

    Ngai, Courtney; Sevian, Hannah; Talanquer, Vicente

    2014-01-01

    Given the diversity of materials in our surroundings, one should expect scientifically literate citizens to have a basic understanding of the core ideas and practices used to analyze chemical substances. In this article, we use the term 'chemical identity' to encapsulate the assumptions, knowledge, and practices upon which chemical…

  11. PCDAS Version 2. 2: Remote network control and data acquisition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fishbaugher, M.J.

    1987-09-01

    This manual is intended for both technical and non-technical people who want to use the PCDAS remote network control and data acquisition software. If you are unfamiliar with remote data collection hardware systems designed at Pacific Northwest Laboratory (PNL), this introduction should answer your basic questions. Even if you have some experience with the PNL-designed Field Data Acquisition Systems (FDAS), it would be wise to review this material before attempting to set up a network. This manual was written based on the assumption that you have a rudimentary understanding of personal computer (PC) operations using Disk Operating System (DOS) versionmore » 2.0 or greater (IBM 1984). You should know how to create subdirectories and get around the subdirectory tree.« less

  12. Science Awareness and Science Literacy through the Basic Physics Course: Physics with a bit of Metaphysics?

    NASA Astrophysics Data System (ADS)

    Rusli, Aloysius

    2016-08-01

    Until the 1980s, it is well known and practiced in Indonesian Basic Physics courses, to present physics by its effective technicalities: The ideally elastic spring, the pulley and moving blocks, the thermodynamics of ideal engine models, theoretical electrostatics and electrodynamics with model capacitors and inductors, wave behavior and its various superpositions, and hopefully closed with a modern physics description. A different approach was then also experimented with, using the Hobson and Moore texts, stressing the alternative aim of fostering awareness, not just mastery, of science and the scientific method. This is hypothesized to be more in line with the changed attitude of the so-called Millenials cohort who are less attentive if not interested, and are more used to multi-tasking which suits their shorter span of attention. The upside is increased awareness of science and the scientific method. The downside is that they are getting less experience of the scientific method which intensely bases itself on critical observation, analytic thinking to set up conclusions or hypotheses, and checking consistency of the hypotheses with measured data. Another aspect is recognition that the human person encompasses both the reasoning capacity and the mental- spiritual-cultural capacity. This is considered essential, as the world grows even smaller due to increased communication capacity, causing strong interactions, nonlinear effects, and showing that value systems become more challenging and challenged due to physics / science and its cosmology, which is successfully based on the scientific method. So students should be made aware of the common basis of these two capacities: the assumptions, the reasoning capacity and the consistency assumption. This shows that the limits of science are their set of basic quantifiable assumptions, and the limits of the mental-spiritual-cultural aspects of life are their set of basic metaphysical (non-quantifiable) assumptions. The bridging between these two human aspects of life, can lead to a “why” of science, and a “meaning” of life. A progress report on these efforts is presented, essentially being of the results indicated by an extended format of the usual weekly reporting used previously in Basic Physics lectures.

  13. Genital Measures: Comments on Their Role in Understanding Human Sexuality

    ERIC Educational Resources Information Center

    Geer, James H.

    1976-01-01

    This paper discusses the use of genital measures in the study of both applied and basic work in human sexuality. Some of the advantages of psychophysiological measures are considered along with cautions concerning unwarranted assumptions. Some of the advances that are possible in both applied and basic work are examined. (Author)

  14. An assessment of finite-element modeling techniques for thick-solid/thin-shell joints analysis

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Androlake, S. G.

    1993-01-01

    The subject of finite-element modeling has long been of critical importance to the practicing designer/analyst who is often faced with obtaining an accurate and cost-effective structural analysis of a particular design. Typically, these two goals are in conflict. The purpose is to discuss the topic of finite-element modeling for solid/shell connections (joints) which are significant for the practicing modeler. Several approaches are currently in use, but frequently various assumptions restrict their use. Such techniques currently used in practical applications were tested, especially to see which technique is the most ideally suited for the computer aided design (CAD) environment. Some basic thoughts regarding each technique are also discussed. As a consequence, some suggestions based on the results are given to lead reliable results in geometrically complex joints where the deformation and stress behavior are complicated.

  15. Small low mass advanced PBR's for propulsion

    NASA Astrophysics Data System (ADS)

    Powell, J. R.; Todosow, M.; Ludewig, H.

    1993-10-01

    The advanced Particle Bed Reactor (PBR) to be described in this paper is characterized by relatively low power, and low cost, while still maintaining competition values for thrust/weight, specific impulse and operating times. In order to retain competitive values for the thrust/weight ratio while reducing the reactor size, it is necessary to change the basic reactor layout, by incorporating new concepts. The new reactor design concept is termed SIRIUS (Small Lightweight Reactor Integral Propulsion System). The following modifications are proposed for the reactor design to be discussed in this paper: Pre-heater (U-235 included in Moderator); Hy-C (Hydride/De-hydride for Reactor Control); Afterburner (U-235 impregnated into Hot Frit); and Hy-S (Hydride Spike Inside Hot Frit). Each of the modifications will be briefly discussed below, with benefits, technical issues, design approach, and risk levels addressed. The paper discusses conceptual assumptions, feasibility analysis, mass estimates, and information needs.

  16. 39 Questionable Assumptions in Modern Physics

    NASA Astrophysics Data System (ADS)

    Volk, Greg

    2009-03-01

    The growing body of anomalies in new energy, low energy nuclear reactions, astrophysics, atomic physics, and entanglement, combined with the failure of the Standard Model and string theory to predict many of the most basic fundamental phenomena, all point to a need for major new paradigms. Not Band-Aids, but revolutionary new ways of conceptualizing physics, in the spirit of Thomas Kuhn's The Structure of Scientific Revolutions. This paper identifies a number of long-held, but unproven assumptions currently being challenged by an increasing number of alternative scientists. Two common themes, both with venerable histories, keep recurring in the many alternative theories being proposed: (1) Mach's Principle, and (2) toroidal, vortex particles. Matter-based Mach's Principle differs from both space-based universal frames and observer-based Einsteinian relativity. Toroidal particles, in addition to explaining electron spin and the fundamental constants, satisfy the basic requirement of Gauss's misunderstood B Law, that motion itself circulates. Though a comprehensive theory is beyond the scope of this paper, it will suggest alternatives to the long list of assumptions in context.

  17. Application of the Recreation Opportunity Spectrum for Outdoor Recreation Planning on Army Installations.

    DTIC Science & Technology

    1982-03-01

    to preference types, and uses capacity estimation; therefore, it is basically a good system for recreation and resource inventory and classification...quan- tity, and distribution of recreational resources. Its basic unit of inventory is landform, or the homogeneity of physical features used to...by Clark and Stankey, "the basic assumption underlying the ROS is that quality recreational experiences are best assured by providing a diverse set of

  18. Assessment and Communication for People with Disorders of Consciousness.

    PubMed

    Ortner, Rupert; Allison, Brendan Z; Pichler, Gerald; Heilinger, Alexander; Sabathiel, Nikolaus; Guger, Christoph

    2017-08-01

    In this experiment, we demonstrate a suite of hybrid Brain-Computer Interface (BCI)-based paradigms that are designed for two applications: assessing the level of consciousness of people unable to provide motor response and, in a second stage, establishing a communication channel for these people that enables them to answer questions with either 'yes' or 'no'. The suite of paradigms is designed to test basic responses in the first step and to continue to more comprehensive tasks if the first tests are successful. The latter tasks require more cognitive functions, but they could provide communication, which is not possible with the basic tests. All assessment tests produce accuracy plots that show whether the algorithms were able to detect the patient's brain's response to the given tasks. If the accuracy level is beyond the significance level, we assume that the subject understood the task and was able to follow the sequence of commands presented via earphones to the subject. The tasks require users to concentrate on certain stimuli or to imagine moving either the left or right hand. All tasks are designed around the assumption that the user is unable to use the visual modality, and thus, all stimuli presented to the user (including instructions, cues, and feedback) are auditory or tactile.

  19. Methodological problems in the use of indirect comparisons for evaluating healthcare interventions: survey of published systematic reviews.

    PubMed

    Song, Fujian; Loke, Yoon K; Walsh, Tanya; Glenny, Anne-Marie; Eastwood, Alison J; Altman, Douglas G

    2009-04-03

    To investigate basic assumptions and other methodological problems in the application of indirect comparison in systematic reviews of competing healthcare interventions. Survey of published systematic reviews. Inclusion criteria Systematic reviews published between 2000 and 2007 in which an indirect approach had been explicitly used. Identified reviews were assessed for comprehensiveness of the literature search, method for indirect comparison, and whether assumptions about similarity and consistency were explicitly mentioned. The survey included 88 review reports. In 13 reviews, indirect comparison was informal. Results from different trials were naively compared without using a common control in six reviews. Adjusted indirect comparison was usually done using classic frequentist methods (n=49) or more complex methods (n=18). The key assumption of trial similarity was explicitly mentioned in only 40 of the 88 reviews. The consistency assumption was not explicit in most cases where direct and indirect evidence were compared or combined (18/30). Evidence from head to head comparison trials was not systematically searched for or not included in nine cases. Identified methodological problems were an unclear understanding of underlying assumptions, inappropriate search and selection of relevant trials, use of inappropriate or flawed methods, lack of objective and validated methods to assess or improve trial similarity, and inadequate comparison or inappropriate combination of direct and indirect evidence. Adequate understanding of basic assumptions underlying indirect and mixed treatment comparison is crucial to resolve these methodological problems. APPENDIX 1: PubMed search strategy. APPENDIX 2: Characteristics of identified reports. APPENDIX 3: Identified studies. References of included studies.

  20. The basic aerodynamics of floatation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davies, M.J.; Wood, D.H.

    1983-09-01

    The original derivation of the basic theory governing the aerodynamics of both hovercraft and modern floatation ovens, requires the validity of some extremely crude assumptions. However, the basic theory is surprisingly accurate. It is shown that this accuracy occurs because the final expression of the basic theory can be derived by approximating the full Navier-Stokes equations in a manner that clearly shows the limitations of the theory. These limitations are used in discussing the relatively small discrepancies between the theory and experiment, which may not be significant for practical purposes.

  1. 5 CFR 841.502 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... EMPLOYEES RETIREMENT SYSTEM-GENERAL ADMINISTRATION Employee Deductions and Government Contributions § 841... standards (using dynamic assumptions) and expressed as a level percentage of aggregate basic pay. Normal...

  2. Experimental investigation of two-phase heat transfer in a porous matrix.

    NASA Technical Reports Server (NTRS)

    Von Reth, R.; Frost, W.

    1972-01-01

    One-dimensional two-phase flow transpiration cooling through porous metal is studied experimentally. The experimental data is compared with a previous one-dimensional analysis. Good agreement with calculated temperature distribution is obtained as long as the basic assumptions of the analytical model are satisfied. Deviations from the basic assumptions are caused by nonhomogeneous and oscillating flow conditions. Preliminary derivation of nondimensional parameters which characterize the stable and unstable flow conditions is given. Superheated liquid droplets observed sputtering from the heated surface indicated incomplete evaporation at heat fluxes well in access of the latent energy transport. A parameter is developed to account for the nonequilibrium thermodynamic effects. Measured and calculated pressure drops show contradicting trends which are attributed to capillary forces.

  3. An Extension of the Partial Credit Model with an Application to the Measurement of Change.

    ERIC Educational Resources Information Center

    Fischer, Gerhard H.; Ponocny, Ivo

    1994-01-01

    An extension to the partial credit model, the linear partial credit model, is considered under the assumption of a certain linear decomposition of the item x category parameters into basic parameters. A conditional maximum likelihood algorithm for estimating basic parameters is presented and illustrated with simulation and an empirical study. (SLD)

  4. On the Basis of the Basic Variety.

    ERIC Educational Resources Information Center

    Schwartz, Bonnie D.

    1997-01-01

    Considers the interplay between source and target language in relation to two points made by Klein and Perdue: (1) the argument that the analysis of the target language should not be used as the model for analyzing interlanguage data; and (2) the theoretical claim that under the technical assumptions of minimalism, the Basic Variety is a "perfect"…

  5. The Not So Common Sense: Differences in How People Judge Social and Political Life.

    ERIC Educational Resources Information Center

    Rosenberg, Shawn W.

    This interdisciplinary book challenges two basic assumptions that orient much contemporary social scientific thinking. Offering theory and empirical research, the book rejects the classic liberal view that people share a basic common sense or rationality; while at the same time, it questions the view of contemporary social theory that meaning is…

  6. Engine Development Design Margins Briefing Charts

    NASA Technical Reports Server (NTRS)

    Bentz, Chuck

    2006-01-01

    New engines experience durability problems after entering service. The most prevalent and costly is the hot section, particularly the high-pressure turbine. The origin of durability problems can be traced back to: 1) the basic aero-mechanical design systems, assumptions, and design margins used by the engine designers, 2) the available materials systems, and 3) to a large extent, aggressive marketing in a highly competitive environment that pushes engine components beyond the demonstrated capability of the basic technology available for the hardware designs. Unfortunately the user must operate the engine in the service environment in order to learn the actual thrust loading and the time at max effort take-off conditions used in service are needed to determine the hot section life. Several hundred thousand hours of operational service will be required before the demonstrated reliability of a fleet of engines or the design deficiencies of the engine hot section parts can be determined. Also, it may take three to four engine shop visits for heavy maintenance on the gas path hardware to establish cost effective build standards. Spare parts drive the oerator's engine maintenance costs but spare parts also makes lots of money for the engine manufacturer during the service life of an engine. Unless competition prevails for follow-on engine buys, there is really no motivation for an OEM to spend internal money to improve parts durability and reduce earnings derived from a lucrative spare parts business. If the hot section life is below design goals or promised values, the OEM migh argue that the engine is being operated beyond its basic design intent. On the other hand, the airframer and the operator will continue to remind the OEM that his engine was selected based on a lot of promises to deliver spec thrust with little impact on engine service life if higher thrust is used intermittently. In the end, a standoff prevails and nothing gets fixed. This briefing will propose ways to hold competing engine manufacturers more accountable for engine hot section design margins during the entire Engine Development process as well as provide tools to assess the design temperature margins in the hot section parts of Service Engines.

  7. Network-level reproduction number and extinction threshold for vector-borne diseases.

    PubMed

    Xue, Ling; Scoglio, Caterina

    2015-06-01

    The basic reproduction number of deterministic models is an essential quantity to predict whether an epidemic will spread or not. Thresholds for disease extinction contribute crucial knowledge of disease control, elimination, and mitigation of infectious diseases. Relationships between basic reproduction numbers of two deterministic network-based ordinary differential equation vector-host models, and extinction thresholds of corresponding stochastic continuous-time Markov chain models are derived under some assumptions. Numerical simulation results for malaria and Rift Valley fever transmission on heterogeneous networks are in agreement with analytical results without any assumptions, reinforcing that the relationships may always exist and proposing a mathematical problem for proving existence of the relationships in general. Moreover, numerical simulations show that the basic reproduction number does not monotonically increase or decrease with the extinction threshold. Consistent trends of extinction probability observed through numerical simulations provide novel insights into mitigation strategies to increase the disease extinction probability. Research findings may improve understandings of thresholds for disease persistence in order to control vector-borne diseases.

  8. Justification for, and design of, an economical programmable multiple flight simulator

    NASA Technical Reports Server (NTRS)

    Kreifeldt, J. G.; Wittenber, J.; Macdonald, G.

    1981-01-01

    The considered research interests in air traffic control (ATC) studies revolve about the concept of distributed ATC management based on the assumption that the pilot has a cockpit display of traffic and navigation information (CDTI) via CRT graphics. The basic premise is that a CDTI equipped pilot can, in coordination with a controller, manage a part of his local traffic situation thereby improving important aspects of ATC performance. A modularly designed programmable flight simulator system is prototyped as a means of providing an economical facility of up to eight simulators to interface with a mainframe/graphics system for ATC experimentation, particularly CDTI-distributed management in which pilot-pilot interaction can have a determining effect on system performance. Need for a multiman simulator facility is predicted on results from an earlier three simulator facility.

  9. Modelling Framework and Assistive Device for Peripheral Intravenous Injections

    NASA Astrophysics Data System (ADS)

    Kam, Kin F.; Robinson, Martin P.; Gilbert, Mathew A.; Pelah, Adar

    2016-02-01

    Intravenous access for blood sampling or drug administration that requires peripheral venepuncture is perhaps the most common invasive procedure practiced in hospitals, clinics and general practice surgeries.We describe an idealised mathematical framework for modelling the dynamics of the peripheral venepuncture process. Basic assumptions of the model are confirmed through motion analysis of needle trajectories during venepuncture, taken from video recordings of a skilled practitioner injecting into a practice kit. The framework is also applied to the design and construction of a proposed device for accurate needle guidance during venepuncture administration, assessed as consistent and repeatable in application and does not lead to over puncture. The study provides insights into the ubiquitous peripheral venepuncture process and may contribute to applications in training and in the design of new devices, including for use in robotic automation.

  10. The Federal Role and Chapter 1: Rethinking Some Basic Assumptions.

    ERIC Educational Resources Information Center

    Kirst, Michael W.

    In the 20 years since the major Federal program for the disadvantaged began, surprisingly little has changed from its original vision. It is now time to question some of the basic policies of Chapter 1 of the Education Consolidation and Improvement Act in view of the change in conceptions about the Federal role and the recent state and local…

  11. Achieving Successful Employment Outcomes with the Use of Assistive Technology. Report from the Study Group, Institute on Rehabilitation Issues (24th, Washington, DC, May 1998).

    ERIC Educational Resources Information Center

    Radtke, Jean, Ed.

    Developed as a result of an institute on rehabilitation issues, this document is a guide to assistive technology as it affects successful competitive employment outcomes for people with disabilities. Chapter 1 offers basic information on assistive technology including basic assumptions, service provider approaches, options for technology…

  12. 5 CFR 842.702 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... for valuation of the System, based on dynamic assumptions. The present value factors are unisex... EMPLOYEES RETIREMENT SYSTEM-BASIC ANNUITY Alternative Forms of Annuities § 842.702 Definitions. In this...

  13. Applying Human Factors Principles to Mitigate Usability Issues Related to Embedded Assumptions in Health Information Technology Design

    PubMed Central

    Lowry, Svetlana Z; Patterson, Emily S

    2014-01-01

    Background There is growing recognition that design flaws in health information technology (HIT) lead to increased cognitive work, impact workflows, and produce other undesirable user experiences that contribute to usability issues and, in some cases, patient harm. These usability issues may in turn contribute to HIT utilization disparities and patient safety concerns, particularly among “non-typical” HIT users and their health care providers. Health care disparities are associated with poor health outcomes, premature death, and increased health care costs. HIT has the potential to reduce these disparate outcomes. In the computer science field, it has long been recognized that embedded cultural assumptions can reduce the usability, usefulness, and safety of HIT systems for populations whose characteristics differ from “stereotypical” users. Among these non-typical users, inappropriate embedded design assumptions may contribute to health care disparities. It is unclear how to address potentially inappropriate embedded HIT design assumptions once detected. Objective The objective of this paper is to explain HIT universal design principles derived from the human factors engineering literature that can help to overcome potential usability and/or patient safety issues that are associated with unrecognized, embedded assumptions about cultural groups when designing HIT systems. Methods Existing best practices, guidance, and standards in software usability and accessibility were subjected to a 5-step expert review process to identify and summarize those best practices, guidance, and standards that could help identify and/or address embedded design assumptions in HIT that could negatively impact patient safety, particularly for non-majority HIT user populations. An iterative consensus-based process was then used to derive evidence-based design principles from the data to address potentially inappropriate embedded cultural assumptions. Results Design principles that may help identify and address embedded HIT design assumptions are available in the existing literature. Conclusions Evidence-based HIT design principles derived from existing human factors and informatics literature can help HIT developers identify and address embedded cultural assumptions that may underlie HIT-associated usability and patient safety concerns as well as health care disparities. PMID:27025349

  14. Applying Human Factors Principles to Mitigate Usability Issues Related to Embedded Assumptions in Health Information Technology Design.

    PubMed

    Gibbons, Michael C; Lowry, Svetlana Z; Patterson, Emily S

    2014-12-18

    There is growing recognition that design flaws in health information technology (HIT) lead to increased cognitive work, impact workflows, and produce other undesirable user experiences that contribute to usability issues and, in some cases, patient harm. These usability issues may in turn contribute to HIT utilization disparities and patient safety concerns, particularly among "non-typical" HIT users and their health care providers. Health care disparities are associated with poor health outcomes, premature death, and increased health care costs. HIT has the potential to reduce these disparate outcomes. In the computer science field, it has long been recognized that embedded cultural assumptions can reduce the usability, usefulness, and safety of HIT systems for populations whose characteristics differ from "stereotypical" users. Among these non-typical users, inappropriate embedded design assumptions may contribute to health care disparities. It is unclear how to address potentially inappropriate embedded HIT design assumptions once detected. The objective of this paper is to explain HIT universal design principles derived from the human factors engineering literature that can help to overcome potential usability and/or patient safety issues that are associated with unrecognized, embedded assumptions about cultural groups when designing HIT systems. Existing best practices, guidance, and standards in software usability and accessibility were subjected to a 5-step expert review process to identify and summarize those best practices, guidance, and standards that could help identify and/or address embedded design assumptions in HIT that could negatively impact patient safety, particularly for non-majority HIT user populations. An iterative consensus-based process was then used to derive evidence-based design principles from the data to address potentially inappropriate embedded cultural assumptions. Design principles that may help identify and address embedded HIT design assumptions are available in the existing literature. Evidence-based HIT design principles derived from existing human factors and informatics literature can help HIT developers identify and address embedded cultural assumptions that may underlie HIT-associated usability and patient safety concerns as well as health care disparities.

  15. Steady-state heat conduction in quiescent fluids: Incompleteness of the Navier-Stokes-Fourier equations

    NASA Astrophysics Data System (ADS)

    Brenner, Howard

    2011-10-01

    Linear irreversible thermodynamic principles are used to demonstrate, by counterexample, the existence of a fundamental incompleteness in the basic pre-constitutive mass, momentum, and energy equations governing fluid mechanics and transport phenomena in continua. The demonstration is effected by addressing the elementary case of steady-state heat conduction (and transport processes in general) occurring in quiescent fluids. The counterexample questions the universal assumption of equality of the four physically different velocities entering into the basic pre-constitutive mass, momentum, and energy conservation equations. Explicitly, it is argued that such equality is an implicit constitutive assumption rather than an established empirical fact of unquestioned authority. Such equality, if indeed true, would require formal proof of its validity, currently absent from the literature. In fact, our counterexample shows the assumption of equality to be false. As the current set of pre-constitutive conservation equations appearing in textbooks are regarded as applicable both to continua and noncontinua (e.g., rarefied gases), our elementary counterexample negating belief in the equality of all four velocities impacts on all aspects of fluid mechanics and transport processes, continua and noncontinua alike.

  16. The effect of errors in the assignment of the transmission functions on the accuracy of the thermal sounding of the atmosphere

    NASA Technical Reports Server (NTRS)

    Timofeyev, Y. M.

    1979-01-01

    In order to test the error of calculation in assumed values of the transmission function for Soviet and American radiometers sounding the atmosphere thermally from orbiting satellites, the assumptions of the transmission calculation is varied with respect to atmospheric CO2 content, transmission frequency, and atmospheric absorption. The error arising from variations of the assumptions from the standard basic model is calculated.

  17. Student Services: Programs and Functions. A Report on the Administration of Selected Student and Campus Services of the University of Illinois at Chicago Circle. Part 1 and 2.

    ERIC Educational Resources Information Center

    Bentz, Robert P.; And Others

    The commuter institute is one to which students commute. The two basic assumptions of this study are: (1) the Chicago Circle campus of the University of Illinois will remain a commuter institution during the decade ahead; and (2) the campus will increasingly serve a more heterogeneous student body. These assumptions have important implications for…

  18. The Concept and Control Capabilities of Universal Electric Vehicle Prototype using LabView Software

    NASA Astrophysics Data System (ADS)

    Skowronek, Hubert; Waszczuk, Kamil; Kowalski, Maciej; Karolczak, Paweł; Baral, Bivek

    2016-10-01

    The concept of drive control prototype electric car designed in assumptions for sales in the markets of developing countries, mainly in South Asia has been presented in the article. The basic requirements for this type of vehicles and the possibility of rapid prototyping onboard equipment for the purpose of preliminary tests have been presented. The control system was composed of a PC and measurement card myRIO and has two operating modes. In the first of them can simulate changes of each components parameters and checking of program proper functioning. In the second mode, instead of the simulation it is possible to control the real object.

  19. Satellite servicing mission preliminary cost estimation model

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The cost model presented is a preliminary methodology for determining a rough order-of-magnitude cost for implementing a satellite servicing mission. Mission implementation, in this context, encompassess all activities associated with mission design and planning, including both flight and ground crew training and systems integration (payload processing) of servicing hardward with the Shuttle. A basic assumption made in developing this cost model is that a generic set of servicing hardware was developed and flight tested, is inventoried, and is maintained by NASA. This implies that all hardware physical and functional interfaces are well known and therefore recurring CITE testing is not required. The development of the cost model algorithms and examples of their use are discussed.

  20. The Culture-Transmission Motive in Immigrants: A World-Wide Internet Survey

    PubMed Central

    Mchitarjan, Irina; Reisenzein, Rainer

    2015-01-01

    A world-wide internet survey was conducted to test central assumptions of a recent theory of cultural transmission in minorities proposed by the authors. 844 1st to 2nd generation immigrants from a wide variety of countries recruited on a microjob platform completed a questionnaire designed to test eight hypotheses derived from the theory. Support was obtained for all hypotheses. In particular, evidence was obtained for the continued presence, in the immigrants, of the culture-transmission motive postulated by the theory: the desire to maintain the culture of origin and transmit it to the next generation. Support was also obtained for the hypothesized anchoring of the culture-transmission motive in more basic motives fulfilled by cultural groups, the relative intra- and intergenerational stability of the culture-transmission motive, and its motivating effects for action tendencies and desires that support cultural transmission under the difficult conditions of migration. Furthermore, the findings suggest that the assumption that people have a culture-transmission motive belongs to the folk psychology of sociocultural groups, and that immigrants regard the fulfillment of this desire as a moral right. PMID:26529599

  1. The Culture-Transmission Motive in Immigrants: A World-Wide Internet Survey.

    PubMed

    Mchitarjan, Irina; Reisenzein, Rainer

    2015-01-01

    A world-wide internet survey was conducted to test central assumptions of a recent theory of cultural transmission in minorities proposed by the authors. 844 1st to 2nd generation immigrants from a wide variety of countries recruited on a microjob platform completed a questionnaire designed to test eight hypotheses derived from the theory. Support was obtained for all hypotheses. In particular, evidence was obtained for the continued presence, in the immigrants, of the culture-transmission motive postulated by the theory: the desire to maintain the culture of origin and transmit it to the next generation. Support was also obtained for the hypothesized anchoring of the culture-transmission motive in more basic motives fulfilled by cultural groups, the relative intra- and intergenerational stability of the culture-transmission motive, and its motivating effects for action tendencies and desires that support cultural transmission under the difficult conditions of migration. Furthermore, the findings suggest that the assumption that people have a culture-transmission motive belongs to the folk psychology of sociocultural groups, and that immigrants regard the fulfillment of this desire as a moral right.

  2. Inexperience and risky decisions of young adolescents, as pedestrians and cyclists, in interactions with lorries, and the effects of competency versus awareness education.

    PubMed

    Twisk, Divera; Vlakveld, Willem; Mesken, Jolieke; Shope, Jean T; Kok, Gerjo

    2013-06-01

    Road injuries are a prime cause of death in early adolescence. Often road safety education (RSE) is used to target risky road behaviour in this age group. These RSE programmes are frequently based on the assumption that deliberate risk taking rather than lack of competency underlies risk behaviour. This study tested the competency of 10-13 year olds, by examining their decisions - as pedestrians and cyclists - in dealing with blind spot areas around lorries. Also, the effects of an awareness programme and a competency programme on these decisions were evaluated. Table-top models were used, representing seven scenarios that differed in complexity: one basic scenario to test the identification of blind spot areas, and 6 traffic scenarios to test behaviour in traffic situations of low or high task complexity. Using a quasi-experimental design (pre-test and post-test reference group design without randomization), the programme effects were assessed by requiring participants (n=62) to show, for each table-top traffic scenario, how they would act if they were in that traffic situation. On the basic scenario, at pre-test 42% of the youngsters identified all blind spots correctly, but only 27% showed safe behaviour in simple scenarios and 5% in complex scenarios. The competency programme yielded improved performance on the basic scenario but not on the traffic scenarios, whereas the awareness programme did not result in any improvements. The correlation between improvements on the basic scenarios and the traffic scenarios was not significant. Young adolescents have not yet mastered the necessary skills for safe performance in simple and complex traffic situations, thus underlining the need for effective prevention programmes. RSE may improve the understanding of blind spot areas but this does not 'automatically' transfer to performance in traffic situations. Implications for the design of RSE are discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  4. Kronos Observatory Operations Challenges in a Lean Environment

    NASA Astrophysics Data System (ADS)

    Koratkar, Anuradha; Peterson, Bradley M.; Polidan, Ronald S.

    2003-02-01

    Kronos is a multiwavelength observatory designed to map the accretion disks and environments of supermassive black holes in various environments using the natural intrinsic variability of the accretion-driven sources. Kronos is envisaged as a Medium Explorer mission to NASA Office of Space Science under the Structure and Evolution of the Universe theme. We will achieve the Kronos science objectives by developing cost-effective techniques for obtaining and assimilating data from the research spacecraft and its subsequent work on the ground. The science operations assumptions for the mission are: (1 Need for flexible scheduling due to the variable nature of targets, (2) Large data volumes but minimal ground station contact, (3) Very small staff for operations. Our first assumption implies that we will have to consider an effective strategy to dynamically reprioritize the observing schedule to maximize science data acquisition. The flexibility we seek greatly increases the science return of the mission, because variability events can be properly captured. Our second assumption implies that we will have to develop some basic on-board analysis strategies to determine which data get downloaded. The small size of the operations staff implies that we need to "automate" as many routine processes of science operations as possible. In this paper we will discuss the various solutions that we are considering to optimize our operations and maximize science returns on the observatory.

  5. Study Quality in SLA: A Cumulative and Developmental Assessment of Designs, Analyses, Reporting Practices, and Outcomes in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2011-01-01

    I began this study with two assumptions. Assumption 1: Study quality matters. If the means by which researchers design, carry out, and report on their studies lack in rigor or transparency, theory and practice are likely to be misguided or at least decelerated. Assumption 2 is an implication of Assumption 1: Quality should be measured rather than…

  6. Baroclinic instability with variable gravity: A perturbation analysis

    NASA Technical Reports Server (NTRS)

    Giere, A. C.; Fowliss, W. W.; Arias, S.

    1980-01-01

    Solutions for a quasigeostrophic baroclinic stability problem in which gravity is a function of height were obtained. Curvature and horizontal shear of the basic state flow were omitted and the vertical and horizontal temperature gradients of the basic state were taken as constant. The effect of a variable dielectric body force, analogous to gravity, on baroclinic instability for the design of a spherical, baroclinic model for Spacelab was determined. Such modeling could not be performed in a laboratory on the Earth's surface because the body force could not be made strong enough to dominate terrestrial gravity. A consequence of the body force variation and the preceding assumptions was that the potential vorticity gradient of the basic state vanished. The problem was solved using a perturbation method. The solution gives results which are qualitatively similar to Eady's results for constant gravity; a short wavelength cutoff and a wavelength of maximum growth rate were observed. The averaged values of the basic state indicate that both the wavelength range of the instability and the growth rate at maximum instability are increased. Results indicate that the presence of the variable body force will not significantly alter the dynamics of the Spacelab experiment. The solutions are also relevant to other geophysical fluid flows where gravity is constant but the static stability or Brunt-Vaisala frequency is a function of height.

  7. Monitored Geologic Repository Project Description Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. M. Curry

    2001-01-30

    The primary objective of the Monitored Geologic Repository Project Description Document (PDD) is to allocate the functions, requirements, and assumptions to the systems at Level 5 of the Civilian Radioactive Waste Management System (CRWMS) architecture identified in Section 4. It provides traceability of the requirements to those contained in Section 3 of the ''Monitored Geologic Repository Requirements Document'' (MGR RD) (YMP 2000a) and other higher-level requirements documents. In addition, the PDD allocates design related assumptions to work products of non-design organizations. The document provides Monitored Geologic Repository (MGR) technical requirements in support of design and performance assessment in preparing formore » the Site Recommendation (SR) and License Application (LA) milestones. The technical requirements documented in the PDD are to be captured in the System Description Documents (SDDs) which address each of the systems at Level 5 of the CRWMS architecture. The design engineers obtain the technical requirements from the SDDs and by reference from the SDDs to the PDD. The design organizations and other organizations will obtain design related assumptions directly from the PDD. These organizations may establish additional assumptions for their individual activities, but such assumptions are not to conflict with the assumptions in the PDD. The PDD will serve as the primary link between the technical requirements captured in the SDDs and the design requirements captured in US Department of Energy (DOE) documents. The approved PDD is placed under Level 3 baseline control by the CRWMS Management and Operating Contractor (M and O) and the following portions of the PDD constitute the Technical Design Baseline for the MGR: the design characteristics listed in Table 1-1, the MGR Architecture (Section 4.1), the Technical Requirements (Section 5), and the Controlled Project Assumptions (Section 6).« less

  8. The Eleventh Quadrennial Review of Military Compensation. Supporting Research Papers

    DTIC Science & Technology

    2012-06-01

    value. 4. BAH + BAS is roughly equal to expenditures for housing and food for servicemembers.22 In the first phase of the formal model, we further...assume that taxes, housing, and food are the only basic living expenses. Then, in the next phase, we include estimates of noncash benefits not included...assumption 4 with assumption 2 implies that civilian housing and food expenses are also equal to military BAH and BAS. However, civilian housing and food

  9. Holographic aids for internal combustion engine flow studies

    NASA Technical Reports Server (NTRS)

    Regan, C.

    1984-01-01

    Worldwide interest in improving the fuel efficiency of internal combustion (I.C.) engines has sparked research efforts designed to learn more about the flow processes of these engines. The flow fields must be understood prior to fuel injection in order to design efficient valves, piston geometries, and fuel injectors. Knowledge of the flow field is also necessary to determine the heat transfer to combustion chamber surfaces. Computational codes can predict velocity and turbulence patterns, but experimental verification is mandatory to justify their basic assumptions. Due to their nonintrusive nature, optical methods are ideally suited to provide the necessary velocity verification data. Optical sytems such as Schlieren photography, laser velocimetry, and illuminated particle visualization are used in I.C. engines, and now their versatility is improved by employing holography. These holographically enhanced optical techniques are described with emphasis on their applications in I.C. engines.

  10. Riddles of masculinity: gender, bisexuality, and thirdness.

    PubMed

    Fogel, Gerald I

    2006-01-01

    Clinical examples are used to illuminate several riddles of masculinity-ambiguities, enigmas, and paradoxes in relation to gender, bisexuality, and thirdness-frequently seen in male patients. Basic psychoanalytic assumptions about male psychology are examined in the light of advances in female psychology, using ideas from feminist and gender studies as well as important and now widely accepted trends in contemporary psychoanalytic theory. By reexamining basic assumptions about heterosexual men, as has been done with ideas concerning women and homosexual men, complexity and nuance come to the fore to aid the clinician in treating the complex characterological pictures seen in men today. In a context of rapid historical and theoretical change, the use of persistent gender stereotypes and unnecessarily limiting theoretical formulations, though often unintended, may mask subtle countertransference and theoretical blind spots, and limit optimal clinical effectiveness.

  11. Costing interventions in primary care.

    PubMed

    Kernick, D

    2000-02-01

    Against a background of increasing demands on limited resources, studies that relate benefits of health interventions to the resources they consume will be an important part of any decision-making process in primary care, and an accurate assessment of costs will be an important part of any economic evaluation. Although there is no such thing as a gold standard cost estimate, there are a number of basic costing concepts that underlie any costing study. How costs are derived and combined will depend on the assumptions that have been made in their derivation. It is important to be clear what assumptions have been made and why in order to maintain consistency across comparative studies and prevent inappropriate conclusions being drawn. This paper outlines some costing concepts and principles to enable primary care practitioners and researchers to have a basic understanding of costing exercises and their pitfalls.

  12. Interpretation of the results of statistical measurements. [search for basic probability model

    NASA Technical Reports Server (NTRS)

    Olshevskiy, V. V.

    1973-01-01

    For random processes, the calculated probability characteristic, and the measured statistical estimate are used in a quality functional, which defines the difference between the two functions. Based on the assumption that the statistical measurement procedure is organized so that the parameters for a selected model are optimized, it is shown that the interpretation of experimental research is a search for a basic probability model.

  13. The Disk Instability Model for SU UMa systems - a Comparison of the Thermal-Tidal Model and Plain Vanilla Model

    NASA Astrophysics Data System (ADS)

    Cannizzo, John K.

    2017-01-01

    We utilize the time dependent accretion disk model described by Ichikawa & Osaki (1992) to explore two basic ideas for the outbursts in the SU UMa systems, Osaki's Thermal-Tidal Model, and the basic accretion disk limit cycle model. We explore a range in possible input parameters and model assumptions to delineate under what conditions each model may be preferred.

  14. Behavioral health at-risk contracting--a rate development and financial reporting guide.

    PubMed

    Zinser, G R

    1994-01-01

    The process of developing rates for behavioral capitation contracts can seem mysterious and intimidating. The following article explains several key features of the method used to develop capitation rates. These include: (1) a basic understanding of the mechanics of rate calculation; (2) awareness of the variables to be considered and assumptions to be made; (3) a source of information to use as a basis for these assumptions; and (4) a system to collect detailed actual experience data.

  15. An Examination of Brazil and the United States as Potential Partners in a Joint Supersonic Military Fighter Aircraft Codevelopment and Production Program.

    DTIC Science & Technology

    1986-09-01

    Brazilian-American Chamber of Commerce Mr. Frank J. Devine, Executive Director Embraer, Empresa Brasileira De Aeronautica Mr. Salo Roth Vice President...Throughout this study the following assumptions have been made. First, it is assumed that the reader has a basic familiarity with aircraft. Therefore...of the 5 1 weapons acquisition process. Third, the assumption is made that most readers are familiar with U.S. procedures involving the sale of

  16. Quasi-experimental study designs series-paper 7: assessing the assumptions.

    PubMed

    Bärnighausen, Till; Oldenburg, Catherine; Tugwell, Peter; Bommer, Christian; Ebert, Cara; Barreto, Mauricio; Djimeu, Eric; Haber, Noah; Waddington, Hugh; Rockers, Peter; Sianesi, Barbara; Bor, Jacob; Fink, Günther; Valentine, Jeffrey; Tanner, Jeffrey; Stanley, Tom; Sierra, Eduardo; Tchetgen, Eric Tchetgen; Atun, Rifat; Vollmer, Sebastian

    2017-09-01

    Quasi-experimental designs are gaining popularity in epidemiology and health systems research-in particular for the evaluation of health care practice, programs, and policy-because they allow strong causal inferences without randomized controlled experiments. We describe the concepts underlying five important quasi-experimental designs: Instrumental Variables, Regression Discontinuity, Interrupted Time Series, Fixed Effects, and Difference-in-Differences designs. We illustrate each of the designs with an example from health research. We then describe the assumptions required for each of the designs to ensure valid causal inference and discuss the tests available to examine the assumptions. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Integrating the dimensions of sex and gender into basic life sciences research: methodologic and ethical issues.

    PubMed

    Holdcroft, Anita

    2007-01-01

    The research process -- from study design and selecting a species and its husbandry, through the experiment, analysis, peer review, and publication -- is rarely subject to questions about sex or gender differences in mainstream life sciences research. However, the impact of sex and gender on these processes is important in explaining biological variations and presentation of symptoms and diseases. This review aims to challenge assumptions and to develop opportunities to mainstream sex and gender in basic scientific research. Questions about the mechanisms of sex and gender effects were reviewed in relation to biological, environmental, social, and psychological interactions. Gender variations, in respect to aging, socializing, and reproduction, that are present in human populations but are rarely featured in laboratory research were considered to more effectively translate animal research into clinical health care. Methodologic approaches to address the present lack of a gender dimension in research include actively reducing variations through attention to physical factors, biological rhythms, and experimental design. In addition, through genomic and acute nongenomic activity, hormones may compound effects through multiple small sex differences that occur during the course of an acute pathologic event. Furthermore, the many exogenous sex steroid hormones and their congeners used in medicine (eg, in contraception and cancer therapies) may add to these effects. The studies reviewed provide evidence that sex and gender are determinants of many outcomes in life science research. To embed the gender dimension into basic scientific research, a broad approach -- gender mainstreaming -- is warranted. One example is the use of review boards (eg, animal ethical review boards and journal peer-review boards) in which gender-related standardized questions can be asked about study design and analysis. A more fundamental approach is to question the relevance of present-day laboratory models to design methods to best represent the age-related changes, comorbidity, and variations experienced by each sex in clinical medicine.

  18. A simplified rotor system mathematical model for piloted flight dynamics simulation

    NASA Technical Reports Server (NTRS)

    Chen, R. T. N.

    1979-01-01

    The model was developed for real-time pilot-in-the-loop investigation of helicopter flying qualities. The mathematical model included the tip-path plane dynamics and several primary rotor design parameters, such as flapping hinge restraint, flapping hinge offset, blade Lock number, and pitch-flap coupling. The model was used in several exploratory studies of the flying qualities of helicopters with a variety of rotor systems. The basic assumptions used and the major steps involved in the development of the set of equations listed are described. The equations consisted of the tip-path plane dynamic equation, the equations for the main rotor forces and moments, and the equation for control phasing required to achieve decoupling in pitch and roll due to cyclic inputs.

  19. A strategy for detecting the conservation of folding-nucleus residues in protein superfamilies.

    PubMed

    Michnick, S W; Shakhnovich, E

    1998-01-01

    Nucleation-growth theory predicts that fast-folding peptide sequences fold to their native structure via structures in a transition-state ensemble that share a small number of native contacts (the folding nucleus). Experimental and theoretical studies of proteins suggest that residues participating in folding nuclei are conserved among homologs. We attempted to determine if this is true in proteins with highly diverged sequences but identical folds (superfamilies). We describe a strategy based on comparisons of residue conservation in natural superfamily sequences with simulated sequences (generated with a Monte-Carlo sequence design strategy) for the same proteins. The basic assumptions of the strategy were that natural sequences will conserve residues needed for folding and stability plus function, the simulated sequences contain no functional conservation, and nucleus residues make native contacts with each other. Based on these assumptions, we identified seven potential nucleus residues in ubiquitin superfamily members. Non-nucleus conserved residues were also identified; these are proposed to be involved in stabilizing native interactions. We found that all superfamily members conserved the same potential nucleus residue positions, except those for which the structural topology is significantly different. Our results suggest that the conservation of the nucleus of a specific fold can be predicted by comparing designed simulated sequences with natural highly diverged sequences that fold to the same structure. We suggest that such a strategy could be used to help plan protein folding and design experiments, to identify new superfamily members, and to subdivide superfamilies further into classes having a similar folding mechanism.

  20. The Central Registry for Child Abuse Cases: Rethinking Basic Assumptions

    ERIC Educational Resources Information Center

    Whiting, Leila

    1977-01-01

    Class data pools on abused and neglected children and their families are found desirable for program planning, but identification by name is of questionable value and possibly a dangerous invasion of civil liberties. (MS)

  1. Self-transcendent positive emotions increase spirituality through basic world assumptions.

    PubMed

    Van Cappellen, Patty; Saroglou, Vassilis; Iweins, Caroline; Piovesana, Maria; Fredrickson, Barbara L

    2013-01-01

    Spirituality has mostly been studied in psychology as implied in the process of overcoming adversity, being triggered by negative experiences, and providing positive outcomes. By reversing this pathway, we investigated whether spirituality may also be triggered by self-transcendent positive emotions, which are elicited by stimuli appraised as demonstrating higher good and beauty. In two studies, elevation and/or admiration were induced using different methods. These emotions were compared to two control groups, a neutral state and a positive emotion (mirth). Self-transcendent positive emotions increased participants' spirituality (Studies 1 and 2), especially for the non-religious participants (Study 1). Two basic world assumptions, i.e., belief in life as meaningful (Study 1) and in the benevolence of others and the world (Study 2) mediated the effect of these emotions on spirituality. Spirituality should be understood not only as a coping strategy, but also as an upward spiralling pathway to and from self-transcendent positive emotions.

  2. Statistical Issues for Uncontrolled Reentry Hazards

    NASA Technical Reports Server (NTRS)

    Matney, Mark

    2008-01-01

    A number of statistical tools have been developed over the years for assessing the risk of reentering objects to human populations. These tools make use of the characteristics (e.g., mass, shape, size) of debris that are predicted by aerothermal models to survive reentry. The statistical tools use this information to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of the analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. This inevitably involves assumptions that simplify the problem and make it tractable, but it is often difficult to test the accuracy and applicability of these assumptions. This paper looks at a number of these theoretical assumptions, examining the mathematical basis for the hazard calculations, and outlining the conditions under which the simplifying assumptions hold. In addition, this paper will also outline some new tools for assessing ground hazard risk in useful ways. Also, this study is able to make use of a database of known uncontrolled reentry locations measured by the United States Department of Defense. By using data from objects that were in orbit more than 30 days before reentry, sufficient time is allowed for the orbital parameters to be randomized in the way the models are designed to compute. The predicted ground footprint distributions of these objects are based on the theory that their orbits behave basically like simple Kepler orbits. However, there are a number of factors - including the effects of gravitational harmonics, the effects of the Earth's equatorial bulge on the atmosphere, and the rotation of the Earth and atmosphere - that could cause them to diverge from simple Kepler orbit behavior and change the ground footprints. The measured latitude and longitude distributions of these objects provide data that can be directly compared with the predicted distributions, providing a fundamental empirical test of the model assumptions.

  3. Deep Borehole Field Test Requirements and Controlled Assumptions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, Ernest

    2015-07-01

    This document presents design requirements and controlled assumptions intended for use in the engineering development and testing of: 1) prototype packages for radioactive waste disposal in deep boreholes; 2) a waste package surface handling system; and 3) a subsurface system for emplacing and retrieving packages in deep boreholes. Engineering development and testing is being performed as part of the Deep Borehole Field Test (DBFT; SNL 2014a). This document presents parallel sets of requirements for a waste disposal system and for the DBFT, showing the close relationship. In addition to design, it will also inform planning for drilling, construction, and scientificmore » characterization activities for the DBFT. The information presented here follows typical preparations for engineering design. It includes functional and operating requirements for handling and emplacement/retrieval equipment, waste package design and emplacement requirements, borehole construction requirements, sealing requirements, and performance criteria. Assumptions are included where they could impact engineering design. Design solutions are avoided in the requirements discussion. Deep Borehole Field Test Requirements and Controlled Assumptions July 21, 2015 iv ACKNOWLEDGEMENTS This set of requirements and assumptions has benefited greatly from reviews by Gordon Appel, Geoff Freeze, Kris Kuhlman, Bob MacKinnon, Steve Pye, David Sassani, Dave Sevougian, and Jiann Su.« less

  4. Small area estimation for estimating the number of infant mortality in West Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Anggreyani, Arie; Indahwati, Kurnia, Anang

    2016-02-01

    Demographic and Health Survey Indonesia (DHSI) is a national designed survey to provide information regarding birth rate, mortality rate, family planning and health. DHSI was conducted by BPS in cooperation with National Population and Family Planning Institution (BKKBN), Indonesia Ministry of Health (KEMENKES) and USAID. Based on the publication of DHSI 2012, the infant mortality rate for a period of five years before survey conducted is 32 for 1000 birth lives. In this paper, Small Area Estimation (SAE) is used to estimate the number of infant mortality in districts of West Java. SAE is a special model of Generalized Linear Mixed Models (GLMM). In this case, the incidence of infant mortality is a Poisson distribution which has equdispersion assumption. The methods to handle overdispersion are binomial negative and quasi-likelihood model. Based on the results of analysis, quasi-likelihood model is the best model to overcome overdispersion problem. The basic model of the small area estimation used basic area level model. Mean square error (MSE) which based on resampling method is used to measure the accuracy of small area estimates.

  5. Quid pro quo: a mechanism for fair collaboration in networked systems.

    PubMed

    Santos, Agustín; Fernández Anta, Antonio; López Fernández, Luis

    2013-01-01

    Collaboration may be understood as the execution of coordinated tasks (in the most general sense) by groups of users, who cooperate for achieving a common goal. Collaboration is a fundamental assumption and requirement for the correct operation of many communication systems. The main challenge when creating collaborative systems in a decentralized manner is dealing with the fact that users may behave in selfish ways, trying to obtain the benefits of the tasks but without participating in their execution. In this context, Game Theory has been instrumental to model collaborative systems and the task allocation problem, and to design mechanisms for optimal allocation of tasks. In this paper, we revise the classical assumptions of these models and propose a new approach to this problem. First, we establish a system model based on heterogenous nodes (users, players), and propose a basic distributed mechanism so that, when a new task appears, it is assigned to the most suitable node. The classical technique for compensating a node that executes a task is the use of payments (which in most networks are hard or impossible to implement). Instead, we propose a distributed mechanism for the optimal allocation of tasks without payments. We prove this mechanism to be robust evenevent in the presence of independent selfish or rationally limited players. Additionally, our model is based on very weak assumptions, which makes the proposed mechanisms susceptible to be implemented in networked systems (e.g., the Internet).

  6. Maximization, learning, and economic behavior

    PubMed Central

    Erev, Ido; Roth, Alvin E.

    2014-01-01

    The rationality assumption that underlies mainstream economic theory has proved to be a useful approximation, despite the fact that systematic violations to its predictions can be found. That is, the assumption of rational behavior is useful in understanding the ways in which many successful economic institutions function, although it is also true that actual human behavior falls systematically short of perfect rationality. We consider a possible explanation of this apparent inconsistency, suggesting that mechanisms that rest on the rationality assumption are likely to be successful when they create an environment in which the behavior they try to facilitate leads to the best payoff for all agents on average, and most of the time. Review of basic learning research suggests that, under these conditions, people quickly learn to maximize expected return. This review also shows that there are many situations in which experience does not increase maximization. In many cases, experience leads people to underweight rare events. In addition, the current paper suggests that it is convenient to distinguish between two behavioral approaches to improve economic analyses. The first, and more conventional approach among behavioral economists and psychologists interested in judgment and decision making, highlights violations of the rational model and proposes descriptive models that capture these violations. The second approach studies human learning to clarify the conditions under which people quickly learn to maximize expected return. The current review highlights one set of conditions of this type and shows how the understanding of these conditions can facilitate market design. PMID:25024182

  7. Maximization, learning, and economic behavior.

    PubMed

    Erev, Ido; Roth, Alvin E

    2014-07-22

    The rationality assumption that underlies mainstream economic theory has proved to be a useful approximation, despite the fact that systematic violations to its predictions can be found. That is, the assumption of rational behavior is useful in understanding the ways in which many successful economic institutions function, although it is also true that actual human behavior falls systematically short of perfect rationality. We consider a possible explanation of this apparent inconsistency, suggesting that mechanisms that rest on the rationality assumption are likely to be successful when they create an environment in which the behavior they try to facilitate leads to the best payoff for all agents on average, and most of the time. Review of basic learning research suggests that, under these conditions, people quickly learn to maximize expected return. This review also shows that there are many situations in which experience does not increase maximization. In many cases, experience leads people to underweight rare events. In addition, the current paper suggests that it is convenient to distinguish between two behavioral approaches to improve economic analyses. The first, and more conventional approach among behavioral economists and psychologists interested in judgment and decision making, highlights violations of the rational model and proposes descriptive models that capture these violations. The second approach studies human learning to clarify the conditions under which people quickly learn to maximize expected return. The current review highlights one set of conditions of this type and shows how the understanding of these conditions can facilitate market design.

  8. Optimizing Experimental Design for Comparing Models of Brain Function

    PubMed Central

    Daunizeau, Jean; Preuschoff, Kerstin; Friston, Karl; Stephan, Klaas

    2011-01-01

    This article presents the first attempt to formalize the optimization of experimental design with the aim of comparing models of brain function based on neuroimaging data. We demonstrate our approach in the context of Dynamic Causal Modelling (DCM), which relates experimental manipulations to observed network dynamics (via hidden neuronal states) and provides an inference framework for selecting among candidate models. Here, we show how to optimize the sensitivity of model selection by choosing among experimental designs according to their respective model selection accuracy. Using Bayesian decision theory, we (i) derive the Laplace-Chernoff risk for model selection, (ii) disclose its relationship with classical design optimality criteria and (iii) assess its sensitivity to basic modelling assumptions. We then evaluate the approach when identifying brain networks using DCM. Monte-Carlo simulations and empirical analyses of fMRI data from a simple bimanual motor task in humans serve to demonstrate the relationship between network identification and the optimal experimental design. For example, we show that deciding whether there is a feedback connection requires shorter epoch durations, relative to asking whether there is experimentally induced change in a connection that is known to be present. Finally, we discuss limitations and potential extensions of this work. PMID:22125485

  9. The organizational culture of emergency departments and the effect on care of older adults: a modified scoping study.

    PubMed

    Skar, Pål; Bruce, Anne; Sheets, Debra

    2015-04-01

    How does the organizational micro culture in emergency departments (EDs) impact the care of older adults presenting with a complaint or condition perceived as non-acute? This scoping study reviews the literature and maps three levels of ED culture (artifacts, values and beliefs, and assumptions). Findings on the artifact level indicate that EDs are poorly designed for the needs of older adults. Findings on the ED value and belief level indicate that EDs are for urgent cases (not geriatric care), that older adults do not receive the care and respect they should be given, that older adults require too much time, and that the basic nursing needs of older adults are not a priority for ED nurses. Finally, finding on the assumptions level underpinning ED behaviors suggest that older adults do not belong in the ED, most older adults in the ED are not critically ill and therefore can wait, and staff need to be available for acute cases at all times. A systematic review on the effect of ED micro culture on the quality of geriatric care is warranted. Copyright © 2014. Published by Elsevier Ltd.

  10. Self-definition of women experiencing a nontraditional graduate fellowship program

    NASA Astrophysics Data System (ADS)

    Buck, Gayle A.; Leslie-Pelecky, Diandra L.; Lu, Yun; Plano Clark, Vicki L.; Creswell, John W.

    2006-10-01

    Women continue to be underrepresented in the fields of science, technology, engineering, and mathematics (STEM). One factor contributing to this underrepresentation is the graduate school experience. Graduate programs in STEM fields are constructed around assumptions that ignore the reality of women's lives; however, emerging opportunities may lead to experiences that are more compatible for women. One such opportunity is the Graduate Teaching Fellows in K-12 Education (GK-12) Program, which was introduced by the National Science Foundation in 1999. Although this nontraditional graduate program was not designed explicitly for women, it provided an unprecedented context in which to research how changing some of the basic assumptions upon which a graduate school operates may impact women in science. This exploratory case study examines the self-definition of 8 women graduate students who participated in a GK-12 program at a major research university. The findings from this case study contribute to higher education's understanding of the terrain women graduate students in the STEM areas must navigate as they participate in programs that are thought to be more conducive to their modes of self-definition while they continue to seek to be successful in the historically Eurocentric, masculine STEM fields.

  11. Velocity Measurement by Scattering from Index of Refraction Fluctuations Induced in Turbulent Flows

    NASA Technical Reports Server (NTRS)

    Lading, Lars; Saffman, Mark; Edwards, Robert

    1996-01-01

    Induced phase screen scattering is defined as scatter light from a weak index of refraction fluctuations induced by turbulence. The basic assumptions and requirements for induced phase screen scattering, including scale requirements, are presented.

  12. Is Tissue the Issue? A Critique of SOMPA's Models and Tests.

    ERIC Educational Resources Information Center

    Goodman, Joan F.

    1979-01-01

    A critical view of the underlying theoretical rationale of the System of Multicultural Pluralistic Assessment (SOMPA) model for student assessment is presented. The critique is extensive and questions the basic assumptions of the model. (JKS)

  13. Undergraduate Cross Registration.

    ERIC Educational Resources Information Center

    Grupe, Fritz H.

    This report discusses various aspects of undergraduate cross-registration procedures, including the dimensions, values, roles and functions, basic assumptions, and facilitating and encouragment of cross-registration. Dimensions of cross-registration encompass financial exchange, eligibility, program limitations, type of grade and credit; extent of…

  14. The Peace Movement: An Exercise in Micro-Macro Linkages.

    ERIC Educational Resources Information Center

    Galtung, Johan

    1988-01-01

    Contends that the basic assumption of the peace movement is the abuse of military power by the state. Argues that the peace movement is most effective through linkages with cultural, political, and economic forces in society. (BSR)

  15. Graduate Education in Psychology: A Comment on Rogers' Passionate Statement

    ERIC Educational Resources Information Center

    Brown, Robert C., Jr.; Tedeschi, James T.

    1972-01-01

    Authors' hope that this critical evaluation can place Carl Rogers' assumptions into perspective; they propose a compromise program meant to satisfy the basic aims of a humanistic psychology program. For Rogers' rejoinder see AA 512 869. (MB)

  16. Effects of real fluid properties on axial turbine meanline design and off-design analysis

    NASA Astrophysics Data System (ADS)

    MacLean, Cameron

    The effects of real fluid properties on axial turbine meanline analysis have been investigated employing two meanline analysis codes, namely Turbine Meanline Design (TMLD) and Turbine Meanline Off-Design (TMLO). The previously developed TMLD code assumed the working fluid was an ideal gas. Therefore it was modified to use real fluid properties. TMLO was then developed from TMLD Both codes can be run using either the ideal gas assumption or real fluid properties. TMLD was employed for the meanline design of several axial turbines for a range of inlet conditions, using both the ideal gas assumption and real fluid properties. The resulting designs were compared to see the effects of real fluid properties. Meanline designs, generated using the ideal gas assumption, were then analysed with TMLO using real fluid properties. This was done over a range of inlet conditions that correspond to varying degrees of departure from ideal gas conditions. The goal was to show how machines designed with the ideal gas assumption would perform with the real working fluid. The working fluid used in both investigations was supercritical carbon dioxide. Results from the investigation show that real fluid properties had a strong effect on the gas path areas of the turbine designs as well as the performance of turbines designed using the ideal gas assumption. Specifically, power output and the velocities of the working fluid were affected. It was found that accounting for losses tended to lessen the effects of the real fluid properties.

  17. Learning to Predict Combinatorial Structures

    NASA Astrophysics Data System (ADS)

    Vembu, Shankar

    2009-12-01

    The major challenge in designing a discriminative learning algorithm for predicting structured data is to address the computational issues arising from the exponential size of the output space. Existing algorithms make different assumptions to ensure efficient, polynomial time estimation of model parameters. For several combinatorial structures, including cycles, partially ordered sets, permutations and other graph classes, these assumptions do not hold. In this thesis, we address the problem of designing learning algorithms for predicting combinatorial structures by introducing two new assumptions: (i) The first assumption is that a particular counting problem can be solved efficiently. The consequence is a generalisation of the classical ridge regression for structured prediction. (ii) The second assumption is that a particular sampling problem can be solved efficiently. The consequence is a new technique for designing and analysing probabilistic structured prediction models. These results can be applied to solve several complex learning problems including but not limited to multi-label classification, multi-category hierarchical classification, and label ranking.

  18. SW-846 Test Method 1340: In Vitro Bioaccessibility Assay for Lead in Soil

    EPA Pesticide Factsheets

    Describes assay procedures written on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  19. From Generating in the Lab to Tutoring Systems in Classrooms.

    PubMed

    McNamara, Danielle S; Jacovina, Matthew E; Snow, Erica L; Allen, Laura K

    2015-01-01

    Work in cognitive and educational psychology examines a variety of phenomena related to the learning and retrieval of information. Indeed, Alice Healy, our honoree, and her colleagues have conducted a large body of groundbreaking research on this topic. In this article we discuss how 3 learning principles (the generation effect, deliberate practice and feedback, and antidotes to disengagement) discussed in Healy, Schneider, and Bourne (2012) have influenced the design of 2 intelligent tutoring systems that attempt to incorporate principles of skill and knowledge acquisition. Specifically, this article describes iSTART-2 and the Writing Pal, which provide students with instruction and practice using comprehension and writing strategies. iSTART-2 provides students with training to use effective comprehension strategies while self-explaining complex text. The Writing Pal provides students with instruction and practice to use basic writing strategies when writing persuasive essays. Underlying these systems are the assumptions that students should be provided with initial instruction that breaks down the tasks into component skills and that deliberate practice should include active generation with meaningful feedback, all while remaining engaging. The implementation of these assumptions is complicated by the ill-defined natures of comprehension and writing and supported by the use of various natural language processing techniques. We argue that there is value in attempting to integrate empirically supported learning principles into educational activities, even when there is imperfect alignment between them. Examples from the design of iSTART-2 and Writing Pal guide this argument.

  20. Big Numbers about Small Children: Estimating the Economic Benefits of Addressing Undernutrition.

    PubMed

    Alderman, Harold; Behrman, Jere R; Puett, Chloe

    2017-02-01

    Different approaches have been used to estimate the economic benefits of reducing undernutrition and to estimate the costs of investing in such programs on a global scale. While many of these studies are ultimately based on evidence from well-designed efficacy trials, all require a number of assumptions to project the impact of such trials to larger populations and to translate the value of the expected improvement in nutritional status into economic terms. This paper provides a short critique of some approaches to estimating the benefits of investments in child nutrition and then presents an alternative set of estimates based on different core data. These new estimates reinforce the basic conclusions of the existing literature: the economic value from reducing undernutrition in undernourished populations is likely to be substantial.

  1. Ethics and managed care.

    PubMed

    Perkel, R L

    1996-03-01

    Managed care presents physicians with potential ethical dilemmas different from dilemmas in traditional fee-for-service practice. The ethical assumptions of managed care are explored, with special attention to the evolving dual responsibilities of physicians as patient advocates and as entrepreneurs. A number of proposals are described that delineate issues in support of and in opposition to managed care. Through an understanding of how to apply basic ethics principles to managed care participation, physicians may yet hold on to the basic ethic of the fiduciary doctor-patient relationship.

  2. Development of a Multiple Linear Regression Model to Forecast Facility Electrical Consumption at an Air Force Base.

    DTIC Science & Technology

    1981-09-01

    corresponds to the same square footage that consumed the electrical energy. 3. The basic assumptions of multiple linear regres- sion, as enumerated in...7. Data related to the sample of bases is assumed to be representative of bases in the population. Limitations Basic limitations on this research were... Ratemaking --Overview. Rand Report R-5894, Santa Monica CA, May 1977. Chatterjee, Samprit, and Bertram Price. Regression Analysis by Example. New York: John

  3. Hazards and occupational risk in hard coal mines - a critical analysis of legal requirements

    NASA Astrophysics Data System (ADS)

    Krause, Marcin

    2017-11-01

    This publication concerns the problems of occupational safety and health in hard coal mines, the basic elements of which are the mining hazards and the occupational risk. The work includes a comparative analysis of selected provisions of general and industry-specific law regarding the analysis of hazards and occupational risk assessment. Based on a critical analysis of legal requirements, basic assumptions regarding the practical guidelines for occupational risk assessment in underground coal mines have been proposed.

  4. Global design of satellite constellations: a multi-criteria performance comparison of classical walker patterns and new design patterns

    NASA Astrophysics Data System (ADS)

    Lansard, Erick; Frayssinhes, Eric; Palmade, Jean-Luc

    Basically, the problem of designing a multisatellite constellation exhibits a lot of parameters with many possible combinations: total number of satellites, orbital parameters of each individual satellite, number of orbital planes, number of satellites in each plane, spacings between satellites of each plane, spacings between orbital planes, relative phasings between consecutive orbital planes. Hopefully, some authors have theoretically solved this complex problem under simplified assumptions: the permanent (or continuous) coverage by a single and multiple satellites of the whole Earth and zonal areas has been entirely solved from a pure geometrical point of view. These solutions exhibit strong symmetry properties (e.g. Walker, Ballard, Rider, Draim constellations): altitude and inclination are identical, orbital planes and satellites are regularly spaced, etc. The problem with such constellations is their oversimplified and restricted geometrical assumption. In fact, the evaluation function which is used implicitly only takes into account the point-to-point visibility between users and satellites and does not deal with very important constraints and considerations that become mandatory when designing a real satellite system (e.g. robustness to satellite failures, total system cost, common view between satellites and ground stations, service availability and satellite reliability, launch and early operations phase, production constraints, etc.). An original and global methodology relying on a powerful optimization tool based on genetic algorithms has been developed at ALCATEL ESPACE. In this approach, symmetrical constellations can be used as initial conditions of the optimization process together with specific evaluation functions. A multi-criteria performance analysis is conducted and presented here in a parametric way in order to identify and evaluate the main sensitive parameters. Quantitative results are given for three examples in the fields of navigation, telecommunication and multimedia satellite systems. In particular, a new design pattern with very efficient properties in terms of robustness to satellite failures is presented and compared with classical Walker patterns.

  5. Computer Applications in Teaching and Learning.

    ERIC Educational Resources Information Center

    Halley, Fred S.; And Others

    Some examples of the usage of computers in teaching and learning are examination generation, automatic exam grading, student tracking, problem generation, computational examination generators, program packages, simulation, and programing skills for problem solving. These applications are non-trivial and do fulfill the basic assumptions necessary…

  6. Probabilistic Simulation of Territorial Seismic Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baratta, Alessandro; Corbi, Ileana

    2008-07-08

    The paper is focused on a stochastic process for the prevision of seismic scenarios on the territory and developed by means of some basic assumptions in the procedure and by elaborating the fundamental parameters recorded during some ground motions occurred in a seismic area.

  7. Elements of a Research Report.

    ERIC Educational Resources Information Center

    Schurter, William J.

    This guide for writing research or technical reports discusses eleven basic elements of such reports and provides examples of "good" and "bad" wordings. These elements are the title, problem statement, purpose statement, need statement, hypothesis, assumptions, procedures, limitations, terminology, conclusion and recommendations. This guide is…

  8. The Case for a Hierarchical Cosmology

    ERIC Educational Resources Information Center

    Vaucouleurs, G. de

    1970-01-01

    The development of modern theoretical cosmology is presented and some questionable assumptions of orthodox cosmology are pointed out. Suggests that recent observations indicate that hierarchical clustering is a basic factor in cosmology. The implications of hierarchical models of the universe are considered. Bibliography. (LC)

  9. The Estimation Theory Framework of Data Assimilation

    NASA Technical Reports Server (NTRS)

    Cohn, S.; Atlas, Robert (Technical Monitor)

    2002-01-01

    Lecture 1. The Estimation Theory Framework of Data Assimilation: 1. The basic framework: dynamical and observation models; 2. Assumptions and approximations; 3. The filtering, smoothing, and prediction problems; 4. Discrete Kalman filter and smoother algorithms; and 5. Example: A retrospective data assimilation system

  10. A clinical trial design using the concept of proportional time using the generalized gamma ratio distribution.

    PubMed

    Phadnis, Milind A; Wetmore, James B; Mayo, Matthew S

    2017-11-20

    Traditional methods of sample size and power calculations in clinical trials with a time-to-event end point are based on the logrank test (and its variations), Cox proportional hazards (PH) assumption, or comparison of means of 2 exponential distributions. Of these, sample size calculation based on PH assumption is likely the most common and allows adjusting for the effect of one or more covariates. However, when designing a trial, there are situations when the assumption of PH may not be appropriate. Additionally, when it is known that there is a rapid decline in the survival curve for a control group, such as from previously conducted observational studies, a design based on the PH assumption may confer only a minor statistical improvement for the treatment group that is neither clinically nor practically meaningful. For such scenarios, a clinical trial design that focuses on improvement in patient longevity is proposed, based on the concept of proportional time using the generalized gamma ratio distribution. Simulations are conducted to evaluate the performance of the proportional time method and to identify the situations in which such a design will be beneficial as compared to the standard design using a PH assumption, piecewise exponential hazards assumption, and specific cases of a cure rate model. A practical example in which hemorrhagic stroke patients are randomized to 1 of 2 arms in a putative clinical trial demonstrates the usefulness of this approach by drastically reducing the number of patients needed for study enrollment. Copyright © 2017 John Wiley & Sons, Ltd.

  11. CFD Analysis of Hypersonic Flowfields With Surface Thermochemistry and Ablation

    NASA Technical Reports Server (NTRS)

    Henline, W. D.

    1997-01-01

    In the past forty years much progress has been made in computational methods applied to the solution of problems in spacecraft hypervelocity flow and heat transfer. Although the basic thermochemical and physical modeling techniques have changed little in this time, several orders of magnitude increase in the speed of numerically solving the Navier-Stokes and associated energy equations have been achieved. The extent to which this computational power can be applied to the design of spacecraft heat shields is dependent on the proper coupling of the external flow equations to the boundary conditions and governing equations representing the thermal protection system in-depth conduction, pyrolysis and surface ablation phenomena. A discussion of the techniques used to do this in past problems as well as the current state-of-art is provided. Specific examples, including past missions such as Galileo, together with the more recent case studies of ESA/Rosetta Sample Comet Return, Mars Pathfinder and X-33 will be discussed. Modeling assumptions, design approach and computational methods and results are presented.

  12. The effectiveness of wastewater treatment in nuclear medicine: Performance data and radioecological considerations.

    PubMed

    Sudbrock, F; Schomäcker, K; Drzezga, A

    2017-01-01

    For planned and ongoing storage of liquid radioactive waste in a designated plant for a nuclear medicine therapy ward (decontamination system/decay system), detailed knowledge of basic parameters such as the amount of radioactivity and the necessary decay time in the plant is required. The design of the plant at the Department of Nuclear Medicine of the University of Cologne, built in 2001, was based on assumptions about the individual discharge of activity from patients, which we can now retrospectively validate. The decontamination factor of the plant is at present in the order of 10 -9 for 131 I. The annual discharges have been continuously reduced over the period of operation and are now in the region of a few kilobecquerels. This work emphasizes the high efficacy of the decontamination plant to reduce the amount of radioactivity released from the nuclear medicine ward into the environment to almost negligible levels. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Causal inferences on the effectiveness of complex social programs: Navigating assumptions, sources of complexity and evaluation design challenges.

    PubMed

    Chatterji, Madhabi

    2016-12-01

    This paper explores avenues for navigating evaluation design challenges posed by complex social programs (CSPs) and their environments when conducting studies that call for generalizable, causal inferences on the intervention's effectiveness. A definition is provided of a CSP drawing on examples from different fields, and an evaluation case is analyzed in depth to derive seven (7) major sources of complexity that typify CSPs, threatening assumptions of textbook-recommended experimental designs for performing impact evaluations. Theoretically-supported, alternative methodological strategies are discussed to navigate assumptions and counter the design challenges posed by the complex configurations and ecology of CSPs. Specific recommendations include: sequential refinement of the evaluation design through systems thinking, systems-informed logic modeling; and use of extended term, mixed methods (ETMM) approaches with exploratory and confirmatory phases of the evaluation. In the proposed approach, logic models are refined through direct induction and interactions with stakeholders. To better guide assumption evaluation, question-framing, and selection of appropriate methodological strategies, a multiphase evaluation design is recommended. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  15. Remotely Telling Humans and Computers Apart: An Unsolved Problem

    NASA Astrophysics Data System (ADS)

    Hernandez-Castro, Carlos Javier; Ribagorda, Arturo

    The ability to tell humans and computers apart is imperative to protect many services from misuse and abuse. For this purpose, tests called CAPTCHAs or HIPs have been designed and put into production. Recent history shows that most (if not all) can be broken given enough time and commercial interest: CAPTCHA design seems to be a much more difficult problem than previously thought. The assumption that difficult-AI problems can be easily converted into valid CAPTCHAs is misleading. There are also some extrinsic problems that do not help, especially the big number of in-house designs that are put into production without any prior public critique. In this paper we present a state-of-the-art survey of current HIPs, including proposals that are now into production. We classify them regarding their basic design ideas. We discuss current attacks as well as future attack paths, and we also present common errors in design, and how many implementation flaws can transform a not necessarily bad idea into a weak CAPTCHA. We present examples of these flaws, using specific well-known CAPTCHAs. In a more theoretical way, we discuss the threat model: confronted risks and countermeasures. Finally, we introduce and discuss some desirable properties that new HIPs should have, concluding with some proposals for future work, including methodologies for design, implementation and security assessment.

  16. A Spreadsheet Simulation Tool for Terrestrial and Planetary Balloon Design

    NASA Technical Reports Server (NTRS)

    Raquea, Steven M.

    1999-01-01

    During the early stages of new balloon design and development, it is necessary to conduct many trade studies. These trade studies are required to determine the design space, and aid significantly in determining overall feasibility. Numerous point designs then need to be generated as details of payloads, materials, mission, and manufacturing are determined. To accomplish these numerous designs, transient models are both unnecessary and time intensive. A steady state model that uses appropriate design inputs to generate system-level descriptive parameters can be very flexible and fast. Just such a steady state model has been developed and has been used during both the MABS 2001 Mars balloon study and the Ultra Long Duration Balloon Project. Using Microsoft Excel's built-in iteration routine, a model was built. Separate sheets were used for performance, structural design, materials, and thermal analysis as well as input and output sheets. As can be seen from figure 1, the model takes basic performance requirements, weight estimates, design parameters, and environmental conditions and generates a system level balloon design. Figure 2 shows a sample output of the model. By changing the inputs and a few of the equations in the model, balloons on earth or other planets can be modeled. There are currently several variations of the model for terrestrial and Mars balloons, as well there are versions of the model that perform crude material design based on strength and weight requirements. To perform trade studies, the Visual Basic language built into Excel was used to create an automated matrix of designs. This trade study module allows a three dimensional trade surface to be generated by using a series of values for any two design variables. Once the fixed and variable inputs are defined, the model automatically steps through the input matrix and fills a spreadsheet with the resulting point designs. The proposed paper will describe the model in detail, including current variations. The assumptions, governing equations, and capabilities will be addressed. Detailed examples of the model in practice will also be used.

  17. Relationship between Organizational Culture and the Use of Psychotropic Medicines in Nursing Homes: A Systematic Integrative Review.

    PubMed

    Sawan, Mouna; Jeon, Yun-Hee; Chen, Timothy F

    2018-03-01

    Psychotropic medicines are commonly used in nursing homes, despite marginal clinical benefits and association with harm in the elderly. Organizational culture is proposed as a factor explaining the high-level use of psychotropic medicines. Schein describes three levels of culture: artifacts, espoused values, and basic assumptions. This integrative review aimed to investigate the facets and role of organizational culture in the use of psychotropic medicines in nursing homes. Five databases were searched for qualitative, quantitative, and mixed method empirical studies up to 13 February 2017. Articles were included if they examined an aspect of organizational culture according to Schein's theory and the use of psychotropic medicines in nursing homes for the management of behavioral and sleep disturbances in residents. Article screening and data extraction were performed independently by one reviewer and checked by the research team. The integrative review method, an approach similar to the method of constant comparison analysis was utilized for data analysis. Twenty-four studies met the inclusion criteria: 13 used quantitative methods, 9 used qualitative methods, 1 was quasi-qualitative, and 1 used mixed methods. Included studies were found to only address two aspects of organizational culture in relation to the use of psychotropic medicines: artifacts and espoused values. No studies addressed the basic assumptions, the unsaid taken-for-granted beliefs, which provide explanations for in/consistencies between the ideal use of psychotropic medicines and the actual use of psychotropic medicines. Previous studies suggest that organizational culture influences the use of psychotropic medicines in nursing homes; however, what is known is descriptive of culture only at the surface level, that is the artifacts and espoused values. Hence, future research that explains the impact of the basic assumptions of culture on the use of psychotropic medicines is important.

  18. Design Life Level: Quantifying risk in a changing climate

    NASA Astrophysics Data System (ADS)

    Rootzén, Holger; Katz, Richard W.

    2013-09-01

    In the past, the concepts of return levels and return periods have been standard and important tools for engineering design. However, these concepts are based on the assumption of a stationary climate and do not apply to a changing climate, whether local or global. In this paper, we propose a refined concept, Design Life Level, which quantifies risk in a nonstationary climate and can serve as the basis for communication. In current practice, typical hydrologic risk management focuses on a standard (e.g., in terms of a high quantile corresponding to the specified probability of failure for a single year). Nevertheless, the basic information needed for engineering design should consist of (i) the design life period (e.g., the next 50 years, say 2015-2064); and (ii) the probability (e.g., 5% chance) of a hazardous event (typically, in the form of the hydrologic variable exceeding a high level) occurring during the design life period. Capturing both of these design characteristics, the Design Life Level is defined as an upper quantile (e.g., 5%) of the distribution of the maximum value of the hydrologic variable (e.g., water level) over the design life period. We relate this concept and variants of it to existing literature and illustrate how they, and some useful complementary plots, may be computed and used. One practically important consideration concerns quantifying the statistical uncertainty in estimating a high quantile under nonstationarity.

  19. Thinking in Arithmetic.

    ERIC Educational Resources Information Center

    Resnick, Lauren B.; And Others

    This paper discusses a radically different set of assumptions to improve educational outcomes for disadvantaged students. It is argued that disadvantaged children, when exposed to carefully organized thinking-oriented instruction, can acquire the traditional basic skills in the process of reasoning and solving problems. The paper is presented in…

  20. Measurement of Inequality: The Gini Coefficient and School Finance Studies.

    ERIC Educational Resources Information Center

    Lows, Raymond L.

    1984-01-01

    Discusses application of the "Lorenz Curve" (a graphical representation of the concentration of wealth) with the "Gini Coefficient" (an index of inequality) to measure social inequality in school finance studies. Examines the basic assumptions of these measures and suggests a minor reconception. (MCG)

  1. Beyond the Virtues-Principles Debate.

    ERIC Educational Resources Information Center

    Keat, Marilyn S.

    1992-01-01

    Indicates basic ontological assumptions in the virtues-principles debate in moral philosophy, noting Aristotle's and Kant's fundamental ideas about morality and considering a hermeneutic synthesis of theories. The article discusses what acceptance of the synthesis might mean in the theory and practice of moral pedagogy, offering examples of…

  2. The Structuring Principle: Political Socialization and Belief Systems

    ERIC Educational Resources Information Center

    Searing, Donald D.; And Others

    1973-01-01

    Assesses the significance of data on childhood political learning to political theory by testing the structuring principle,'' considered one of the central assumptions of political socialization research. This principle asserts that basic orientations acquired during childhood structure the later learning of specific issue beliefs.'' The…

  3. The Experience of Disability.

    ERIC Educational Resources Information Center

    Hastings, Elizabeth

    1981-01-01

    The author outlines the experiences of disability and demonstrates that generally unpleasant experiences are the direct result of a basic and false assumption on the part of society. Experiences of the disabled are discussed in areas the author categorizes as exclusion or segregation, deprivation, prejudice, poverty, frustration, and…

  4. Some Remarks on the Theory of Political Education. German Studies Notes.

    ERIC Educational Resources Information Center

    Holtmann, Antonius

    This theoretical discussion explores pedagogical assumptions of political education in West Germany. Three major methodological orientations are discussed: the normative-ontological, empirical-analytical, and dialectical-historical. The author recounts the aims, methods, and basic presuppositions of each of these approaches. Topics discussed…

  5. Assessment of the Natural Environment.

    ERIC Educational Resources Information Center

    Cantrell, Mary Lynn; Cantrell, Robert P.

    1985-01-01

    Basic assumptions of an ecological-behavioral view of assessing behavior disordered students are reviewed along with a proposed method for ecological analysis and specific techniques for measuring ecological variables (such as environmental units, behaviors of significant others, and behavioral expectations). The use of such information in program…

  6. Sherlock Holmes as a Social Scientist.

    ERIC Educational Resources Information Center

    Ward, Veronica; Orbell, John

    1988-01-01

    Presents a way of teaching the scientific method through studying the adventures of Sherlock Holmes. Asserting that Sherlock Holmes used the scientific method to solve cases, the authors construct Holmes' method through excerpts from novels featuring his adventures. Discusses basic assumptions, paradigms, theory building, and testing. (SLM)

  7. Basic principles of respiratory function monitoring in ventilated newborns: A review.

    PubMed

    Schmalisch, Gerd

    2016-09-01

    Respiratory monitoring during mechanical ventilation provides a real-time picture of patient-ventilator interaction and is a prerequisite for lung-protective ventilation. Nowadays, measurements of airflow, tidal volume and applied pressures are standard in neonatal ventilators. The measurement of lung volume during mechanical ventilation by tracer gas washout techniques is still under development. The clinical use of capnography, although well established in adults, has not been embraced by neonatologists because of technical and methodological problems in very small infants. While the ventilatory parameters are well defined, the calculation of other physiological parameters are based upon specific assumptions which are difficult to verify. Incomplete knowledge of the theoretical background of these calculations and their limitations can lead to incorrect interpretations with clinical consequences. Therefore, the aim of this review was to describe the basic principles and the underlying assumptions of currently used methods for respiratory function monitoring in ventilated newborns and to highlight methodological limitations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Lectures on Dark Matter Physics

    NASA Astrophysics Data System (ADS)

    Lisanti, Mariangela

    Rotation curve measurements from the 1970s provided the first strong indication that a significant fraction of matter in the Universe is non-baryonic. In the intervening years, a tremendous amount of progress has been made on both the theoretical and experimental fronts in the search for this missing matter, which we now know constitutes nearly 85% of the Universe's matter density. These series of lectures provide an introduction to the basics of dark matter physics. They are geared for the advanced undergraduate or graduate student interested in pursuing research in high-energy physics. The primary goal is to build an understanding of how observations constrain the assumptions that can be made about the astro- and particle physics properties of dark matter. The lectures begin by delineating the basic assumptions that can be inferred about dark matter from rotation curves. A detailed discussion of thermal dark matter follows, motivating Weakly Interacting Massive Particles, as well as lighter-mass alternatives. As an application of these concepts, the phenomenology of direct and indirect detection experiments is discussed in detail.

  9. Testing the basic assumption of the hydrogeomorphic approach to assessing wetland functions.

    PubMed

    Hruby, T

    2001-05-01

    The hydrogeomorphic (HGM) approach for developing "rapid" wetland function assessment methods stipulates that the variables used are to be scaled based on data collected at sites judged to be the best at performing the wetland functions (reference standard sites). A critical step in the process is to choose the least altered wetlands in a hydrogeomorphic subclass to use as a reference standard against which other wetlands are compared. The basic assumption made in this approach is that wetlands judged to have had the least human impact have the highest level of sustainable performance for all functions. The levels at which functions are performed in these least altered wetlands are assumed to be "characteristic" for the subclass and "sustainable." Results from data collected in wetlands in the lowlands of western Washington suggest that the assumption may not be appropriate for this region. Teams developing methods for assessing wetland functions did not find that the least altered wetlands in a subclass had a range of performance levels that could be identified as "characteristic" or "sustainable." Forty-four wetlands in four hydrogeomorphic subclasses (two depressional subclasses and two riverine subclasses) were rated by teams of experts on the severity of their human alterations and on the level of performance of 15 wetland functions. An ordinal scale of 1-5 was used to quantify alterations in water regime, soils, vegetation, buffers, and contributing basin. Performance of functions was judged on an ordinal scale of 1-7. Relatively unaltered wetlands were judged to perform individual functions at levels that spanned all of the seven possible ratings in all four subclasses. The basic assumption of the HGM approach, that the least altered wetlands represent "characteristic" and "sustainable" levels of functioning that are different from those found in altered wetlands, was not confirmed. Although the intent of the HGM approach is to use level of functioning as a metric to assess the ecological integrity or "health" of the wetland ecosystem, the metric does not seem to work in western Washington for that purpose.

  10. Effects of fish movement assumptions on the design of a marine protected area to protect an overfished stock.

    PubMed

    Cornejo-Donoso, Jorge; Einarsson, Baldvin; Birnir, Bjorn; Gaines, Steven D

    2017-01-01

    Marine Protected Areas (MPA) are important management tools shown to protect marine organisms, restore biomass, and increase fisheries yields. While MPAs have been successful in meeting these goals for many relatively sedentary species, highly mobile organisms may get few benefits from this type of spatial protection due to their frequent movement outside the protected area. The use of a large MPA can compensate for extensive movement, but testing this empirically is challenging, as it requires both large areas and sufficient time series to draw conclusions. To overcome this limitation, MPA models have been used to identify designs and predict potential outcomes, but these simulations are highly sensitive to the assumptions describing the organism's movements. Due to recent improvements in computational simulations, it is now possible to include very complex movement assumptions in MPA models (e.g. Individual Based Model). These have renewed interest in MPA simulations, which implicitly assume that increasing the detail in fish movement overcomes the sensitivity to the movement assumptions. Nevertheless, a systematic comparison of the designs and outcomes obtained under different movement assumptions has not been done. In this paper, we use an individual based model, interconnected to population and fishing fleet models, to explore the value of increasing the detail of the movement assumptions using four scenarios of increasing behavioral complexity: a) random, diffusive movement, b) aggregations, c) aggregations that respond to environmental forcing (e.g. sea surface temperature), and d) aggregations that respond to environmental forcing and are transported by currents. We then compare these models to determine how the assumptions affect MPA design, and therefore the effective protection of the stocks. Our results show that the optimal MPA size to maximize fisheries benefits increases as movement complexity increases from ~10% for the diffusive assumption to ~30% when full environment forcing was used. We also found that in cases of limited understanding of the movement dynamics of a species, simplified assumptions can be used to provide a guide for the minimum MPA size needed to effectively protect the stock. However, using oversimplified assumptions can produce suboptimal designs and lead to a density underestimation of ca. 30%; therefore, the main value of detailed movement dynamics is to provide more reliable MPA design and predicted outcomes. Large MPAs can be effective in recovering overfished stocks, protect pelagic fish and provide significant increases in fisheries yields. Our models provide a means to empirically test this spatial management tool, which theoretical evidence consistently suggests as an effective alternative to managing highly mobile pelagic stocks.

  11. Linking parasite populations in hosts to parasite populations in space through Taylor's law and the negative binomial distribution

    PubMed Central

    Poulin, Robert; Lagrue, Clément

    2017-01-01

    The spatial distribution of individuals of any species is a basic concern of ecology. The spatial distribution of parasites matters to control and conservation of parasites that affect human and nonhuman populations. This paper develops a quantitative theory to predict the spatial distribution of parasites based on the distribution of parasites in hosts and the spatial distribution of hosts. Four models are tested against observations of metazoan hosts and their parasites in littoral zones of four lakes in Otago, New Zealand. These models differ in two dichotomous assumptions, constituting a 2 × 2 theoretical design. One assumption specifies whether the variance function of the number of parasites per host individual is described by Taylor's law (TL) or the negative binomial distribution (NBD). The other assumption specifies whether the numbers of parasite individuals within each host in a square meter of habitat are independent or perfectly correlated among host individuals. We find empirically that the variance–mean relationship of the numbers of parasites per square meter is very well described by TL but is not well described by NBD. Two models that posit perfect correlation of the parasite loads of hosts in a square meter of habitat approximate observations much better than two models that posit independence of parasite loads of hosts in a square meter, regardless of whether the variance–mean relationship of parasites per host individual obeys TL or NBD. We infer that high local interhost correlations in parasite load strongly influence the spatial distribution of parasites. Local hotspots could influence control and conservation of parasites. PMID:27994156

  12. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI): Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    PubMed Central

    Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie

    2017-01-01

    The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals. PMID:29180977

  13. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI): Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    PubMed

    Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie

    2017-01-01

    The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals.

  14. Aspects of fluency in writing.

    PubMed

    Uppstad, Per Henning; Solheim, Oddny Judith

    2007-03-01

    The notion of 'fluency' is most often associated with spoken-language phenomena such as stuttering. The present article investigates the relevance of considering fluency in writing. The basic argument for raising this question is empirical-it follows from a focus on difficulties in written and spoken language as manifestations of different problems which should be investigated separately on the basis of their symptoms. Key-logging instruments provide new possibilities for the study of writing. The obvious use of this new technology is to study writing as it unfolds in real time, instead of focusing only on aspects of the end product. A more sophisticated application is to exploit the key-logging instrument in order to test basic assumptions of contemporary theories of spelling. The present study is a dictation task involving words and non-words, intended to investigate spelling in nine-year-old pupils with regard to their mastery of the doubling of consonants in Norwegian. In this study, we report on differences with regard to temporal measures between a group of strong writers and a group of poor ones. On the basis of these pupils' writing behavior, the relevance of the concept of 'fluency' in writing is highlighted. The interpretation of the results questions basic assumptions of the cognitive hypothesis about spelling; the article concludes by hypothesizing a different conception of spelling.

  15. Calculation of Temperature Rise in Calorimetry.

    ERIC Educational Resources Information Center

    Canagaratna, Sebastian G.; Witt, Jerry

    1988-01-01

    Gives a simple but fuller account of the basis for accurately calculating temperature rise in calorimetry. Points out some misconceptions regarding these calculations. Describes two basic methods, the extrapolation to zero time and the equal area method. Discusses the theoretical basis of each and their underlying assumptions. (CW)

  16. Helicopter Toy and Lift Estimation

    ERIC Educational Resources Information Center

    Shakerin, Said

    2013-01-01

    A $1 plastic helicopter toy (called a Wacky Whirler) can be used to demonstrate lift. Students can make basic measurements of the toy, use reasonable assumptions and, with the lift formula, estimate the lift, and verify that it is sufficient to overcome the toy's weight. (Contains 1 figure.)

  17. The Rural School Principalship: Unique Challenges, Opportunities.

    ERIC Educational Resources Information Center

    Hill, Jonathan

    1993-01-01

    Presents findings based on author's research and experience as principal in California's Mojave Desert. Five basic characteristics distinguish the rural principalship: lack of an assistant principal or other support staff; assumption of other duties, including central office tasks, teaching, or management of another site; less severe student…

  18. Teacher Education: Of the People, by the People, and for the People.

    ERIC Educational Resources Information Center

    Clinton, Hillary Rodham

    1985-01-01

    Effective inservice programs are necessary to ensure that current reforms in education are properly implemented. Inservice programs must meet the needs of both the educational system and educators. Six basic policy assumptions dealing with what is needed in inservice education are discussed. (DF)

  19. School Discipline Disproportionality: Culturally Competent Interventions for African American Males

    ERIC Educational Resources Information Center

    Simmons-Reed, Evette A.; Cartledge, Gwendolyn

    2014-01-01

    Exclusionary policies are practiced widely in schools despite being associated with extremely poor outcomes for culturally and linguistically diverse students, particularly African American males with and without disabilities. This article discusses zero tolerance policies, the related research questioning their basic assumptions, and the negative…

  20. Educational Evaluation: Analysis and Responsibility.

    ERIC Educational Resources Information Center

    Apple, Michael W., Ed.; And Others

    This book presents controversial aspects of evaluation and aims at broadening perspectives and insights in the evaluation field. Chapter 1 criticizes modes of evaluation and the basic rationality behind them and focuses on assumptions that have problematic consequences. Chapter 2 introduces concepts of evaluation and examines methods of grading…

  1. General Nature of Multicollinearity in Multiple Regression Analysis.

    ERIC Educational Resources Information Center

    Liu, Richard

    1981-01-01

    Discusses multiple regression, a very popular statistical technique in the field of education. One of the basic assumptions in regression analysis requires that independent variables in the equation should not be highly correlated. The problem of multicollinearity and some of the solutions to it are discussed. (Author)

  2. Feminism, Communication and the Politics of Knowledge.

    ERIC Educational Resources Information Center

    Gallagher, Margaret

    Recent retrieval of pre-nineteenth century feminist thought provides a telling lesson in the politics of knowledge creation and control. From a feminist perspective, very little research carried out within the critical research paradigm questions the "basic assumptions, conventional wisdom, media myths and the accepted way of doing…

  3. A Neo-Kohlbergian Approach to Morality Research.

    ERIC Educational Resources Information Center

    Rest, James R.; Narvaez, Darcia; Thoma, Stephen J.; Bebeau, Muriel J.

    2000-01-01

    Proposes a model of moral judgment that builds on Lawrence Kohlberg's core assumptions. Addresses the concerns that have surfaced related to Kohlberg's work in moral judgment. Presents an overview of this model using Kohlberg's basic starting points, ideas from cognitive science, and developments in moral philosophy. (CMK)

  4. Reconciling Time, Space and Function: A New Dorsal-Ventral Stream Model of Sentence Comprehension

    ERIC Educational Resources Information Center

    Bornkessel-Schlesewsky, Ina; Schlesewsky, Matthias

    2013-01-01

    We present a new dorsal-ventral stream framework for language comprehension which unifies basic neurobiological assumptions (Rauschecker & Scott, 2009) with a cross-linguistic neurocognitive sentence comprehension model (eADM; Bornkessel & Schlesewsky, 2006). The dissociation between (time-dependent) syntactic structure-building and…

  5. Qualitative Research in Counseling Psychology: Conceptual Foundations

    ERIC Educational Resources Information Center

    Morrow, Susan L.

    2007-01-01

    Beginning with calls for methodological diversity in counseling psychology, this article addresses the history and current state of qualitative research in counseling psychology. It identifies the historical and disciplinary origins as well as basic assumptions and underpinnings of qualitative research in general, as well as within counseling…

  6. Intergenerational resource transfers with random offspring numbers

    PubMed Central

    Arrow, Kenneth J.; Levin, Simon A.

    2009-01-01

    A problem common to biology and economics is the transfer of resources from parents to children. We consider the issue under the assumption that the number of offspring is unknown and can be represented as a random variable. There are 3 basic assumptions. The first assumption is that a given body of resources can be divided into consumption (yielding satisfaction) and transfer to children. The second assumption is that the parents' welfare includes a concern for the welfare of their children; this is recursive in the sense that the children's welfares include concern for their children and so forth. However, the welfare of a child from a given consumption is counted somewhat differently (generally less) than that of the parent (the welfare of a child is “discounted”). The third assumption is that resources transferred may grow (or decline). In economic language, investment, including that in education or nutrition, is productive. Under suitable restrictions, precise formulas for the resulting allocation of resources are found, demonstrating that, depending on the shape of the utility curve, uncertainty regarding the number of offspring may or may not favor increased consumption. The results imply that wealth (stock of resources) will ultimately have a log-normal distribution. PMID:19617553

  7. Telepresence for space: The state of the concept

    NASA Technical Reports Server (NTRS)

    Smith, Randy L.; Gillan, Douglas J.; Stuart, Mark A.

    1990-01-01

    The purpose here is to examine the concept of telepresence critically. To accomplish this goal, first, the assumptions that underlie telepresence and its applications are examined, and second, the issues raised by that examination are discussed. Also, these assumptions and issues are used as a means of shifting the focus in telepresence from development to user-based research. The most basic assumption of telepresence is that the information being provided to the human must be displayed in a natural fashion, i.e., the information should be displayed to the same human sensory modalities, and in the same fashion, as if the person where actually at the remote site. A further fundamental assumption for the functional use of telepresence is that a sense of being present in the work environment will produce superior performance. In other words, that sense of being there would allow the human operator of a distant machine to take greater advantage of his or her considerable perceptual, cognitive, and motor capabilities in the performance of a task than would more limited task-related feedback. Finally, a third fundamental assumption of functional telepresence is that the distant machine under the operator's control must substantially resemble a human in dexterity.

  8. Communication failure: basic components, contributing factors, and the call for structure.

    PubMed

    Dayton, Elizabeth; Henriksen, Kerm

    2007-01-01

    Communication is a taken-for-granted human activity that is recognized as important once it has failed. Communication failures are a major contributor to adverse events in health care. The components and processes of communication converge in an intricate manner, creating opportunities for misunderstanding along the way. When a patient's safety is at risk, providers should speak up (that is, initiate a message) to draw attention to the situation before harm is caused. They should also clearly explain (encode) and understand (decode) each other's diagnosis and recommendations to ensure well coordinated delivery of care. Beyond basic dyadic communication exchanges, an intricate web of individual, group, and organizational factors--more specifically, cognitive workload, implicit assumptions, authority gradients, diffusion of responsibility, and transitions of care--complicate communication. More structured and explicitly designed forms of communication have been recommended to reduce ambiguity, enhance clarity, and send an unequivocal signal, when needed, that a different action is required. Read-backs, Situation-Background-Assessment-Recommendation, critical assertions, briefings, and debriefings are seeing increasing use in health care. CODA: Although structured forms of communication have good potential to enhance clarity, they are not fail-safe. Providers need to be sensitive to unexpected consequences regarding their use.

  9. ["Dieu et cerveau, rien que Dieu et cerveau!" Johann Gottfried von Herder (1744-1803) and the neurosciences of this time].

    PubMed

    Stahnisch, Frank

    2007-01-01

    The impact of Johann Gottfried von Herder on the broad spectrum of the history of ideas can hardly be estimated by separate categories derived from individual disciplines. It transcends the spheres of philosophy, theology, historiography and even medical anthropology--also because Herder, unlike many of his contemporary philosophers and hommes de lettres, was particularly interested in the neurophysiological and -anatomical investigations of his time. Herder's universal interest in human learning is reflected in numerous personal contacts to contemporary academic scholars and natural scientists, such as the Swiss theologian Johann Caspar Lavater, whose physiognomic doctrine mapped out a comprehensive research programme on character analysis, or the Mainz anatomist Samuel Thomas von Soemmering. Herder tightly received the latter's assumption about the interplay between the human soul and the anatomy of the brain. In this article, it shall be demonstrated that Herder's neurophilosophy was primarily influenced by a "pandynamic assumption of nature" and that it designated the brain centrally as a "working tool of God"--right between the human faculties of rationality, feeling and bodily development. The attractiveness of this concept to both basic brain research and clinical neurology was a result of his anthropological approach which combined latest developments in the natural sciences with a central perspective on the human sciences.

  10. Three regularities of recognition memory: the role of bias.

    PubMed

    Hilford, Andrew; Maloney, Laurence T; Glanzer, Murray; Kim, Kisok

    2015-12-01

    A basic assumption of Signal Detection Theory is that decisions are made on the basis of likelihood ratios. In a preceding paper, Glanzer, Hilford, and Maloney (Psychonomic Bulletin & Review, 16, 431-455, 2009) showed that the likelihood ratio assumption implies that three regularities will occur in recognition memory: (1) the Mirror Effect, (2) the Variance Effect, (3) the normalized Receiver Operating Characteristic (z-ROC) Length Effect. The paper offered formal proofs and computational demonstrations that decisions based on likelihood ratios produce the three regularities. A survey of data based on group ROCs from 36 studies validated the likelihood ratio assumption by showing that its three implied regularities are ubiquitous. The study noted, however, that bias, another basic factor in Signal Detection Theory, can obscure the Mirror Effect. In this paper we examine how bias affects the regularities at the theoretical level. The theoretical analysis shows: (1) how bias obscures the Mirror Effect, not the other two regularities, and (2) four ways to counter that obscuring. We then report the results of five experiments that support the theoretical analysis. The analyses and the experimental results also demonstrate: (1) that the three regularities govern individual, as well as group, performance, (2) alternative explanations of the regularities are ruled out, and (3) that Signal Detection Theory, correctly applied, gives a simple and unified explanation of recognition memory data.

  11. D-OPTIMAL EXPERIMENTAL DESIGNS TO TEST FOR DEPARTURE FROM ADDITIVITY IN A FIXED-RATIO MIXTURE RAY.

    EPA Science Inventory

    Humans are exposed to mixtures of environmental compounds. A regulatory assumption is that the mixtures of chemicals act in an additive manner. However, this assumption requires experimental validation. Traditional experimental designs (full factorial) require a large number of e...

  12. Plant uptake of elements in soil and pore water: field observations versus model assumptions.

    PubMed

    Raguž, Veronika; Jarsjö, Jerker; Grolander, Sara; Lindborg, Regina; Avila, Rodolfo

    2013-09-15

    Contaminant concentrations in various edible plant parts transfer hazardous substances from polluted areas to animals and humans. Thus, the accurate prediction of plant uptake of elements is of significant importance. The processes involved contain many interacting factors and are, as such, complex. In contrast, the most common way to currently quantify element transfer from soils into plants is relatively simple, using an empirical soil-to-plant transfer factor (TF). This practice is based on theoretical assumptions that have been previously shown to not generally be valid. Using field data on concentrations of 61 basic elements in spring barley, soil and pore water at four agricultural sites in mid-eastern Sweden, we quantify element-specific TFs. Our aim is to investigate to which extent observed element-specific uptake is consistent with TF model assumptions and to which extent TF's can be used to predict observed differences in concentrations between different plant parts (root, stem and ear). Results show that for most elements, plant-ear concentrations are not linearly related to bulk soil concentrations, which is congruent with previous studies. This behaviour violates a basic TF model assumption of linearity. However, substantially better linear correlations are found when weighted average element concentrations in whole plants are used for TF estimation. The highest number of linearly-behaving elements was found when relating average plant concentrations to soil pore-water concentrations. In contrast to other elements, essential elements (micronutrients and macronutrients) exhibited relatively small differences in concentration between different plant parts. Generally, the TF model was shown to work reasonably well for micronutrients, whereas it did not for macronutrients. The results also suggest that plant uptake of elements from sources other than the soil compartment (e.g. from air) may be non-negligible. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Quality Control and Nondestructive Evaluation Techniques for Composites. Part 2. Physiochemical Characterization Techniques - A State-of-the Art Review

    DTIC Science & Technology

    1983-05-01

    in the presence of fillers or without it. The basic assumption made is that the heat of reaction is proportional to the extent of the reaction...disperse the SillllV* rVdi\\tion ^^9 • .canning machan ^m. ill isolate the frequency range falling on the detector In this manner. the spectrum...the molar orms with only has n absorb ing (nxp) and # by the udied. Of t have a all of the analysis a complete the same There are two basic

  14. Production process stability - core assumption of INDUSTRY 4.0 concept

    NASA Astrophysics Data System (ADS)

    Chromjakova, F.; Bobak, R.; Hrusecka, D.

    2017-06-01

    Today’s industrial enterprises are confronted by implementation of INDUSTRY 4.0 concept with basic problem - stabilised manufacturing and supporting processes. Through this phenomenon of stabilisation, they will achieve positive digital management of both processes and continuously throughput. There is required structural stability of horizontal (business) and vertical (digitized) manufacturing processes, supported through digitalised technologies of INDUSTRY 4.0 concept. Results presented in this paper based on the research results and survey realised in more industrial companies. Following will described basic model for structural process stabilisation in manufacturing environment.

  15. National Launch System comparative economic analysis

    NASA Technical Reports Server (NTRS)

    Prince, A.

    1992-01-01

    Results are presented from an analysis of economic benefits (or losses), in the form of the life cycle cost savings, resulting from the development of the National Launch System (NLS) family of launch vehicles. The analysis was carried out by comparing various NLS-based architectures with the current Shuttle/Titan IV fleet. The basic methodology behind this NLS analysis was to develop a set of annual payload requirements for the Space Station Freedom and LEO, to design launch vehicle architectures around these requirements, and to perform life-cycle cost analyses on all of the architectures. A SEI requirement was included. Launch failure costs were estimated and combined with the relative reliability assumptions to measure the effects of losses. Based on the analysis, a Shuttle/NLS architecture evolving into a pressurized-logistics-carrier/NLS architecture appears to offer the best long-term cost benefit.

  16. Theory and applications survey of decentralized control methods

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1975-01-01

    A nonmathematical overview is presented of trends in the general area of decentralized control strategies which are suitable for hierarchical systems. Advances in decentralized system theory are closely related to advances in the so-called stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools pertaining to the classical stochastic control problem are outlined. Particular attention is devoted to pitfalls in the mathematical problem formulation for decentralized control. Major conclusions are that any purely deterministic approach to multilevel hierarchical dynamic systems is unlikely to lead to realistic theories or designs, that the flow of measurements and decisions in a decentralized system should not be instantaneous and error-free, and that delays in information exchange in a decentralized system lead to reasonable approaches to decentralized control. A mathematically precise notion of aggregating information is not yet available.

  17. Why is it Doing That? - Assumptions about the FMS

    NASA Technical Reports Server (NTRS)

    Feary, Michael; Immanuel, Barshi; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    In the glass cockpit, it's not uncommon to hear exclamations such as "why is it doing that?". Sometimes pilots ask "what were they thinking when they set it this way?" or "why doesn't it tell me what it's going to do next?". Pilots may hold a conceptual model of the automation that is the result of fleet lore, which may or may not be consistent with what the engineers had in mind. But what did the engineers have in mind? In this study, we present some of the underlying assumptions surrounding the glass cockpit. Engineers and designers make assumptions about the nature of the flight task; at the other end, instructor and line pilots make assumptions about how the automation works and how it was intended to be used. These underlying assumptions are seldom recognized or acknowledged, This study is an attempt to explicitly arti culate such assumptions to better inform design and training developments. This work is part of a larger project to support training strategies for automation.

  18. Economic Theory and Broadcasting.

    ERIC Educational Resources Information Center

    Bates, Benjamin J.

    Focusing on access to audience through broadcast time, this paper examines the status of research into the economics of broadcasting. The paper first discusses the status of theory in the study of broadcast economics, both as described directly and as it exists in the statement of the basic assumptions generated by prior work and general…

  19. Dewey and Schon: An Analysis of Reflective Thinking.

    ERIC Educational Resources Information Center

    Bauer, Norman J.

    The challenge to the dominance of rationality in educational philosophy presented by John Dewey and Donald Schon is examined in this paper. The paper identifies basic assumptions of their perspective and explains concepts of reflective thinking, which include biography, context of uncertainty, and "not-yet." A model of reflective thought…

  20. Tiedeman's Approach to Career Development.

    ERIC Educational Resources Information Center

    Harren, Vincent A.

    Basic to Tiedeman's approach to career development and decision making is the assumption that one is responsible for one's own behavior because one has the capacity for choice and lives in a world which is not deterministic. Tiedeman, a cognitive-developmental theorist, views continuity of development as internal or psychological while…

  1. Linking Educational Philosophy with Micro-Level Technology: The Search for a Complete Method.

    ERIC Educational Resources Information Center

    Januszewski, Alan

    Traditionally, educational technologists have not been concerned with social or philosophical questions, and the field does not have a basic educational philosophy. Instead, it is dominated by a viewpoint characterized as "technical rationality" or "technicism"; the most important assumption of this viewpoint is that science…

  2. Network Analysis in Comparative Social Sciences

    ERIC Educational Resources Information Center

    Vera, Eugenia Roldan; Schupp, Thomas

    2006-01-01

    This essay describes the pertinence of Social Network Analysis (SNA) for the social sciences in general, and discusses its methodological and conceptual implications for comparative research in particular. The authors first present a basic summary of the theoretical and methodological assumptions of SNA, followed by a succinct overview of its…

  3. Conservatism in America--What Does it Mean for Teacher Education?

    ERIC Educational Resources Information Center

    Dolce, Carl J.

    The current conflict among opposing sets of cultural ideals is illustrated by several interrelated conditions. The conservative phenomenon is more complex than the traditional liberal-conservative dichotomy would suggest. Changes in societal conditions invite a reexamination of basic assumptions across the broad spectrum of political ideology.…

  4. Variable thickness transient ground-water flow model. Volume 1. Formulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reisenauer, A.E.

    1979-12-01

    Mathematical formulation for the variable thickness transient (VTT) model of an aquifer system is presented. The basic assumptions are described. Specific data requirements for the physical parameters are discussed. The boundary definitions and solution techniques of the numerical formulation of the system of equations are presented.

  5. A SYSTEMS ANALYSIS OF SCHOOL BOARD ACTION.

    ERIC Educational Resources Information Center

    SCRIBNER, JAY D.

    THE BASIC ASSUMPTION OF THE FUNCTIONAL-SYSTEMS THEORY IS THAT STRUCTURES FULFILL FUNCTIONS IN SYSTEMS AND THAT SUBSYSTEMS OPERATE SEPARATELY WITHIN ANY TYPE OF STRUCTURE. RELYING MAINLY ON GABRIEL ALMOND'S PARADIGM, THE AUTHOR ATTEMPTS TO DETERMINE THE USEFULNESS OF THE FUNCTIONAL-SYSTEMS THEORY IN CONDUCTING EMPIRICAL RESEARCH OF SCHOOL BOARDS.…

  6. Distance-Based and Distributed Learning: A Decision Tool for Education Leaders.

    ERIC Educational Resources Information Center

    McGraw, Tammy M.; Ross, John D.

    This decision tool presents a progression of data collection and decision-making strategies that can increase the effectiveness of distance-based or distributed learning instruction. A narrative and flow chart cover the following steps: (1) basic assumptions, including purpose of instruction, market scan, and financial resources; (2) needs…

  7. Applying the Principles of Specific Objectivity and of Generalizability to the Measurement of Change.

    ERIC Educational Resources Information Center

    Fischer, Gerhard H.

    1987-01-01

    A natural parameterization and formalization of the problem of measuring change in dichotomous data is developed. Mathematically-exact definitions of specific objectivity are presented, and the basic structures of the linear logistic test model and the linear logistic model with relaxed assumptions are clarified. (SLD)

  8. A Guide to Curriculum Planning in Mathematics. Bulletin No. 6284.

    ERIC Educational Resources Information Center

    Chambers, Donald L.; And Others

    This guide was written under the basic assumptions that the mathematics curriculum must continuously change and that mathematics is most effectively learned through a spiral approach. Further, it is assumed that the audience will be members of district mathematics curriculum committees. Instructional objectives have been organized to reveal the…

  9. Validated Test Method 1315: Mass Transfer Rates of Constituents in Monolithic or Compacted Granular Materials Using a Semi-Dynamic Tank Leaching Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  10. Document-Oriented E-Learning Components

    ERIC Educational Resources Information Center

    Piotrowski, Michael

    2009-01-01

    This dissertation questions the common assumption that e-learning requires a "learning management system" (LMS) such as Moodle or Blackboard. Based on an analysis of the current state of the art in LMSs, we come to the conclusion that the functionality of conventional e-learning platforms consists of basic content management and…

  11. Moral Development in Higher Education

    ERIC Educational Resources Information Center

    Liddell, Debora L.; Cooper, Diane L.

    2012-01-01

    In this article, the authors lay out the basic foundational concepts and assumptions that will guide the reader through the chapters to come as the chapter authors explore "how" moral growth can be facilitated through various initiatives on the college campus. This article presents a brief review of the theoretical frameworks that provide the…

  12. Measuring Protein Interactions by Optical Biosensors

    PubMed Central

    Zhao, Huaying; Boyd, Lisa F.; Schuck, Peter

    2017-01-01

    This unit gives an introduction to the basic techniques of optical biosensing for measuring equilibrium and kinetics of reversible protein interactions. Emphasis is given to the description of robust approaches that will provide reliable results with few assumptions. How to avoid the most commonly encountered problems and artifacts is also discussed. PMID:28369667

  13. A "View from Nowhen" on Time Perception Experiments

    ERIC Educational Resources Information Center

    Riemer, Martin; Trojan, Jorg; Kleinbohl, Dieter; Holzl, Rupert

    2012-01-01

    Systematic errors in time reproduction tasks have been interpreted as a misperception of time and therefore seem to contradict basic assumptions of pacemaker-accumulator models. Here we propose an alternative explanation of this phenomenon based on methodological constraints regarding the direction of time, which cannot be manipulated in…

  14. Teaching Literature: Some Honest Doubts.

    ERIC Educational Resources Information Center

    Rutledge, Donald G.

    1968-01-01

    The possibility that many English teachers take their subject too seriously should be considered. The assumption that literature can to any degree either improve or adversely affect students is doubtful, but the exclusive study of "great literature" in our secondary schools may invite basic reflections too early: a year's steady diet of "King…

  15. East Europe Report, Political, Sociological and Military Affairs, No. 2219

    DTIC Science & Technology

    1983-10-24

    takes place in training booths and classrooms. On the way to warrant officer one must take sociology, Russian, basic construction, materials...polemics. I admit that I like this much more than the obligatory hearty kiss on both cheeks along with, of course, the assumption that polemicists have

  16. Exceptional Children Conference Papers: Behavioral and Emotional Problems.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Arlington, VA.

    Four of the seven conference papers treating behavioral and emotional problems concern the Conceptual Project, an attempt to provide definition and evaluation of conceptual models of the various theories of emotional disturbance and their basic assumptions, and to provide training packages based on these materials. The project is described in…

  17. The Binding Properties of Quechua Suffixes.

    ERIC Educational Resources Information Center

    Weber, David

    This paper sketches an explicitly non-lexicalist application of grammatical theory to Huallaga (Huanuco) Quechua (HgQ). The advantages of applying binding theory to many suffixes that have previously been treated only as objects of the morphology are demonstrated. After an introduction, section 2 outlines basic assumptions about the nature of HgQ…

  18. Validated Test Method 1316: Liquid-Solid Partitioning as a Function of Liquid-to-Solid Ratio in Solid Materials Using a Parallel Batch Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  19. Creating a Healthy Camp Community: A Nurse's Role.

    ERIC Educational Resources Information Center

    Lishner, Kris Miller; Bruya, Margaret Auld

    This book provides an organized, systematic overview of the basic aspects of health program management, nursing practice, and human relations issues in camp nursing. A foremost assumption is that health care in most camps needs improvement. Good health is dependent upon interventions involving social, environmental, and lifestyle factors that…

  20. Fatherless America: Confronting Our Most Urgent Social Problem.

    ERIC Educational Resources Information Center

    Blankenhorn, David

    The United States is rapidly becoming a fatherless society. Fatherlessness is the leading cause of declining child well-being, providing the impetus behind social problems such as crime, domestic violence, and adolescent pregnancy. Challenging the basic assumptions of opinion leaders in academia and in the media, this book debunks the prevailing…

  1. Teaching Strategy: A New Planet.

    ERIC Educational Resources Information Center

    O'Brien, Edward L.

    1998-01-01

    Presents a lesson for middle and secondary school students in which they respond to a hypothetical scenario that enables them to develop a list of basic rights. Expounds that students compare their list of rights to the Universal Declaration of Human Rights in order to explore the assumptions about human rights. (CMK)

  2. Session overview: forest ecosystems

    Treesearch

    John J. Battles; Robert C. Heald

    2004-01-01

    The core assumption of this symposium is that science can provide insight to management. Nowhere is this link more formally established than in regard to the science and management of forest ecosystems. The basic questions addressed are integral to our understanding of nature; the applications of this understanding are crucial to effective stewardship of natural...

  3. Human Praxis: A New Basic Assumption for Art Educators of the Future.

    ERIC Educational Resources Information Center

    Hodder, Geoffrey S.

    1980-01-01

    After analyzing Vincent Lanier's five characteristic roles of art education, the article briefly explains the pedagogy of Paulo Freire, based on human praxis, and applies it to the existing "oppresive" art education system. The article reduces Lanier's roles to resemble a single Freirean model. (SB)

  4. Model-Based Reasoning

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk; Seel, Norbert M.

    2013-01-01

    In this paper, there will be a particular focus on mental models and their application to inductive reasoning within the realm of instruction. A basic assumption of this study is the observation that the construction of mental models and related reasoning is a slowly developing capability of cognitive systems that emerges effectively with proper…

  5. Alternate hosts of Blepharipa pratensis (Meigen)

    Treesearch

    Paul A. Godwin; Thomas M. Odell

    1977-01-01

    A current tactic for biological control of the gypsy moth, Lymantria dispar Linnaeus, is to release its parasites in forests susceptible to gypsy moth damage before the gypsy moth arrives. The basic assumption in these anticipatory releases is that the parasites can find and utilize native insects as hosts in the interim. Blepharipa...

  6. Children and Adolescents: Should We Teach Them or Let Them Learn?

    ERIC Educational Resources Information Center

    Rohwer, William D., Jr.

    Research to date has provided too few answers for vital educational questions concerning teaching children or letting them learn. A basic problem is that experimentation usually begins by accepting conventional assumptions about schooling, ignoring experiments that would entail disturbing the ordering of current educational priorities.…

  7. R0 for vector-borne diseases: impact of the assumption for the duration of the extrinsic incubation period.

    PubMed

    Hartemink, Nienke; Cianci, Daniela; Reiter, Paul

    2015-03-01

    Mathematical modeling and notably the basic reproduction number R0 have become popular tools for the description of vector-borne disease dynamics. We compare two widely used methods to calculate the probability of a vector to survive the extrinsic incubation period. The two methods are based on different assumptions for the duration of the extrinsic incubation period; one method assumes a fixed period and the other method assumes a fixed daily rate of becoming infectious. We conclude that the outcomes differ substantially between the methods when the average life span of the vector is short compared to the extrinsic incubation period.

  8. Likelihood ratio decisions in memory: three implied regularities.

    PubMed

    Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T

    2009-06-01

    We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.

  9. Design and Analysis of an Electromagnetic Thrust Bearing

    NASA Technical Reports Server (NTRS)

    Banerjee, Bibhuti; Rao, Dantam K.

    1996-01-01

    A double-acting electromagnetic thrust bearing is normally used to counter the axial loads in many rotating machines that employ magnetic bearings. It essentially consists of an actuator and drive electronics. Existing thrust bearing design programs are based on several assumptions. These assumptions, however, are often violated in practice. For example, no distinction is made between maximum external loads and maximum bearing forces, which are assumed to be identical. Furthermore, it is assumed that the maximum flux density in the air gap occurs at the nominal gap position of the thrust runner. The purpose of this paper is to present a clear theoretical basis for the design of the electromagnetic thrust bearing which obviates such assumptions.

  10. Where Are We Going? Planning Assumptions for Community Colleges.

    ERIC Educational Resources Information Center

    Maas, Rao, Taylor and Associates, Riverside, CA.

    Designed to provide community college planners with a series of reference assumptions to consider in the planning process, this document sets forth assumptions related to finance (i.e., operational funds, capital funds, alternate funding sources, and campus financial operations); California state priorities; occupational trends; population (i.e.,…

  11. Mathematical Modeling: Are Prior Experiences Important?

    ERIC Educational Resources Information Center

    Czocher, Jennifer A.; Moss, Diana L.

    2017-01-01

    Why are math modeling problems the source of such frustration for students and teachers? The conceptual understanding that students have when engaging with a math modeling problem varies greatly. They need opportunities to make their own assumptions and design the mathematics to fit these assumptions (CCSSI 2010). Making these assumptions is part…

  12. Capillary Flow in Containers of Polygonal Section: Theory and Experiment

    NASA Technical Reports Server (NTRS)

    Weislogel, Mark M.; Rame, Enrique (Technical Monitor)

    2001-01-01

    An improved understanding of the large-length-scale capillary flows arising in a low-gravity environment is critical to that engineering community concerned with the design and analysis of spacecraft fluids management systems. Because a significant portion of liquid behavior in spacecraft is capillary dominated it is natural to consider designs that best exploit the spontaneous character of such flows. In the present work, a recently verified asymptotic analysis is extended to approximate spontaneous capillary flows in a large class of cylindrical containers of irregular polygonal section experiencing a step reduction in gravitational acceleration. Drop tower tests are conducted using partially-filled irregular triangular containers for comparison with the theoretical predictions. The degree to which the experimental data agree with the theory is a testament to the robustness of the basic analytical assumption of predominantly parallel flow. As a result, the closed form analytical expressions presented serve as simple, accurate tools for predicting bulk flow characteristics essential to practical low-g system design and analysis. Equations for predicting corner wetting rates, total container flow rates, and transient surfaces shapes are provided that are relevant also to terrestrial applications such as capillary flow in porous media.

  13. A Study of Crowd Ability and its Influence on Crowdsourced Evaluation of Design Concepts

    DTIC Science & Technology

    2014-05-01

    identifies the experts from the crowd, under the assumptions that ( 1 ) experts do exist and (2) only experts have consistent evaluations. These assumptions...for design evaluation tasks . Keywords: crowdsourcing, design evaluation, sparse evaluation ability, machine learning ∗Corresponding author. 1 ...intelligence” of a much larger crowd of people with diverse backgrounds [ 1 ]. Crowdsourced evaluation, or the delegation of an eval- uation task to a

  14. N-terminal segments modulate the α-helical propensities of the intrinsically disordered basic regions of bZIP proteins.

    PubMed

    Das, Rahul K; Crick, Scott L; Pappu, Rohit V

    2012-02-17

    Basic region leucine zippers (bZIPs) are modular transcription factors that play key roles in eukaryotic gene regulation. The basic regions of bZIPs (bZIP-bRs) are necessary and sufficient for DNA binding and specificity. Bioinformatic predictions and spectroscopic studies suggest that unbound monomeric bZIP-bRs are uniformly disordered as isolated domains. Here, we test this assumption through a comparative characterization of conformational ensembles for 15 different bZIP-bRs using a combination of atomistic simulations and circular dichroism measurements. We find that bZIP-bRs have quantifiable preferences for α-helical conformations in their unbound monomeric forms. This helicity varies from one bZIP-bR to another despite a significant sequence similarity of the DNA binding motifs (DBMs). Our analysis reveals that intramolecular interactions between DBMs and eight-residue segments directly N-terminal to DBMs are the primary modulators of bZIP-bR helicities. We test the accuracy of this inference by designing chimeras of bZIP-bRs to have either increased or decreased overall helicities. Our results yield quantitative insights regarding the relationship between sequence and the degree of intrinsic disorder within bZIP-bRs, and might have general implications for other intrinsically disordered proteins. Understanding how natural sequence variations lead to modulation of disorder is likely to be important for understanding the evolution of specificity in molecular recognition through intrinsically disordered regions (IDRs). Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Physical constraints on biological integral control design for homeostasis and sensory adaptation.

    PubMed

    Ang, Jordan; McMillen, David R

    2013-01-22

    Synthetic biology includes an effort to use design-based approaches to create novel controllers, biological systems aimed at regulating the output of other biological processes. The design of such controllers can be guided by results from control theory, including the strategy of integral feedback control, which is central to regulation, sensory adaptation, and long-term robustness. Realization of integral control in a synthetic network is an attractive prospect, but the nature of biochemical networks can make the implementation of even basic control structures challenging. Here we present a study of the general challenges and important constraints that will arise in efforts to engineer biological integral feedback controllers or to analyze existing natural systems. Constraints arise from the need to identify target output values that the combined process-plus-controller system can reach, and to ensure that the controller implements a good approximation of integral feedback control. These constraints depend on mild assumptions about the shape of input-output relationships in the biological components, and thus will apply to a variety of biochemical systems. We summarize our results as a set of variable constraints intended to provide guidance for the design or analysis of a working biological integral feedback controller. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. Academic Public Relations Curricula: How They Compare with the Bateman-Cutlip Commission Standards.

    ERIC Educational Resources Information Center

    McCartney, Hunter P.

    To see what effect the 1975 Bateman-Cutlip Commission's recommendations have had on improving public relations education in the United States, 173 questionnaires were sent to colleges or universities with accredited or comprehensive programs in public relations. Responding to five basic assumptions underlying the commission's recommendations,…

  17. Faculty and Student Attitudes about Transfer of Learning

    ERIC Educational Resources Information Center

    Lightner, Robin; Benander, Ruth; Kramer, Eugene F.

    2008-01-01

    Transfer of learning is using previous knowledge in novel contexts. While this is a basic assumption of the educational process, students may not always perceive all the options for using what they have learned in different, novel situations. Within the framework of transfer of learning, this study outlines an attitudinal survey concerning faculty…

  18. New Directions in Teacher Education: Foundations, Curriculum, Policy.

    ERIC Educational Resources Information Center

    Denton, Jon, Ed.; And Others

    This publication includes presentations made at the Aikin-Stinnett Lecture Series and follow-up papers sponsored by the Instructional Research Laboratory at Texas A&M University. The papers in this collection focus upon the basic assumptions and conceptual bases of teacher education and the use of research in providing a foundation for…

  19. Perspective Making: Constructivism as a Meaning-Making Structure for Simulation Gaming

    ERIC Educational Resources Information Center

    Lainema, Timo

    2009-01-01

    Constructivism has recently gained popularity, although it is not a completely new learning paradigm. Much of the work within e-learning, for example, uses constructivism as a reference "discipline" (explicitly or implicitly). However, some of the work done within the simulation gaming (SG) community discusses what the basic assumptions and…

  20. Spiral Growth in Plants: Models and Simulations

    ERIC Educational Resources Information Center

    Allen, Bradford D.

    2004-01-01

    The analysis and simulation of spiral growth in plants integrates algebra and trigonometry in a botanical setting. When the ideas presented here are used in a mathematics classroom/computer lab, students can better understand how basic assumptions about plant growth lead to the golden ratio and how the use of circular functions leads to accurate…

  1. Dynamic Assessment and Its Implications for RTI Models

    ERIC Educational Resources Information Center

    Wagner, Richard K.; Compton, Donald L.

    2011-01-01

    Dynamic assessment refers to assessment that combines elements of instruction for the purpose of learning something about an individual that cannot be learned as easily or at all from conventional assessment. The origins of dynamic assessment can be traced to Thorndike (1924), Rey (1934), and Vygotsky (1962), who shared three basic assumptions.…

  2. Looking for Skinner and Finding Freud

    ERIC Educational Resources Information Center

    Overskeid, Geir

    2007-01-01

    Sigmund Freud and B. F. Skinner are often seen as psychology's polar opposites. It seems this view is fallacious. Indeed, Freud and Skinner had many things in common, including basic assumptions shaped by positivism and determinism. More important, Skinner took a clear interest in psychoanalysis and wanted to be analyzed but was turned down. His…

  3. Student Teachers' Beliefs about the Teacher's Role in Inclusive Education

    ERIC Educational Resources Information Center

    Domovic, Vlatka; Vidovic Vlasta, Vizek; Bouillet, Dejana

    2017-01-01

    The main aim of this research is to examine the basic features of student teachers' professional beliefs about the teacher's role in relation to teaching mainstream pupils and pupils with developmental disabilities. The starting assumption of this analysis is that teacher professional development is largely dependent upon teachers' beliefs about…

  4. United States Air Force Training Line Simulator. Final Report.

    ERIC Educational Resources Information Center

    Nauta, Franz; Pierce, Michael B.

    This report describes the technical aspects and potential applications of a computer-based model simulating the flow of airmen through basic training and entry-level technical training. The objective of the simulation is to assess the impacts of alternative recruit classification and training policies under a wide variety of assumptions regarding…

  5. Cable in Boston; A Basic Viability Report.

    ERIC Educational Resources Information Center

    Hauben, Jan Ward; And Others

    The viability of urban cable television (CATV) as an economic phenomenon is examined via a case study of its feasibility in Boston, a microcosm of general urban environment. To clarify cable's economics, a unitary concept of viability is used in which all local characteristics, cost assumptions, and growth estimates are structured dynamically as a…

  6. "I Fell off [the Mothering] Track": Barriers to "Effective Mothering" among Prostituted Women

    ERIC Educational Resources Information Center

    Dalla, Rochelle

    2004-01-01

    Ecological theory and basic assumptions for the promotion of effective mothering among low-income and working-poor women are applied in relation to a particularly vulnerable population: street-level prostitution-involved women. Qualitative data from 38 street-level prostituted women shows barriers to effective mothering at the individual,…

  7. Between "Homo Sociologicus" and "Homo Biologicus": The Reflexive Self in the Age of Social Neuroscience

    ERIC Educational Resources Information Center

    Pickel, Andreas

    2012-01-01

    The social sciences rely on assumptions of a unified self for their explanatory logics. Recent work in the new multidisciplinary field of social neuroscience challenges precisely this unproblematic character of the subjective self as basic, well-defined entity. If disciplinary self-insulation is deemed unacceptable, the philosophical challenge…

  8. Fueling a Third Paradigm of Education: The Pedagogical Implications of Digital, Social and Mobile Media

    ERIC Educational Resources Information Center

    Pavlik, John V.

    2015-01-01

    Emerging technologies are fueling a third paradigm of education. Digital, networked and mobile media are enabling a disruptive transformation of the teaching and learning process. This paradigm challenges traditional assumptions that have long characterized educational institutions and processes, including basic notions of space, time, content,…

  9. Using LISREL to Evaluate Measurement Models and Scale Reliability.

    ERIC Educational Resources Information Center

    Fleishman, John; Benson, Jeri

    1987-01-01

    LISREL program was used to examine measurement model assumptions and to assess reliability of Coopersmith Self-Esteem Inventory for Children, Form B. Data on 722 third-sixth graders from over 70 schools in large urban school district were used. LISREL program assessed (1) nature of basic measurement model for scale, (2) scale invariance across…

  10. What Are We Looking For?--Pro Critical Realism in Text Interpretation

    ERIC Educational Resources Information Center

    Siljander, Pauli

    2011-01-01

    A visible role in the theoretical discourses on education has been played in the last couple of decades by the constructivist epistemologies, which have questioned the basic assumptions of realist epistemologies. The increased popularity of interpretative approaches especially has put the realist epistemologies on the defensive. Basing itself on…

  11. The Hidden Reason Behind Children's Misbehavior.

    ERIC Educational Resources Information Center

    Nystul, Michael S.

    1986-01-01

    Discusses hidden reason theory based on the assumptions that: (1) the nature of people is positive; (2) a child's most basic psychological need is involvement; and (3) a child has four possible choices in life (good somebody, good nobody, bad somebody, or severely mentally ill.) A three step approach for implementing hidden reason theory is…

  12. 78 FR 26269 - Connect America Fund; High-Cost Universal Service Support

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-06

    ... the model platform, which is the basic framework for the model consisting of key assumptions about the... combination of competitive bidding and a new forward-looking model of the cost of constructing modern multi-purpose networks.'' Using the cost model to ``estimate the support necessary to serve areas where costs...

  13. The Effective Elementary School Principal: Theoretical Bases, Research Findings and Practical Implications.

    ERIC Educational Resources Information Center

    Burnett, I. Emett, Jr.; Pankake, Anita M.

    Although much of the current school reform movement relies on the basic assumption of effective elementary school administration, insufficient effort has been made to synthesize key concepts found in organizational theory and management studies with relevant effective schools research findings. This paper attempts such a synthesis to help develop…

  14. Response: Training Doctoral Students to Be Scientists

    ERIC Educational Resources Information Center

    Pollio, David E.

    2012-01-01

    The purpose of this article is to begin framing doctoral training for a science of social work. This process starts by examining two seemingly simple questions: "What is a social work scientist?" and "How do we train social work scientists?" In answering the first question, some basic assumptions and concepts about what constitutes a "social work…

  15. Adults with Intellectual and Developmental Disabilities and Participation in Decision Making: Ethical Considerations for Professional-Client Practice

    ERIC Educational Resources Information Center

    Lotan, Gurit; Ells, Carolyn

    2010-01-01

    In this article, the authors challenge professionals to re-examine assumptions about basic concepts and their implications in supporting adults with intellectual and developmental disabilities. The authors focus on decisions with significant implications, such as planning transition from school to adult life, changing living environments, and…

  16. A Convergence of Two Cultures in the Implementation of P.L. 94-142.

    ERIC Educational Resources Information Center

    Haas, Toni J.

    The Education for All Handicapped Children Act (PL 94-142) demanded basic changes in the practices, purposes, and institutional structures of schools to accommodate handicapped students, but did not adequately address the differences between general and special educators in expectations, training, or assumptions about the functions of schooling…

  17. From Earth to Space--Advertising Films Created in a Computer-Based Primary School Task

    ERIC Educational Resources Information Center

    Öman, Anne

    2017-01-01

    Today, teachers orchestrate computer-based tasks in software applications in Swedish primary schools. Meaning is made through various modes, and multimodal perspectives on literacy have the basic assumption that meaning is made through many representational and communicational resources. The case study presented in this paper has analysed pupils'…

  18. Child Sexual Abuse: Intervention and Treatment Issues. The User Manual Series.

    ERIC Educational Resources Information Center

    Faller, Kathleen Coulborn

    This manual describes professional practices in intervention and treatment of sexual abuse and discusses how to address the problems of sexually abused children and their families. It makes an assumption that the reader has basic information about sexual abuse. The discussion focuses primarily on the child's guardian as the abuser. The manual…

  19. A Comparative Analysis of Selected Mechanical Aspects of the Ice Skating Stride.

    ERIC Educational Resources Information Center

    Marino, G. Wayne

    This study quantitatively analyzes selected aspects of the skating strides of above-average and below-average ability skaters. Subproblems were to determine how stride length and stride rate are affected by changes in skating velocity, to ascertain whether the basic assumption that stride length accurately approximates horizontal movement of the…

  20. Implementing a Redesign Strategy: Lessons from Educational Change.

    ERIC Educational Resources Information Center

    Basom, Richard E., Jr.; Crandall, David P.

    The effective implementation of school redesign, based on a social systems approach, is discussed in this paper. A basic assumption is that the interdependence of system elements has implications for a complex change process. Seven barriers to redesign and five critical issues for successful redesign strategy are presented. Seven linear steps for…

  1. Children Are Human Beings

    ERIC Educational Resources Information Center

    Bossard, James H. S.

    2017-01-01

    The basic assumption underlying this article is that the really significant changes in human history are those that occur, not in the mechanical gadgets which men use nor in the institutionalized arrangements by which they live, but in their attitudes and in the values which they accept. The revolutions of the past that have had the greatest…

  2. Civility in Politics and Education. Routledge Studies in Contemporary Philosophy

    ERIC Educational Resources Information Center

    Mower, Deborah, Ed.; Robison, Wade L., Ed.

    2011-01-01

    This book examines the concept of civility and the conditions of civil disagreement in politics and education. Although many assume that civility is merely polite behavior, it functions to aid rational discourse. Building on this basic assumption, the book offers multiple accounts of civility and its contribution to citizenship, deliberative…

  3. Improving Clinical Teaching: The ADN Experience. Pathways to Practice.

    ERIC Educational Resources Information Center

    Haase, Patricia T.; And Others

    Three Florida associate degree in nursing (ADN) demonstration projects of the Nursing Curriculum Project (NCP) are described, and the history of the ADN program and current controversies are reviewed. In 1976, the NCP of the Southern Regional Education Board issued basic assumptions about the role of the ADN graduate, relating them to client…

  4. Development and Validation of a Clarinet Performance Adjudication Scale

    ERIC Educational Resources Information Center

    Abeles, Harold F.

    1973-01-01

    A basic assumption of this study is that there are generally agreed upon performance standards as evidenced by the use of adjudicators for evaluations at contests and festivals. An evaluation instrument was developed to enable raters to measure effectively those aspects of performance that have common standards of proficiency. (Author/RK)

  5. Organize Your School for Improvement

    ERIC Educational Resources Information Center

    Truby, William F.

    2017-01-01

    W. Edwards Deming has suggested 96% of organization performance is a function of the organization's structure. He contends only about 4% of an organization's performance is attributable to the people. This is a fundamental difference as most school leaders work with the basic assumption that 80% of a school's performance is related to staff and…

  6. Training for Basic Skills or Educating Workers?: Changing Conceptions of Workplace Education Programs.

    ERIC Educational Resources Information Center

    Schultz, Katherine

    Although the National Workplace Literacy Program is relatively new, a new orthodoxy of program development based on particular understandings of literacy and learning has emerged. Descriptions of two model workplace education programs are the beginning points for an examination of the assumptions contained in most reports of workplace education…

  7. Appreciative Inquiry: A Model for Organizational Development and Performance Improvement in Student Affairs

    ERIC Educational Resources Information Center

    Elleven, Russell K.

    2007-01-01

    The article examines a relatively new tool to increase the effectiveness of organizations and people. The recent development and background of Appreciative Inquiry (AI) is reviewed. Basic assumptions of the model are discussed. Implications for departments and divisions of student affairs are analyzed. Finally, suggested readings and workshop…

  8. An Economic Theory of School Governance.

    ERIC Educational Resources Information Center

    Rada, Roger D.

    Working from the basic assumption that the primary motivation for those involved in school governance is self-interest, this paper develops and discusses 15 hypotheses that form the essential elements of an economic theory of school governance. The paper opens with a review of previous theories of governance and their origins in social science…

  9. The Effectiveness of Ineffectiveness: A New Approach to Assessing Patterns of Organizational Effectiveness.

    ERIC Educational Resources Information Center

    Cameron, Kim S.

    A way to assess and improve organizational effectiveness is discussed, with a focus on factors that inhibit successful organizational performance. The basic assumption is that it is easier, more accurate, and more beneficial for individuals and organizations to identify criteria of ineffectiveness (faults and weaknesses) than to identify criteria…

  10. Validated Test Method 1314: Liquid-Solid Partitioning as a Function of Liquid-Solid Ratio for Constituents in Solid Materials Using An Up-Flow Percolation Column Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  11. Testing Intercultural Competence in (International) English: Some Basic Questions and Suggested Answers

    ERIC Educational Resources Information Center

    Camerer, Rudi

    2014-01-01

    The testing of intercultural competence has long been regarded as the field of psychometric test procedures, which claim to analyse an individual's personality by specifying and quantifying personality traits with the help of self-answer questionnaires and the statistical evaluation of these. The underlying assumption is that what is analysed and…

  12. Lifeboat Counseling: The Issue of Survival Decisions

    ERIC Educational Resources Information Center

    Dowd, E. Thomas; Emener, William G.

    1978-01-01

    Rehabilitation counseling, as a profession, needs to look at future world possibilities, especially in light of overpopulation, and be aware that the need may arise for adjusting basic assumptions about human life--from the belief that every individual has a right to a meaningful life to the notion of selecting who shall live. (DTT)

  13. Challenges of Adopting Constructive Alignment in Action Learning Education

    ERIC Educational Resources Information Center

    Remneland Wikhamn, Björn

    2017-01-01

    This paper will critically examine how the two influential pedagogical approaches of action-based learning and constructive alignment relate to each other, and how they may differ in focus and basic assumptions. From the outset, they are based on similar underpinnings, with the student and the learning outcomes in the center. Drawing from…

  14. Curricular Learning Communities and Unprepared Students: How Faculty Can Provide a Foundation for Success

    ERIC Educational Resources Information Center

    Engstrom, Cathy McHugh

    2008-01-01

    The pedagogical assumptions and teaching practices of learning community models reflect exemplary conditions for learning, so using these models with unprepared students seems desirable and worthy of investigation. This chapter describes the key role of faculty in creating active, integrative learning experiences for students in basic skills…

  15. Education in Conflict and Crisis for National Security.

    ERIC Educational Resources Information Center

    McClelland, Charles A.

    A basic assumption is that the level of conflict within and between nations will escalate over the next 50 years. Trying to "muddle through" using the tools and techniques of organized violence may yield national suicide. Therefore, complex conflict resolution skills need to be developed and used by some part of society to quell disorder…

  16. Textbooks as a Possible Influence on Unscientific Ideas about Evolution

    ERIC Educational Resources Information Center

    Tshuma, Tholani; Sanders, Martie

    2015-01-01

    While school textbooks are assumed to be written for and used by students, it is widely acknowledged that they also serve a vital support function for teachers, particularly in times of curriculum change. A basic assumption is that biology textbooks are scientifically accurate. Furthermore, because of the negative impact of…

  17. A basic review on the inferior alveolar nerve block techniques.

    PubMed

    Khalil, Hesham

    2014-01-01

    The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned.

  18. A basic review on the inferior alveolar nerve block techniques

    PubMed Central

    Khalil, Hesham

    2014-01-01

    The inferior alveolar nerve block is the most common injection technique used in dentistry and many modifications of the conventional nerve block have been recently described in the literature. Selecting the best technique by the dentist or surgeon depends on many factors including the success rate and complications related to the selected technique. Dentists should be aware of the available current modifications of the inferior alveolar nerve block techniques in order to effectively choose between these modifications. Some operators may encounter difficulty in identifying the anatomical landmarks which are useful in applying the inferior alveolar nerve block and rely instead on assumptions as to where the needle should be positioned. Such assumptions can lead to failure and the failure rate of inferior alveolar nerve block has been reported to be 20-25% which is considered very high. In this basic review, the anatomical details of the inferior alveolar nerve will be given together with a description of its both conventional and modified blocking techniques; in addition, an overview of the complications which may result from the application of this important technique will be mentioned. PMID:25886095

  19. A Markov chain model for reliability growth and decay

    NASA Technical Reports Server (NTRS)

    Siegrist, K.

    1982-01-01

    A mathematical model is developed to describe a complex system undergoing a sequence of trials in which there is interaction between the internal states of the system and the outcomes of the trials. For example, the model might describe a system undergoing testing that is redesigned after each failure. The basic assumptions for the model are that the state of the system after a trial depends probabilistically only on the state before the trial and on the outcome of the trial and that the outcome of a trial depends probabilistically only on the state of the system before the trial. It is shown that under these basic assumptions, the successive states form a Markov chain and the successive states and outcomes jointly form a Markov chain. General results are obtained for the transition probabilities, steady-state distributions, etc. A special case studied in detail describes a system that has two possible state ('repaired' and 'unrepaired') undergoing trials that have three possible outcomes ('inherent failure', 'assignable-cause' 'failure' and 'success'). For this model, the reliability function is computed explicitly and an optimal repair policy is obtained.

  20. Investigating the Assumptions of Uses and Gratifications Research

    ERIC Educational Resources Information Center

    Lometti, Guy E.; And Others

    1977-01-01

    Discusses a study designed to determine empirically the gratifications sought from communication channels and to test the assumption that individuals differentiate channels based on gratifications. (MH)

  1. Recognising the Effects of Costing Assumptions in Educational Business Simulation Games

    ERIC Educational Resources Information Center

    Eckardt, Gordon; Selen, Willem; Wynder, Monte

    2015-01-01

    Business simulations are a powerful way to provide experiential learning that is focussed, controlled, and concentrated. Inherent in any simulation, however, are numerous assumptions that determine feedback, and hence the lessons learnt. In this conceptual paper we describe some common cost assumptions that are implicit in simulation design and…

  2. Designing lymphocyte functional structure for optimal signal detection: voilà, T cells.

    PubMed

    Noest, A J

    2000-11-21

    One basic task of immune systems is to detect signals from unknown "intruders" amidst a noisy background of harmless signals. To clarify the functional importance of many observed lymphocyte properties, I ask: What properties would a cell have if one designed it according to the theory of optimal detection, with minimal regard for biological constraints? Sparse and reasonable assumptions about the statistics of available signals prove sufficient for deriving many features of the optimal functional structure, in an incremental and modular design. The use of one common formalism guarantees that all parts of the design collaborate to solve the detection task. Detection performance is computed at several stages of the design. Comparison between design variants reveals e.g. the importance of controlling the signal integration time. This predicts that an appropriate control mechanism should exist. Comparing the design to reality, I find a striking similarity with many features of T cells. For example, the formalism dictates clonal specificity, serial receptor triggering, (grades of) anergy, negative and positive selection, co-stimulation, high-zone tolerance, and clonal production of cytokines. Serious mismatches should be found if T cells were hindered by mechanistic constraints or vestiges of their (co-)evolutionary history, but I have not found clear examples. By contrast, fundamental mismatches abound when comparing the design to immune systems of e.g. invertebrates. The wide-ranging differences seem to hinge on the (in)ability to generate a large diversity of receptors. Copyright 2000 Academic Press.

  3. The application of the detection filter to aircraft control surface and actuator failure detection and isolation

    NASA Technical Reports Server (NTRS)

    Bonnice, W. F.; Wagner, E.; Motyka, P.; Hall, S. R.

    1985-01-01

    The performance of the detection filter in detecting and isolating aircraft control surface and actuator failures is evaluated. The basic detection filter theory assumption of no direct input-output coupling is violated in this application due to the use of acceleration measurements for detecting and isolating failures. With this coupling, residuals produced by control surface failures may only be constrained to a known plane rather than to a single direction. A detection filter design with such planar failure signatures is presented, with the design issues briefly addressed. In addition, a modification to constrain the residual to a single known direction even with direct input-output coupling is also presented. Both the detection filter and the modification are tested using a nonlinear aircraft simulation. While no thresholds were selected, both filters demonstrated an ability to detect control surface and actuator failures. Failure isolation may be a problem if there are several control surfaces which produce similar effects on the aircraft. In addition, the detection filter was sensitive to wind turbulence and modeling errors.

  4. Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program

    NASA Technical Reports Server (NTRS)

    Ryan, Shannon

    2013-01-01

    This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks.

  5. Use of Climate Information for Decision-Making and Impacts Research: State of Our Understanding

    DTIC Science & Technology

    2016-03-01

    SUMMARY Much of human society and its infrastructure has been designed and built on a key assumption: that future climate conditions at any given...experienced in the past. This assumption affects infrastructure design and maintenance, emergency response management, and long-term investment and planning...our scientific understanding of the climate system in a manner that incorporates user needs into the design of scientific experiments, and that

  6. Consequences of Violated Equating Assumptions under the Equivalent Groups Design

    ERIC Educational Resources Information Center

    Lyren, Per-Erik; Hambleton, Ronald K.

    2011-01-01

    The equal ability distribution assumption associated with the equivalent groups equating design was investigated in the context of a selection test for admission to higher education. The purpose was to assess the consequences for the test-takers in terms of receiving improperly high or low scores compared to their peers, and to find strong…

  7. Flat Engineered Multichannel Reflectors

    NASA Astrophysics Data System (ADS)

    Asadchy, V. S.; Díaz-Rubio, A.; Tcvetkova, S. N.; Kwon, D.-H.; Elsakka, A.; Albooyeh, M.; Tretyakov, S. A.

    2017-07-01

    Recent advances in engineered gradient metasurfaces have enabled unprecedented opportunities for light manipulation using optically thin sheets, such as anomalous refraction, reflection, or focusing of an incident beam. Here, we introduce a concept of multichannel functional metasurfaces, which are able to control incoming and outgoing waves in a number of propagation directions simultaneously. In particular, we reveal a possibility to engineer multichannel reflectors. Under the assumption of reciprocity and energy conservation, we find that there exist three basic functionalities of such reflectors: specular, anomalous, and retroreflections. Multichannel response of a general flat reflector can be described by a combination of these functionalities. To demonstrate the potential of the introduced concept, we design and experimentally test three different multichannel reflectors: three- and five-channel retroreflectors and a three-channel power splitter. Furthermore, by extending the concept to reflectors supporting higher-order Floquet harmonics, we forecast the emergence of other multichannel flat devices, such as isolating mirrors, complex splitters, and multi-functional gratings.

  8. Probabilistic Design Storm Method for Improved Flood Estimation in Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Berk, Mario; Å pačková, Olga; Straub, Daniel

    2017-12-01

    The design storm approach with event-based rainfall-runoff models is a standard method for design flood estimation in ungauged catchments. The approach is conceptually simple and computationally inexpensive, but the underlying assumptions can lead to flawed design flood estimations. In particular, the implied average recurrence interval (ARI) neutrality between rainfall and runoff neglects uncertainty in other important parameters, leading to an underestimation of design floods. The selection of a single representative critical rainfall duration in the analysis leads to an additional underestimation of design floods. One way to overcome these nonconservative approximations is the use of a continuous rainfall-runoff model, which is associated with significant computational cost and requires rainfall input data that are often not readily available. As an alternative, we propose a novel Probabilistic Design Storm method that combines event-based flood modeling with basic probabilistic models and concepts from reliability analysis, in particular the First-Order Reliability Method (FORM). The proposed methodology overcomes the limitations of the standard design storm approach, while utilizing the same input information and models without excessive computational effort. Additionally, the Probabilistic Design Storm method allows deriving so-called design charts, which summarize representative design storm events (combinations of rainfall intensity and other relevant parameters) for floods with different return periods. These can be used to study the relationship between rainfall and runoff return periods. We demonstrate, investigate, and validate the method by means of an example catchment located in the Bavarian Pre-Alps, in combination with a simple hydrological model commonly used in practice.

  9. The Basilar Artery International Cooperation Study (BASICS): study protocol for a randomised controlled trial

    PubMed Central

    2013-01-01

    Background Despite recent advances in acute stroke treatment, basilar artery occlusion (BAO) is associated with a death or disability rate of close to 70%. Randomised trials have shown the safety and efficacy of intravenous thrombolysis (IVT) given within 4.5 h and have shown promising results of intra-arterial thrombolysis given within 6 h of symptom onset of acute ischaemic stroke, but these results do not directly apply to patients with an acute BAO because only few, if any, of these patients were included in randomised acute stroke trials. Recently the results of the Basilar Artery International Cooperation Study (BASICS), a prospective registry of patients with acute symptomatic BAO challenged the often-held assumption that intra-arterial treatment (IAT) is superior to IVT. Our observations in the BASICS registry underscore that we continue to lack a proven treatment modality for patients with an acute BAO and that current clinical practice varies widely. Design BASICS is a randomised controlled, multicentre, open label, phase III intervention trial with blinded outcome assessment, investigating the efficacy and safety of additional IAT after IVT in patients with BAO. The trial targets to include 750 patients, aged 18 to 85 years, with CT angiography or MR angiography confirmed BAO treated with IVT. Patients will be randomised between additional IAT followed by optimal medical care versus optimal medical care alone. IVT has to be initiated within 4.5 h from estimated time of BAO and IAT within 6 h. The primary outcome parameter will be favourable outcome at day 90 defined as a modified Rankin Scale score of 0–3. Discussion The BASICS registry was observational and has all the limitations of a non-randomised study. As the IAT approach becomes increasingly available and frequently utilised an adequately powered randomised controlled phase III trial investigating the added value of this therapy in patients with an acute symptomatic BAO is needed (clinicaltrials.gov: NCT01717755). PMID:23835026

  10. An Investigation of the Equipercentile Assumption and the One-Group Pre/Post Design.

    ERIC Educational Resources Information Center

    Powell, George D.; Raffeld, Paul C.

    The equipercentile assumption states that students in traditional classrooms who receive no other instructional assistance, will maintain their relative rank order over time. To test this assumption, fall to fall test results on the SRA Achievement Tests were obtained for grades 2-3, and 6-7. Total reading and total mathematics growth scale values…

  11. Statistical foundations of liquid-crystal theory

    PubMed Central

    Seguin, Brian; Fried, Eliot

    2013-01-01

    We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals. PMID:23772091

  12. Advanced space power requirements and techniques. Task 1: Mission projections and requirements. Volume 3: Appendices. [cost estimates and computer programs

    NASA Technical Reports Server (NTRS)

    Wolfe, M. G.

    1978-01-01

    Contents: (1) general study guidelines and assumptions; (2) launch vehicle performance and cost assumptions; (3) satellite programs 1959 to 1979; (4) initiative mission and design characteristics; (5) satellite listing; (6) spacecraft design model; (7) spacecraft cost model; (8) mission cost model; and (9) nominal and optimistic budget program cost summaries.

  13. Power and Method: Political Activism and Educational Research. Critical Social Thought Series.

    ERIC Educational Resources Information Center

    Gitlin, Andrew, Ed.

    This book scrutinizes some basic assumptions about educational research with the aim that such research may act more powerfully on those persistent and important problems of our schools surrounding issues of race, class, and gender. In particular, the 13 essays in this book examine how power is infused in research by addressing such questions as…

  14. Philosophical Ethnography: Or, How Philosophy and Ethnography Can Live Together in the World of Educational Research

    ERIC Educational Resources Information Center

    Feinberg, Walter

    2006-01-01

    This essay explores a disciplinary hybrid, called here, philosophical ethnography. Philosophical ethnography is a philosophy of the everyday and ethnography in the context of intercultural discourse about coordinating meaning, evaluation, norms and action. Its basic assumption is that in the affairs of human beings truth, justice and beauty are…

  15. The Future of Family Business Education in UK Business Schools

    ERIC Educational Resources Information Center

    Collins, Lorna; Seaman, Claire; Graham, Stuart; Stepek, Martin

    2013-01-01

    Purpose: This practitioner paper aims to question basic assumptions about management education and to argue that a new paradigm is needed for UK business schools which embraces an oft neglected, yet economically vital, stakeholder group, namely family businesses. It seeks to pose the question of why we have forgotten to teach about family business…

  16. Social Maladjustment and Students with Behavioral and Emotional Disorders: Revisiting Basic Assumptions and Assessment Issues

    ERIC Educational Resources Information Center

    Olympia, Daniel; Farley, Megan; Christiansen, Elizabeth; Pettersson, Hollie; Jenson, William; Clark, Elaine

    2004-01-01

    While much of the current focus in special education remains on reauthorization of the Individuals with Disabilities Act of 1997, disparities in the identification of children with serious emotional disorders continue to plague special educators and school psychologists. Several years after the issue of social maladjustment and its relationship to…

  17. Locations of Racism in Education: A Speech Act Analysis of a Policy Chain

    ERIC Educational Resources Information Center

    Arneback, Emma; Quennerstedt, Ann

    2016-01-01

    This article explores how racism is located in an educational policy chain and identifies how its interpretation changes throughout the chain. A basic assumption is that the policy formation process can be seen as a chain in which international, national and local policies are "links"--separate entities yet joined. With Sweden as the…

  18. Pedagogical and Social Climate in School Questionnaire: Factorial Validity and Reliability of the Teacher Version

    ERIC Educational Resources Information Center

    Dimitrova, Radosveta; Ferrer-Wreder, Laura; Galanti, Maria Rosaria

    2016-01-01

    This study evaluated the factorial structure of the Pedagogical and Social Climate in School (PESOC) questionnaire among 307 teachers in Bulgaria. The teacher edition of PESOC consists of 11 scales (i.e., Expectations for Students, Unity Among Teachers, Approach to Students, Basic Assumptions About Students' Ability to Learn, School-Home…

  19. The Education System in Greece. [Revised.

    ERIC Educational Resources Information Center

    EURYDICE Central Unit, Brussels (Belgium).

    The education policy of the Greek government rests on the basic assumption that effective education is a social goal and that every citizen has a right to an education. A brief description of the Greek education system and of the adjustments made to give practical meaning to the provisions on education in the Constitution is presented in the…

  20. Experiences in Rural Mental Health II: Organizing a Low Budget Program.

    ERIC Educational Resources Information Center

    Hollister, William G.; And Others

    Based on a North Carolina feasibility study (1967-73) which focused on development of a pattern for providing comprehensive mental health services to rural people, this second program guide deals with organization of a low-income program budget. Presenting the basic assumptions utilized in the development of a low-budget program in Franklin and…

  1. Student Achievement in Basic College Mathematics: Its Relationship to Learning Style and Learning Method

    ERIC Educational Resources Information Center

    Gunthorpe, Sydney

    2006-01-01

    From the assumption that matching a student's learning style with the learning method best suited for the student, it follows that developing courses that correlate learning method with learning style would be more successful for students. Albuquerque Technical Vocational Institute (TVI) in New Mexico has attempted to provide students with more…

  2. The Politics and Coverage of Terror: From Media Images to Public Consciousness.

    ERIC Educational Resources Information Center

    Wittebols, James H.

    This paper presents a typology of terrorism which is grounded in how media differentially cover each type. The typology challenges some of the basic assumptions, such as that the media "allow" themselves to be exploited by terrorists and "encourage" terrorism, and the conventional wisdom about the net effects of the media's…

  3. The Past as Prologue: Examining the Consequences of Business as Usual. Center Paper 01-93.

    ERIC Educational Resources Information Center

    Jones, Dennis P.; And Others

    This study examined the ability of California to meet increased demand for postsecondary education without significantly altering the basic historical assumptions and policies that have governed relations between the state and its institutions of higher learning. Results of a series of analyses that estimated projected enrollments and costs under…

  4. The Spouse and Familial Incest: An Adlerian Perspective.

    ERIC Educational Resources Information Center

    Quinn, Kathleen L.

    A major component of Adlerian psychology concerns the belief in responsibility to self and others. In both incest perpetrator and spouse the basic underlying assumption of responsibility to self and others is often not present. Activities and behaviors occur in a social context and as such need to be regarded within a social context that may serve…

  5. Initial Comparison of Single Cylinder Stirling Engine Computer Model Predictions with Test Results

    NASA Technical Reports Server (NTRS)

    Tew, R. C., Jr.; Thieme, L. G.; Miao, D.

    1979-01-01

    A Stirling engine digital computer model developed at NASA Lewis Research Center was configured to predict the performance of the GPU-3 single-cylinder rhombic drive engine. Revisions to the basic equations and assumptions are discussed. Model predictions with the early results of the Lewis Research Center GPU-3 tests are compared.

  6. Effects of Problem Scope and Creativity Instructions on Idea Generation and Selection

    ERIC Educational Resources Information Center

    Rietzschel, Eric F.; Nijstad, Bernard A.; Stroebe, Wolfgang

    2014-01-01

    The basic assumption of brainstorming is that increased quantity of ideas results in increased generation as well as selection of creative ideas. Although previous research suggests that idea quantity correlates strongly with the number of good ideas generated, quantity has been found to be unrelated to the quality of selected ideas. This article…

  7. Methods of Evaluation To Determine the Preservation Needs in Libraries and Archives: A RAMP Study with Guidelines.

    ERIC Educational Resources Information Center

    Cunha, George M.

    This Records and Archives Management Programme (RAMP) study is intended to assist in the development of basic training programs and courses in document preservation and restoration, and to promote harmonization of such training both within the archival profession and within the broader information field. Based on the assumption that conservation…

  8. The Role of the Social Studies in Public Education.

    ERIC Educational Resources Information Center

    Byrne, T. C.

    This paper was prepared for a social studies curriculum conference in Alberta in June, 1967. It provides a point of view on curriculum building which could be useful in establishing a national service in this field. The basic assumption is that the social studies should in some measure change the behavior of the students (a sharp departure from…

  9. Twisting of thin walled columns perfectly restrained at one end

    NASA Technical Reports Server (NTRS)

    Lazzarino, Lucio

    1938-01-01

    Proceeding from the basic assumptions of the Batho-Bredt theory on twisting failure of thin-walled columns, the discrepancies most frequently encountered are analyzed. A generalized approximate method is suggested for the determination of the disturbances in the stress condition of the column, induced by the constrained warping in one of the end sections.

  10. Adolescent Literacy in Europe--An Urgent Call for Action

    ERIC Educational Resources Information Center

    Sulkunen, Sari

    2013-01-01

    This article focuses on the literacy of the adolescents who, in most European countries, are about to leave or have recently left basic education with the assumption that they have the command of functional literacy as required in and for further studies, citizenship, work life and a fulfilling life as individuals. First, the overall performance…

  11. Is the European (Active) Citizenship Ideal Fostering Inclusion within the Union? A Critical Review

    ERIC Educational Resources Information Center

    Milana, Marcella

    2008-01-01

    This article reviews: (1) the establishment and functioning of EU citizenship: (2) the resulting perception of education for European active citizenship; and (3) the question of its adequacy for enhancing democratic values and practices within the Union. Key policy documents produced by the EU help to unfold the basic assumptions on which…

  12. The Importance of Woody Twig Ends to Deer in the Southeast

    Treesearch

    Charles T. Cushwa; Robert L. Downing; Richard F. Harlow; David F. Urbston

    1970-01-01

    One of the basic assumptions underlying research on wildlife habitat in the five Atlantic states of the Southeast is that white-tailed deer (Odocoileus virginianus) rely heavily on the ends of woody twigs during the winter. Considerable research has been undertaken to determine methods for increasing and measuring the availability of woody twigs to...

  13. Going off the Grid: Re-Examining Technology in the Basic Writing Classroom

    ERIC Educational Resources Information Center

    Clay-Buck, Holly; Tuberville, Brenda

    2015-01-01

    The notion that today's students are constantly exposed to information technology has become so pervasive that it seems the academic conversation assumes students are "tech savvy." The proliferation of apps and smart phones aimed at the traditional college-aged population feeds into this assumption, aided in no small part by a growing…

  14. Network model and short circuit program for the Kennedy Space Center electric power distribution system

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Assumptions made and techniques used in modeling the power network to the 480 volt level are discussed. Basic computational techniques used in the short circuit program are described along with a flow diagram of the program and operational procedures. Procedures for incorporating network changes are included in this user's manual.

  15. Challenging Freedom: Neoliberalism and the Erosion of Democratic Education

    ERIC Educational Resources Information Center

    Karaba, Robert

    2016-01-01

    Goodlad, et al. (2002) rightly point out that a culture can either resist or support change. Schein's (2010) model of culture indicates observable behaviors of a culture can be explained by exposing underlying shared values and basic assumptions that give meaning to the performance. Yet culture is many-faceted and complex. So Schein advised a…

  16. Patterns and Policies: The Changing Demographics of Foreign Language Instruction. Issues in Language Program Direction: A Series of Annual Volumes.

    ERIC Educational Resources Information Center

    Liskin-Gasparro, Judith E., Ed.

    This collection of papers is divided into three parts. Part 1, "Changing Patterns: Curricular Implications," includes "Basic Assumptions Revisited: Today's French and Spanish Students at a Large Metropolitan University" (Gail Guntermann, Suzanne Hendrickson, and Carmen de Urioste) and "Le Francais et Mort, Vive le…

  17. The Space-Time Conservation Element and Solution Element Method: A New High-Resolution and Genuinely Multidimensional Paradigm for Solving Conservation Laws. 1; The Two Dimensional Time Marching Schemes

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung; Wang, Xiao-Yen; Chow, Chuen-Yen

    1998-01-01

    A new high resolution and genuinely multidimensional numerical method for solving conservation laws is being, developed. It was designed to avoid the limitations of the traditional methods. and was built from round zero with extensive physics considerations. Nevertheless, its foundation is mathmatically simple enough that one can build from it a coherent, robust. efficient and accurate numerical framework. Two basic beliefs that set the new method apart from the established methods are at the core of its development. The first belief is that, in order to capture physics more efficiently and realistically, the modeling, focus should be placed on the original integral form of the physical conservation laws, rather than the differential form. The latter form follows from the integral form under the additional assumption that the physical solution is smooth, an assumption that is difficult to realize numerically in a region of rapid chance. such as a boundary layer or a shock. The second belief is that, with proper modeling of the integral and differential forms themselves, the resulting, numerical solution should automatically be consistent with the properties derived front the integral and differential forms, e.g., the jump conditions across a shock and the properties of characteristics. Therefore a much simpler and more robust method can be developed by not using the above derived properties explicitly.

  18. Dental Procedures and the Risk of Infective Endocarditis

    PubMed Central

    Chen, Pei-Chun; Tung, Ying-Chang; Wu, Patricia W.; Wu, Lung-Sheng; Lin, Yu-Sheng; Chang, Chee-Jen; Kung, Suefang; Chu, Pao-Hsien

    2015-01-01

    Abstract Infective endocarditis (IE) is an uncommon but potentially devastating disease. Recently published data have revealed a significant increase in the incidence of IE following the restriction on indications for antibiotic prophylaxis as recommended by the revised guidelines. This study aims to reexamine the basic assumption behind the rationale of prophylaxis that dental procedures increase the risk of IE. Using the Longitudinal Health Insurance Database of Taiwan, we retrospectively analyzed a total of 739 patients hospitalized for IE between 1999 and 2012. A case-crossover design was conducted to compare the odds of exposure to dental procedures within 3 months preceding hospitalization with that during matched control periods when no IE developed. In the unadjusted model, the odds ratio (OR) was 0.93 for tooth extraction (95% confidence interval [CI] 0.54–1.59), 1.64 for surgery (95% CI 0.61–4.42), 0.92 for dental scaling (95% CI 0.59–1.42), 1.69 for periodontal treatment (95% CI 0.88–3.21), and 1.29 for endodontic treatment (95% CI 0.72–2.31). The association between dental procedures and the risk of IE remained insignificant after adjustment for antibiotic use, indicating that dental procedures did not increase the risk of IE. Therefore, this result may argue against the conventional assumption on which the recommended prophylaxis for IE is based. PMID:26512586

  19. Non-stationary hydrologic frequency analysis using B-spline quantile regression

    NASA Astrophysics Data System (ADS)

    Nasri, B.; Bouezmarni, T.; St-Hilaire, A.; Ouarda, T. B. M. J.

    2017-11-01

    Hydrologic frequency analysis is commonly used by engineers and hydrologists to provide the basic information on planning, design and management of hydraulic and water resources systems under the assumption of stationarity. However, with increasing evidence of climate change, it is possible that the assumption of stationarity, which is prerequisite for traditional frequency analysis and hence, the results of conventional analysis would become questionable. In this study, we consider a framework for frequency analysis of extremes based on B-Spline quantile regression which allows to model data in the presence of non-stationarity and/or dependence on covariates with linear and non-linear dependence. A Markov Chain Monte Carlo (MCMC) algorithm was used to estimate quantiles and their posterior distributions. A coefficient of determination and Bayesian information criterion (BIC) for quantile regression are used in order to select the best model, i.e. for each quantile, we choose the degree and number of knots of the adequate B-spline quantile regression model. The method is applied to annual maximum and minimum streamflow records in Ontario, Canada. Climate indices are considered to describe the non-stationarity in the variable of interest and to estimate the quantiles in this case. The results show large differences between the non-stationary quantiles and their stationary equivalents for an annual maximum and minimum discharge with high annual non-exceedance probabilities.

  20. Understanding The Individual Impacts Of Human Interventions And Climate Change On Hydrologic Variables In India

    NASA Astrophysics Data System (ADS)

    Sharma, T.; Chhabra, S., Jr.; Karmakar, S.; Ghosh, S.

    2015-12-01

    We have quantified the historical climate change and Land Use Land Cover (LULC) change impacts on the hydrologic variables of Indian subcontinent by using Variable Infiltration Capacity (VIC) mesoscale model at 0.5° spatial resolution and daily temporal resolution. The results indicate that the climate change in India has predominating effects on the basic water balance components such as water yield, evapotranspiration and soil moisture. This analysis is with the assumption of naturalised hydrologic cycle, i.e., the impacts of human interventions like construction of controlled (primarily dams, diversions and reservoirs) and water withdrawals structures are not taken into account. The assumption is unrealistic since there are numerous anthropogenic disturbances which result in large changes on vegetation composition and distribution patterns. These activities can directly or indirectly influence the dynamics of water cycle; subsequently affecting the hydrologic processes like plant transpiration, infiltration, evaporation, runoff and sublimation. Here, we have quantified the human interventions by using the reservoir and irrigation module of VIC model which incorporates the irrigation schemes, reservoir characteristics and water withdrawals. The impact of human interventions on hydrologic variables in many grids are found more predominant than climate change and might be detrimental to water resources at regional level. This spatial pattern of impacts will facilitate water manager and planners to design and station hydrologic structures for a sustainable water resources management.

  1. An entropic framework for modeling economies

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel; Golan, Amos

    2014-08-01

    We develop an information-theoretic framework for economic modeling. This framework is based on principles of entropic inference that are designed for reasoning on the basis of incomplete information. We take the point of view of an external observer who has access to limited information about broad macroscopic economic features. We view this framework as complementary to more traditional methods. The economy is modeled as a collection of agents about whom we make no assumptions of rationality (in the sense of maximizing utility or profit). States of statistical equilibrium are introduced as those macrostates that maximize entropy subject to the relevant information codified into constraints. The basic assumption is that this information refers to supply and demand and is expressed in the form of the expected values of certain quantities (such as inputs, resources, goods, production functions, utility functions and budgets). The notion of economic entropy is introduced. It provides a measure of the uniformity of the distribution of goods and resources. It captures both the welfare state of the economy as well as the characteristics of the market (say, monopolistic, concentrated or competitive). Prices, which turn out to be the Lagrange multipliers, are endogenously generated by the economy. Further studies include the equilibrium between two economies and the conditions for stability. As an example, the case of the nonlinear economy that arises from linear production and utility functions is treated in some detail.

  2. An Astrometric Facility For Planetary Detection On The Space Station

    NASA Astrophysics Data System (ADS)

    Nishioka, Kenji; Scargle, Jeffrey D.; Givens, John J.

    1987-09-01

    An Astrometric Telescope Facility (ATF) for planetary detection is being studied as a potential Space Station initial operating capability payload. The primary science objective of this mission is the detection and study of planetary systems around other stars. In addition, the facility will be capable of other astrometric measurements such as stellar motions of other galaxies and highly precise direct measurement of stellar distances within the Milky Way Galaxy. This paper summarizes the results of a recently completed ATF preliminary systems definition study. Results of this study indicate that the preliminary concept for the facility is fully capable of meeting the science objectives without the development of any new technologies. This preliminary systems study started with the following basic assumptions: 1) the facility will be placed in orbit by a single Shuttle launch, 2) the Space Station will provide a coarse pointing system , electrical power, communications, assembly and checkout, maintenance and refurbishment services, and 3) the facility will be operated from a ground facility. With these assumptions and the science performance requirements a preliminary "strawman" facility was designed. The strawman facility design with a prime-focus telescope of 1.25-m aperture, f-ratio of 13 and a single prime-focus instrument was chosen to minimize random and systemmatic errors. Total facility mass is 5100 kg and overall dimensions are 1.85-m diam by 21.5-m long. A simple straightforward operations approach has been developed for ATF. A real-time facility control is not normally required, but does maintain a near real-time ground monitoring capability for facility and science data stream on a full-time basis. Facility observational sequences are normally loaded once a week. In addition, the preliminary system is designed to be fail-safe and single-fault tolerant. Routine interactions by the Space Station crew with ATF will not be necessary, but onboard controls are provided for crew override as required for emergencies and maintenance.

  3. Comparisons of luminaires: Efficacies and system design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albright, L.D.; Both, A.J.

    1994-12-31

    After reviewing basic information, three design examples have been presented to demonstrate a process of supplemental lighting design. The sequences of each example suggest careful thought and analysis are required to obtain supplemental lighting designs that provide both high levels of PAR and suitable uniformity. The end results should suggest how an analysis can evolve to achieve desired results, and the types of tools and adjustments required. It appears possible to design research greenhouses and plant growth chambers to achieve a {+-}10% PAR uniformity using HIPS luminaires. Further, HPS luminaires (and, by extension, NEHD, etc.) are required to achieve highmore » PAR levels and have the decided advantage of providing the possibility of aiming, which reduces the region of the {open_quotes}edge effect{close_quotes}. Further, for designing plant lighting systems, a modification of the standard IES luminaire data file structure is potentially useful. Luminaire installation is an important factor to obtain PAR uniformity. Spacing and mounting height are critically important. Additionally, the mounting angle of each luminaire must be carefully adjusted to conform with design assumptions. This is true for both plant growth chambers and greenhouses. Surface reflectances are particularly important when designing for small lighted regions such as plant growth chambers and research greenhouses. It is not obvious, just from looking at a surface, what its reflectance is. It is suggested that an effort be mounted to develop valid surface reflectance data to be used by designers. The importance of the surfaces (particularly the walls) in achieving PAR uniformity suggests the importance of periodic cleaning/maintenance to retain initial reflectance values.« less

  4. Designing an Academic Library as a Place and a Space: How a Robotic Storage System Will Create a 21st Century Library Design

    ERIC Educational Resources Information Center

    Bostick, Sharon L.; Irwin, Bryan

    2012-01-01

    Renovating, expanding or building new libraries today is a challenge on several levels. Libraries in general are faced with image issues, such as the assumption that a library building exists only to house print material followed by the equally erroneous assumption that everything is now available online, thus rendering a physical building…

  5. Supply-demand balance in outward-directed networks and Kleiber's law

    PubMed Central

    Painter, Page R

    2005-01-01

    Background Recent theories have attempted to derive the value of the exponent α in the allometric formula for scaling of basal metabolic rate from the properties of distribution network models for arteries and capillaries. It has recently been stated that a basic theorem relating the sum of nutrient currents to the specific nutrient uptake rate, together with a relationship claimed to be required in order to match nutrient supply to nutrient demand in 3-dimensional outward-directed networks, leads to Kleiber's law (b = 3/4). Methods The validity of the supply-demand matching principle and the assumptions required to prove the basic theorem are assessed. The supply-demand principle is evaluated by examining the supply term and the demand term in outward-directed lattice models of nutrient and water distribution systems and by applying the principle to fractal-like models of mammalian arterial systems. Results Application of the supply-demand principle to bifurcating fractal-like networks that are outward-directed does not predict 3/4-power scaling, and evaluation of water distribution system models shows that the matching principle does not match supply to demand in such systems. Furthermore, proof of the basic theorem is shown to require that the covariance of nutrient uptake and current path length is 0, an assumption unlikely to be true in mammalian arterial systems. Conclusion The supply-demand matching principle does not lead to a satisfactory explanation for the approximately 3/4-power scaling of mammalian basal metabolic rate. PMID:16283939

  6. Supply-demand balance in outward-directed networks and Kleiber's law.

    PubMed

    Painter, Page R

    2005-11-10

    Recent theories have attempted to derive the value of the exponent alpha in the allometric formula for scaling of basal metabolic rate from the properties of distribution network models for arteries and capillaries. It has recently been stated that a basic theorem relating the sum of nutrient currents to the specific nutrient uptake rate, together with a relationship claimed to be required in order to match nutrient supply to nutrient demand in 3-dimensional outward-directed networks, leads to Kleiber's law (b = 3/4). The validity of the supply-demand matching principle and the assumptions required to prove the basic theorem are assessed. The supply-demand principle is evaluated by examining the supply term and the demand term in outward-directed lattice models of nutrient and water distribution systems and by applying the principle to fractal-like models of mammalian arterial systems. Application of the supply-demand principle to bifurcating fractal-like networks that are outward-directed does not predict 3/4-power scaling, and evaluation of water distribution system models shows that the matching principle does not match supply to demand in such systems. Furthermore, proof of the basic theorem is shown to require that the covariance of nutrient uptake and current path length is 0, an assumption unlikely to be true in mammalian arterial systems. The supply-demand matching principle does not lead to a satisfactory explanation for the approximately 3/4-power scaling of mammalian basal metabolic rate.

  7. Optimal post-experiment estimation of poorly modeled dynamic systems

    NASA Technical Reports Server (NTRS)

    Mook, D. Joseph

    1988-01-01

    Recently, a novel strategy for post-experiment state estimation of discretely-measured dynamic systems has been developed. The method accounts for errors in the system dynamic model equations in a more general and rigorous manner than do filter-smoother algorithms. The dynamic model error terms do not require the usual process noise assumptions of zero-mean, symmetrically distributed random disturbances. Instead, the model error terms require no prior assumptions other than piecewise continuity. The resulting state estimates are more accurate than filters for applications in which the dynamic model error clearly violates the typical process noise assumptions, and the available measurements are sparse and/or noisy. Estimates of the dynamic model error, in addition to the states, are obtained as part of the solution of a two-point boundary value problem, and may be exploited for numerous reasons. In this paper, the basic technique is explained, and several example applications are given. Included among the examples are both state estimation and exploitation of the model error estimates.

  8. Comparisons of Multilevel Modeling and Structural Equation Modeling Approaches to Actor-Partner Interdependence Model.

    PubMed

    Hong, Sehee; Kim, Soyoung

    2018-01-01

    There are basically two modeling approaches applicable to analyzing an actor-partner interdependence model: the multilevel modeling (hierarchical linear model) and the structural equation modeling. This article explains how to use these two models in analyzing an actor-partner interdependence model and how these two approaches work differently. As an empirical example, marital conflict data were used to analyze an actor-partner interdependence model. The multilevel modeling and the structural equation modeling produced virtually identical estimates for a basic model. However, the structural equation modeling approach allowed more realistic assumptions on measurement errors and factor loadings, rendering better model fit indices.

  9. Strategies for evaluating the assumptions of the regression discontinuity design: a case study using a human papillomavirus vaccination programme.

    PubMed

    Smith, Leah M; Lévesque, Linda E; Kaufman, Jay S; Strumpf, Erin C

    2017-06-01

    The regression discontinuity design (RDD) is a quasi-experimental approach used to avoid confounding bias in the assessment of new policies and interventions. It is applied specifically in situations where individuals are assigned to a policy/intervention based on whether they are above or below a pre-specified cut-off on a continuously measured variable, such as birth date, income or weight. The strength of the design is that, provided individuals do not manipulate the value of this variable, assignment to the policy/intervention is considered as good as random for individuals close to the cut-off. Despite its popularity in fields like economics, the RDD remains relatively unknown in epidemiology where its application could be tremendously useful. In this paper, we provide a practical introduction to the RDD for health researchers, describe four empirically testable assumptions of the design and offer strategies that can be used to assess whether these assumptions are met in a given study. For illustrative purposes, we implement these strategies to assess whether the RDD is appropriate for a study of the impact of human papillomavirus vaccination on cervical dysplasia. We found that, whereas the assumptions of the RDD were generally satisfied in our study context, birth timing had the potential to confound our effect estimate in an unexpected way and therefore needed to be taken into account in the analysis. Our findings underscore the importance of assessing the validity of the assumptions of this design, testing them when possible and making adjustments as necessary to support valid causal inference. © The Author 2016. Published by Oxford University Press on behalf of the International Epidemiological Association

  10. Fitting N-mixture models to count data with unmodeled heterogeneity: Bias, diagnostics, and alternative approaches

    USGS Publications Warehouse

    Duarte, Adam; Adams, Michael J.; Peterson, James T.

    2018-01-01

    Monitoring animal populations is central to wildlife and fisheries management, and the use of N-mixture models toward these efforts has markedly increased in recent years. Nevertheless, relatively little work has evaluated estimator performance when basic assumptions are violated. Moreover, diagnostics to identify when bias in parameter estimates from N-mixture models is likely is largely unexplored. We simulated count data sets using 837 combinations of detection probability, number of sample units, number of survey occasions, and type and extent of heterogeneity in abundance or detectability. We fit Poisson N-mixture models to these data, quantified the bias associated with each combination, and evaluated if the parametric bootstrap goodness-of-fit (GOF) test can be used to indicate bias in parameter estimates. We also explored if assumption violations can be diagnosed prior to fitting N-mixture models. In doing so, we propose a new model diagnostic, which we term the quasi-coefficient of variation (QCV). N-mixture models performed well when assumptions were met and detection probabilities were moderate (i.e., ≥0.3), and the performance of the estimator improved with increasing survey occasions and sample units. However, the magnitude of bias in estimated mean abundance with even slight amounts of unmodeled heterogeneity was substantial. The parametric bootstrap GOF test did not perform well as a diagnostic for bias in parameter estimates when detectability and sample sizes were low. The results indicate the QCV is useful to diagnose potential bias and that potential bias associated with unidirectional trends in abundance or detectability can be diagnosed using Poisson regression. This study represents the most thorough assessment to date of assumption violations and diagnostics when fitting N-mixture models using the most commonly implemented error distribution. Unbiased estimates of population state variables are needed to properly inform management decision making. Therefore, we also discuss alternative approaches to yield unbiased estimates of population state variables using similar data types, and we stress that there is no substitute for an effective sample design that is grounded upon well-defined management objectives.

  11. Combustion Technology for Incinerating Wastes from Air Force Industrial Processes.

    DTIC Science & Technology

    1984-02-01

    The assumption of equilibrium between environmental compartments. * The statistical extrapolations yielding "safe" doses of various constituents...would be contacted to identify the assumptions and data requirements needed to design, construct and implement the model. The model’s primary objective...Recovery Planning Model (RRPLAN) is described. This section of the paper summarizes the model’s assumptions , major components and modes of operation

  12. Issues in the economic evaluation of influenza vaccination by injection of healthy working adults in the US: a review and decision analysis of ten published studies.

    PubMed

    Hogan, Thomas J

    2012-05-01

    The objective was to review recent economic evaluations of influenza vaccination by injection in the US, assess their evidence, and conclude on their collective findings. The literature was searched for economic evaluations of influenza vaccination injection in healthy working adults in the US published since 1995. Ten evaluations described in nine papers were identified. These were synopsized and their results evaluated, the basic structure of all evaluations was ascertained, and sensitivity of outcomes to changes in parameter values were explored using a decision model. Areas to improve economic evaluations were noted. Eight of nine evaluations with credible economic outcomes were favourable to vaccination, representing a statistically significant result compared with a proportion of 50% that would be expected if vaccination and no vaccination were economically equivalent. Evaluations shared a basic structure, but differed considerably with respect to cost components, assumptions, methods, and parameter estimates. Sensitivity analysis indicated that changes in parameter values within the feasible range, individually or simultaneously, could reverse economic outcomes. Given stated misgivings, the methods of estimating influenza reduction ascribed to vaccination must be researched to confirm that they produce accurate and reliable estimates. Research is also needed to improve estimates of the costs per case of influenza illness and the costs of vaccination. Based on their assumptions, the reviewed papers collectively appear to support the economic benefits of influenza vaccination of healthy adults. Yet the underlying assumptions, methods and parameter estimates themselves warrant further research to confirm they are accurate, reliable and appropriate to economic evaluation purposes.

  13. Intospace a European industrial initiative to commercialise space

    NASA Astrophysics Data System (ADS)

    von der Lippe, Juergen K.; Sprenger, Heinz J.

    2005-07-01

    Intospace, founded in 1985, was the response to the government's request to provide evidence to the industrial promises of commercial utilisation of space systems such as Spacelab and the already planned space station. The company was set up with an exceptional structure comprising 95 shareholders from all over western Europe from space and non-space industry and financial institutes. The companies joined as shareholders and committed beyond the basic capital to cover financial losses up to a given limit allowing the company to invest in market development. Compared to other commercial initiatives in the European space scenario the product that Intospace was supposed to offer, was without doubt the most demanding one regarding its market prospects. The primary product of Intospace was to provide services to commercial customers for using microgravity for research and production in space. This was based on the assumption that an effective operational infrastructure with frequent flights of Spacelab and Eureca would be available leading finally to the space station with Columbus. A further assumption had been that basic research projects of the agencies would provide sufficient data as a basis for commercial project planning. The conflict with these assumptions is best illustrated by the fact that the lifetime of Intospace is framed by the two shuttle disasters, the Challenger accident a couple of months after foundation of Intospace and the Columbia accident with Spacehab on board leading to liquidation of the company. The paper will present the background behind the foundation of the Intospace initiative, describe the objectives and major strategic steps to develop the market.

  14. The unique world of the Everett version of quantum theory

    NASA Astrophysics Data System (ADS)

    Squires, Euan J.

    1988-03-01

    We ask whether the basic Everett assumption, that there are no changes of the wavefunction other than those given by the Schrödinger equation, is compatible with experience. We conclude that it is, provided we allow the world of observation to be partially a creation of consciousness. The model suggests the possible existence of quantum paranormal effects.

  15. ENRICHMENT PROGRAM FOR ACADEMICALLY TALENTED JUNIOR HIGH SCHOOL STUDENTS FROM LOW INCOME FAMILIES.

    ERIC Educational Resources Information Center

    PRESSMAN, HARVEY

    A PROPOSAL FOR AN ENRICHMENT PROGRAM FOR ACADEMICALLY TALENTED JUNIOR HIGH SCHOOL STUDENTS FROM LOW-INCOME FAMILIES IN CERTAIN AREAS OF BOSTON IS PRESENTED. BASIC ASSUMPTIONS ARE THAT THERE IS AND OBVIOUS AND PRESSING NEED TO GIVE EXTRA HELP TO THE ABLE STUDENT FROM A DISADVANTAGED BACKGROUND, AND THAT A RELATIVELY BRIEF ENRICHMENT EXPERIENCE FOR…

  16. Redwoods—responsibilities for a long-lived species/resource

    Treesearch

    Robert Ewing

    2017-01-01

    What responsibilities do humans have to ensure that redwoods survive? And what values and strategies are required to accomplish such a purpose? A basic assumption is that the saving of a species, or more broadly of an ecosystem, is ultimately about human survival and that there is a responsibility to use all tools available to this end. To date, our actions to sustain...

  17. Comments on ""Contact Diffusion Interaction of Materials with Cladding''

    NASA Technical Reports Server (NTRS)

    Morris, J. F.

    1972-01-01

    A Russian paper by A. A. Babad-Zakhryapina contributes much to the understanding of fuel, clad interactions, and thus to nuclear thermionic technology. In that publication the basic diffusion expression is a simple one. A more general but complicated equation for this mass transport results from the present work. With appropriate assumptions, however, the new relation reduces to Babad-Zakhryapina's version.

  18. First order ball bearing kinematics

    NASA Technical Reports Server (NTRS)

    Kingbury, E.

    1984-01-01

    Two first order equations are given connecting geometry and internal motions in an angular contact ball bearing. Total speed, kinematic equivalence, basic speed ratio, and modal speed ratio are defined and discussed; charts are given for the speed ratios covering all bearings and all rotational modes. Instances where specific first order assumptions might fail are discussed, and the resulting effects on bearing performance reviewed.

  19. Forest inventories generate scientifically sound information on the forest resource, but do our data and information really matter?

    Treesearch

    Christoph Keinn; Goran Stahl

    2009-01-01

    Current research in forest inventory focuses very much on technical-statistical problems geared mainly to the optimization of data collection and information generation. The basic assumption is that better information leads to better decisions and, therefore, to better forest management and forest policy. Not many studies, however, strive to explicitly establish the...

  20. Four Scenarios for Determining the Size and Reusability of Learning Objects

    ERIC Educational Resources Information Center

    Schoonenboom, Judith

    2012-01-01

    The best method for determining the size of learning objects (LOs) so as to optimise their reusability has been a topic of debate for years now. Although there appears to be agreement on basic assumptions, developed guidelines and principles are often in conflict. This study shows that this confusion stems from the fact that in the literature,…

  1. A Survey of Report of Risk Management for Clay County, Florida.

    ERIC Educational Resources Information Center

    Florida State Dept. of Education, Tallahassee.

    Risk management encompasses far more than an insurance program alone. The basic elements consist of--(1) elimination or reduction of exposure to loss, (2) protection from exposure to loss, (3) assumption of risk loss, and (4) transfer of risk to a professional carrier. This survey serves as a means of evaluating the methods of application of these…

  2. Cable Television and Education: Proceedings of the CATV and Education Conference, May 11-12, 1973.

    ERIC Educational Resources Information Center

    Cardellino, Earl L., Comp.; Forsythe, Charles G., Comp.

    Edited versions of the conference presentations are compiled. The purpose of the meeting was to bring together media specialists and other educators from throughout Pennsylvania to evaluate the basic assumptions underlying the educational use of cable television (CATV) and to share ideas about the ways in which cable could be used to change the…

  3. Data-Driven Leadership: Determining Your Indicators and Building Your Dashboards

    ERIC Educational Resources Information Center

    Copeland, Mo

    2016-01-01

    For years, schools have tended to approach budgets with some basic assumptions and aspirations and general wish lists but with scant data to drive the budget conversation. Suppose there were a better way? What if the conversation started with a review of the last five to ten years of data on three key mission- and strategy-driven indicators:…

  4. The "Cause" of Low Self-Control: The Influence of Maternal Self-Control

    ERIC Educational Resources Information Center

    Nofziger, Stacey

    2008-01-01

    Self-control theory is one of the most tested theories within the field of criminology. However, one of the basic assumptions of the theory has remained largely ignored. Gottfredson and Hirschi stated that the focus of their general theory of crime is the "connection between the self-control of the parent and the subsequent self-control of the…

  5. Consumption of Mass Communication--Construction of a Model on Information Consumption Behaviour.

    ERIC Educational Resources Information Center

    Sepstrup, Preben

    A general conceptual model on the consumption of information is introduced. Information as the output of the mass media is treated as a product, and a model on the consumption of this product is developed by merging elements from consumer behavior theory and mass communication theory. Chapter I gives basic assumptions about the individual and the…

  6. Treating the Tough Adolescent: A Family-Based, Step-by-Step Guide. The Guilford Family Therapy Series.

    ERIC Educational Resources Information Center

    Sells, Scott P.

    A model for treating difficult adolescents and their families is presented. Part 1 offers six basic assumptions about the causes of severe behavioral problems and presents the treatment model with guidelines necessary to address each of these six causes. Case examples highlight and clarify major points within each of the 15 procedural steps of the…

  7. The Nature of Living Systems: An Exposition of the Basic Concepts in General Systems Theory.

    ERIC Educational Resources Information Center

    Miller, James G.

    General systems theory is a set of related definitions, assumptions, and propositions which deal with reality as an integrated hierarchy of organizations of matter and energy. In this paper, the author defines the concepts of space, time, matter, energy, and information in terms of their meaning in general systems theory. He defines a system as a…

  8. Cognitive access to numbers: the philosophical significance of empirical findings about basic number abilities.

    PubMed

    Giaquinto, Marcus

    2017-02-19

    How can we acquire a grasp of cardinal numbers, even the first very small positive cardinal numbers, given that they are abstract mathematical entities? That problem of cognitive access is the main focus of this paper. All the major rival views about the nature and existence of cardinal numbers face difficulties; and the view most consonant with our normal thought and talk about numbers, the view that cardinal numbers are sizes of sets, runs into the cognitive access problem. The source of the problem is the plausible assumption that cognitive access to something requires causal contact with it. It is argued that this assumption is in fact wrong, and that in this and similar cases, we should accept that a certain recognize-and-distinguish capacity is sufficient for cognitive access. We can then go on to solve the cognitive access problem, and thereby support the set-size view of cardinal numbers, by paying attention to empirical findings about basic number abilities. To this end, some selected studies of infants, pre-school children and a trained chimpanzee are briefly discussed.This article is part of a discussion meeting issue 'The origins of numerical abilities'. © 2017 The Author(s).

  9. Thinking science with thinking machines: The multiple realities of basic and applied knowledge in a research border zone.

    PubMed

    Hoffman, Steve G

    2015-04-01

    Some scholars dismiss the distinction between basic and applied science as passé, yet substantive assumptions about this boundary remain obdurate in research policy, popular rhetoric, the sociology and philosophy of science, and, indeed, at the level of bench practice. In this article, I draw on a multiple ontology framework to provide a more stable affirmation of a constructivist position in science and technology studies that cannot be reduced to a matter of competing perspectives on a single reality. The analysis is grounded in ethnographic research in the border zone of Artificial Intelligence science. I translate in-situ moments in which members of neighboring but differently situated labs engage in three distinct repertoires that render the reality of basic and applied science: partitioning, flipping, and collapsing. While the essences of scientific objects are nowhere to be found, the boundary between basic and applied is neither illusion nor mere propaganda. Instead, distinctions among scientific knowledge are made real as a matter of course.

  10. Experience with the EURECA Packet Telemetry and Packet Telecommand system

    NASA Technical Reports Server (NTRS)

    Sorensen, Erik Mose; Ferri, Paolo

    1994-01-01

    The European Retrieval Carrier (EURECA) was launched on its first flight on the 31st of July 1992 and retrieved on the 29th of June 1993. EURECA is characterized by several new on-board features, most notably Packet telemetry, and a partial implementation of packet telecommanding, the first ESA packetised spacecraft. Today more than one year after the retrieval the data from the EURECA mission has to a large extent been analysed and we can present some of the interesting results. This paper concentrates on the implementation and operational experience with the EURECA Packet Telemetry and Packet Telecommanding. We already discovered during the design of the ground system that the use of packet telemetry has major impact on the overall design and that processing of packet telemetry may have significant effect on the computer loading and sizing. During the mission a number of problems were identified with the on-board implementation resulting in very strange anomalous behaviors. Many of these problems directly violated basic assumptions for the design of the ground segment adding to the strange behavior. The paper shows that the design of a telemetry packet system should be flexible enough to allow a rapid configuration of the telemetry processing in order to adapt it to the new situation in case of an on-board failure. The experience gained with the EURECA mission control should be used to improve ground systems for future missions.

  11. A Multi-state Model for Designing Clinical Trials for Testing Overall Survival Allowing for Crossover after Progression

    PubMed Central

    Xia, Fang; George, Stephen L.; Wang, Xiaofei

    2015-01-01

    In designing a clinical trial for comparing two or more treatments with respect to overall survival (OS), a proportional hazards assumption is commonly made. However, in many cancer clinical trials, patients pass through various disease states prior to death and because of this may receive treatments other than originally assigned. For example, patients may crossover from the control treatment to the experimental treatment at progression. Even without crossover, the survival pattern after progression may be very different than the pattern prior to progression. The proportional hazards assumption will not hold in these situations and the design power calculated on this assumption will not be correct. In this paper we describe a simple and intuitive multi-state model allowing for progression, death before progression, post-progression survival and crossover after progression and apply this model to the design of clinical trials for comparing the OS of two treatments. For given values of the parameters of the multi-state model, we simulate the required number of deaths to achieve a specified power and the distribution of time required to achieve the requisite number of deaths. The results may be quite different from those derived using the usual PH assumption. PMID:27239255

  12. Statistical foundations of liquid-crystal theory: I. Discrete systems of rod-like molecules.

    PubMed

    Seguin, Brian; Fried, Eliot

    2012-12-01

    We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals.

  13. On the physical parameters for Centaurus X-3 and Hercules X-1.

    NASA Technical Reports Server (NTRS)

    Mccluskey, G. E., Jr.; Kondo, Y.

    1972-01-01

    It is shown how upper and lower limits on the physical parameters of X-ray sources in Centaurus X-3 and Hercules X-1 may be determined from a reasonably simple and straightforward consideration. The basic assumption is that component A (the non-X-ray emitting component) is not a star collapsing toward its Schwartzschild radius (i.e., a black hole). This assumption appears reasonable since component A (the radius of the central occulting star) appears to physically occult component X. If component A is a 'normal' star, both observation and theory indicate that its mass is not greater than about 60 solar masses. The possibility in which component X is either a neutron star or a white dwarf is considered.

  14. Discussion of examination of a cored hydraulic fracture in a deep gas well

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nolte, K.G.

    Warpinski et al. document information found from a core through a formation after a hydraulic fracture treatment. As they indicate, the core provides the first detailed evaluation of an actual propped hydraulic fracture away from the well and at a significant depth, and this evaluation leads to findings that deviate substantially from the assumptions incorporated into current fracturing models. In this discussion, a defense of current fracture design assumptions is developed. The affirmation of current assumptions, for general industry applications, is based on an assessment of the global impact of the local complexity found in the core. The assessment leadsmore » to recommendations for the evolution of fracture design practice.« less

  15. Is the hypothesis of preimplantation genetic screening (PGS) still supportable? A review.

    PubMed

    Gleicher, Norbert; Orvieto, Raoul

    2017-03-27

    The hypothesis of preimplantation genetic diagnosis (PGS) was first proposed 20 years ago, suggesting that elimination of aneuploid embryos prior to transfer will improve implantation rates of remaining embryos during in vitro fertilization (IVF), increase pregnancy and live birth rates and reduce miscarriages. The aforementioned improved outcome was based on 5 essential assumptions: (i) Most IVF cycles fail because of aneuploid embryos. (ii) Their elimination prior to embryo transfer will improve IVF outcomes. (iii) A single trophectoderm biopsy (TEB) at blastocyst stage is representative of the whole TE. (iv) TE ploidy reliably represents the inner cell mass (ICM). (v) Ploidy does not change (i.e., self-correct) downstream from blastocyst stage. We aim to offer a review of the aforementioned assumptions and challenge the general hypothesis of PGS. We reviewed 455 publications, which as of January 20, 2017 were listed in PubMed under the search phrase < preimplantation genetic screening (PGS) for aneuploidy>. The literature review was performed by both authors who agreed on the final 55 references. Various reports over the last 18 months have raised significant questions not only about the basic clinical utility of PGS but the biological underpinnings of the hypothesis, the technical ability of a single trophectoderm (TE) biopsy to accurately assess an embryo's ploidy, and suggested that PGS actually negatively affects IVF outcomes while not affecting miscarriage rates. Moreover, due to high rates of false positive diagnoses as a consequence of high mosaicism rates in TE, PGS leads to the discarding of large numbers of normal embryos with potential for normal euploid pregnancies if transferred rather than disposed of. We found all 5 basic assumptions underlying the hypothesis of PGS to be unsupported: (i) The association of embryo aneuploidy with IVF failure has to be reevaluated in view how much more common TE mosaicism is than has until recently been appreciated. (ii) Reliable elimination of presumed aneuploid embryos prior to embryo transfer appears unrealistic. (iii) Mathematical models demonstrate that a single TEB cannot provide reliable information about the whole TE. (iv) TE does not reliably reflect the ICM. (v) Embryos, likely, still have strong innate ability to self-correct downstream from blastocyst stage, with ICM doing so better than TE. The hypothesis of PGS, therefore, no longer appears supportable. With all 5 basic assumptions underlying the hypothesis of PGS demonstrated to have been mistaken, the hypothesis of PGS, itself, appears to be discredited. Clinical use of PGS for the purpose of IVF outcome improvements should, therefore, going forward be restricted to research studies.

  16. Integrated Aeroservoelastic Optimization: Status and Direction

    NASA Technical Reports Server (NTRS)

    Livne, Eli

    1999-01-01

    The interactions of lightweight flexible airframe structures, steady and unsteady aerodynamics, and wide-bandwidth active controls on modern airplanes lead to considerable multidisciplinary design challenges. More than 25 years of mathematical and numerical methods' development, numerous basic research studies, simulations and wind-tunnel tests of simple models, wind-tunnel tests of complex models of real airplanes, as well as flight tests of actively controlled airplanes, have all contributed to the accumulation of a substantial body of knowledge in the area of aeroservoelasticity. A number of analysis codes, with the capabilities to model real airplane systems under the assumptions of linearity, have been developed. Many tests have been conducted, and results were correlated with analytical predictions. A selective sample of references covering aeroservoelastic testing programs from the 1960s to the early 1980s, as well as more recent wind-tunnel test programs of real or realistic configurations, are included in the References section of this paper. An examination of references 20-29 will reveal that in the course of development (or later modification), of almost every modern airplane with a high authority active control system, there arose a need to face aeroservoelastic problems and aeroservoelastic design challenges.

  17. Directed acyclic graphs (DAGs): an aid to assess confounding in dental research.

    PubMed

    Merchant, Anwar T; Pitiphat, Waranuch

    2002-12-01

    Confounding, a special type of bias, occurs when an extraneous factor is associated with the exposure and independently affects the outcome. In order to get an unbiased estimate of the exposure-outcome relationship, we need to identify potential confounders, collect information on them, design appropriate studies, and adjust for confounding in data analysis. However, it is not always clear which variables to collect information on and adjust for in the analyses. Inappropriate adjustment for confounding can even introduce bias where none existed. Directed acyclic graphs (DAGs) provide a method to select potential confounders and minimize bias in the design and analysis of epidemiological studies. DAGs have been used extensively in expert systems and robotics. Robins (1987) introduced the application of DAGs in epidemiology to overcome shortcomings of traditional methods to control for confounding, especially as they related to unmeasured confounding. DAGs provide a quick and visual way to assess confounding without making parametric assumptions. We introduce DAGs, starting with definitions and rules for basic manipulation, stressing more on applications than theory. We then demonstrate their application in the control of confounding through examples of observational and cross-sectional epidemiological studies.

  18. The qualitative orientation in medical education research.

    PubMed

    Cleland, Jennifer Anne

    2017-06-01

    Qualitative research is very important in educational research as it addresses the "how" and "why" research questions and enables deeper understanding of experiences, phenomena and context. Qualitative research allows you to ask questions that cannot be easily put into numbers to understand human experience. Getting at the everyday realities of some social phenomenon and studying important questions as they are really practiced helps extend knowledge and understanding. To do so, you need to understand the philosophical stance of qualitative research and work from this to develop the research question, study design, data collection methods and data analysis. In this article, I provide an overview of the assumptions underlying qualitative research and the role of the researcher in the qualitative process. I then go on to discuss the type of research objectives which are common in qualitative research, then introduce the main qualitative designs, data collection tools, and finally the basics of qualitative analysis. I introduce the criteria by which you can judge the quality of qualitative research. Many classic references are cited in this article, and I urge you to seek out some of these further reading to inform your qualitative research program.

  19. Theory, modelling and calibration of passive samplers used in water monitoring: Chapter 7

    USGS Publications Warehouse

    Booij, K.; Vrana, B.; Huckins, James N.; Greenwood, Richard B.; Mills, Graham; Vrana, B.

    2007-01-01

    This chapter discusses contaminant uptake by a passive sampling device (PSD) that consists of a central sorption phase, surrounded by a membrane. A variety of models has been used over the past few years to better understand the kinetics of contaminant transfer to passive samplers. These models are essential for understanding how the amounts of absorbed contaminants relate to ambient concentrations, as well as for the design and evaluation of calibration experiments. Models differ in the number of phases and simplifying assumptions that are taken into consideration, such as the existence of (pseudo-) steady-state conditions, the presence or absence of linear concentration gradients within the membrane phase, the way in which transport within the WBL is modeled and whether or not the aqueous concentration is constant during the sampler exposure. The chapter introduces the basic concepts and models used in the literature on passive samplers for the special case of triolein-containing semipermeable membrane devices (SPMDs). These can easily be extended to samplers with more or with less sorption phases. It also discusses the transport of chemicals through the various phases constituting PSDs. the implications of these models for designing and evaluating calibration studies have been discussed.

  20. The qualitative orientation in medical education research

    PubMed Central

    2017-01-01

    Qualitative research is very important in educational research as it addresses the “how” and “why” research questions and enables deeper understanding of experiences, phenomena and context. Qualitative research allows you to ask questions that cannot be easily put into numbers to understand human experience. Getting at the everyday realities of some social phenomenon and studying important questions as they are really practiced helps extend knowledge and understanding. To do so, you need to understand the philosophical stance of qualitative research and work from this to develop the research question, study design, data collection methods and data analysis. In this article, I provide an overview of the assumptions underlying qualitative research and the role of the researcher in the qualitative process. I then go on to discuss the type of research objectives which are common in qualitative research, then introduce the main qualitative designs, data collection tools, and finally the basics of qualitative analysis. I introduce the criteria by which you can judge the quality of qualitative research. Many classic references are cited in this article, and I urge you to seek out some of these further reading to inform your qualitative research program. PMID:28597869

  1. Research on Basic Design Education: An International Survey

    ERIC Educational Resources Information Center

    Boucharenc, C. G.

    2006-01-01

    This paper reports on the results of a survey and qualitative analysis on the teaching of "Basic Design" in schools of design and architecture located in 22 countries. In the context of this research work, Basic Design means the teaching and learning of design fundamentals that may also be commonly referred to as the Principles of Two- and…

  2. 23 CFR 1340.3 - Basic design requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE § 1340.3 Basic design requirements. Surveys conducted in... requirement. The sample identified for the survey shall have a probability-based design such that estimates... 23 Highways 1 2010-04-01 2010-04-01 false Basic design requirements. 1340.3 Section 1340.3...

  3. 23 CFR 1340.3 - Basic design requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... STATE OBSERVATIONAL SURVEYS OF SEAT BELT USE § 1340.3 Basic design requirements. Surveys conducted in... requirement. The sample identified for the survey shall have a probability-based design such that estimates... 23 Highways 1 2011-04-01 2011-04-01 false Basic design requirements. 1340.3 Section 1340.3...

  4. Evaluation of agile designs in first-in-human (FIH) trials--a simulation study.

    PubMed

    Perlstein, Itay; Bolognese, James A; Krishna, Rajesh; Wagner, John A

    2009-12-01

    The aim of the investigation was to evaluate alternatives to standard first-in-human (FIH) designs in order to optimize the information gained from such studies by employing novel agile trial designs. Agile designs combine adaptive and flexible elements to enable optimized use of prior information either before and/or during conduct of the study to seamlessly update the study design. A comparison of the traditional 6 + 2 (active + placebo) subjects per cohort design with alternative, reduced sample size, agile designs was performed by using discrete event simulation. Agile designs were evaluated for specific adverse event models and rates as well as dose-proportional, saturated, and steep-accumulation pharmacokinetic profiles. Alternative, reduced sample size (hereafter referred to as agile) designs are proposed for cases where prior knowledge about pharmacokinetics and/or adverse event relationships are available or appropriately assumed. Additionally, preferred alternatives are proposed for a general case when prior knowledge is limited or unavailable. Within the tested conditions and stated assumptions, some agile designs were found to be as efficient as traditional designs. Thus, simulations demonstrated that the agile design is a robust and feasible approach to FIH clinical trials, with no meaningful loss of relevant information, as it relates to PK and AE assumptions. In some circumstances, applying agile designs may decrease the duration and resources required for Phase I studies, increasing the efficiency of early clinical development. We highlight the value and importance of useful prior information when specifying key assumptions related to safety, tolerability, and PK.

  5. International Organisations and the Construction of the Learning Active Citizen: An Analysis of Adult Learning Policy Documents from a Durkheimian Perspective

    ERIC Educational Resources Information Center

    Field, John; Schemmann, Michael

    2017-01-01

    The article analyses how citizenship is conceptualised in policy documents of four key international organisations. The basic assumption is that public policy has not turned away from adult learning for active citizenship, but that there are rather new ways in which international governmental organisations conceptualise and in some cases seek to…

  6. 26 CFR 1.404(a)-3 - Contributions of an employer to or under an employees' pension trust or annuity plan that meets...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... representative experience may be used as an assumed retirement age. Different basic assumptions or rates may be used for different classes of risks or different groups where justified by conditions or required by... proper, or except when a change is necessitated by reason of the use of different methods, factors...

  7. Learners in the English Learning and Skills Sector: The Implications of Half-Right Policy Assumptions

    ERIC Educational Resources Information Center

    Hodgson, Ann; Steer, Richard; Spours, Ken; Edward, Sheila; Coffield, Frank; Finlay, Ian; Gregson, Maggie

    2007-01-01

    The English Learning and Skills Sector (LSS) contains a highly diverse range of learners and covers all aspects of post-16 learning with the exception of higher education. In the research on which this paper is based we are concerned with the effects of policy on three types of learners--unemployed adults attempting to improve their basic skills…

  8. A Test of Three Basic Assumptions of Situational Leadership® II Model and Their Implications for HRD Practitioners

    ERIC Educational Resources Information Center

    Zigarmi, Drea; Roberts, Taylor Peyton

    2017-01-01

    Purpose: This study aims to test the following three assertions underlying the Situational Leadership® II (SLII) Model: all four leadership styles are received by followers; all four leadership styles are needed by followers; and if there is a fit between the leadership style a follower receives and needs, that follower will demonstrate favorable…

  9. The Trouble with Levels: A Reexamination of Craik and Lockhart's Framework for Memory Research

    ERIC Educational Resources Information Center

    Baddeley, Alan D.

    1978-01-01

    Begins by discussing a number of problems in applying a levels-of-processing approach to memory as proposed in the late 1960s and then revised in 1972 by Craik and Lockhart, suggests that some of the basic assumptions are false, and argues for information-processing models devised to study working memory and reading, which aim to explore the…

  10. Modernism, Postmodernism, or Neither? A Fresh Look at "Fine Art"

    ERIC Educational Resources Information Center

    Kamhi, Michelle Marder

    2006-01-01

    Numerous incidents have been reported in recent years wherein a work of art is mistaken as trash. The question is, how have people reached the point in the civilized world where a purported work of art cannot be distinguished from a pile of rubbish or a grid of condensation pipes? The answer to that question lies in the basic assumption of nearly…

  11. New Strategies for Delivering Library Resources to Users: Rethinking the Mechanisms in which Libraries Are Processing and Delivering Bibliographic Records

    ERIC Educational Resources Information Center

    El-Sherbini, Magda; Wilson, Amanda J

    2007-01-01

    The focus of this paper is to examine the current library practice of processing and delivering information and to introduce alternative scenarios that may keep librarians relevant in the technological era. In the scenarios presented here, the authors will attempt to challenge basic assumptions about the usefulness of and need for OPAC systems,…

  12. A Critical Reading of Ecocentrism and Its Meta-Scientific Use of Ecology: Instrumental versus Emancipatory Approaches in Environmental Education and Ecology Education

    ERIC Educational Resources Information Center

    Hovardas, Tasos

    2013-01-01

    The aim of the paper is to make a critical reading of ecocentrism and its meta-scientific use of ecology. First, basic assumptions of ecocentrism will be examined, which involve nature's intrinsic value, postmodern and modern positions in ecocentrism, and the subject-object dichotomy under the lenses of ecocentrism. Then, we will discuss…

  13. How Content and Symbolism in Mother Goose May Contribute to the Development of a Child's Integrated Psyche.

    ERIC Educational Resources Information Center

    Abrams, Joan

    Based on the assumption that the content and symbolism of nursery rhymes reflect the particular needs of those who respond to them, this paper analyzes Mother Goose rhymes in relation to the psychological stages of child development. Each basic need of the child, as defined in Bruno Bettelheim's "The Uses of Enchantment," is applied to…

  14. United States Air Force Agency Financial Report 2014

    DTIC Science & Technology

    2014-01-01

    basic sciences and 45 semester hours in humanities and social sciences . This 90 semester hour total comprises 60 percent of the total academic...Test and Evaluation Support $723 F-35 $628 Defense Research Sciences $373 GPS III-Operational Control Segment $373 Long Range Strike Bomber $359...Development, Test & Evaluation Family Housing & Military Construction (Less: Earned Revenue) Net Cost before Losses/ (Gains) from Actuarial Assumption

  15. Changing the Culture of Fuel Efficiency: A Change in Attitude

    DTIC Science & Technology

    2014-05-09

    2011 September). Organizational Culture: Assessment and Transformation. Journal of Change Management, 11(3), 305-328. Bandura , A. (1986). Social ...describes that, “organizational culture is a set of basic assumptions that a group has invented, discovered or developed in learning to cope with its...change. In the first category they found the most influential factors are leadership, attraction-selection-attrition, socialization , reward systems

  16. Rationality, Authority and Spindles: An Enquiry into Some Neglected Aspects of Organizational Effectiveness and a Partial Application to Public Schools.

    ERIC Educational Resources Information Center

    Allison, Derek J.

    Focusing on the problem of authority, an analysis of the theories of Max Weber, James D. Thompson, and Elliott Jaques forms the basis for this proposal for improved organizational effectiveness in public schools. Basic assumptions are that modern organizations are established and operated under rational principles and subject to rational analysis,…

  17. Patterns in δ15N in roots, stems, and leaves of sugar maple and American beech seedlings, saplings, and mature trees

    Treesearch

    L.H. Pardo; P. Semaoune; P.G. Schaberg; C. Eagar; M. Sebilo

    2013-01-01

    Stable isotopes of nitrogen (N) in plants are increasingly used to evaluate ecosystem N cycling patterns. A basic assumption in this research is that plant δ15N reflects the δ15N of the N source. Recent evidence suggests that plants may fractionate on uptake, transport, or transformation of N. If the...

  18. High Voltage Testing. Volume 2. Specifications and Test Procedures

    DTIC Science & Technology

    1982-08-01

    the greatest impact on the initial assumption and criteria developed in the published criteria documents include: dielectric withstanding voltage...3382-75 Measurement of Energy and Integrated Charge Transfer Due to Partial Discharges (Corona) Using Bridge Techniques. ASTM-D 3426 - Dielectric... Energy (NEMA Publication No. WC 7-1971). NEMA Publication No. 109 - AIEE-EEI-NEMA Standard Basic Insulation Level. 092-57 - Method of Test for Flash and

  19. Design Considerations for Large Computer Communication Networks,

    DTIC Science & Technology

    1976-04-01

    particular, we will discuss the last three assumptions in order to motivate some of the models to be considered in this chapter. Independence Assumption...channels. fg Part (a), again motivated by an earlier remark on deterministic routing, will become more accurate when we include in the model, based on fixed...hierarchical routing, then this assumption appears to be quite acceptable. Part (b) is motivated by the quite symmetrical structure of the networks considered

  20. Some important considerations in the development of stress corrosion cracking test methods.

    NASA Technical Reports Server (NTRS)

    Wei, R. P.; Novak, S. R.; Williams, D. P.

    1972-01-01

    Discussion of some of the precaution needs the development of fracture-mechanics based test methods for studying stress corrosion cracking involves. Following a review of pertinent analytical fracture mechanics considerations and of basic test methods, the implications for test corrosion cracking studies of the time-to-failure determining kinetics of crack growth and life are examined. It is shown that the basic assumption of the linear-elastic fracture mechanics analyses must be clearly recognized and satisfied in experimentation and that the effects of incubation and nonsteady-state crack growth must also be properly taken into account in determining the crack growth kinetics, if valid data are to be obtained from fracture-mechanics based test methods.

  1. Acoustic Absorption in Porous Materials

    NASA Technical Reports Server (NTRS)

    Kuczmarski, Maria A.; Johnston, James C.

    2011-01-01

    An understanding of both the areas of materials science and acoustics is necessary to successfully develop materials for acoustic absorption applications. This paper presents the basic knowledge and approaches for determining the acoustic performance of porous materials in a manner that will help materials researchers new to this area gain the understanding and skills necessary to make meaningful contributions to this field of study. Beginning with the basics and making as few assumptions as possible, this paper reviews relevant topics in the acoustic performance of porous materials, which are often used to make acoustic bulk absorbers, moving from the physics of sound wave interactions with porous materials to measurement techniques for flow resistivity, characteristic impedance, and wavenumber.

  2. Hydrogen donors and acceptors and basic amino acids jointly contribute to carcinogenesis.

    PubMed

    Tang, Man; Zhou, Yanchao; Li, Yiqi; Zou, Juntong; Yang, Beicheng; Cai, Li; Zhang, Xuelan; Liu, Qiuyun

    2017-01-01

    A hypothesis is postulated that high content of hydrogen donors and acceptors, and basic amino acids cause the intracellular trapping of the H + and Cl - ions, which increases cancer risks as local formation of HCl is mutagenic to DNA. Other cations such as Ca 2+ , and weak acids such as short-chain organic acids may attenuate the intracellular gathering of the H + and Cl - , two of the most abundant ions in the cells. Current data on increased cancer risks in diabetic and obese patients are consistent with the assumption that hydrogen bonding propensity on glucose, triglycerides and other molecules is among the causative factors. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. MHD processes in the outer heliosphere

    NASA Technical Reports Server (NTRS)

    Burlaga, L. F.

    1984-01-01

    The magnetic field measurements from Voyager and the magnetohydrodynamic (MHD) processes in the outer heliosphere are reviewed. A bibliography of the experimental and theoretical work concerning magnetic fields and plasmas observed in the outer heliosphere is given. Emphasis in this review is on basic concepts and dynamical processes involving the magnetic field. The theory that serves to explain and unify the interplanetary magnetic field and plasma observations is magnetohydrodynamics. Basic physical processes and observations that relate directly to solutions of the MHD equations are emphasized, but obtaining solutions of this complex system of equations involves various assumptions and approximations. The spatial and temporal complexity of the outer heliosphere and some approaches for dealing with this complexity are discussed.

  4. Basic statistics (the fundamental concepts).

    PubMed

    Lim, Eric

    2014-12-01

    An appreciation and understanding of statistics is import to all practising clinicians, not simply researchers. This is because mathematics is the fundamental basis to which we base clinical decisions, usually with reference to the benefit in relation to risk. Unless a clinician has a basic understanding of statistics, he or she will never be in a position to question healthcare management decisions that have been handed down from generation to generation, will not be able to conduct research effectively nor evaluate the validity of published evidence (usually making an assumption that most published work is either all good or all bad). This article provides a brief introduction to basic statistical methods and illustrates its use in common clinical scenarios. In addition, pitfalls of incorrect usage have been highlighted. However, it is not meant to be a substitute for formal training or consultation with a qualified and experienced medical statistician prior to starting any research project.

  5. The Equations of Oceanic Motions

    NASA Astrophysics Data System (ADS)

    Müller, Peter

    2006-10-01

    Modeling and prediction of oceanographic phenomena and climate is based on the integration of dynamic equations. The Equations of Oceanic Motions derives and systematically classifies the most common dynamic equations used in physical oceanography, from large scale thermohaline circulations to those governing small scale motions and turbulence. After establishing the basic dynamical equations that describe all oceanic motions, M|ller then derives approximate equations, emphasizing the assumptions made and physical processes eliminated. He distinguishes between geometric, thermodynamic and dynamic approximations and between the acoustic, gravity, vortical and temperature-salinity modes of motion. Basic concepts and formulae of equilibrium thermodynamics, vector and tensor calculus, curvilinear coordinate systems, and the kinematics of fluid motion and wave propagation are covered in appendices. Providing the basic theoretical background for graduate students and researchers of physical oceanography and climate science, this book will serve as both a comprehensive text and an essential reference.

  6. A guide to understanding social science research for natural scientists.

    PubMed

    Moon, Katie; Blackman, Deborah

    2014-10-01

    Natural scientists are increasingly interested in social research because they recognize that conservation problems are commonly social problems. Interpreting social research, however, requires at least a basic understanding of the philosophical principles and theoretical assumptions of the discipline, which are embedded in the design of social research. Natural scientists who engage in social science but are unfamiliar with these principles and assumptions can misinterpret their results. We developed a guide to assist natural scientists in understanding the philosophical basis of social science to support the meaningful interpretation of social research outcomes. The 3 fundamental elements of research are ontology, what exists in the human world that researchers can acquire knowledge about; epistemology, how knowledge is created; and philosophical perspective, the philosophical orientation of the researcher that guides her or his action. Many elements of the guide also apply to the natural sciences. Natural scientists can use the guide to assist them in interpreting social science research to determine how the ontological position of the researcher can influence the nature of the research; how the epistemological position can be used to support the legitimacy of different types of knowledge; and how philosophical perspective can shape the researcher's choice of methods and affect interpretation, communication, and application of results. The use of this guide can also support and promote the effective integration of the natural and social sciences to generate more insightful and relevant conservation research outcomes. © 2014 Society for Conservation Biology.

  7. Building patient-centeredness: hospital design as an interpretive act.

    PubMed

    Bromley, Elizabeth

    2012-09-01

    Hospital designs reflect the sociocultural, economic, professional, and aesthetic priorities prevalent at a given time. As such, hospital buildings concretize assumptions about illness, care and healing, patienthood, and medical providers' roles. Trends in hospital design have been attributed to the increasing influence of consumerism on healthcare, the influx of business-oriented managers, and technological changes. This paper describes the impact of the concept of patient-centeredness on the design of a new hospital in the USA. Data come from 35 interviews with planners, administrators, and designers of the new hospital, as well as from public documents about the hospital design. Thematic content analysis was used to identify salient design principles and intents. For these designers, administrators, and planners, an interpretation of patient-centeredness served as a heuristic, guiding the most basic decisions about space, people, and processes in the hospital. I detail the particular interpretation of patient-centeredness used to build and manage the new hospital space and the roles and responsibilities of providers working within it. Three strategies were central to the implementation of patient-centeredness: an onstage/offstage layout; a concierge approach to patients; and the scripting of communication. I discuss that this interpretation of patient-centeredness may challenge medical professionals' roles, may construct medical care as a product that should sate the patient's desire, and may distance patients from the realities of medical care. By describing the ways in which hospital designs reflect and reinforce contemporary concepts of patienthood and caring, this paper raises questions about the implementation of patient-centeredness that deserve further empirical study by medical social scientists. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Multi-Objective Hybrid Optimal Control for Multiple-Flyby Interplanetary Mission Design using Chemical Propulsion

    NASA Technical Reports Server (NTRS)

    Englander, Jacob A.; Vavrina, Matthew A.

    2015-01-01

    Preliminary design of high-thrust interplanetary missions is a highly complex process. The mission designer must choose discrete parameters such as the number of flybys and the bodies at which those flybys are performed. For some missions, such as surveys of small bodies, the mission designer also contributes to target selection. In addition, real-valued decision variables, such as launch epoch, flight times, maneuver and flyby epochs, and flyby altitudes must be chosen. There are often many thousands of possible trajectories to be evaluated. The customer who commissions a trajectory design is not usually interested in a point solution, but rather the exploration of the trade space of trajectories between several different objective functions. This can be a very expensive process in terms of the number of human analyst hours required. An automated approach is therefore very desirable. This work presents such an approach by posing the impulsive mission design problem as a multi-objective hybrid optimal control problem. The method is demonstrated on several real-world problems. Two assumptions are frequently made to simplify the modeling of an interplanetary high-thrust trajectory during the preliminary design phase. The first assumption is that because the available thrust is high, any maneuvers performed by the spacecraft can be modeled as discrete changes in velocity. This assumption removes the need to integrate the equations of motion governing the motion of a spacecraft under thrust and allows the change in velocity to be modeled as an impulse and the expenditure of propellant to be modeled using the time-independent solution to Tsiolkovsky's rocket equation [1]. The second assumption is that the spacecraft moves primarily under the influence of the central body, i.e. the sun, and all other perturbing forces may be neglected in preliminary design. The path of the spacecraft may then be modeled as a series of conic sections. When a spacecraft performs a close approach to a planet, the central body switches from the sun to that planet and the trajectory is modeled as a hyperbola with respect to the planet. This is known as the method of patched conics. The impulsive and patched-conic assumptions significantly simplify the preliminary design problem.

  9. Considerations in the design of a communication network for an autonomously managed power system

    NASA Technical Reports Server (NTRS)

    Mckee, J. W.; Whitehead, Norma; Lollar, Louis

    1989-01-01

    The considerations involved in designing a communication network for an autonomously managed power system intended for use in space vehicles are examined. An overview of the design and implementation of a communication network implemented in a breadboard power system is presented. An assumption that the monitoring and control devices are distributed but physically close leads to the selection of a multidrop cable communication system. The assumption of a high-quality communication cable in which few messages are lost resulted in a simple recovery procedure consisting of a time out and retransmit process.

  10. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  11. Dynamics of a Tularemia Outbreak in a Closely Monitored Free-Roaming Population of Wild House Mice.

    PubMed

    Dobay, Akos; Pilo, Paola; Lindholm, Anna K; Origgi, Francesco; Bagheri, Homayoun C; König, Barbara

    2015-01-01

    Infectious disease outbreaks can be devastating because of their sudden occurrence, as well as the complexity of monitoring and controlling them. Outbreaks in wildlife are even more challenging to observe and describe, especially when small animals or secretive species are involved. Modeling such infectious disease events is relevant to investigating their dynamics and is critical for decision makers to accomplish outbreak management. Tularemia, caused by the bacterium Francisella tularensis, is a potentially lethal zoonosis. Of the few animal outbreaks that have been reported in the literature, only those affecting zoo animals have been closely monitored. Here, we report the first estimation of the basic reproduction number R0 of an outbreak in wildlife caused by F. tularensis using quantitative modeling based on a susceptible-infected-recovered framework. We applied that model to data collected during an extensive investigation of an outbreak of tularemia caused by F. tularensis subsp. holarctica (also designated as type B) in a closely monitored, free-roaming house mouse (Mus musculus domesticus) population in Switzerland. Based on our model and assumptions, the best estimated basic reproduction number R0 of the current outbreak is 1.33. Our results suggest that tularemia can cause severe outbreaks in small rodents. We also concluded that the outbreak self-exhausted in approximately three months without administrating antibiotics.

  12. Dynamics of a Tularemia Outbreak in a Closely Monitored Free-Roaming Population of Wild House Mice

    PubMed Central

    Dobay, Akos; Pilo, Paola; Lindholm, Anna K.; Origgi, Francesco; Bagheri, Homayoun C.; König, Barbara

    2015-01-01

    Infectious disease outbreaks can be devastating because of their sudden occurrence, as well as the complexity of monitoring and controlling them. Outbreaks in wildlife are even more challenging to observe and describe, especially when small animals or secretive species are involved. Modeling such infectious disease events is relevant to investigating their dynamics and is critical for decision makers to accomplish outbreak management. Tularemia, caused by the bacterium Francisella tularensis, is a potentially lethal zoonosis. Of the few animal outbreaks that have been reported in the literature, only those affecting zoo animals have been closely monitored. Here, we report the first estimation of the basic reproduction number R 0 of an outbreak in wildlife caused by F. tularensis using quantitative modeling based on a susceptible-infected-recovered framework. We applied that model to data collected during an extensive investigation of an outbreak of tularemia caused by F. tularensis subsp. holarctica (also designated as type B) in a closely monitored, free-roaming house mouse (Mus musculus domesticus) population in Switzerland. Based on our model and assumptions, the best estimated basic reproduction number R 0 of the current outbreak is 1.33. Our results suggest that tularemia can cause severe outbreaks in small rodents. We also concluded that the outbreak self-exhausted in approximately three months without administrating antibiotics. PMID:26536232

  13. Experimental measurement of binding energy, selectivity, and allostery using fluctuation theorems.

    PubMed

    Camunas-Soler, Joan; Alemany, Anna; Ritort, Felix

    2017-01-27

    Thermodynamic bulk measurements of binding reactions rely on the validity of the law of mass action and the assumption of a dilute solution. Yet, important biological systems such as allosteric ligand-receptor binding, macromolecular crowding, or misfolded molecules may not follow these assumptions and may require a particular reaction model. Here we introduce a fluctuation theorem for ligand binding and an experimental approach using single-molecule force spectroscopy to determine binding energies, selectivity, and allostery of nucleic acids and peptides in a model-independent fashion. A similar approach could be used for proteins. This work extends the use of fluctuation theorems beyond unimolecular folding reactions, bridging the thermodynamics of small systems and the basic laws of chemical equilibrium. Copyright © 2017, American Association for the Advancement of Science.

  14. On the accuracy of personality judgment: a realistic approach.

    PubMed

    Funder, D C

    1995-10-01

    The "accuracy paradigm" for the study of personality judgment provides an important, new complement to the "error paradigm" that dominated this area of research for almost 2 decades. The present article introduces a specific approach within the accuracy paradigm called the Realistic Accuracy Model (RAM). RAM begins with the assumption that personality traits are real attributes of individuals. This assumption entails the use of a broad array of criteria for the evaluation of personality judgment and leads to a model that describes accuracy as a function of the availability, detection, and utilization of relevant behavioral cues. RAM provides a common explanation for basic moderators of accuracy, sheds light on how these moderators interact, and outlines a research agenda that includes the reintegration of the study of error with the study of accuracy.

  15. The limits of discipline: towards interdisciplinary food studies.

    PubMed

    Wilk, Richard

    2012-11-05

    While the number of scholars working on the broad topic of food has never been greater, the topic is still divided among numerous disciplines and specialists who do not often communicate with each other. This paper discusses some of the deep differences between disciplinary approaches, and concludes that food scientists differ in some of their basic assumptions about human nature. After outlining some of the institutional issues standing in the way of interdisciplinary work, the paper argues for a more synthetic and empirical approach, grounded in the study of everyday life. True interdisciplinary collaboration will have to go beyond assembling multidisciplinary teams. Instead we must accept the limitations of the classic disciplinary paradigms, and be willing to question and test our methods and assumptions. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Saturation behavior: a general relationship described by a simple second-order differential equation.

    PubMed

    Kepner, Gordon R

    2010-04-13

    The numerous natural phenomena that exhibit saturation behavior, e.g., ligand binding and enzyme kinetics, have been approached, to date, via empirical and particular analyses. This paper presents a mechanism-free, and assumption-free, second-order differential equation, designed only to describe a typical relationship between the variables governing these phenomena. It develops a mathematical model for this relation, based solely on the analysis of the typical experimental data plot and its saturation characteristics. Its utility complements the traditional empirical approaches. For the general saturation curve, described in terms of its independent (x) and dependent (y) variables, a second-order differential equation is obtained that applies to any saturation phenomena. It shows that the driving factor for the basic saturation behavior is the probability of the interactive site being free, which is described quantitatively. Solving the equation relates the variables in terms of the two empirical constants common to all these phenomena, the initial slope of the data plot and the limiting value at saturation. A first-order differential equation for the slope emerged that led to the concept of the effective binding rate at the active site and its dependence on the calculable probability the interactive site is free. These results are illustrated using specific cases, including ligand binding and enzyme kinetics. This leads to a revised understanding of how to interpret the empirical constants, in terms of the variables pertinent to the phenomenon under study. The second-order differential equation revealed the basic underlying relations that describe these saturation phenomena, and the basic mathematical properties of the standard experimental data plot. It was shown how to integrate this differential equation, and define the common basic properties of these phenomena. The results regarding the importance of the slope and the new perspectives on the empirical constants governing the behavior of these phenomena led to an alternative perspective on saturation behavior kinetics. Their essential commonality was revealed by this analysis, based on the second-order differential equation.

  17. Challenging Teachers' Pedagogic Practice and Assumptions about Social Media

    ERIC Educational Resources Information Center

    Cartner, Helen C.; Hallas, Julia L.

    2017-01-01

    This article describes an innovative approach to professional development designed to challenge teachers' pedagogic practice and assumptions about educational technologies such as social media. Developing effective technology-related professional development for teachers can be a challenge for institutions and facilitators who provide this…

  18. Experimental evolution in silico: a custom-designed mathematical model for virulence evolution of Bacillus thuringiensis.

    PubMed

    Strauß, Jakob Friedrich; Crain, Philip; Schulenburg, Hinrich; Telschow, Arndt

    2016-08-01

    Most mathematical models on the evolution of virulence are based on epidemiological models that assume parasite transmission follows the mass action principle. In experimental evolution, however, mass action is often violated due to controlled infection protocols. This "theory-experiment mismatch" raises the question whether there is a need for new mathematical models to accommodate the particular characteristics of experimental evolution. Here, we explore the experimental evolution model system of Bacillus thuringiensis as a parasite and Caenorhabditis elegans as a host. Recent experimental studies with strict control of parasite transmission revealed that one-sided adaptation of B. thuringiensis with non-evolving hosts selects for intermediate or no virulence, sometimes coupled with parasite extinction. In contrast, host-parasite coevolution selects for high virulence and for hosts with strong resistance against B. thuringiensis. In order to explain the empirical results, we propose a new mathematical model that mimics the basic experimental set-up. The key assumptions are: (i) controlled parasite transmission (no mass action), (ii) discrete host generations, and (iii) context-dependent cost of toxin production. Our model analysis revealed the same basic trends as found in the experiments. Especially, we could show that resistant hosts select for highly virulent bacterial strains. Moreover, we found (i) that the evolved level of virulence is independent of the initial level of virulence, and (ii) that the average amount of bacteria ingested significantly affects the evolution of virulence with fewer bacteria ingested selecting for highly virulent strains. These predictions can be tested in future experiments. This study highlights the usefulness of custom-designed mathematical models in the analysis and interpretation of empirical results from experimental evolution. Copyright © 2016 The Authors. Published by Elsevier GmbH.. All rights reserved.

  19. Intonation and compensation of fretted string instruments

    NASA Astrophysics Data System (ADS)

    Varieschi, Gabriele; Gower, Christina

    2011-04-01

    We discuss theoretical and physical models that are useful for analyzing the intonation of musical instruments such as guitars and mandolins and can be used to improve the tuning on these instruments. The placement of frets on the fingerboard is designed according to mathematical rules and the assumption of an ideal string. The analysis becomes more complicated when we include the effects of deformation of the string and inharmonicity due to other string characteristics. As a consequence, perfect intonation of all the notes on the instrument cannot be achieved, but complex compensation procedures can be introduced to minimize the problem. To test the validity of these procedures, we performed extensive measurements using standard monochord sonometers and other acoustical devices, confirming the correctness of our theoretical models. These experimental activities can be integrated into acoustics courses and laboratories and can become a more advanced version of basic experiments with monochords and sonometers. This work was supported by a grant from the Frank R. Seaver College of Science and Engineering, Loyola Marymount University.

  20. A new scenario framework for climate change research: The concept of Shared Climate Policy Assumptions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kriegler, Elmar; Edmonds, James A.; Hallegatte, Stephane

    2014-04-01

    The paper presents the concept of shared climate policy assumptions as an important element of the new scenario framework. Shared climate policy assumptions capture key climate policy dimensions such as the type and scale of mitigation and adaptation measures. They are not specified in the socio-economic reference pathways, and therefore introduce an important third dimension to the scenario matrix architecture. Climate policy assumptions will have to be made in any climate policy scenario, and can have a significant impact on the scenario description. We conclude that a meaningful set of shared climate policy assumptions is useful for grouping individual climatemore » policy analyses and facilitating their comparison. Shared climate policy assumptions should be designed to be policy relevant, and as a set to be broad enough to allow a comprehensive exploration of the climate change scenario space.« less

  1. Conclusion: Agency in the face of complexity and the future of assumption-aware evaluation practice.

    PubMed

    Morrow, Nathan; Nkwake, Apollo M

    2016-12-01

    This final chapter in the volume pulls together common themes from the diverse set of articles by a group of eight authors in this issue, and presents some reflections on the next steps for improving the ways in which evaluators work with assumptions. Collectively, the authors provide a broad overview of existing and emerging approaches to the articulation and use of assumptions in evaluation theory and practice. The authors reiterate the rationale and key terminology as a common basis for working with assumption in program design and evaluation. They highlight some useful concepts and categorizations to promote more rigorous treatment of assumptions in evaluation. A three-tier framework for fostering agency for assumption-aware evaluation practice is proposed-agency for themselves (evaluators); agency for others (stakeholders); and agency for standards and principles. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. PTSD as Meaning Violation: Testing a Cognitive Worldview Perspective.

    PubMed

    Park, Crystal L; Mills, Mary Alice; Edmondson, Donald

    2012-01-01

    The cognitive perspective on post-traumatic stress disorder (PTSD) has been successful in explaining many PTSD-related phenomena and in developing effective treatments, yet some of its basic assumptions remain surprisingly under-examined. The present study tested two of these assumptions: (1) situational appraisals of the event as violating global meaning (i.e., beliefs and goals) is related to PTSD symptomatology, and (2) the effect of situational appraisals of violation on PTSD symptomatology is mediated by global meaning (i.e., views of self and world). We tested these assumptions in a cross-sectional study of 130 college students who had experienced a DSM-IV level trauma. Structural equation modeling showed that appraisals of the extent to which the trauma violated one's beliefs and goals related fairly strongly to PTSD. In addition, the effects of appraisals of belief and goal violations on PTSD symptoms were fully mediated through negative global beliefs about both the self and the world. These findings support the cognitive worldview perspective, highlighting the importance of the meaning individuals assign to traumatic events, particularly the role of meaning violation.

  3. PTSD as Meaning Violation: Testing a Cognitive Worldview Perspective

    PubMed Central

    Park, Crystal L.; Mills, Mary Alice; Edmondson, Donald

    2014-01-01

    The cognitive perspective on post-traumatic stress disorder (PTSD) has been successful in explaining many PTSD-related phenomena and in developing effective treatments, yet some of its basic assumptions remain surprisingly under-examined. The present study tested two of these assumptions: (1) situational appraisals of the event as violating global meaning (i.e., beliefs and goals) is related to PTSD symptomatology, and (2) the effect of situational appraisals of violation on PTSD symptomatology is mediated by global meaning (i.e., views of self and world). We tested these assumptions in a cross-sectional study of 130 college students who had experienced a DSM-IV level trauma. Structural equation modeling showed that appraisals of the extent to which the trauma violated one’s beliefs and goals related fairly strongly to PTSD. In addition, the effects of appraisals of belief and goal violations on PTSD symptoms were fully mediated through negative global beliefs about both the self and the world. These findings support the cognitive worldview perspective, highlighting the importance of the meaning individuals assign to traumatic events, particularly the role of meaning violation. PMID:24860641

  4. Defense and the Economy

    DTIC Science & Technology

    1993-01-01

    Assumptions .......................................................... 15 b. Modeling Productivity ...and a macroeconomic model of the U.S. economy, designed to provide long-range projections 3 consistent with trends in production technology, shifts in...investments in roads, bridges, sewer systems, etc. In addition to these modeling assumptions, we also have introduced productivity increases to reflect the

  5. Lesbian health and the assumption of heterosexuality: an organizational perspective.

    PubMed

    Daley, Andrea

    2003-01-01

    This study used a qualitative research design to explore hospital policies and practices and the assumption of female heterosexuality. The assumption of heterosexuality is a product of discursive practices that normalize heterosexuality and individualize lesbian sexual identities. Literature indicates that the assumption of female heterosexuality is implicated in both the invisibility and marked visibility of lesbians as service users. This research adds to existing literature by shifting the focus of study from individual to organizational practices and, in so doing, seeks to uncover hidden truths, explore the functional power of language, and allow for the discovery of what we know and--equally as important--how we know.

  6. Historical Thinking and Other Unnatural Acts: Charting the Future of Teaching the Past. Critical Perspectives on the Past.

    ERIC Educational Resources Information Center

    Wineburg, Sam

    What ways of thinking, writing, and questioning would be lost if we eliminated history from the curriculum? The essays in this book begin with the basic assumption that history teaches people a way to make choices, to balance opinions, to tell stories, and to become uneasy--when necessary--about the stories that are told. The book is concerned…

  7. Recombination-generation currents in degenerate semiconductors

    NASA Technical Reports Server (NTRS)

    Von Roos, O.

    1978-01-01

    The classical Shockley-Read-Hall theory of free carrier recombination and generation via traps is extended to degenerate semiconductors. A concise and simple expression is found which avoids completely the concept of a Fermi level, a concept which is alien to nonequilibrium situations. Assumptions made in deriving the recombination generation current are carefully delineated and are found to be basically identical to those made in the original theory applicable to nondegenerate semiconductors.

  8. French NATO Policy: The Next Five Years

    DTIC Science & Technology

    1990-06-01

    tradeoffs on the ambitious French modernization programs. Most dramatic have been the projected strategic consequences of perestroika: France , like... project power into areas of French influence in the Third World. In the mid-I 980s, France was spending roughly 3.9 percent of gross domestic product on...policy environment and its effects on the basic assumptions underpinning French policy. He concludes that in the future, France will be easier to work

  9. [Medical service marketing at the time of medical insurance].

    PubMed

    Polyakov, I V; Uvarov, S A; Mikhaylova, L S; Lankin, K A

    1997-01-01

    Presents the approaches to applying the fundamentals of marketing to public health. Medical insurance organization may effectively work as arbitrators and marketing agents; the basic assumption in the theory of marketing underlies their activity. The concept of marketing implies investigation of the requirements of the users of medical services and the development of measures aimed at meeting the requirements of man in terms of health service and health maintenance.

  10. Techniques for the computation in demographic projections of health manpower.

    PubMed

    Horbach, L

    1979-01-01

    Some basic principles and algorithms are presented which can be used for projective calculations of medical staff on the basis of demographic data. The effects of modifications of the input data such as by health policy measures concerning training capacity, can be demonstrated by repeated calculations with assumptions. Such models give a variety of results and may highlight the probable future balance between health manpower supply and requirements.

  11. Forging a Combat Mobility Culture

    DTIC Science & Technology

    2006-04-01

    values and beliefs, and basic assumptions. Artifacts are the most visible aspects of an organization. They include physical environment...Leadership, Command, and Communication Studies Academic Year 2006 Coursebook (Edited by Sharon McBride, Maxwell AFB, AL: Air Command and Staff...Air Force Doing it Right?.” In Leadership, Command, and Communication Studies Academic Year 2006 Coursebook . Edited by Sharon McBride, Maxwell AFB, AL: Air Command and Staff College, October 2005. 38

  12. Modeling precipitation δ 18O variability in East Asia since the Last Glacial Maximum: temperature and amount effects across different timescales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wen, Xinyu; Liu, Zhengyu; Chen, Zhongxiao

    Water isotopes in precipitation have played a key role in the reconstruction of past climate on millennial timescales and longer. But, for midlatitude regions like East Asia with complex terrain, the reliability behind the basic assumptions of the temperature effect and amount effect is based on modern observational data and still remains unclear for past climate. In the present work, we reexamine the two basic effects on seasonal, interannual, and millennial timescales in a set of time slice experiments for the period 22–0 ka using an isotope-enabled atmospheric general circulation model (AGCM). Our study confirms the robustness of the temperaturemore » and amount effects on the seasonal cycle over China in the present climatic conditions, with the temperature effect dominating in northern China and the amount effect dominating in the far south of China but no distinct effect in the transition region of central China. However, our analysis shows that neither temperature nor amount effect is significantly dominant over China on millennial and interannual timescales, which is a challenge to those classic assumptions in past climate reconstruction. This work helps shed light on the interpretation of the proxy record of δ 18O from a modeling point of view.« less

  13. Modeling precipitation δ 18O variability in East Asia since the Last Glacial Maximum: temperature and amount effects across different timescales

    DOE PAGES

    Wen, Xinyu; Liu, Zhengyu; Chen, Zhongxiao; ...

    2016-11-06

    Water isotopes in precipitation have played a key role in the reconstruction of past climate on millennial timescales and longer. But, for midlatitude regions like East Asia with complex terrain, the reliability behind the basic assumptions of the temperature effect and amount effect is based on modern observational data and still remains unclear for past climate. In the present work, we reexamine the two basic effects on seasonal, interannual, and millennial timescales in a set of time slice experiments for the period 22–0 ka using an isotope-enabled atmospheric general circulation model (AGCM). Our study confirms the robustness of the temperaturemore » and amount effects on the seasonal cycle over China in the present climatic conditions, with the temperature effect dominating in northern China and the amount effect dominating in the far south of China but no distinct effect in the transition region of central China. However, our analysis shows that neither temperature nor amount effect is significantly dominant over China on millennial and interannual timescales, which is a challenge to those classic assumptions in past climate reconstruction. This work helps shed light on the interpretation of the proxy record of δ 18O from a modeling point of view.« less

  14. Dynamics of an HIV-1 infection model with cell mediated immunity

    NASA Astrophysics Data System (ADS)

    Yu, Pei; Huang, Jianing; Jiang, Jiao

    2014-10-01

    In this paper, we study the dynamics of an improved mathematical model on HIV-1 virus with cell mediated immunity. This new 5-dimensional model is based on the combination of a basic 3-dimensional HIV-1 model and a 4-dimensional immunity response model, which more realistically describes dynamics between the uninfected cells, infected cells, virus, the CTL response cells and CTL effector cells. Our 5-dimensional model may be reduced to the 4-dimensional model by applying a quasi-steady state assumption on the variable of virus. However, it is shown in this paper that virus is necessary to be involved in the modeling, and that a quasi-steady state assumption should be applied carefully, which may miss some important dynamical behavior of the system. Detailed bifurcation analysis is given to show that the system has three equilibrium solutions, namely the infection-free equilibrium, the infectious equilibrium without CTL, and the infectious equilibrium with CTL, and a series of bifurcations including two transcritical bifurcations and one or two possible Hopf bifurcations occur from these three equilibria as the basic reproduction number is varied. The mathematical methods applied in this paper include characteristic equations, Routh-Hurwitz condition, fluctuation lemma, Lyapunov function and computation of normal forms. Numerical simulation is also presented to demonstrate the applicability of the theoretical predictions.

  15. A novel methodology for estimating upper limits of major cost drivers for profitable conceptual launch system architectures

    NASA Astrophysics Data System (ADS)

    Rhodes, Russel E.; Byrd, Raymond J.

    1998-01-01

    This paper presents a ``back of the envelope'' technique for fast, timely, on-the-spot, assessment of affordability (profitability) of commercial space transportation architectural concepts. The tool presented here is not intended to replace conventional, detailed costing methodology. The process described enables ``quick look'' estimations and assumptions to effectively determine whether an initial concept (with its attendant cost estimating line items) provides focus for major leapfrog improvement. The Cost Charts Users Guide provides a generic sample tutorial, building an approximate understanding of the basic launch system cost factors and their representative magnitudes. This process will enable the user to develop a net ``cost (and price) per payload-mass unit to orbit'' incorporating a variety of significant cost drivers, supplemental to basic vehicle cost estimates. If acquisition cost and recurring cost factors (as a function of cost per payload-mass unit to orbit) do not meet the predetermined system-profitability goal, the concept in question will be clearly seen as non-competitive. Multiple analytical approaches, and applications of a variety of interrelated assumptions, can be examined in a quick, (on-the-spot) cost approximation analysis as this tool has inherent flexibility. The technique will allow determination of concept conformance to system objectives.

  16. Querying phenotype-genotype relationships on patient datasets using semantic web technology: the example of Cerebrotendinous xanthomatosis.

    PubMed

    Taboada, María; Martínez, Diego; Pilo, Belén; Jiménez-Escrig, Adriano; Robinson, Peter N; Sobrido, María J

    2012-07-31

    Semantic Web technology can considerably catalyze translational genetics and genomics research in medicine, where the interchange of information between basic research and clinical levels becomes crucial. This exchange involves mapping abstract phenotype descriptions from research resources, such as knowledge databases and catalogs, to unstructured datasets produced through experimental methods and clinical practice. This is especially true for the construction of mutation databases. This paper presents a way of harmonizing abstract phenotype descriptions with patient data from clinical practice, and querying this dataset about relationships between phenotypes and genetic variants, at different levels of abstraction. Due to the current availability of ontological and terminological resources that have already reached some consensus in biomedicine, a reuse-based ontology engineering approach was followed. The proposed approach uses the Ontology Web Language (OWL) to represent the phenotype ontology and the patient model, the Semantic Web Rule Language (SWRL) to bridge the gap between phenotype descriptions and clinical data, and the Semantic Query Web Rule Language (SQWRL) to query relevant phenotype-genotype bidirectional relationships. The work tests the use of semantic web technology in the biomedical research domain named cerebrotendinous xanthomatosis (CTX), using a real dataset and ontologies. A framework to query relevant phenotype-genotype bidirectional relationships is provided. Phenotype descriptions and patient data were harmonized by defining 28 Horn-like rules in terms of the OWL concepts. In total, 24 patterns of SWQRL queries were designed following the initial list of competency questions. As the approach is based on OWL, the semantic of the framework adapts the standard logical model of an open world assumption. This work demonstrates how semantic web technologies can be used to support flexible representation and computational inference mechanisms required to query patient datasets at different levels of abstraction. The open world assumption is especially good for describing only partially known phenotype-genotype relationships, in a way that is easily extensible. In future, this type of approach could offer researchers a valuable resource to infer new data from patient data for statistical analysis in translational research. In conclusion, phenotype description formalization and mapping to clinical data are two key elements for interchanging knowledge between basic and clinical research.

  17. Vocational Didactics: Core Assumptions and Approaches from Denmark, Germany, Norway, Spain and Sweden

    ERIC Educational Resources Information Center

    Gessler, Michael; Moreno Herrera, Lázaro

    2015-01-01

    The design of vocational didactics has to meet special requirements. Six core assumptions are identified: outcome orientation, cultural-historical embedding, horizontal structure, vertical structure, temporal structure, and the changing nature of work. Different approaches and discussions from school-based systems (Spain and Sweden) and dual…

  18. Stirling Engine External Heat System Design with Heat Pipe Heater.

    DTIC Science & Technology

    1986-07-01

    Figure 10. However, the evaporator analysis is greatly simplified by making the conservative assumption of constant heat flux. This assumption results in...number Cold Start Data * " ROM density of the metal, gr/cm 3 CAPM specific heat of the metal, cal./gr. K ETHG effective gauze thickness: the

  19. Post Stereotypes: Deconstructing Racial Assumptions and Biases through Visual Culture and Confrontational Pedagogy

    ERIC Educational Resources Information Center

    Jung, Yuha

    2015-01-01

    The Post Stereotypes project embodies confrontational pedagogy and involves postcard artmaking designed to both solicit expression of and deconstruct students' racial, ethnic, and cultural stereotypes and assumptions. As part of the Cultural Diversity in American Art course, students created postcard art that visually represented their personal…

  20. Lunar Navigation Architecture Design Considerations

    NASA Technical Reports Server (NTRS)

    D'Souza, Christopher; Getchius, Joel; Holt, Greg; Moreau, Michael

    2009-01-01

    The NASA Constellation Program is aiming to establish a long-term presence on the lunar surface. The Constellation elements (Orion, Altair, Earth Departure Stage, and Ares launch vehicles) will require a lunar navigation architecture for navigation state updates during lunar-class missions. Orion in particular has baselined earth-based ground direct tracking as the primary source for much of its absolute navigation needs. However, due to the uncertainty in the lunar navigation architecture, the Orion program has had to make certain assumptions on the capabilities of such architectures in order to adequately scale the vehicle design trade space. The following paper outlines lunar navigation requirements, the Orion program assumptions, and the impacts of these assumptions to the lunar navigation architecture design. The selection of potential sites was based upon geometric baselines, logistical feasibility, redundancy, and abort support capability. Simulated navigation covariances mapped to entry interface flightpath- angle uncertainties were used to evaluate knowledge errors. A minimum ground station architecture was identified consisting of Goldstone, Madrid, Canberra, Santiago, Hartebeeshoek, Dongora, Hawaii, Guam, and Ascension Island (or the geometric equivalent).

  1. Economic evaluation in chronic pain: a systematic review and de novo flexible economic model.

    PubMed

    Sullivan, W; Hirst, M; Beard, S; Gladwell, D; Fagnani, F; López Bastida, J; Phillips, C; Dunlop, W C N

    2016-07-01

    There is unmet need in patients suffering from chronic pain, yet innovation may be impeded by the difficulty of justifying economic value in a field beset by data limitations and methodological variability. A systematic review was conducted to identify and summarise the key areas of variability and limitations in modelling approaches in the economic evaluation of treatments for chronic pain. The results of the literature review were then used to support the development of a fully flexible open-source economic model structure, designed to test structural and data assumptions and act as a reference for future modelling practice. The key model design themes identified from the systematic review included: time horizon; titration and stabilisation; number of treatment lines; choice/ordering of treatment; and the impact of parameter uncertainty (given reliance on expert opinion). Exploratory analyses using the model to compare a hypothetical novel therapy versus morphine as first-line treatments showed cost-effectiveness results to be sensitive to structural and data assumptions. Assumptions about the treatment pathway and choice of time horizon were key model drivers. Our results suggest structural model design and data assumptions may have driven previous cost-effectiveness results and ultimately decisions based on economic value. We therefore conclude that it is vital that future economic models in chronic pain are designed to be fully transparent and hope our open-source code is useful in order to aspire to a common approach to modelling pain that includes robust sensitivity analyses to test structural and parameter uncertainty.

  2. Examining the Stationarity Assumption for Statistically Downscaled Climate Projections of Precipitation

    NASA Astrophysics Data System (ADS)

    Wootten, A.; Dixon, K. W.; Lanzante, J. R.; Mcpherson, R. A.

    2017-12-01

    Empirical statistical downscaling (ESD) approaches attempt to refine global climate model (GCM) information via statistical relationships between observations and GCM simulations. The aim of such downscaling efforts is to create added-value climate projections by adding finer spatial detail and reducing biases. The results of statistical downscaling exercises are often used in impact assessments under the assumption that past performance provides an indicator of future results. Given prior research describing the danger of this assumption with regards to temperature, this study expands the perfect model experimental design from previous case studies to test the stationarity assumption with respect to precipitation. Assuming stationarity implies the performance of ESD methods are similar between the future projections and historical training. Case study results from four quantile-mapping based ESD methods demonstrate violations of the stationarity assumption for both central tendency and extremes of precipitation. These violations vary geographically and seasonally. For the four ESD methods tested the greatest challenges for downscaling of daily total precipitation projections occur in regions with limited precipitation and for extremes of precipitation along Southeast coastal regions. We conclude with a discussion of future expansion of the perfect model experimental design and the implications for improving ESD methods and providing guidance on the use of ESD techniques for impact assessments and decision-support.

  3. Dynamic Self-Consistent Field Theories for Polymer Blends and Block Copolymers

    NASA Astrophysics Data System (ADS)

    Kawakatsu, Toshihiro

    Understanding the behavior of the phase separated domain structures and rheological properties of multi-component polymeric systems require detailed information on the dynamics of domains and that of conformations of constituent polymer chains. Self-consistent field (SCF) theory is a useful tool to treat such a problem because the conformation entropy of polymer chains in inhomogeneous systems can be evaluated quantitatively using this theory. However, when we turn our attention to the dynamic properties in a non-equilibrium state, the basic assumption of the SCF theory, i.e. the assumption of equilibrium chain conformation, breaks down. In order to avoid such a difficulty, dynamic SCF theories were developed. In this chapter, we give a brief review of the recent developments of dynamic SCF theories, and discuss where the cutting-edge of this theory is.

  4. Values and assumptions in the development of DSM-III and DSM-III-R: an insider's perspective and a belated response to Sadler, Hulgus, and Agich's "On values in recent American psychiatric classification".

    PubMed

    Spitzer, R L

    2001-06-01

    It is widely acknowledged that the approach taken in the development of a classification of mental disorders is guided by various values and assumptions. The author, who played a central role in the development of DSM-III (American Psychiatric Association [1980] Diagnostic and statistical manual of mental disorders, 3rd ed. Washington, DC:Author) and DSM-III-R (American Psychiatric Association [1987] Diagnostic and statistical manual of mental disorders, 3rd ed, rev. Washington, DC:Author) will explicate the basic values and assumptions that guided the development of these two diagnostic manuals. In so doing, the author will respond to the critique of DSM-III and DSM-III-R made by Sadler et al. in their 1994 paper (Sadler JZ, Hulgus YF, Agich GJ [1994] On values in recent American psychiatric classification. JMed Phil 19:261-277). The author will attempt to demonstrate that the stated goals of DSM-III and DSM-III-R are not inherently in conflict and are easily explicated by appealing to widely held values and assumptions, most of which appeared in the literature during the development of the manuals. Furthermore, we will demonstrate that it is not true that DSM-III places greater emphasis on reliability over validity and is covertly committed to a biological approach to explaining psychiatric disturbance.

  5. How Mean is the Mean?

    PubMed Central

    Speelman, Craig P.; McGann, Marek

    2013-01-01

    In this paper we voice concerns about the uncritical manner in which the mean is often used as a summary statistic in psychological research. We identify a number of implicit assumptions underlying the use of the mean and argue that the fragility of these assumptions should be more carefully considered. We examine some of the ways in which the potential violation of these assumptions can lead us into significant theoretical and methodological error. Illustrations of alternative models of research already extant within Psychology are used to explore methods of research less mean-dependent and suggest that a critical assessment of the assumptions underlying its use in research play a more explicit role in the process of study design and review. PMID:23888147

  6. Statistical power for detecting trends with applications to seabird monitoring

    USGS Publications Warehouse

    Hatch, Shyla A.

    2003-01-01

    Power analysis is helpful in defining goals for ecological monitoring and evaluating the performance of ongoing efforts. I examined detection standards proposed for population monitoring of seabirds using two programs (MONITOR and TRENDS) specially designed for power analysis of trend data. Neither program models within- and among-years components of variance explicitly and independently, thus an error term that incorporates both components is an essential input. Residual variation in seabird counts consisted of day-to-day variation within years and unexplained variation among years in approximately equal parts. The appropriate measure of error for power analysis is the standard error of estimation (S.E.est) from a regression of annual means against year. Replicate counts within years are helpful in minimizing S.E.est but should not be treated as independent samples for estimating power to detect trends. Other issues include a choice of assumptions about variance structure and selection of an exponential or linear model of population change. Seabird count data are characterized by strong correlations between S.D. and mean, thus a constant CV model is appropriate for power calculations. Time series were fit about equally well with exponential or linear models, but log transformation ensures equal variances over time, a basic assumption of regression analysis. Using sample data from seabird monitoring in Alaska, I computed the number of years required (with annual censusing) to detect trends of -1.4% per year (50% decline in 50 years) and -2.7% per year (50% decline in 25 years). At ??=0.05 and a desired power of 0.9, estimated study intervals ranged from 11 to 69 years depending on species, trend, software, and study design. Power to detect a negative trend of 6.7% per year (50% decline in 10 years) is suggested as an alternative standard for seabird monitoring that achieves a reasonable match between statistical and biological significance.

  7. Approximations of Two-Attribute Utility Functions

    DTIC Science & Technology

    1976-09-01

    preferred to") be a bina-zy relation on the set • of simple probability measures or ’gambles’ defined on a set T of consequences. Throughout this study it...simplifying independence assumptions. Although there are several approaches to this problem, the21 present study will focus on approximations of u... study will elicit additional interest in the topic. 2. REMARKS ON APPROXIMATION THEORY This section outlines a few basic ideas of approximation theory

  8. WalkThrough Example Procedures for MAMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruggiero, Christy E.; Gaschen, Brian Keith; Bloch, Jeffrey Joseph

    This documentation is a growing set of walk through examples of analyses using the MAMA V2.0 software. It does not cover all the features or possibilities with the MAMA software, but will address using many of the basic analysis tools to quantify particle size and shape in an image. This document will continue to evolve as additional procedures and examples are added. The starting assumption is that the MAMA software has been successfully installed.

  9. Deepening and Extending Channels for Navigation, Georgetown Harbor, South Carolina. Review of Reports.

    DTIC Science & Technology

    1978-01-01

    South Carolina fo 9*10=0 ~c cmd me, hA. lUU~h~hum~gd.~ JANUARY 1978 85 01 11 084 S ~~ . . . . . . . . . . . . . . FEASIBILITY REPORT REVIEW OF REPORT...ADOPTED JANUARY 28, 1958 2 ENVIRONMENTAL ASSESSMENT . . °.. . . . . . . . . . . .. . .. -" . , .". * * . . . . . . . .. -~ . . . -. REVIEW OF REPORTS... review . As a result of this review , it was judged that some of the basic assumptions presented in the draft report were no longer applicable and that

  10. Predictability of currency market exchange

    NASA Astrophysics Data System (ADS)

    Ohira, Toru; Sazuka, Naoya; Marumo, Kouhei; Shimizu, Tokiko; Takayasu, Misako; Takayasu, Hideki

    2002-05-01

    We analyze tick data of yen-dollar exchange with a focus on its up and down movement. We show that there exists a rather particular conditional probability structure with such high frequency data. This result provides us with evidence to question one of the basic assumptions of the traditional market theory, where such bias in high frequency price movements is regarded as not present. We also construct systematically a random walk model reflecting this probability structure.

  11. Conditioned Limit Theorems for Some Null Recurrent Markov Processes

    DTIC Science & Technology

    1976-08-01

    Chapter 1 INTRODUCTION 1.1 Summary of Results Let (Vk, k ! 0) be a discrete time Markov process with state space EC(- , ) and let S be...explain our results in some detail. 2 We begin by stating our three basic assumptions: (1) vk s k 2 0 Is a Markov process with state space E C(-o,%); (Ii... 12 n 3. CONDITIONING ON T (, > n.................................1.9 3.1 Preliminary Results

  12. Factors Affecting Post-Service Wage Growth for Veterans

    DTIC Science & Technology

    1991-12-01

    Labor economics is primarily concerned with how employers and employees respond to changes in wages, prices, profits, and the non-pecuniary aspects...of the employment reLaticnship [Ref: 4, pg. 31 Two of the basic assumptions underlying labor economics are Lhat resources are scarce, and that people...Retiree’ Post-Service Earnigs and Empjoyment, February 1981, Fand Corporation. 4. Ehrenberq, R. G. and Smith, R. S., Modern Labor Economics . 3ra Edit on

  13. The Assumption of Adequacy: Operation Safe Haven, A Chaplain’s View.

    DTIC Science & Technology

    1999-06-04

    poverty , their ignorance regarding everything from literacy to the most basic hygiene was overwhelming. One chaplain assistant from Fort Carson...perspective, the Panamanians, ninety percent of whom lived in absolute poverty were less than enamored with this state of affairs. The Canal Zone...was soon discovered that the entire adult population on the island of Cuba is addicted to nicotine ), and a brand new pair of running shoes. While going

  14. An Assessment of Propensity Score Matching as a Nonexperimental Impact Estimator: Evidence from Mexico's PROGRESA Program

    ERIC Educational Resources Information Center

    Diaz, Juan Jose; Handa, Sudhanshu

    2006-01-01

    Not all policy questions can be addressed by social experiments. Nonexperimental evaluation methods provide an alternative to experimental designs but their results depend on untestable assumptions. This paper presents evidence on the reliability of propensity score matching (PSM), which estimates treatment effects under the assumption of…

  15. Is a "Complex" Task Really Complex? Validating the Assumption of Cognitive Task Complexity

    ERIC Educational Resources Information Center

    Sasayama, Shoko

    2016-01-01

    In research on task-based learning and teaching, it has traditionally been assumed that differing degrees of cognitive task complexity can be inferred through task design and/or observations of differing qualities in linguistic production elicited by second language (L2) communication tasks. Without validating this assumption, however, it is…

  16. Estimation of the Prevalence of Autism Spectrum Disorder in South Korea, Revisited

    ERIC Educational Resources Information Center

    Pantelis, Peter C.; Kennedy, Daniel P.

    2016-01-01

    Two-phase designs in epidemiological studies of autism prevalence introduce methodological complications that can severely limit the precision of resulting estimates. If the assumptions used to derive the prevalence estimate are invalid or if the uncertainty surrounding these assumptions is not properly accounted for in the statistical inference…

  17. Is Having More Prerequisite Knowledge Better for Learning from Productive Failure?

    ERIC Educational Resources Information Center

    Toh, Pee Li Leslie; Kapur, Manu

    2017-01-01

    A critical assumption made in Kapur's ("Instr Sci" 40:651-672, 2012) productive failure design is that students have the necessary prerequisite knowledge resources to generate and explore solutions to problems before learning the targeted concept. Through two quasi-experimental studies, we interrogated this assumption in the context of…

  18. Designing occupancy studies when false-positive detections occur

    USGS Publications Warehouse

    Clement, Matthew

    2016-01-01

    1.Recently, estimators have been developed to estimate occupancy probabilities when false-positive detections occur during presence-absence surveys. Some of these estimators combine different types of survey data to improve estimates of occupancy. With these estimators, there is a tradeoff between the number of sample units surveyed, and the number and type of surveys at each sample unit. Guidance on efficient design of studies when false positives occur is unavailable. 2.For a range of scenarios, I identified survey designs that minimized the mean square error of the estimate of occupancy. I considered an approach that uses one survey method and two observation states and an approach that uses two survey methods. For each approach, I used numerical methods to identify optimal survey designs when model assumptions were met and parameter values were correctly anticipated, when parameter values were not correctly anticipated, and when the assumption of no unmodelled detection heterogeneity was violated. 3.Under the approach with two observation states, false positive detections increased the number of recommended surveys, relative to standard occupancy models. If parameter values could not be anticipated, pessimism about detection probabilities avoided poor designs. Detection heterogeneity could require more or fewer repeat surveys, depending on parameter values. If model assumptions were met, the approach with two survey methods was inefficient. However, with poor anticipation of parameter values, with detection heterogeneity, or with removal sampling schemes, combining two survey methods could improve estimates of occupancy. 4.Ignoring false positives can yield biased parameter estimates, yet false positives greatly complicate the design of occupancy studies. Specific guidance for major types of false-positive occupancy models, and for two assumption violations common in field data, can conserve survey resources. This guidance can be used to design efficient monitoring programs and studies of species occurrence, species distribution, or habitat selection, when false positives occur during surveys.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Lijuan; Gonder, Jeff; Burton, Evan

    This study evaluates the costs and benefits associated with the use of a plug-in hybrid electric bus and determines the cost effectiveness relative to a conventional bus and a hybrid electric bus. A sensitivity sweep analysis was performed over a number of a different battery sizes, charging powers, and charging stations. The net present value was calculated for each vehicle design and provided the basis for the design evaluation. In all cases, given present day economic assumptions, the conventional bus achieved the lowest net present value while the optimal plug-in hybrid electric bus scenario reached lower lifetime costs than themore » hybrid electric bus. The study also performed parameter sensitivity analysis under low market potential assumptions and high market potential assumptions. The net present value of plug-in hybrid electric bus is close to that of conventional bus.« less

  20. Exploring Life Support Architectures for Evolution of Deep Space Human Exploration

    NASA Technical Reports Server (NTRS)

    Anderson, Molly S.; Stambaugh, Imelda C.

    2015-01-01

    Life support system architectures for long duration space missions are often explored analytically in the human spaceflight community to find optimum solutions for mass, performance, and reliability. But in reality, many other constraints can guide the design when the life support system is examined within the context of an overall vehicle, as well as specific programmatic goals and needs. Between the end of the Constellation program and the development of the "Evolvable Mars Campaign", NASA explored a broad range of mission possibilities. Most of these missions will never be implemented but the lessons learned during these concept development phases may color and guide future analytical studies and eventual life support system architectures. This paper discusses several iterations of design studies from the life support system perspective to examine which requirements and assumptions, programmatic needs, or interfaces drive design. When doing early concept studies, many assumptions have to be made about technology and operations. Data can be pulled from a variety of sources depending on the study needs, including parametric models, historical data, new technologies, and even predictive analysis. In the end, assumptions must be made in the face of uncertainty. Some of these may introduce more risk as to whether the solution for the conceptual design study will still work when designs mature and data becomes available.

  1. Supplementation of an Artificial Medium for the Parasitoid Exorista larvarum (Diptera: Tachnidae) With Hemolymph of Hermetia illucens (Diptera: Stratiomyidae) or Antheraea pernyi (Lepidoptera: Saturniidae).

    PubMed

    Dindo, Maria Luisa; Vandicke, Jonas; Marchetti, Elisa; Spranghers, Thomas; Bonte, Jochem; De Clercq, Patrick

    2016-04-01

    The effect of supplementing hemolymph of the black soldier fly, Hermetia illucens (L.), or the Chinese oak silkworm, Antheraea pernyi (Guérin-Méneville), to a basic insect-free artificial medium for the tachinid Exorista larvarum (L.) was investigated. The supplementation (20% w/w) was based on the assumption that insect additives may optimize the media for this parasitoid. Egg hatch, pupal and adult yields, and sex ratio did not differ among the enriched and basic media. Preimaginal development was faster on both hemolymph-enriched media than on the basic medium. Despite the shorter development on the medium supplemented with H. illucens hemolymph than on the basic medium, on the two media puparium weights were comparable. The female flies reared on the medium enriched with H. illucens hemolymph did not lay more eggs, but the latter yielded significantly more puparia compared with the control females. Conversely, the medium enriched with A. pernyi hemolymph yielded lower female puparium weights than the basic medium and produced only one ovipositing female out of the five obtained female adults. These results indicate that the in vitro development of E. larvarum improved when the basic artificial medium was enriched with H. illucens hemolymph, whereas the supplementation with A. pernyi hemolymph negatively affected the quality of the in vitro-reared females.

  2. A robust two-way semi-linear model for normalization of cDNA microarray data

    PubMed Central

    Wang, Deli; Huang, Jian; Xie, Hehuang; Manzella, Liliana; Soares, Marcelo Bento

    2005-01-01

    Background Normalization is a basic step in microarray data analysis. A proper normalization procedure ensures that the intensity ratios provide meaningful measures of relative expression values. Methods We propose a robust semiparametric method in a two-way semi-linear model (TW-SLM) for normalization of cDNA microarray data. This method does not make the usual assumptions underlying some of the existing methods. For example, it does not assume that: (i) the percentage of differentially expressed genes is small; or (ii) the numbers of up- and down-regulated genes are about the same, as required in the LOWESS normalization method. We conduct simulation studies to evaluate the proposed method and use a real data set from a specially designed microarray experiment to compare the performance of the proposed method with that of the LOWESS normalization approach. Results The simulation results show that the proposed method performs better than the LOWESS normalization method in terms of mean square errors for estimated gene effects. The results of analysis of the real data set also show that the proposed method yields more consistent results between the direct and the indirect comparisons and also can detect more differentially expressed genes than the LOWESS method. Conclusions Our simulation studies and the real data example indicate that the proposed robust TW-SLM method works at least as well as the LOWESS method and works better when the underlying assumptions for the LOWESS method are not satisfied. Therefore, it is a powerful alternative to the existing normalization methods. PMID:15663789

  3. Analysis of molecular variance inferred from metric distances among DNA haplotypes: application to human mitochondrial DNA restriction data.

    PubMed

    Excoffier, L; Smouse, P E; Quattro, J M

    1992-06-01

    We present here a framework for the study of molecular variation within a single species. Information on DNA haplotype divergence is incorporated into an analysis of variance format, derived from a matrix of squared-distances among all pairs of haplotypes. This analysis of molecular variance (AMOVA) produces estimates of variance components and F-statistic analogs, designated here as phi-statistics, reflecting the correlation of haplotypic diversity at different levels of hierarchical subdivision. The method is flexible enough to accommodate several alternative input matrices, corresponding to different types of molecular data, as well as different types of evolutionary assumptions, without modifying the basic structure of the analysis. The significance of the variance components and phi-statistics is tested using a permutational approach, eliminating the normality assumption that is conventional for analysis of variance but inappropriate for molecular data. Application of AMOVA to human mitochondrial DNA haplotype data shows that population subdivisions are better resolved when some measure of molecular differences among haplotypes is introduced into the analysis. At the intraspecific level, however, the additional information provided by knowing the exact phylogenetic relations among haplotypes or by a nonlinear translation of restriction-site change into nucleotide diversity does not significantly modify the inferred population genetic structure. Monte Carlo studies show that site sampling does not fundamentally affect the significance of the molecular variance components. The AMOVA treatment is easily extended in several different directions and it constitutes a coherent and flexible framework for the statistical analysis of molecular data.

  4. Phylogenetic Analysis Supports the Aerobic-Capacity Model for the Evolution of Endothermy.

    PubMed

    Nespolo, Roberto F; Solano-Iguaran, Jaiber J; Bozinovic, Francisco

    2017-01-01

    The evolution of endothermy is a controversial topic in evolutionary biology, although several hypotheses have been proposed to explain it. To a great extent, the debate has centered on the aerobic-capacity model (AC model), an adaptive hypothesis involving maximum and resting rates of metabolism (MMR and RMR, respectively; hereafter "metabolic traits"). The AC model posits that MMR, a proxy of aerobic capacity and sustained activity, is the target of directional selection and that RMR is also influenced as a correlated response. Associated with this reasoning are the assumptions that (1) factorial aerobic scope (FAS; MMR/RMR) and net aerobic scope (NAS; MMR - RMR), two commonly used indexes of aerobic capacity, show different evolutionary optima and (2) the functional link between MMR and RMR is a basic design feature of vertebrates. To test these assumptions, we performed a comparative phylogenetic analysis in 176 vertebrate species, ranging from fish and amphibians to birds and mammals. Using disparity-through-time analysis, we also explored trait diversification and fitted different evolutionary models to study the evolution of metabolic traits. As predicted, we found (1) a positive phylogenetic correlation between RMR and MMR, (2) diversification of metabolic traits exceeding that of random-walk expectations, (3) that a model assuming selection fits the data better than alternative models, and (4) that a single evolutionary optimum best fits FAS data, whereas a model involving two optima (one for ectotherms and another for endotherms) is the best explanatory model for NAS. These results support the AC model and give novel information concerning the mode and tempo of physiological evolution of vertebrates.

  5. Calibration-free assays on standard real-time PCR devices

    PubMed Central

    Debski, Pawel R.; Gewartowski, Kamil; Bajer, Seweryn; Garstecki, Piotr

    2017-01-01

    Quantitative Polymerase Chain Reaction (qPCR) is one of central techniques in molecular biology and important tool in medical diagnostics. While being a golden standard qPCR techniques depend on reference measurements and are susceptible to large errors caused by even small changes of reaction efficiency or conditions that are typically not marked by decreased precision. Digital PCR (dPCR) technologies should alleviate the need for calibration by providing absolute quantitation using binary (yes/no) signals from partitions provided that the basic assumption of amplification a single target molecule into a positive signal is met. Still, the access to digital techniques is limited because they require new instruments. We show an analog-digital method that can be executed on standard (real-time) qPCR devices. It benefits from real-time readout, providing calibration-free assessment. The method combines advantages of qPCR and dPCR and bypasses their drawbacks. The protocols provide for small simplified partitioning that can be fitted within standard well plate format. We demonstrate that with the use of synergistic assay design standard qPCR devices are capable of absolute quantitation when normal qPCR protocols fail to provide accurate estimates. We list practical recipes how to design assays for required parameters, and how to analyze signals to estimate concentration. PMID:28327545

  6. Researchers' choice of the number and range of levels in experiments affects the resultant variance-accounted-for effect size.

    PubMed

    Okada, Kensuke; Hoshino, Takahiro

    2017-04-01

    In psychology, the reporting of variance-accounted-for effect size indices has been recommended and widely accepted through the movement away from null hypothesis significance testing. However, most researchers have paid insufficient attention to the fact that effect sizes depend on the choice of the number of levels and their ranges in experiments. Moreover, the functional form of how and how much this choice affects the resultant effect size has not thus far been studied. We show that the relationship between the population effect size and number and range of levels is given as an explicit function under reasonable assumptions. Counterintuitively, it is found that researchers may affect the resultant effect size to be either double or half simply by suitably choosing the number of levels and their ranges. Through a simulation study, we confirm that this relation also applies to sample effect size indices in much the same way. Therefore, the variance-accounted-for effect size would be substantially affected by the basic research design such as the number of levels. Simple cross-study comparisons and a meta-analysis of variance-accounted-for effect sizes would generally be irrational unless differences in research designs are explicitly considered.

  7. Calibration-free assays on standard real-time PCR devices

    NASA Astrophysics Data System (ADS)

    Debski, Pawel R.; Gewartowski, Kamil; Bajer, Seweryn; Garstecki, Piotr

    2017-03-01

    Quantitative Polymerase Chain Reaction (qPCR) is one of central techniques in molecular biology and important tool in medical diagnostics. While being a golden standard qPCR techniques depend on reference measurements and are susceptible to large errors caused by even small changes of reaction efficiency or conditions that are typically not marked by decreased precision. Digital PCR (dPCR) technologies should alleviate the need for calibration by providing absolute quantitation using binary (yes/no) signals from partitions provided that the basic assumption of amplification a single target molecule into a positive signal is met. Still, the access to digital techniques is limited because they require new instruments. We show an analog-digital method that can be executed on standard (real-time) qPCR devices. It benefits from real-time readout, providing calibration-free assessment. The method combines advantages of qPCR and dPCR and bypasses their drawbacks. The protocols provide for small simplified partitioning that can be fitted within standard well plate format. We demonstrate that with the use of synergistic assay design standard qPCR devices are capable of absolute quantitation when normal qPCR protocols fail to provide accurate estimates. We list practical recipes how to design assays for required parameters, and how to analyze signals to estimate concentration.

  8. A theoretical basis for the analysis of redundant software subject to coincident errors

    NASA Technical Reports Server (NTRS)

    Eckhardt, D. E., Jr.; Lee, L. D.

    1985-01-01

    Fundamental to the development of redundant software techniques fault-tolerant software, is an understanding of the impact of multiple-joint occurrences of coincident errors. A theoretical basis for the study of redundant software is developed which provides a probabilistic framework for empirically evaluating the effectiveness of the general (N-Version) strategy when component versions are subject to coincident errors, and permits an analytical study of the effects of these errors. The basic assumptions of the model are: (1) independently designed software components are chosen in a random sample; and (2) in the user environment, the system is required to execute on a stationary input series. The intensity of coincident errors, has a central role in the model. This function describes the propensity to introduce design faults in such a way that software components fail together when executing in the user environment. The model is used to give conditions under which an N-Version system is a better strategy for reducing system failure probability than relying on a single version of software. A condition which limits the effectiveness of a fault-tolerant strategy is studied, and it is posted whether system failure probability varies monotonically with increasing N or whether an optimal choice of N exists.

  9. Genetics in Diabetic Retinopathy: Current Concepts and New Insights

    PubMed Central

    Simó-Servat, Olga; Hernández, Cristina; Simó, Rafael

    2013-01-01

    There is emerging evidence which indicates the essential role of genetic factors in the development of diabetic retinopathy (DR). In this regard it should be highlighted that genetic factors account for 25-50% of the risk of developing DR. Therefore, the use of genetic analysis to identify those diabetic patients most prone to developing DR might be useful in designing a more individualized treatment. In this regard, there are three main research strategies: candidate gene studies, linkage studies and Genome-Wide Association Studies (GWAS). In the candidate gene approach, several genes encoding proteins closely related to DR development have been analyzed. The linkage studies analyze shared alleles among family members with DR under the assumption that these predispose to a more aggressive development of DR. Finally, Genome-Wide Association Studies (GWAS) are a new tool involving a massive evaluation of single nucleotide polymorphisms (SNP) in large samples. In this review the available information using these three methodologies is critically analyzed. A genetic approach in order to identify new candidates in the pathogenesis of DR would permit us to design more targeted therapeutic strategies in order to decrease this devastating complication of diabetes. Basic researchers, ophthalmologists, diabetologists and geneticists should work together in order to gain new insights into this issue. PMID:24403848

  10. Carrier rockets

    NASA Astrophysics Data System (ADS)

    Aleksandrov, V. A.; Vladimirov, V. V.; Dmitriev, R. D.; Osipov, S. O.

    This book takes into consideration domestic and foreign developments related to launch vehicles. General information concerning launch vehicle systems is presented, taking into account details of rocket structure, basic design considerations, and a number of specific Soviet and American launch vehicles. The basic theory of reaction propulsion is discussed, giving attention to physical foundations, the various types of forces acting on a rocket in flight, basic parameters characterizing rocket motion, the effectiveness of various approaches to obtain the desired velocity, and rocket propellants. Basic questions concerning the classification of launch vehicles are considered along with construction and design considerations, aspects of vehicle control, reliability, construction technology, and details of structural design. Attention is also given to details of rocket motor design, the basic systems of the carrier rocket, and questions of carrier rocket development.

  11. Philosophical Assumptions of Research on Gender Difference or: Two-By-Two and We'll Never Break Through.

    ERIC Educational Resources Information Center

    Johnson, Bonnie McD.; Leck, Glorianne M.

    The philosophical proposition axiomatic in all gender difference research is examined in this paper. Research on gender differences is that which attempts to describe categorical differences between males and females, based on a designated potential for sexual reproduction. The methodological problems raised by this assumption include the…

  12. On knowing the unconscious: lessons from the epistemology of geometry and space.

    PubMed

    Brakel, L A

    1994-02-01

    Concepts involving unconscious processes and contents are central to any understanding of psychoanalysis. Indeed, the dynamic unconscious is familiar as a necessary assumption of the psychoanalytic method. Using the manner of knowing the geometry of space, including non-ordinary sized space, this paper attempts to demonstrate by analogy the possibility of knowing (and knowing the nature of) unconscious mentation-that of which by definition we cannot be aware; and yet that which constitutes a basic assumption of psychoanalysis. As an assumption of the psychoanalytic method, no amount of data from within the psychoanalytic method can ever provide evidence for the existence of the unconscious, nor for knowing its nature; hence the need for this sort of illustration by analogy. Along the way, three claims are made: (1) Freudian 'secondary process' operating during everyday adult, normal, logical thought can be considered a modernised version of the Kantian categories. (2) Use of models facilitates a generation of outside-the-Kantian-categories possibilities, and also provides a conserving function, as outside-the-categories possibilities can be assimilated. (3) Transformations are different from translations; knowledge of transformations can provide non-trivial knowledge about various substrates, otherwise difficult to know.

  13. Metal Accretion onto White Dwarfs. II. A Better Approach Based on Time-Dependent Calculations in Static Models

    NASA Astrophysics Data System (ADS)

    Fontaine, G.; Dufour, P.; Chayer, P.; Dupuis, J.; Brassard, P.

    2015-06-01

    The accretion-diffusion picture is the model par excellence for describing the presence of planetary debris polluting the atmospheres of relatively cool white dwarfs. Inferences on the process based on diffusion timescale arguments make the implicit assumption that the concentration gradient of a given metal at the base of the convection zone is negligible. This assumption is, in fact, not rigorously valid, but it allows the decoupling of the surface abundance from the evolving distribution of a given metal in deeper layers. A better approach is a full time-dependent calculation of the evolution of the abundance profile of an accreting-diffusing element. We used the same approach as that developed by Dupuis et al. to model accretion episodes involving many more elements than those considered by these authors. Our calculations incorporate the improvements to diffusion physics mentioned in Paper I. The basic assumption in the Dupuis et al. approach is that the accreted metals are trace elements, i.e, that they have no effects on the background (DA or non-DA) stellar structure. This allows us to consider an arbitrary number of accreting elements.

  14. Area, length and thickness conservation: Dogma or reality?

    NASA Astrophysics Data System (ADS)

    Moretti, Isabelle; Callot, Jean Paul

    2012-08-01

    The basic assumption of quantitative structural geology is the preservation of material during deformation. However the hypothesis of volume conservation alone does not help to predict past or future geometries and so this assumption is usually translated into bed length in 2D (or area in 3D) and thickness conservation. When subsurface data are missing, geologists may extrapolate surface data to depth using the kink-band approach. These extrapolations, preserving both thicknesses and dips, lead to geometries which are restorable but often erroneous, due to both disharmonic deformation and internal deformation of layers. First, the Bolivian Sub-Andean Zone case is presented to highlight the evolution of the concepts on which balancing is based, and the important role played by a decoupling level in enhancing disharmony. Second, analogue models are analyzed to test the validity of the balancing techniques. Chamberlin's excess area approach is shown to be on average valid. However, neither the length nor the thicknesses are preserved. We propose that in real cases, the length preservation hypothesis during shortening could also be a wrong assumption. If the data are good enough to image the decollement level, the Chamberlin excess area method could be used to compute the bed length changes.

  15. General solutions for the oxidation kinetics of polymers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, K.T.; Clough, R.L.; Wise, J.

    1996-08-01

    The simplest general kinetic schemes applicable to the oxidation of polymers are presented, discussed and analyzed in terms of the underlying kinetic assumptions. For the classic basic autoxidation scheme (BAS), which involves three bimolecular termination steps and is applicable mainly to unstabilized polymers, typical assumptions used singly or in groups include (1) long kinetic chain length, (2) a specific ratio of the termination rate constants and (3) insensitivity to the oxygen concentration (e.g., domination by a single termination step). Steady-state solutions for the rate of oxidation are given in terms of one, two, three, or four parameters, corresponding respectively tomore » three, two, one, or zero kinetic assumptions. The recently derived four-parameter solution predicts conditions yielding unusual dependencies of the oxidation rate on oxygen concentration and on initiation rate, as well as conditions leading to some unusual diffusion-limited oxidation profile shapes. For stabilized polymers, unimolecular termination schemes are typically more appropriate than bimolecular. Kinetics incorporating unimolecular termination reactions are shown to result in very simple oxidation expressions which have been experimentally verified for both radiation-initiated oxidation of an EPDM and thermoxidative degradation of nitrile and chloroprene elastomers.« less

  16. Mathematical and Computational Foundations of Recurrence Quantifications

    NASA Astrophysics Data System (ADS)

    Marwan, Norbert; Webber, Charles L.

    Real-world systems possess deterministic trajectories, phase singularities and noise. Dynamic trajectories have been studied in temporal and frequency domains, but these are linear approaches. Basic to the field of nonlinear dynamics is the representation of trajectories in phase space. A variety of nonlinear tools such as the Lyapunov exponent, Kolmogorov-Sinai entropy, correlation dimension, etc. have successfully characterized trajectories in phase space, provided the systems studied were stationary in time. Ubiquitous in nature, however, are systems that are nonlinear and nonstationary, existing in noisy environments all of which are assumption breaking to otherwise powerful linear tools. What has been unfolding over the last quarter of a century, however, is the timely discovery and practical demonstration that the recurrences of system trajectories in phase space can provide important clues to the system designs from which they derive. In this chapter we will introduce the basics of recurrence plots (RP) and their quantification analysis (RQA). We will begin by summarizing the concept of phase space reconstructions. Then we will provide the mathematical underpinnings of recurrence plots followed by the details of recurrence quantifications. Finally, we will discuss computational approaches that have been implemented to make recurrence strategies feasible and useful. As computers become faster and computer languages advance, younger generations of researchers will be stimulated and encouraged to capture nonlinear recurrence patterns and quantification in even better formats. This particular branch of nonlinear dynamics remains wide open for the definition of new recurrence variables and new applications untouched to date.

  17. Classical geometric resolution of the Einstein—Podolsky—Rosen paradox

    PubMed Central

    Ne'eman, Yuval

    1983-01-01

    I show that, in the geometry of a fiber bundle describing a gauge theory, curvature and parallel transport ensure and impose nonseparability. The “Einstein—Podolsky—Rosen paradox” is thus resolved “classically.” I conjecture that the ostentatiously “implausible” features of the quantum treatment are due to the fact that space—time separability, a basic assumption of single-particle nonrelativistic quantum mechanics, does not fit the bundle geometry of the complete physics. PMID:16593392

  18. Boundary layer transition: A review of theory, experiment and related phenomena

    NASA Technical Reports Server (NTRS)

    Kistler, E. L.

    1971-01-01

    The overall problem of boundary layer flow transition is reviewed. Evidence indicates a need for new, basic physical hypotheses in classical fluid mechanics math models based on the Navier-Stokes equations. The Navier-Stokes equations are challenged as inadequate for the investigation of fluid transition, since they are based on several assumptions which should be expected to alter significantly the stability characteristics of the resulting math model. Strong prima facie evidence is presented to this effect.

  19. 41 CFR 102-76.10 - What basic design and construction policy governs Federal agencies?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... basic design and construction policies: (a) Provide the highest quality services for designing and... requirements. (See 40 U.S.C. 3310 and 3312.) (d) Design Federal buildings to have a long life expectancy and...

  20. 41 CFR 102-76.10 - What basic design and construction policy governs Federal agencies?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... basic design and construction policies: (a) Provide the highest quality services for designing and... requirements. (See 40 U.S.C. 3310 and 3312.) (d) Design Federal buildings to have a long life expectancy and...

  1. Study of photon emission by electron capture during solar nuclei acceleration, 1: Temperature-dependent cross section for charge changing processes

    NASA Technical Reports Server (NTRS)

    Perez-Peraza, J.; Alvarez, M.; Laville, A.; Gallegos, A.

    1985-01-01

    The study of charge changing cross sections of fast ions colliding with matter provides the fundamental basis for the analysis of the charge states produced in such interactions. Given the high degree of complexity of the phenomena, there is no theoretical treatment able to give a comprehensive description. In fact, the involved processes are very dependent on the basic parameters of the projectile, such as velocity charge state, and atomic number, and on the target parameters, the physical state (molecular, atomic or ionized matter) and density. The target velocity, may have also incidence on the process, through the temperature of the traversed medium. In addition, multiple electron transfer in single collisions intrincates more the phenomena. Though, in simplified cases, such as protons moving through atomic hydrogen, considerable agreement has been obtained between theory and experiments However, in general the available theoretical approaches have only limited validity in restricted regions of the basic parameters. Since most measurements of charge changing cross sections are performed in atomic matter at ambient temperature, models are commonly based on the assumption of targets at rest, however at Astrophysical scales, temperature displays a wide range in atomic and ionized matter. Therefore, due to the lack of experimental data , an attempt is made here to quantify temperature dependent cross sections on basis to somewhat arbitrary, but physically reasonable assumptions.

  2. When life imitates art: surrogate decision making at the end of life.

    PubMed

    Shapiro, Susan P

    2007-01-01

    The privileging of the substituted judgment standard as the gold standard for surrogate decision making in law and bioethics has constrained the research agenda in end-of-life decision making. The empirical literature is inundated with a plethora of "Newlywed Game" designs, in which potential patients and potential surrogates respond to hypothetical scenarios to see how often they "get it right." The preoccupation with determining the capacity of surrogates to accurately reproduce the judgments of another makes a number of assumptions that blind scholars to the variables central to understanding how surrogates actually make medical decisions on behalf of another. These assumptions include that patient preferences are knowable, surrogates have adequate and accurate information, time stands still, patients get the surrogates they want, patients want and surrogates utilize substituted judgment criteria, and surrogates are disinterested. This article examines these assumptions and considers the challenges of designing research that makes them problematic.

  3. Dendrite and Axon Specific Geometrical Transformation in Neurite Development

    PubMed Central

    Mironov, Vasily I.; Semyanov, Alexey V.; Kazantsev, Victor B.

    2016-01-01

    We propose a model of neurite growth to explain the differences in dendrite and axon specific neurite development. The model implements basic molecular kinetics, e.g., building protein synthesis and transport to the growth cone, and includes explicit dependence of the building kinetics on the geometry of the neurite. The basic assumption was that the radius of the neurite decreases with length. We found that the neurite dynamics crucially depended on the relationship between the rate of active transport and the rate of morphological changes. If these rates were in the balance, then the neurite displayed axon specific development with a constant elongation speed. For dendrite specific growth, the maximal length was rapidly saturated by degradation of building protein structures or limited by proximal part expansion reaching the characteristic cell size. PMID:26858635

  4. Spreading dynamics on complex networks: a general stochastic approach.

    PubMed

    Noël, Pierre-André; Allard, Antoine; Hébert-Dufresne, Laurent; Marceau, Vincent; Dubé, Louis J

    2014-12-01

    Dynamics on networks is considered from the perspective of Markov stochastic processes. We partially describe the state of the system through network motifs and infer any missing data using the available information. This versatile approach is especially well adapted for modelling spreading processes and/or population dynamics. In particular, the generality of our framework and the fact that its assumptions are explicitly stated suggests that it could be used as a common ground for comparing existing epidemics models too complex for direct comparison, such as agent-based computer simulations. We provide many examples for the special cases of susceptible-infectious-susceptible and susceptible-infectious-removed dynamics (e.g., epidemics propagation) and we observe multiple situations where accurate results may be obtained at low computational cost. Our perspective reveals a subtle balance between the complex requirements of a realistic model and its basic assumptions.

  5. Rationality.

    PubMed

    Shafir, Eldar; LeBoeuf, Robyn A

    2002-01-01

    This chapter reviews selected findings in research on reasoning, judgment, and choice and considers the systematic ways in which people violate basic requirements of the corresponding normative analyses. Recent objections to the empirical findings are then considered; these objections question the findings' relevance to assumptions about rationality. These objections address the adequacy of the tasks used in the aforementioned research and the appropriateness of the critical interpretation of participants' responses, as well as the justifiability of some of the theoretical assumptions made by experimenters. The objections are each found not to seriously impinge on the general conclusion that people often violate tenets of rationality in inadvisable ways. In the process, relevant psychological constructs, ranging from cognitive ability and need for cognition, to dual process theories and the role of incentives, are discussed. It is proposed that the rationality critique is compelling and rightfully gaining influence in the social sciences in general.

  6. The EPQ model under conditions of two levels of trade credit and limited storage capacity in supply chain management

    NASA Astrophysics Data System (ADS)

    Chung, Kun-Jen

    2013-09-01

    An inventory problem involves a lot of factors influencing inventory decisions. To understand it, the traditional economic production quantity (EPQ) model plays rather important role for inventory analysis. Although the traditional EPQ models are still widely used in industry, practitioners frequently question validities of assumptions of these models such that their use encounters challenges and difficulties. So, this article tries to present a new inventory model by considering two levels of trade credit, finite replenishment rate and limited storage capacity together to relax the basic assumptions of the traditional EPQ model to improve the environment of the use of it. Keeping in mind cost-minimisation strategy, four easy-to-use theorems are developed to characterise the optimal solution. Finally, the sensitivity analyses are executed to investigate the effects of the various parameters on ordering policies and the annual total relevant costs of the inventory system.

  7. Expectations and beliefs in science communication: Learning from three European gene therapy discussions of the early 1990s.

    PubMed

    Meyer, Gitte

    2016-04-01

    There is widespread agreement that the potential of gene therapy was oversold in the early 1990s. This study, however, comparing written material from the British, Danish and German gene therapy discourses of the period finds significant differences: Over-optimism was not equally strong everywhere; gene therapy was not universally hyped. Against that background, attention is directed towards another area of variation in the material: different basic assumptions about science and scientists. Exploring such culturally rooted assumptions and beliefs and their possible significance to science communication practices, it is argued that deep beliefs may constitute drivers of hype that are particularly difficult to deal with. To participants in science communication, the discouragement of hype, viewed as a practical-ethical challenge, can be seen as a learning exercise that includes critical attention to internalised beliefs. © The Author(s) 2014.

  8. Expanding Advanced Civilizations in the Universe

    NASA Astrophysics Data System (ADS)

    Gros, C.

    The 1950 lunch-table remark by Enrico Fermi `Where is everybody' has started intensive scientific and philosophical discussions about what we call nowadays the `Fermi paradox': If there had been ever a single advanced civilization in the cosmological history of our galaxy, dedicated to expansion, it would have had plenty of time to colonize the entire galaxy via exponential growth. No evidence of present or past alien visits to earth are known to us, leading to the standard conclusion that no advanced expanding civilization has ever existed in the milky-way. This conclusion rest fundamentally on the ad-hoc assumption, that any alien civilizations dedicated to expansion at one time would remain dedicated to expansions forever. Considering our limited knowledge about alien civilizations we need however to relax this basic assumption. Here we show that a substantial and stable population of expanding advanced civilization might consequently exist in our galaxy.

  9. Modeling intelligent adversaries for terrorism risk assessment: some necessary conditions for adversary models.

    PubMed

    Guikema, Seth

    2012-07-01

    Intelligent adversary modeling has become increasingly important for risk analysis, and a number of different approaches have been proposed for incorporating intelligent adversaries in risk analysis models. However, these approaches are based on a range of often-implicit assumptions about the desirable properties of intelligent adversary models. This "Perspective" paper aims to further risk analysis for situations involving intelligent adversaries by fostering a discussion of the desirable properties for these models. A set of four basic necessary conditions for intelligent adversary models is proposed and discussed. These are: (1) behavioral accuracy to the degree possible, (2) computational tractability to support decision making, (3) explicit consideration of uncertainty, and (4) ability to gain confidence in the model. It is hoped that these suggested necessary conditions foster discussion about the goals and assumptions underlying intelligent adversary modeling in risk analysis. © 2011 Society for Risk Analysis.

  10. Zipf's word frequency law in natural language: a critical review and future directions.

    PubMed

    Piantadosi, Steven T

    2014-10-01

    The frequency distribution of words has been a key object of study in statistical linguistics for the past 70 years. This distribution approximately follows a simple mathematical form known as Zipf's law. This article first shows that human language has a highly complex, reliable structure in the frequency distribution over and above this classic law, although prior data visualization methods have obscured this fact. A number of empirical phenomena related to word frequencies are then reviewed. These facts are chosen to be informative about the mechanisms giving rise to Zipf's law and are then used to evaluate many of the theoretical explanations of Zipf's law in language. No prior account straightforwardly explains all the basic facts or is supported with independent evaluation of its underlying assumptions. To make progress at understanding why language obeys Zipf's law, studies must seek evidence beyond the law itself, testing assumptions and evaluating novel predictions with new, independent data.

  11. Applications of non-parametric statistics and analysis of variance on sample variances

    NASA Technical Reports Server (NTRS)

    Myers, R. H.

    1981-01-01

    Nonparametric methods that are available for NASA-type applications are discussed. An attempt will be made here to survey what can be used, to attempt recommendations as to when each would be applicable, and to compare the methods, when possible, with the usual normal-theory procedures that are avavilable for the Gaussion analog. It is important here to point out the hypotheses that are being tested, the assumptions that are being made, and limitations of the nonparametric procedures. The appropriateness of doing analysis of variance on sample variances are also discussed and studied. This procedure is followed in several NASA simulation projects. On the surface this would appear to be reasonably sound procedure. However, difficulties involved center around the normality problem and the basic homogeneous variance assumption that is mase in usual analysis of variance problems. These difficulties discussed and guidelines given for using the methods.

  12. The Concept of Curriculum Design.

    ERIC Educational Resources Information Center

    Barrow, Robin

    This paper approaches the concept of curriculum design from a philosophical perspective, arguing that the concept of "design" in curriculum is fundamentally misleading. The paper begins with a series of comments questioning the assumption that curriculum design involves a set of discrete skills or procedures in which one may attain expertise, like…

  13. Inertial Fusion Energy reactor design studies: Prometheus-L, Prometheus-H. Volume 2, Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waganer, L.M.; Driemeyer, D.E.; Lee, V.D.

    1992-03-01

    This report contains a review of design studies for Inertial Confinement reactor. This second of three volumes discussions is some detail the following: Objectives, requirements, and assumptions; rationale for design option selection; key technical issues and R&D requirements; and conceptual design selection and description.

  14. Identifying gaps in conservation networks: of indicators and uncertainty in geographic-based analyses

    Treesearch

    Curtis H. Flather; Kenneth R. Wilson; Denis J. Dean; William C. McComb

    1997-01-01

    Mapping of biodiversity elements to expose gaps in. conservation networks has become a common strategy in nature-reserve design. We review a set of critical assumptions and issues that influence the interpretation and implementation of gap analysis, including: (1) the assumption that a subset of taxa can be used to indicate overall diversity patterns, and (2) the...

  15. A Metric to Evaluate Mobile Satellite Systems

    NASA Technical Reports Server (NTRS)

    Young, Elizabeth L.

    1997-01-01

    The concept of a "cost per billable minute" methodology to analyze mobile satellite systems is reviewed. Certain assumptions, notably those about the marketplace and regulatory policies, may need to be revisited. Fading and power control assumptions need to be tested. Overall, the metric would seem to have value in the design phase of a system and for comparisons between and among alternative systems.

  16. What's Eating into School Recess? Implications of Extended Eating for Free Play and Physical Activity

    ERIC Educational Resources Information Center

    Wyver, Shirley; Engelen, Lina; Bundy, Anita; Naughton, Geraldine

    2012-01-01

    An assumption made when designing recess interventions in schools is that there is a clear demarcation between eating time and play time. We used observational data conducted as part of the Sydney Playground Project to test if this assumption was correct. The Sydney Playground Project is a cluster randomised controlled trial of a recess…

  17. Advancing Work Practices: Rethinking Online Professional Development in the Context of the Intervention-Based Sustainable Change

    ERIC Educational Resources Information Center

    Noesgaard, Signe Schack

    2016-01-01

    Purpose: The paper aims to discuss the effectiveness of e-Learning in advancing work practices. The paper investigates the assumption that e-Learning is as effective as face-to-face interventions when stimulating change. It also examines the assumption that well-designed and well-executed instructional interventions will advance work practices.…

  18. Communication Design Education: Could Nine Reflections Be Sufficient?

    ERIC Educational Resources Information Center

    van der Waarde, Karel; Vroombout, Maurits

    2012-01-01

    Situation: Graphic design education is subject to substantial changes. Changes in professional practice and higher education aggravate insecurities about the contents and structure of courses, assessment criteria, relations between practice, research and theory and teaching methods. Assumption: Graphic design education (visual communication design…

  19. Implementation and Evaluation of Microcomputer Systems for the Republic of Turkey’s Naval Ships.

    DTIC Science & Technology

    1986-03-01

    important database design tool for both logical and physical database design, such as flowcharts or pseudocodes are used for program design. Logical...string manipulation in FORTRAN is difficult but not impossible. BASIC ( Beginners All-Purpose Symbolic Instruction Code): Basic is currently the most...63 APPENDIX B GLOSSARY/ACRONYM LIST AC Alternating Current AP Application Program BASIC Beginners All-purpose Symbolic Instruction Code CCP

  20. A Concept Transformation Learning Model for Architectural Design Learning Process

    ERIC Educational Resources Information Center

    Wu, Yun-Wu; Weng, Kuo-Hua; Young, Li-Ming

    2016-01-01

    Generally, in the foundation course of architectural design, much emphasis is placed on teaching of the basic design skills without focusing on teaching students to apply the basic design concepts in their architectural designs or promoting students' own creativity. Therefore, this study aims to propose a concept transformation learning model to…

  1. Basic Emotions in Human Neuroscience: Neuroimaging and Beyond.

    PubMed

    Celeghin, Alessia; Diano, Matteo; Bagnis, Arianna; Viola, Marco; Tamietto, Marco

    2017-01-01

    The existence of so-called 'basic emotions' and their defining attributes represents a long lasting and yet unsettled issue in psychology. Recently, neuroimaging evidence, especially related to the advent of neuroimaging meta-analytic methods, has revitalized this debate in the endeavor of systems and human neuroscience. The core theme focuses on the existence of unique neural bases that are specific and characteristic for each instance of basic emotion. Here we review this evidence, outlining contradictory findings, strengths and limits of different approaches. Constructionism dismisses the existence of dedicated neural structures for basic emotions, considering that the assumption of a one-to-one relationship between neural structures and their functions is central to basic emotion theories. While these critiques are useful to pinpoint current limitations of basic emotions theories, we argue that they do not always appear equally generative in fostering new testable accounts on how the brain relates to affective functions. We then consider evidence beyond PET and fMRI, including results concerning the relation between basic emotions and awareness and data from neuropsychology on patients with focal brain damage. Evidence from lesion studies are indeed particularly informative, as they are able to bring correlational evidence typical of neuroimaging studies to causation, thereby characterizing which brain structures are necessary for, rather than simply related to, basic emotion processing. These other studies shed light on attributes often ascribed to basic emotions, such as automaticity of perception, quick onset, and brief duration. Overall, we consider that evidence in favor of the neurobiological underpinnings of basic emotions outweighs dismissive approaches. In fact, the concept of basic emotions can still be fruitful, if updated to current neurobiological knowledge that overcomes traditional one-to-one localization of functions in the brain. In particular, we propose that the structure-function relationship between brain and emotions is better described in terms of pluripotentiality, which refers to the fact that one neural structure can fulfill multiple functions, depending on the functional network and pattern of co-activations displayed at any given moment.

  2. Robust multi-atlas label propagation by deep sparse representation

    PubMed Central

    Zu, Chen; Wang, Zhengxia; Zhang, Daoqiang; Liang, Peipeng; Shi, Yonghong; Shen, Dinggang; Wu, Guorong

    2016-01-01

    Recently, multi-atlas patch-based label fusion has achieved many successes in medical imaging area. The basic assumption in the current state-of-the-art approaches is that the image patch at the target image point can be represented by a patch dictionary consisting of atlas patches from registered atlas images. Therefore, the label at the target image point can be determined by fusing labels of atlas image patches with similar anatomical structures. However, such assumption on image patch representation does not always hold in label fusion since (1) the image content within the patch may be corrupted due to noise and artifact; and (2) the distribution of morphometric patterns among atlas patches might be unbalanced such that the majority patterns can dominate label fusion result over other minority patterns. The violation of the above basic assumptions could significantly undermine the label fusion accuracy. To overcome these issues, we first consider forming label-specific group for the atlas patches with the same label. Then, we alter the conventional flat and shallow dictionary to a deep multi-layer structure, where the top layer (label-specific dictionaries) consists of groups of representative atlas patches and the subsequent layers (residual dictionaries) hierarchically encode the patchwise residual information in different scales. Thus, the label fusion follows the representation consensus across representative dictionaries. However, the representation of target patch in each group is iteratively optimized by using the representative atlas patches in each label-specific dictionary exclusively to match the principal patterns and also using all residual patterns across groups collaboratively to overcome the issue that some groups might be absent of certain variation patterns presented in the target image patch. Promising segmentation results have been achieved in labeling hippocampus on ADNI dataset, as well as basal ganglia and brainstem structures, compared to other counterpart label fusion methods. PMID:27942077

  3. Robust multi-atlas label propagation by deep sparse representation.

    PubMed

    Zu, Chen; Wang, Zhengxia; Zhang, Daoqiang; Liang, Peipeng; Shi, Yonghong; Shen, Dinggang; Wu, Guorong

    2017-03-01

    Recently, multi-atlas patch-based label fusion has achieved many successes in medical imaging area. The basic assumption in the current state-of-the-art approaches is that the image patch at the target image point can be represented by a patch dictionary consisting of atlas patches from registered atlas images. Therefore, the label at the target image point can be determined by fusing labels of atlas image patches with similar anatomical structures. However, such assumption on image patch representation does not always hold in label fusion since (1) the image content within the patch may be corrupted due to noise and artifact; and (2) the distribution of morphometric patterns among atlas patches might be unbalanced such that the majority patterns can dominate label fusion result over other minority patterns. The violation of the above basic assumptions could significantly undermine the label fusion accuracy. To overcome these issues, we first consider forming label-specific group for the atlas patches with the same label. Then, we alter the conventional flat and shallow dictionary to a deep multi-layer structure, where the top layer ( label-specific dictionaries ) consists of groups of representative atlas patches and the subsequent layers ( residual dictionaries ) hierarchically encode the patchwise residual information in different scales. Thus, the label fusion follows the representation consensus across representative dictionaries. However, the representation of target patch in each group is iteratively optimized by using the representative atlas patches in each label-specific dictionary exclusively to match the principal patterns and also using all residual patterns across groups collaboratively to overcome the issue that some groups might be absent of certain variation patterns presented in the target image patch. Promising segmentation results have been achieved in labeling hippocampus on ADNI dataset, as well as basal ganglia and brainstem structures, compared to other counterpart label fusion methods.

  4. Forks in the road: choices in procedures for designing wildland linkages.

    PubMed

    Beier, Paul; Majka, Daniel R; Spencer, Wayne D

    2008-08-01

    Models are commonly used to identify lands that will best maintain the ability of wildlife to move between wildland blocks through matrix lands after the remaining matrix has become incompatible with wildlife movement. We offer a roadmap of 16 choices and assumptions that arise in designing linkages to facilitate movement or gene flow of focal species between 2 or more predefined wildland blocks. We recommend designing linkages to serve multiple (rather than one) focal species likely to serve as a collective umbrella for all native species and ecological processes, explicitly acknowledging untested assumptions, and using uncertainty analysis to illustrate potential effects of model uncertainty. Such uncertainty is best displayed to stakeholders as maps of modeled linkages under different assumptions. We also recommend modeling corridor dwellers (species that require more than one generation to move their genes between wildland blocks) differently from passage species (for which an individual can move between wildland blocks within a few weeks). We identify a problem, which we call the subjective translation problem, that arises because the analyst must subjectively decide how to translate measurements of resource selection into resistance. This problem can be overcome by estimating resistance from observations of animal movement, genetic distances, or interpatch movements. There is room for substantial improvement in the procedures used to design linkages robust to climate change and in tools that allow stakeholders to compare an optimal linkage design to alternative designs that minimize costs or achieve other conservation goals.

  5. Characterization of plasma current quench during disruptions at HL-2A

    NASA Astrophysics Data System (ADS)

    Zhu, Jinxia; Zhang, Yipo; Dong, Yunbo; HL-2A Team

    2017-05-01

    The most essential assumptions of physics for the evaluation of electromagnetic forces on the plasma-facing components due to a disruption-induced eddy current are characteristics of plasma current quenches including the current quench rate or its waveforms. The characteristics of plasma current quenches at HL-2A have been analyzed during spontaneous disruptions. Both linear decay and exponential decay are found in the disruptions with the fastest current quenches. However, there are two stages of current quench in the slow current quench case. The first stage with an exponential decay and the second stage followed by a rapid linear decay. The faster current quench rate corresponds to the faster movement of plasma displacement. The parameter regimes on the current quench time and the current quench rates have been obtained from disruption statistics at HL-2A. There exists no remarkable difference for distributions obtained between the limiter and the divertor configuration. This data from HL-2A provides basic data of the derivation of design criteria for a large-sized machine during the current decay phase of the disruptions.

  6. A Probabilistic Approach to Model Update

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, Mercedes C.; Voracek, David F.

    2001-01-01

    Finite element models are often developed for load validation, structural certification, response predictions, and to study alternate design concepts. In rare occasions, models developed with a nominal set of parameters agree with experimental data without the need to update parameter values. Today, model updating is generally heuristic and often performed by a skilled analyst with in-depth understanding of the model assumptions. Parameter uncertainties play a key role in understanding the model update problem and therefore probabilistic analysis tools, developed for reliability and risk analysis, may be used to incorporate uncertainty in the analysis. In this work, probability analysis (PA) tools are used to aid the parameter update task using experimental data and some basic knowledge of potential error sources. Discussed here is the first application of PA tools to update parameters of a finite element model for a composite wing structure. Static deflection data at six locations are used to update five parameters. It is shown that while prediction of individual response values may not be matched identically, the system response is significantly improved with moderate changes in parameter values.

  7. Linear models for assessing mechanisms of sperm competition: the trouble with transformations.

    PubMed

    Eggert, Anne-Katrin; Reinhardt, Klaus; Sakaluk, Scott K

    2003-01-01

    Although sperm competition is a pervasive selective force shaping the reproductive tactics of males, the mechanisms underlying different patterns of sperm precedence remain obscure. Parker et al. (1990) developed a series of linear models designed to identify two of the more basic mechanisms: sperm lotteries and sperm displacement; the models can be tested experimentally by manipulating the relative numbers of sperm transferred by rival males and determining the paternity of offspring. Here we show that tests of the model derived for sperm lotteries can result in misleading inferences about the underlying mechanism of sperm precedence because the required inverse transformations may lead to a violation of fundamental assumptions of linear regression. We show that this problem can be remedied by reformulating the model using the actual numbers of offspring sired by each male, and log-transforming both sides of the resultant equation. Reassessment of data from a previous study (Sakaluk and Eggert 1996) using the corrected version of the model revealed that we should not have excluded a simple sperm lottery as a possible mechanism of sperm competition in decorated crickets, Gryllodes sigillatus.

  8. Adaptive hidden Markov model with anomaly States for price manipulation detection.

    PubMed

    Cao, Yi; Li, Yuhua; Coleman, Sonya; Belatreche, Ammar; McGinnity, Thomas Martin

    2015-02-01

    Price manipulation refers to the activities of those traders who use carefully designed trading behaviors to manually push up or down the underlying equity prices for making profits. With increasing volumes and frequency of trading, price manipulation can be extremely damaging to the proper functioning and integrity of capital markets. The existing literature focuses on either empirical studies of market abuse cases or analysis of particular manipulation types based on certain assumptions. Effective approaches for analyzing and detecting price manipulation in real time are yet to be developed. This paper proposes a novel approach, called adaptive hidden Markov model with anomaly states (AHMMAS) for modeling and detecting price manipulation activities. Together with wavelet transformations and gradients as the feature extraction methods, the AHMMAS model caters to price manipulation detection and basic manipulation type recognition. The evaluation experiments conducted on seven stock tick data from NASDAQ and the London Stock Exchange and 10 simulated stock prices by stochastic differential equation show that the proposed AHMMAS model can effectively detect price manipulation patterns and outperforms the selected benchmark models.

  9. The Sensitivity of Coded Mask Telescopes

    NASA Technical Reports Server (NTRS)

    Skinner, Gerald K.

    2008-01-01

    Simple formulae are often used to estimate the sensitivity of coded mask X-ray or gamma-ray telescopes, but t,hese are strictly only applicable if a number of basic assumptions are met. Complications arise, for example, if a grid structure is used to support the mask elements, if the detector spatial resolution is not good enough to completely resolve all the detail in the shadow of the mask or if any of a number of other simplifying conditions are not fulfilled. We derive more general expressions for the Poisson-noise-limited sensitivity of astronomical telescopes using the coded mask technique, noting explicitly in what circumstances they are applicable. The emphasis is on using nomenclature and techniques that result in simple and revealing results. Where no convenient expression is available a procedure is given which allows the calculation of the sensitivity. We consider certain aspects of the optimisation of the design of a coded mask telescope and show that when the detector spatial resolution and the mask to detector separation are fixed, the best source location accuracy is obtained when the mask elements are equal in size to the detector pixels.

  10. Comparative economic and environmental assessment of four beech wood based biorefinery concepts.

    PubMed

    Budzinski, Maik; Nitzsche, Roy

    2016-09-01

    The aim of this study was to analyze four conceptual beech wood based biorefineries generated during process design in terms of environmental and economic criteria. Biorefinery 1 annually converts 400,000 dry metric tons of beech wood into the primary products 41,600t/yr polymer-grade ethylene and 58,520tDM/yr organosolv lignin and the fuels 90,800tDM/yr hydrolysis lignin and 38,400t/yr biomethane. Biorefinery 2 is extended by the product of 58,400t/yr liquid "food-grade" carbon dioxide. Biorefinery 3 produces 69,600t/yr anhydrous ethanol instead of ethylene. Compared to biorefinery 3, biorefinery 4 additionally provides carbon dioxide as product. Biorefinery 3 and 4 seem most promising, since under basic assumptions both criteria, (i) economic effectiveness and (ii) reduction of potential environmental impacts, can be fulfilled. All four alternatives may reduce potential environmental impacts compared to reference systems using the ReCiPe methodology. Economic feasibilities of the analyzed biorefineries are highly sensitive. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Parametric Cost Analysis: A Design Function

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.

    1989-01-01

    Parametric cost analysis uses equations to map measurable system attributes into cost. The measures of the system attributes are called metrics. The equations are called cost estimating relationships (CER's), and are obtained by the analysis of cost and technical metric data of products analogous to those to be estimated. Examples of system metrics include mass, power, failure_rate, mean_time_to_repair, energy _consumed, payload_to_orbit, pointing_accuracy, manufacturing_complexity, number_of_fasteners, and percent_of_electronics_weight. The basic assumption is that a measurable relationship exists between system attributes and the cost of the system. If a function exists, the attributes are cost drivers. Candidates for metrics include system requirement metrics and engineering process metrics. Requirements are constraints on the engineering process. From optimization theory we know that any active constraint generates cost by not permitting full optimization of the objective. Thus, requirements are cost drivers. Engineering processes reflect a projection of the requirements onto the corporate culture, engineering technology, and system technology. Engineering processes are an indirect measure of the requirements and, hence, are cost drivers.

  12. E-Basics: Online Basic Training in Program Evaluation

    ERIC Educational Resources Information Center

    Silliman, Ben

    2016-01-01

    E-Basics is an online training in program evaluation concepts and skills designed for youth development professionals, especially those working in nonformal science education. Ten hours of online training in seven modules is designed to prepare participants for mentoring and applied practice, mastery, and/or team leadership in program evaluation.…

  13. SARCOPENIA: DESIGNING PHASE IIB TRIALS

    PubMed Central

    CHUMLEA, WM.C.; CESARI, M.; EVANS, W.J.; FERRUCCI, L.; FIELDING, R.A.; PAHOR, M.; STUDENSKI, S.; VELLAS, B.

    2012-01-01

    Sarcopenia is the age-related involuntary loss of skeletal muscle mass and functionality that can lead to the development of disability, frailty and increased health care costs. The development of interventions aimed at preventing and/or treating sarcopenia is complex, requiring the adoption of assumptions and standards that are not well established scientifically or clinically. A number of investigators and clinicians (both from academia and industry) met in Rome (Italy) in 2009 to develop a consensus definition of sarcopenia. Subsequently, in Albuquerque (New Mexico, USA) in 2010, the same group met again to consider the complex issues necessary for designing Phase II clinical trials for sarcopenia. Current clinical trial data indicate that fat-free mass (FFM) parameters are responsive to physical activity/nutritional treatment modalities over short time periods, but pharmacological trials of sarcopenia have yet to show significant efficacy. In order to conduct a clinical trial within a reasonable time frame, groups that model or display accelerated aging and loss of FFM are necessary. Few studies have used acceptable designs for testing treatment effects, sample sizes or primary outcomes that could provide interpretable findings or effects across studies. Dual energy x ray absorptiometry (DXA) is the measure of choice for assessing FFM, but sufficient time is needed for changes to be detected accurately and reliably. A tool set that would allow clinical, basic and epidemiological research on sarcopenia to advance rapidly toward diagnosis and treatment phases should be those reflecting function and strength. PMID:21623466

  14. Shielding of medical imaging X-ray facilities: a simple and practical method.

    PubMed

    Bibbo, Giovanni

    2017-12-01

    The most widely accepted method for shielding design of X-ray facilities is that contained in the National Council on Radiation Protection and Measurements Report 147 whereby the computation of the barrier thickness for primary, secondary and leakage radiations is based on the knowledge of the distances from the radiation sources, the assumptions of the clinical workload, and usage and occupancy of adjacent areas. The shielding methodology used in this report is complex. With this methodology, the shielding designers need to make assumptions regarding the use of the X-ray room and the adjoining areas. Different shielding designers may make different assumptions resulting in different shielding requirements for a particular X-ray room. A more simple and practical method is to base the shielding design on the shielding principle used to shield X-ray tube housing to limit the leakage radiation from the X-ray tube. In this case, the shielding requirements of the X-ray room would depend only on the maximum radiation output of the X-ray equipment regardless of workload, usage or occupancy of the adjacent areas of the room. This shielding methodology, which has been used in South Australia since 1985, has proven to be practical and, to my knowledge, has not led to excess shielding of X-ray installations.

  15. Photometry and spectroscopy of a newly discovered polar - Nova Cygni 1975 (V1500 CYG)

    NASA Technical Reports Server (NTRS)

    Kaluzny, Janusz; Chlebowski, Tomasz

    1988-01-01

    The paper reports photometric and spectroscopic observations which led to the conclusion that Nova Cygni 1975 (V1500 Cyg) is a polar (of AM Her-type).The CCD photometry confirms the constancy of the photometric period which is again interpreted as an orbital cycle. The time-resolved MMT spectra make it possible to reconstruct, under several assumptions, the basic system parameters: M1=0.9M solar mass and M2=0.31M solar mass.

  16. A beginner's guide to belief revision and truth maintenance systems

    NASA Technical Reports Server (NTRS)

    Mason, Cindy L.

    1992-01-01

    This brief note is intended to familiarize the non-TMS audience with some of the basic ideas surrounding classic TMS's (truth maintenance systems), namely the justification-based TMS and the assumption-based TMS. Topics of further interest include the relation between non-monotonic logics and TMS's, efficiency and search issues, complexity concerns, as well as the variety of TMS systems that have surfaced in the past decade or so. These include probabilistic-based TMS systems, fuzzy TMS systems, tri-valued belief systems, and so on.

  17. The Nonlinear Dynamic Response of an Elastic-Plastic Thin Plate under Impulsive Loading,

    DTIC Science & Technology

    1987-06-11

    Among those numerical methods, the finite element method is the most effective one. The method presented in this paper is an " influence function " numerical...computational time is much less than the finite element method. Its precision is higher also. II. Basic Assumption and the Influence Function of a Simple...calculation. Fig. 1 3 2. The Influence function of a Simple Supported Plate The motion differential equation of a thin plate can be written as DV’w+ _.eluq() (1

  18. Dark energy cosmology with tachyon field in teleparallel gravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Motavalli, H., E-mail: Motavalli@Tabrizu.ac.ir; Akbarieh, A. Rezaei; Nasiry, M.

    2016-07-15

    We construct a tachyon teleparallel dark energy model for a homogeneous and isotropic flat universe in which a tachyon as a non-canonical scalar field is non-minimally coupled to gravity in the framework of teleparallel gravity. The explicit form of potential and coupling functions are obtained under the assumption that the Lagrangian admits the Noether symmetry approach. The dynamical behavior of the basic cosmological observables is compared to recent observational data, which implies that the tachyon field may serve as a candidate for dark energy.

  19. An interactive quality of work life model applied to organizational transition.

    PubMed

    Knox, S; Irving, J A

    1997-01-01

    Most healthcare organizations in the United States are in the process of some type of organizational change or transition. Professional nurses and other healthcare providers practicing in U.S. healthcare delivery organizations are very aware of the dramatic effects of restructuring processes. A phenomenal amount of change and concern is occurring with organizational redesign, generating many questions and uncertainties. These transitions challenge the basic assumptions and principles guiding the practice of clinical and management roles in healthcare.

  20. Principles of cost-benefit analysis for ERTS experiments, volumes 1 and 2

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The basic elements of a cost-benefit study are discussed along with special considerations for ERTS experiments. Elements required for a complete economic analysis of ERTS are considered to be: statement of objectives, specification of assumptions, enumeration of system alternatives, benefit analysis, cost analysis nonefficiency considerations, and final system selection. A hypothetical cost-benefit example is presented with the assumed objective of an increase in remote sensing surveys of grazing lands to better utilize available forage to lower meat prices.

Top