Sample records for yoru inemuri unten

  1. SPAWAR Single Training Integrator. SPAWAR Discussions

    DTIC Science & Technology

    2012-02-01

    Information Dominance training must improve. Systems are more interdependent/integrated now more than ever and require holistic approach. Single system based training solutions are untenable to Fleet.

  2. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  3. Armageddon Is Not around the Corner.

    ERIC Educational Resources Information Center

    Walsh, Edward A.

    1984-01-01

    Nuclear war is unthinkable, psychologically unfeasible, and untenable technically; conventional war against the Russians is impossible. Therefore, time and money should not be spent preparing for confrontation. Proposals are made for redirecting funds. (RM)

  4. 77 FR 74161 - Fisheries of the Exclusive Economic Zone Off Alaska; Allocating Bering Sea and Aleutian Islands...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-13

    ..., processors, and affected communities to request an exemption from regional delivery requirements. Federal... within the designated region; likewise, crab purchased with regionally designated individual processing... fishing and processing activity making regional delivery requirements untenable in some seasons. Amendment...

  5. Using transcriptomic tools to evaluate biological effects across effluent gradients at a diverse set of study sites in Minnesota, USA

    EPA Science Inventory

    The aim of this overall project was to explore the utility of ?‘omics’ approaches in monitoring aquatic environments where complex, often unknown, stressors make chemical-specific risk assessment untenable. This specific component of the effort examined changes in the fathead min...

  6. A Test of the Dimensionality Assumptions of Rotter's Internal-External Scale

    ERIC Educational Resources Information Center

    Klockars, Alan J.; Varnum, Susan W.

    1975-01-01

    Examined two assumptions about the dimensionality of Rotters' Internal-External (I-E) scale: First, the bipolarity of the two statements within each item pair; second, the unidimensionality of the overall construct. Both assumptions regarding Rotters' I-E Scale were found untenable. (Author/BJG)

  7. China Emerging

    DTIC Science & Technology

    2012-03-14

    historical components to the disputes in the South China Sea that have bearing on the issue. China, Taiwan, Vietnam, Malaysia , the Philippines...government for needed efficiency. It becomes more and more untenable for an authoritative government to enforce censorship , political repression, state

  8. End of living: maintaining a lifeworld during terminal illness.

    PubMed

    Wrubel, Judith; Acree, Michael; Goodman, Steffanie; Folkman, Susan

    2009-12-01

    The narrative responses of 32 people with AIDS or cancer with survival prognoses of 6 months to a year to monthly interview questions about their daily lives were analysed with a team-based qualitative methodology. Two groups emerged: (a) a Maintained Lifeworld Group characterised by one or more of the following: continued engagement with family, friends, and community; the ability to relinquish untenable goals and substitute new, realistic ones; engagement in spirituality and a spiritual practice; and, (b) a Lifeworld Interrupted Group characterised by one or more of the following: relocation just before or during the study, cognitive impairment, commitment to untenable goals, ongoing substance abuse. Understanding how people with a terminal illness can maintain a lifeworld and experience well-being while also managing the physical challenges of their illness could help inform the support offered by professional and family caregivers to improve care recipients' quality of life.

  9. Beyond the Hyperbole: Information Literacy Reconsidered

    ERIC Educational Resources Information Center

    Julien, Heidi

    2016-01-01

    Information literacy, as a concept, has suffered from terminological confusion and has been burdened with untenable expectations. In addition, insufficient attention has been given to the place of information with the context of information behavior or information practices generally. Significant challenges remain to developing information…

  10. Emotional Intelligence as a Salient Predictor for Collegians' Career Decision Making

    ERIC Educational Resources Information Center

    Puffer, Keith A.

    2011-01-01

    Among the plethora of career theories and counseling practices, human emotion continues to be underrepresented. The paucity is evoking discontentment. For many career specialists, a distal role for emotionality has become untenable. This study demonstrated emotional intelligence (EI) associates with familiar constructs within the career…

  11. The Rise and Fall of Democracies in Third World Societies. Studies in Third World Societies. Publication Number Twenty-Seven.

    ERIC Educational Resources Information Center

    Sutlive, Vinson H. Ed.; And Others

    Topics concerned with the experiments and problems of self-rule in Third World societies are presented in nine essays. The essays are: (1) "Democracy in Iran: The Untenable Dream" (John D. Stempel); (2) "Afghanistan's Struggle for National Liberation" (Hafizullah Emadi); (3) "Turkey's Experience with Political…

  12. Bringing Intergenerational Social Mobility Research into the Twenty-First Century: Why Mothers Matter

    ERIC Educational Resources Information Center

    Beller, Emily

    2009-01-01

    Conventional social mobility research, which measures family social class background relative to only fathers' characteristics, presents an outmoded picture of families--a picture wherein mothers' economic participation is neither common nor important. This article demonstrates that such measurement is theoretically and empirically untenable.…

  13. Race and Politics Rip into the Urban Superintendency.

    ERIC Educational Resources Information Center

    Rist, Marilee C.

    1990-01-01

    Heightened racial and ethnic-group politics and increasingly rocky board-superintendent relations are making the urban superintendency increasingly untenable. The politics of urban school governance can stymie even the best candidates. To survive, big-city superintendents need a thick hide, sensitivity to diversity, charisma, self-confidence,…

  14. Medicare Prospective Reimbursement for Mental Health Services: A Literature Review.

    ERIC Educational Resources Information Center

    Holcomb, William R.; Thompson, Warren A.

    1988-01-01

    Reviews literature evaluating appropriateness of Medicare's prospective payment system (PPS) through Diagnostic-Related Groups (DRGs), since its implementation in 1983, for psychiatric care. Cites shortcomings that make the system untenable for mental health care, including lack of homogeneity of DRGs, inability to predict length of stay, and…

  15. Fungal trunk diseases: A problem beyond grapevines?

    USDA-ARS?s Scientific Manuscript database

    Grapevine trunk diseases (GTDs) are caused by a range of taxonomically unrelated fungi, which occur wherever grapes are grown and are the main biotic factor limiting vineyard productivity and longevity. GTDs cause untenable economic losses. For example, they are considered a “national crisis” in Fra...

  16. Comparing Fit and Reliability Estimates of a Psychological Instrument Using Second-Order CFA, Bifactor, and Essentially Tau-Equivalent (Coefficient Alpha) Models via AMOS 22

    ERIC Educational Resources Information Center

    Black, Ryan A.; Yang, Yanyun; Beitra, Danette; McCaffrey, Stacey

    2015-01-01

    Estimation of composite reliability within a hierarchical modeling framework has recently become of particular interest given the growing recognition that the underlying assumptions of coefficient alpha are often untenable. Unfortunately, coefficient alpha remains the prominent estimate of reliability when estimating total scores from a scale with…

  17. Interrupting History: Rethinking History Curriculum after "The End of History". Counterpoints: Studies in the Postmodern Theory of Education. Volume 404

    ERIC Educational Resources Information Center

    Parkes, Robert John

    2011-01-01

    Since the emergence of postmodern social theory, history has been haunted by predictions of its imminent end. Postmodernism has been accused of making historical research and writing untenable, encouraging the proliferation of revisionist histories, providing fertile ground for historical denial, and promoting the adoption of a mournful view of…

  18. Criterion Referenced Measurement in Speech-Communication Classrooms: Panacea for Mediocrity. Research Report.

    ERIC Educational Resources Information Center

    Buley, Jerry L.

    The philosophical underpinnings of the typical testing practices of speech communication teachers in regard to norm-referenced measurement contain several assumptions which teachers may find untenable on closer inspection. Some of the consequences of these assumptions are a waste of human potential, inefficient use of instructional expertise,…

  19. Patriotism, History and the Legitimate Aims of American Education

    ERIC Educational Resources Information Center

    Merry, Michael S.

    2009-01-01

    This article argues that while an attachment to one's country is both natural and even partially justifiable, cultivating loyal patriotism in schools is untenable insofar as it conflicts with the legitimate aims of education. These aims include the epistemological competence necessary for ascertaining important truths germane to the various…

  20. Palatization in Japanese Mimetics: Response to Mester and Ito.

    ERIC Educational Resources Information Center

    Schourup, Lawrence; Tamori, Ikuhiro

    1992-01-01

    Mester and Ito's evidence for the phonological theory of Restricted Underspecification (RU) is refuted. Attention is focused on reduplicated forms; and it is concluded that, if there is only a rough and sporadic sound-syllable meaning association with palatization, the argument for RU is untenable. (12 references) (LB)

  1. Aggregating Political Dimensions: Of the Feasibility of Political Indicators

    ERIC Educational Resources Information Center

    Sanin, Francisco Gutierrez; Buitrago, Diana; Gonzalez, Andrea

    2013-01-01

    Political indicators are widely used in academic writing and decision making, but remain controversial. This paper discusses the problems related to the aggregation functions they use. Almost always, political indicators are aggregated by weighted averages or summations. The use of such functions is based on untenable assumptions (existence of…

  2. Estimating economic gains for landowners due to time-dependent changes in biotechnology

    Treesearch

    John E. Wagner; Thomas P. Holmes

    1998-01-01

    This paper presents a model for examining the economic value of biotechnological research given time-dependent changes in biotechnology. Previous papers examined this issue assuming a time-neutral change in biotechnology. However, when analyzing the genetic improvements of increasing a tree's resistance to a pathogen, this assumption is untenable. The authors...

  3. Setting and measuring team goals and objectives for improved management of forestry research

    Treesearch

    Scott J. Josiah

    1999-01-01

    As our world becomes more complex and diverse, many forestry research organizations are responding by adopting more interdisciplinary and collaborative research programs. Our rapidly increasing knowledge of the ecological, social, and economic factors affecting forestry and natural resource management makes it simply untenable to expect that complex problems can be...

  4. Education, Religion, and a Sustainable Planet

    ERIC Educational Resources Information Center

    Vandenberg, Donald

    2008-01-01

    Religious pluralism led to the colonies' separation of church and state by 1776, to Mann's campaign for common schooling, and to the complete secularization of public schools by 1900. The dependence of Western theology upon untenable Greek metaphysics justifies an explanation that the evolutionary purpose of religion was to promote personal…

  5. On the Road to Success: How States Collaborate and Use Data to Improve Student Outcomes. Executive Summary

    ERIC Educational Resources Information Center

    Jobs for the Future, 2012

    2012-01-01

    Enrollment is rising across the nation's community colleges, but completion rates remain untenably low. Reformers are focusing on the importance of using comprehensive, high-quality data on student progress and completion to bring about change. A core tenet of Achieving the Dream: Community Colleges Count has been to embed a culture of…

  6. On the Road to Success: How States Collaborate and Use Data to Improve Student Outcomes. A Working Paper by the Achieving the Dream Cross-State Data Work Group

    ERIC Educational Resources Information Center

    Baldwin, Chris; Borcoman, Gabriela; Chappell-Long, Cheryl; Coperthwaite, Corby A.; Glenn, Darrell; Hutchinson, Tony; Hughes, John; Jenkins, Rick; Jovanovich, Donna; Keller, Jonathan; Klimczak, Benjamin; Schneider, Bill; Stewart, Carmen; Stuart, Debra; Yeager, Michael

    2012-01-01

    Enrollment is rising across the nation's community colleges, but completion rates remain untenably low. Reformers are focusing on the importance of using comprehensive, high-quality data on student progress and completion to bring about change. A core tenet of Achieving the Dream: Community Colleges Count has been to embed a culture of…

  7. Sociopolitical Development in Educational Systems: From Margins to Center

    ERIC Educational Resources Information Center

    Kirshner, Ben; Hipolito-Delgado, Carlos; Zion, Shelley

    2015-01-01

    This is a challenging moment for supporters of public education: the status quo is untenable but the options offered by "reformers" appear equally dangerous. In this context we need arguments for the democratic purposes of education that offer an alternative to existing inequities on one hand and technocratic or privatized solutions on…

  8. 26 CFR 1.50A-4 - Exceptions to the application of § 1.50A-3.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... untenable that the employee is, in effect, compelled by the taxpayer to quit, or if the employee is coerced into quitting, the employee will not be deemed to have voluntarily left the employment of the taxpayer... the employee feels necessitates his quitting work with the taxpayer to remain at home. Any employee...

  9. Common Schools: Classical Schools Citizenship Education in a Pluralistic State

    ERIC Educational Resources Information Center

    Pitts, Timothy Wade

    2011-01-01

    In the current political climate, where many politicians in both Europe and United States have proclaimed that multi-cultural education has failed as an educational paradigm, there is a growing fear that the very idea of a democratic, multicultural society is untenable over time. In this dissertation, I explore three responses to the question of…

  10. Dynamic Cross Domain Information Sharing - A Concept Paper on Flexible Adaptive Policy Management

    DTIC Science & Technology

    2010-10-01

    no read-up, no write-down” rule of the classical Bell-La Padula [1] model is becoming unten- able because of the increasing need to seamlessly handle...Elliott Bell, "Looking Back at the Bell-La Padula Model," , Washington, DC, USA, 2005. [2] (2009, Jan.) DISA NCES Website. [Online]. http://www.disa.mil

  11. Content Validity of Standardized Achievement Tests and Test Curriculum Overlap.

    ERIC Educational Resources Information Center

    Green, Donald Ross

    Discussions of "test curriculum overlap" that focus on the term "mismatch" tend to be based on several untenable premises. This paper addresses the issue of the degree to which standardized tests should reflect the specific content of a given school curriculum with regard to three points: (1) The idea of matching the content of…

  12. Verknüpfung von DQ-Indikatoren mit KPIs und Auswirkungen auf das Return on Investment

    NASA Astrophysics Data System (ADS)

    Block, Frank

    Häufig ist nicht klar, welche Beziehungen zwischen Datenqualitätsindikatoren (DQI, Definition folgt weiter unten) und Key Performance Indicators (KPI, s. Abschnitt 1.3 für weitere Details) eines Unternehmens oder einer Organisation bestehen. Dies ist insbesondere deshalb von Bedeutung, da die Kenntnis dieser Beziehungen maßgeblich die Ausprägung eines Datenqualitätsprojekts beeinflusst.

  13. Rethinking Dabrowski's Theory: I. the Case against Primary Integration

    ERIC Educational Resources Information Center

    Piechowski, Michael M.

    2014-01-01

    Some terms of Dabrowski's theory are misleading. The construct of level and the concepts of integration and disintegration mean different things. The concept of primary integration as a starting point for personality development is untenable in light of research on child development. In its place, Level I as a type of development that is…

  14. Awakening Albia: Feminist Mythologies Beyond Androgyny.

    ERIC Educational Resources Information Center

    Spraggins, Mary Pringle

    The term androgyny, with its sex-related etymology, is based on untenable social stereotypes and for feminist critics is a dead end. The androgyny myth, like matriarchal myths and myths which deify women, should be replaced. However, a replacement would have to fill a wide niche in order to allow critics to focus from a propitious vantage point on…

  15. Pandora's Box: Academic Perceptions of Student Plagiarism in Writing

    ERIC Educational Resources Information Center

    Sutherland-Smith, Wendy

    2005-01-01

    Plagiarism is viewed by many academics as a kind of Pandora's box--the elements contained inside are too frightening to allow escape for fear of the havoc that may result. Reluctance by academic members of staff to discuss student plagiarism openly may contribute to the often untenable situations we, as teachers, face when dealing with student…

  16. An Analysis of the Academic Achievement of Urban and Rural Low-Socioeconomic Status Tennessee Public Schools

    ERIC Educational Resources Information Center

    Crow, Johnny

    2010-01-01

    Comparing a small, rural school with sometimes less than 100 students to a massive inner-city school with greater than 2,500 students is crude and untenable. There are simply too many variables. Nonetheless, the No Child Left Behind Act treats these two very different schools the same. When urban and rural schools cannot meet AYP or highly…

  17. Food Science Education and the Cognitive Science of Learning

    ERIC Educational Resources Information Center

    Chew, Stephen L.

    2014-01-01

    In this essay, I argue that the traditional view of teaching, that the teacher's responsibility is to present information that students are solely responsible for learning, has been rendered untenable by cognitive science research in learning. The teacher can have a powerful effect on student learning by teaching not only content, but how to…

  18. Twenty-five years of managing vegetation in conifer plantations in northern and central California: results, application, principles, and challenges

    Treesearch

    Philip M. McDonald; Gary O. Fiddler

    2010-01-01

    In the late 1970s, the outlook for conifer seedlings in new plantations in the Western United States was dismal&too many were dying or growing below the potential of the site. This situation was untenable, and a large study aimed at increasing the survival and growth of planted conifer seedlings was implemented. This was the National Administrative Study on...

  19. Shared Work, Valued Care: New Norms for Organizing Market Work and Unpaid Care Work.

    ERIC Educational Resources Information Center

    Appelbaum, Eileen; Bailey, Thomas; Berg, Peter; Kalleberg, Arne L.

    Until the 1970s, social norms dictated that women provided care for their families and men were employed for pay. The rapid increase in paid work for women has resulted in an untenable model of work and care in which all employees are assumed to be unencumbered with family responsibilities and women who care for their families are dismissed as…

  20. Reproductive semi-cloning respecting biparental origin. A biologically unsound principle.

    PubMed

    Tateno, H; Latham, K E; Yanagimachi, R

    2003-03-01

    The original debate article proposed the use of "semi-cloning" as a viable method for assisted reproduction. This debate counters the proposal as being biologically unsound. Given the fundamental limitations of chromosomal segregation and genomic imprinting, the notion of using the MII oocyte to drive haploidization of a somatic cell genome and thereby obtain a substitute for authentic gametes is ill-conceived and untenable.

  1. Maritime Pre-Positioning Force-Future: Bill Payer or Sea Basing Enabler?

    DTIC Science & Technology

    2008-03-25

    Ship Building Plan , UAV CLASSIFICATION: Unclassified Actions at sea no longer suffice to influence world events; actions from the sea must...in amphibious ships or fall victim to an untenable Navy ship building plan . Premature consideration of cost issues hindered MPF-F program...fiscal environment and an illusory Navy ship building plan . Given the demonstrated capability and success of the current Maritime Pre-positioning

  2. Professionalization of the Senior Chinese Officer Corps Trends and Implications

    DTIC Science & Technology

    1997-01-01

    81The officers who retired were Ye Jianying , Nie Rongzhen, Xu Xiangqian, Wang Zhen, Song Renqiong, and Li Desheng. Of course, the political impact of...increased education level, functional spe- cialization, and adherence to retirement norms.4 Li Cheng and Lynn White, in their 1993 Asian Survey article...making rigorous comparative analysis untenable. Second, Li and White do not place their results or analysis in any theoretical context. In

  3. JPRS Report West Europe

    DTIC Science & Technology

    1988-08-22

    crisis? [Answer] If one can generalize from the much more difficult economic scandals of other parties, one might conclude that the political effect...just revealed placement scandal made his posi- tion untenable. Aalto was politically primarily responsi- ble for the venture which had dragged SKP...into the mud and caused it much greater losses than had been reported publicly. Without the scandal Aalto would probably be sitting peacefully in

  4. Integrating holism and reductionism in the science of art perception.

    PubMed

    Graham, Daniel J

    2013-04-01

    The contextualist claim that universalism is irrelevant to the proper study of art can be evaluated by examining an analogous question in neuroscience. Taking the reductionist-holist debate in visual neuroscience as a model, we see that the analog of orthodox contextualism is untenable, whereas integrated approaches have proven highly effective. Given the connection between art and vision, unified approaches are likewise more germane to the scientific study of art.

  5. U.S. and Korea in Vietnam and the Japan-Korea Treaty: Search for Security, Prosperity and Influence

    DTIC Science & Technology

    1991-05-01

    never became the divisive social issue it became in the United States. There is even today an unspoken understanding that it helped the security and...effort made an immediate return untenable, but the U.S. was aware that it was a critical bilateral issue, because Japan had become important to the war...National Reconstruction (SCNR), and recommended that the emphasis of a new Korea policy should be on long-term economic, political, and social

  6. Should the patient be allowed to die? 1

    PubMed Central

    Nicholson, Richard

    1975-01-01

    In considering the patient's right to a certain quality of dying, this essay outlines how the legal and ethical justifications for passive euthanasia depend on the doctrine of acts and omissions. It is suggested that this doctrine is untenable and that alternative justifications are needed. The development of the modern mechanistic approach to death is traced, showing that a possible basis for an humane way of death lies in a reacceptance of a metaphysical concept of life. PMID:1100831

  7. Dissipative neutrino oscillations in randomly fluctuating matter

    NASA Astrophysics Data System (ADS)

    Benatti, F.; Floreanini, R.

    2005-01-01

    The generalized dynamics describing the propagation of neutrinos in randomly fluctuating media is analyzed: It takes into account matter-induced, decoherence phenomena that go beyond the standard Mikheyev-Smirnov-Wolfenstein (MSW) effect. A widely adopted density fluctuation pattern is found to be physically untenable: A more general model needs to be instead considered, leading to flavor changing effective neutrino-matter interactions. They induce new, dissipative effects that modify the neutrino oscillation pattern in a way amenable to a direct experimental analysis.

  8. Comments on the photospheric dynamo model of Henoux and Somov

    NASA Technical Reports Server (NTRS)

    Melrose, D. B.; Khan, J. I.

    1989-01-01

    A detailed model for a photospheric dynamo has been presented by Henoux and Somov (1987), who used the three-fluid model to treat the properties of the weakly ionized plasma. Only the equations for the two ionized components were solved. The equation for the neutral component is considered, and it is argued that the model is unacceptable becaused of an implied impossibly large unbalanced stress on the neutral gas. It is argued more generally that all existing photospheric dynamo models are untenable.

  9. More bang for your buck: super-adiabatic quantum engines.

    PubMed

    del Campo, A; Goold, J; Paternostro, M

    2014-08-28

    The practical untenability of the quasi-static assumption makes any realistic engine intrinsically irreversible and its operating time finite, thus implying friction effects at short cycle times. An important technological goal is thus the design of maximally efficient engines working at the maximum possible power. We show that, by utilising shortcuts to adiabaticity in a quantum engine cycle, one can engineer a thermodynamic cycle working at finite power and zero friction. Our findings are illustrated using a harmonic oscillator undergoing a quantum Otto cycle.

  10. More bang for your buck: Super-adiabatic quantum engines

    PubMed Central

    Campo, A. del; Goold, J.; Paternostro, M.

    2014-01-01

    The practical untenability of the quasi-static assumption makes any realistic engine intrinsically irreversible and its operating time finite, thus implying friction effects at short cycle times. An important technological goal is thus the design of maximally efficient engines working at the maximum possible power. We show that, by utilising shortcuts to adiabaticity in a quantum engine cycle, one can engineer a thermodynamic cycle working at finite power and zero friction. Our findings are illustrated using a harmonic oscillator undergoing a quantum Otto cycle. PMID:25163421

  11. Verknüpfung von DQ-Indikatoren mit KPIs und Auswirkungen auf das Return on Investment

    NASA Astrophysics Data System (ADS)

    Block, Frank

    Häufig ist nicht klar, welche Beziehungen zwischen Datenqualitätsindikatoren (DQI, Definition folgt weiter unten) und Key Performance Indicators (KPI, s. Abschnitt 1.3 für weitere Details) eines Unternehmens oder einer Organisation bestehen. Dies ist insbesondere deshalb von Bedeutung, da die Kenntnis dieser Beziehungen maßgeblich die Ausprägung eines Datenqualitätsprojekts beeinflusst. Sie ist als Entscheidungsgrundlage unabdingbar und gibt Antworten auf folgende Fragen: Was kostet unserem Unternehmen/unserer Organisation1 schlechte Datenqualität? Können wir uns das leisten?

  12. Relativity and indeterminism

    NASA Astrophysics Data System (ADS)

    Byrne, Patrick H.

    1981-12-01

    It is well known that Albert Einstein adhered to a deterministic world view throughout his career. Nevertheless, his developments of the special and general theories of relativity prove to be incompatible with that world view. Two different forms of determinism—classical Laplacian determinism and the determinism of isolated systems—are considered. Through careful considerations of what concretely is involved in predicting future states of the entire universe, or of isolated systems, it is shown that the demands of the theories of relativity make these deterministic positions untenable.

  13. The problem of moral motivation and the happy victimizer phenomenon: killing two birds with one stone.

    PubMed

    Minnameier, Gerhard

    2010-01-01

    One surprising feature of cognitive and emotional development in the moral domain is the so-called happy victimizer phenomenon, which is commonly explained by a lack of moral motivation. Concerning this general approach, there are two pieces of news in this chapter. The bad news is that moral motivation is a highly problematic concept and its purported theoretical role in moral functioning untenable. The good news is that the happy victimizer phenomenon can be explained without reference to something like "moral motivation." © Wiley Periodicals, Inc.

  14. Forecasting continuously increasing life expectancy: what implications?

    PubMed

    Le Bourg, Eric

    2012-04-01

    It has been proposed that life expectancy could linearly increase in the next decades and that median longevity of the youngest birth cohorts could reach 105 years or more. These forecasts have been criticized but it seems that their implications for future maximal lifespan (i.e. the lifespan of the last survivors) have not been considered. These implications make these forecasts untenable and it is less risky to hypothesize that life expectancy and maximal lifespan will reach an asymptotic limit in some decades from now. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. A new quasi-thermal trap model for solar flare hard X-ray bursts - An electrostatic trap model

    NASA Technical Reports Server (NTRS)

    Spicer, D. S.; Emslie, A. G.

    1988-01-01

    A new quasi-thermal trap model of solar flare hard X-ray bursts is presented. The new model utilizes the trapping ability of a magnetic mirror and a magnetic field-aligned electrostatic potential produced by differences in anisotropies of the electron and ion distribution function. It is demonstrated that this potential can, together with the magnetic mirror itself, effectively confine electrons in a trap, thereby enhancing their bremsstrahlung yield per electron. This analysis makes even more untenable models involving precipitation of the bremsstrahlung-producing electrons onto a cold target.

  16. Untenable nonstationarity: An assessment of the fitness for purpose of trend tests in hydrology

    NASA Astrophysics Data System (ADS)

    Serinaldi, Francesco; Kilsby, Chris G.; Lombardo, Federico

    2018-01-01

    The detection and attribution of long-term patterns in hydrological time series have been important research topics for decades. A significant portion of the literature regards such patterns as 'deterministic components' or 'trends' even though the complexity of hydrological systems does not allow easy deterministic explanations and attributions. Consequently, trend estimation techniques have been developed to make and justify statements about tendencies in the historical data, which are often used to predict future events. Testing trend hypothesis on observed time series is widespread in the hydro-meteorological literature mainly due to the interest in detecting consequences of human activities on the hydrological cycle. This analysis usually relies on the application of some null hypothesis significance tests (NHSTs) for slowly-varying and/or abrupt changes, such as Mann-Kendall, Pettitt, or similar, to summary statistics of hydrological time series (e.g., annual averages, maxima, minima, etc.). However, the reliability of this application has seldom been explored in detail. This paper discusses misuse, misinterpretation, and logical flaws of NHST for trends in the analysis of hydrological data from three different points of view: historic-logical, semantic-epistemological, and practical. Based on a review of NHST rationale, and basic statistical definitions of stationarity, nonstationarity, and ergodicity, we show that even if the empirical estimation of trends in hydrological time series is always feasible from a numerical point of view, it is uninformative and does not allow the inference of nonstationarity without assuming a priori additional information on the underlying stochastic process, according to deductive reasoning. This prevents the use of trend NHST outcomes to support nonstationary frequency analysis and modeling. We also show that the correlation structures characterizing hydrological time series might easily be underestimated, further compromising the attempt to draw conclusions about trends spanning the period of records. Moreover, even though adjusting procedures accounting for correlation have been developed, some of them are insufficient or are applied only to some tests, while some others are theoretically flawed but still widely applied. In particular, using 250 unimpacted stream flow time series across the conterminous United States (CONUS), we show that the test results can dramatically change if the sequences of annual values are reproduced starting from daily stream flow records, whose larger sizes enable a more reliable assessment of the correlation structures.

  17. DO PERIODICITIES IN EXTINCTION-WITH POSSIBLE ASTRONOMICAL CONNECTIONS-SURVIVE A REVISION OF THE GEOLOGICAL TIMESCALE?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melott, Adrian L.; Bambach, Richard K.

    A major revision of the geological timescale was published in 2012. We re-examine our past finding of a 27 Myr periodicity in marine extinction rates by re-assigning dates to the extinction data used previously. We find that the spectral power in this period is somewhat increased, and persists at a narrow bandwidth, which supports our previous contention that the Nemesis hypothesis is untenable as an explanation for the periodicity that was first noted by Raup and Sepkoski in the 1980s. We enumerate a number of problems in a recent study comparing extinction rates with time series models.

  18. Do Periodicities in Extinction—with Possible Astronomical Connections—Survive a Revision of the Geological Timescale?

    NASA Astrophysics Data System (ADS)

    Melott, Adrian L.; Bambach, Richard K.

    2013-08-01

    A major revision of the geological timescale was published in 2012. We re-examine our past finding of a 27 Myr periodicity in marine extinction rates by re-assigning dates to the extinction data used previously. We find that the spectral power in this period is somewhat increased, and persists at a narrow bandwidth, which supports our previous contention that the Nemesis hypothesis is untenable as an explanation for the periodicity that was first noted by Raup & Sepkoski in the 1980s. We enumerate a number of problems in a recent study comparing extinction rates with time series models.

  19. Are There Levels of Consciousness?

    PubMed

    Bayne, Tim; Hohwy, Jakob; Owen, Adrian M

    2016-06-01

    The notion of a level of consciousness is a key construct in the science of consciousness. Not only is the term employed to describe the global states of consciousness that are associated with post-comatose disorders, epileptic absence seizures, anaesthesia, and sleep, it plays an increasingly influential role in theoretical and methodological contexts. However, it is far from clear what precisely a level of consciousness is supposed to be. This paper argues that the levels-based framework for conceptualizing global states of consciousness is untenable and develops in its place a multidimensional account of global states. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Recurring errors among recent history of psychology textbooks.

    PubMed

    Thomas, Roger K

    2007-01-01

    Five recurring errors in history of psychology textbooks are discussed. One involves an identical misquotation. The remaining examples involve factual and interpretational errors that more than one and usually several textbook authors made. In at least 2 cases some facts were fabricated, namely, so-called facts associated with Pavlov's mugging and Descartes's reasons for choosing the pineal gland as the locus for mind-body interaction. A fourth example involves Broca's so-called discovery of the speech center, and the fifth example involves misinterpretations of Lloyd Morgan's intentions regarding his famous canon. When an error involves misinterpretation and thus misrepresentation, I will show why the misinterpretation is untenable.

  1. Letter to the editor: On plurality and authorship in science.

    PubMed

    Tang, Bor Luen

    2018-01-01

    Moffatt argues that the "plurality of distinct accounts of scientific authorship" necessitates caution in attempts to identify unethical authorship practices, and urges that considerations be given to establishing a "single consensus account of authorship." The revised International Committee of Medical Journal Editors (ICMJE) criteria do capture the essential features of authorship in terms of "intellectual contribution" and "responsibility and accountability," which would clearly demarcate academically legitimate authorship from the common misdemeanors of ghost writing and honorary authorship. However, plurality in the practice of science and credit-sharing culture at the ground would likely render universal adoption or compliance of a single consensus account of authorship untenable.

  2. Developing Practice Guidelines for Psychoanalysis

    PubMed Central

    GRAY, SHEILA HAFTER

    1996-01-01

    Consensus-based practice guidelines codify clinical intelligence and the rich oral tradition in medicine. Because they reflect actual practice, they are readily accepted by clinicians as a basis for external review. This article illustrates the development of guidelines for a psychoanalytic approach to the large pool of patients who present with a depression. It suggests an integrated biopsychosocial approach to these individuals that is useful in current practice, and it offers propositions that may be tested in future research undertakings. Eventually, practice guidelines such as these may form the basis of economical systems of health care that avoid arbitrary, clinically untenable limitations on services. PMID:22700290

  3. Berkeley's moral philosophy.

    PubMed Central

    Warnock, G

    1990-01-01

    Berkeley held that the moral duty of mankind was to obey God's laws; that--since God was a benevolent Creator--the object of His laws must be to promote the welfare and flourishing of mankind; and that, accordingly, humans could identify their moral duties by asking what system of laws for conduct would in fact tend to promote that object. This position--which is akin to that of 'rule' Utilitarianism--is neither unfamiliar nor manifestly untenable. He was surely mistaken, however, in his further supposition that, if this theory were accepted, the resolution of all (or most) particular moral dilemmas would be simple and straightforward. PMID:2181141

  4. Respect for cultural diversity in bioethics. Empirical, conceptual and normative constraints.

    PubMed

    Bracanovic, Tomislav

    2011-08-01

    In contemporary debates about the nature of bioethics there is a widespread view that bioethical decision making should involve certain knowledge of and respect for cultural diversity of persons to be affected. The aim of this article is to show that this view is untenable and misleading. It is argued that introducing the idea of respect for cultural diversity into bioethics encounters a series of conceptual and empirical constraints. While acknowledging that cultural diversity is something that decision makers in bioethical contexts should try to understand and, when possible, respect, it is argued that this cultural turn ignores the typically normative role of bioethics and thus threatens to undermine its very foundations.

  5. Crossing the line: the legal and ethical problems of foreign surrogacy.

    PubMed

    Gamble, Natalie

    2009-08-01

    UK law has for many years taken a careful approach to surrogacy, neither banning it nor allowing it to develop unrestrictedly. This careful middle approach seeks to balance permitting what may be a last hope for infertile couples against a wider public policy that bars commercialized reproduction: surrogacy is allowed in the UK, provided it is consensual and involves the payment of no more than reasonable expenses. But in an increasingly globalized world, patients are crossing borders for treatment, often to places where such restrictions on the commerciality or enforceability of surrogacy arrangements do not apply. The resulting conflicts of law can be a minefield, and this makes the maintenance of the UK's careful legal balance increasingly untenable.

  6. No-Fault Malpractice Insurance

    PubMed Central

    Bush, J. W.; Chen, M. M.; Bush, A. S.

    1975-01-01

    No-fault medical malpractice insurance has been proposed as an alternative to the present tort liability approach. Statistical examination of the concept of proximate cause reveals not only that the question of acceptable care, and therefore of fault, is unavoidable in identifying patients deserving compensation, but also that specifying fault in an individual case is scientifically untenable. A simple formula for a Coefficient of Causality clarifies the question of proximate cause in existing trial practices and suggests that many of the threats associated with malpractice suits arise from the structure of the tort-insurance system rather than from professional responsibility for medical injury. The concepts could provide the basis for a revised claims and compensation procedure. PMID:1146300

  7. Acid-base chemistry of frustrated water at protein interfaces.

    PubMed

    Fernández, Ariel

    2016-01-01

    Water molecules at a protein interface are often frustrated in hydrogen-bonding opportunities due to subnanoscale confinement. As shown, this condition makes them behave as a general base that may titrate side-chain ammonium and guanidinium cations. Frustration-based chemistry is captured by a quantum mechanical treatment of proton transference and shown to remove same-charge uncompensated anticontacts at the interface found in the crystallographic record and in other spectroscopic information on the aqueous interface. Such observations are untenable within classical arguments, as hydronium is a stronger acid than ammonium or guanidinium. Frustration enables a directed Grotthuss mechanism for proton transference stabilizing same-charge anticontacts. © 2015 Federation of European Biochemical Societies.

  8. Me and my body: the relevance of the distinction for the difference between withdrawing life support and euthanasia.

    PubMed

    McGee, Andrew

    2011-01-01

    In this paper, I discuss David Shaw's claim that the body of a terminally ill person can be conceived as a kind of life support, akin to an artificial ventilator. I claim that this position rests upon an untenable dualism between the mind and the body. Given that dualism continues to be attractive to some thinkers, I attempt to diagnose the reasons why it continues to be attractive, as well as to demonstrate its incoherence, drawing on some recent work in the philosophy of psychology. I conclude that, if my criticisms are sound, Shaw's attempt to deny the distinction between withdrawal and euthanasia fails. © 2011 American Society of Law, Medicine & Ethics, Inc.

  9. Group competition, reproductive leveling, and the evolution of human altruism.

    PubMed

    Bowles, Samuel

    2006-12-08

    Humans behave altruistically in natural settings and experiments. A possible explanation-that groups with more altruists survive when groups compete-has long been judged untenable on empirical grounds for most species. But there have been no empirical tests of this explanation for humans. My empirical estimates show that genetic differences between early human groups are likely to have been great enough so that lethal intergroup competition could account for the evolution of altruism. Crucial to this process were distinctive human practices such as sharing food beyond the immediate family, monogamy, and other forms of reproductive leveling. These culturally transmitted practices presuppose advanced cognitive and linguistic capacities, possibly accounting for the distinctive forms of altruism found in our species.

  10. Could life have evolved in cometary nuclei

    NASA Technical Reports Server (NTRS)

    Bar-Nun, A.; Lazcano-Araujo, A.; Oro, J.

    1981-01-01

    The suggestion by Hoyle and Wickramasinghe (1978) that life might have originated in cometary nuclei rather than directly on the earth is discussed. Factors in the cometary environment including the conditions at perihelion passage leading to the ablation of cometary ices, ice temperatures, the absence of an atmosphere and discrete liquid and solid surfaces, weak cometary structure incapable of supporting a liquid core, and radiation are presented as arguments against biopoesis in comets. It is concluded that although the contribution of cometary and meteoritic matter was significant in shaping the earth environment, the view that life on earth originally arose in comets is untenable, and the proposition that the process of interplanetary infection still occurs is unlikely in view of the high specificity of host-parasite relationships.

  11. Reproductive cloning and arguments from potential.

    PubMed

    Oakley, Justin

    2006-01-01

    The possibility of human reproductive cloning has led some bioethicists to suggest that potentiality-based arguments for fetal moral status become untenable, as such arguments would be committed to making the implausible claim that any adult somatic cell is itself a potential person. In this article I defend potentiality-based arguments for fetal moral status against such a reductio. Starting from the widely-held claim that the maintenance of numerical identity throughout successive changes places constraints on what a given entity can plausibly be said to have the potential to become, I argue that the cell reprogramming that takes place in reproductive cloning is such that it produces a new individual, and so adult somatic cells cannot be potential persons.

  12. Elevated blood pressure and personality: a meta-analytic review.

    PubMed

    Jorgensen, R S; Johnson, B T; Kolodziej, M E; Schreer, G E

    1996-09-01

    A meta-analysis of 295 relevant effect sizes obtained from 25,469 participants confirmed expectations that elevated blood pressure (BP) and essential hypertension (EH) would be associated with lower affect expression but with more negative affectivity and defensiveness. The strongest associations occurred for defensiveness and measures of anger and affect expression linked to an interpersonal context(s). However, a number of other factors also were found to moderate associations of BP with personality measures, including awareness of BP status, gender, occupation, and diastolic versus systolic BP assessment. Given these moderators, the authors conclude that a traditional view of personality causing EH is untenable and that, not incorporating multifactorial, synergistic approaches is likely to obscure associations of personality-behavior with EH.

  13. The economics of new drugs: can we afford to make progress in a common disease?

    PubMed

    Hirsch, Bradford R; Schulman, Kevin A

    2013-01-01

    The concept of personalized medicine is beginning to come to fruition, but the cost of drug development is untenable today. To identify new initiatives that would support a more sustainable business model, the economics of drug development are analyzed, including the cost of drug development, cost of capital, target market size, returns to innovators at the product and firm levels, and, finally, product pricing. We argue that a quick fix is not available. Instead, a rethinking of the entire pharmaceutical development process is needed from the way that clinical trials are conducted, to the role of biomarkers in segmenting markets, to the use of grant support, and conditional approval to decrease the cost of capital. In aggregate, the opportunities abound.

  14. Influenza pandemic periodicity, virus recycling, and the art of risk assessment.

    PubMed

    Dowdle, Walter R

    2006-01-01

    Influenza pandemic risk assessment is an uncertain art. The theory that influenza A virus pandemics occur every 10 to 11 years and seroarcheologic evidence of virus recycling set the stage in early 1976 for risk assessment and risk management of the Fort Dix, New Jersey, swine influenza outbreak. Additional data and passage of time proved the theory untenable. Much has been learned about influenza A virus and its natural history since 1976, but the exact conditions that lead to the emergence of a pandemic strain are still unknown. Current avian influenza events parallel those of swine influenza in 1976 but on a larger and more complex scale. Pre- and post-pandemic risk assessment and risk management are continuous but separate public health functions.

  15. Measuring ambivalence to science

    NASA Astrophysics Data System (ADS)

    Gardner, P. L.

    Ambivalence is a psychological state in which a person holds mixed feelings (positive and negative) towards some psychological object. Standard methods of attitude measurement, such as Likert and semantic differential scales, ignore the possibility of ambivalence; ambivalent responses cannot be distinguished from neutral ones. This neglect arises out of an assumption that positive and negative affects towards a particular psychological object are bipolar, i.e., unidimensional in opposite directions. This assumption is frequently untenable. Conventional item statistics and measures of test internal consistency are ineffective as checks on this assumption; it is possible for a scale to be multidimensional and still display apparent internal consistency. Factor analysis is a more effective procedure. Methods of measuring ambivalence are suggested, and implications for research are discussed.

  16. In defence of pedagogy: a critique of the notion of andragogy.

    PubMed

    Darbyshire, P

    1993-10-01

    Malcolm Knowles' theory of andragogy has gained increasing acceptance among nurse educators. Andragogy is espoused as a progressive educational theory, adopted as a theoretical underpinning for curricula and is even considered to be synonymous with a variety of teaching techniques and strategies such as 'problem-based' and 'self-directed' learning. This paper offers a critique of the notion of andragogy which maintains that the distinction created between andragogy and pedagogy is spurious and based upon assumptions which are untenable. It is argued that andragogy has been uncritically accepted within nursing education in much the same way that the nursing process and models of nursing were in their day. Finally, it is claimed that true pedagogy has far more radical, powerful and transformative possibilities for nursing education.

  17. Why Von Neumann interstellar probes could not exist: nonoptical reflections on modern analytic philosophy, bad arguments, and unutilised data.

    NASA Astrophysics Data System (ADS)

    Goodall, Clive

    1993-08-01

    A decisive and lethal response to a naive radical skepticism concerning the prospects for the existence of Extraterrestrial Intelligence is derivable from core areas of Modern Analytic Philosophy. The naive skeptical view is fundamentally flawed in the way it oversimplifies certain complex issues, failing as it does, to recognize a special class of conceptual problems for what they really are and mistakenly treating them instead as empirical issues. Specifically, this skepticism is based upon an untenable oversimplifying mode of the 'mind-brain' relation. Moreover, independent logical considerations concerning the mind-brain relation provide evidential grounds for why we should in fact expect a priori that an Alien Intelligence will face constraints upon, and immense difficulties in, making its existence known by non- electromagnetic means.

  18. Tradition and Technology: Sea Ice Science on Inuit Sleds

    NASA Astrophysics Data System (ADS)

    Wilkinson, Jeremy P.; Hanson, Susanne; Hughes, Nick E.; James, Alistair; Jones, Bryn; MacKinnon, Rory; Rysgaard, Søren; Toudal, Leif

    2011-01-01

    The Arctic is home to a circumpolar community of native people whose culture and traditions have enabled them to thrive in what most would perceive as a totally inhospitable and untenable environment. In many ways, sea ice can be viewed as the glue that binds these northern communities together; it is utilized in all aspects of their daily life. Sea ice acts as highways of the north; indeed, one can travel on these highways with dogsleds and snowmobiles. These travels over the frozen ocean occur at all periods of the sea ice cycle and over different ice types and ages. Excursions may be hunting trips to remote regions or social visits to nearby villages. Furthermore, hunting on the sea ice contributes to the health, culture, and commercial income of a community.

  19. Self-interest and the theory of action.

    PubMed

    Barbalet, Jack

    2012-09-01

    The concept of self-interest remains underdeveloped in sociology although central to economics. Recent methodological and social trends render sociological indifference to the concept untenable. The term has enjoyed historical predominance in the West since the sixteenth century. While it is seen in modern economics as a singular motivating force, Adam Smith regarded self-interest in economic action as necessarily moderated by sympathy. In addition to its problematic economic conceptualization self-interest has an experiential basis in unequal power relations. An alternative to the concept of self-interest is presented by Amartya Sen in his account of commitment; its inconsistencies, however, render Sen's statement unsatisfactory. Differences between present and future interests indicate that the distinction between self-interested and other-interested action is not sustainable. © London School of Economics and Political Science 2012.

  20. Lifelong antiretroviral therapy or HIV cure: The benefits for the individual patient.

    PubMed

    Buell, Kevin G; Chung, Christopher; Chaudhry, Zain; Puri, Aiysha; Nawab, Khizr; Ravindran, Rahul Prashanth

    2016-01-01

    There are an estimated 35 million people living with human immunodeficiency virus (HIV) globally, 19 million of whom are unaware of their HIV status and, in the absence of antiretroviral therapy (ART), will have a shortened life expectancy. Although ART remains the "gold standard" for treatment of HIV infection, the requirement for lifelong treatment poses multiple challenges for the patient. These include stigma, an untenable pill burden, side effects, and the threat of viral resistance in the case of non-compliance. This review evaluates the challenges of accessing, delivering, and sustaining ART for people living with HIV and will discuss the case for pursuing a goal of HIV cure, the potential benefits of such a cure for the individual patient, and the current potential candidates for such a cure.

  1. Iconoclast or creed? Objectivism, pragmatism, and the hierarchy of evidence.

    PubMed

    Goldenberg, Maya J

    2009-01-01

    Because "evidence" is at issue in evidence-based medicine (EBM), the critical responses to the movement have taken up themes from post-positivist philosophy of science to demonstrate the untenability of the objectivist account of evidence. While these post-positivist critiques seem largely correct, I propose that when they focus their analyses on what counts as evidence, the critics miss important and desirable pragmatic features of the evidence-based approach. This article redirects critical attention toward EBM's rigid hierarchy of evidence as the culprit of its objectionable epistemic practices. It reframes the EBM discourse in light of a distinction between objectivist and pragmatic epistemology, which allows for a more nuanced analysis of EBM than previously offered: one that is not either/or in its evaluation of the decision-making technology as either iconoclastic or creedal.

  2. Comment on ‘Poynting flux in the neighbourhood of a point charge in arbitrary motion and radiative power losses’

    NASA Astrophysics Data System (ADS)

    Rowland, David R.

    2018-01-01

    Based on a calculation of the Poynting vector flux in the neighbourhood of an accelerating point charge, Singal (2016 Eur. J. Phys. 37 045210) has claimed that the instantaneous rate of energy radiated by the charge differs from the Larmor formula. It is argued in this comment that Singal’s proposed formula for the radiated power is physically untenable because it predicts a negative rate of energy loss in physically realisable situations. The cause of Singal’s erroneous conclusion is identified as being a failure to realise that the bound electromagnetic field energy of an accelerating charge differs by the Schott energy from the bound field energy of a charge moving at a constant velocity equal to the current velocity of the accelerating charge. References to the salient literature are provided.

  3. Stinging ants.

    PubMed

    Rhoades, R

    2001-08-01

    Ants belong to the order Hymenoptera, along with bees, wasps, yellow jackets, etc., they are the most successful animal genera in this world. It is their selfless social structure which accounts for their huge impact. Their effect on man ranges from the parasol ant, which makes plant cultivation untenable in certain parts of South America, to Solenopsis Invicta in the southeastern United States of America, which kill ground dwelling birds and small animals, harass livestock, and renders farmland unusable. With the exception of the Bulldog Ant of Australia (which is the size of a medium cockroach) direct toxic effects are not a lethal threat to man. Human fatalities and morbidity are related to secondary infections of excoriated stings or allergic anaphylaxis. This article reviews history and recent developments regarding stinging ants around the world.

  4. The Limits of Surrogates' Moral Authority and Physician Professionalism: Can the Paradigm of Palliative Sedation Be Instructive?

    PubMed

    Berger, Jeffrey T

    2017-01-01

    With narrow exception, physicians' treatment of incapacitated patients requires the consent of health surrogates. Although the decision-making authority of surrogates is appropriately broad, their moral authority is not without limits. Discerning these bounds is particularly germane to ethically complex treatments and has important implications for the welfare of patients, for the professional integrity of clinicians, and, in fact, for the welfare of surrogates. Palliative sedation is one such complex treatment; as such, it provides a valuable model for analyzing the scope of surrogates' moral authority. Guidelines for palliative sedation that present it as a "last-resort" treatment for severe and intractable suffering yet require surrogate consent in order to offer it are ethically untenable, precisely because the moral limits of surrogate authority have not been considered. © 2017 The Hastings Center.

  5. Claims and Identity: On-Premise and Cloud Solutions

    NASA Astrophysics Data System (ADS)

    Bertocci, Vittorio

    Today's identity-management practices are often a patchwork of partial solutions, which somehow accommodate but never really integrate applications and entities separated by technology and organizational boundaries. The rise of Software as a Service (SaaS) and cloud computing, however, will force organizations to cross such boundaries so often that ad hoc solutions will simply be untenable. A new approach that tears down identity silos and supports a de-perimiterized IT by design is in order.This article will walk you through the principles of claims-based identity management, a model which addresses both traditional and cloud scenarios with the same efficacy. We will explore the most common token exchange patterns, highlighting the advantages and opportunities they offer when applied on cloud computing solutions and generic distributed systems.

  6. Growth and patterning in the conodont skeleton

    PubMed Central

    Donoghue, P. C. J.

    1998-01-01

    Recent advances in our understanding of conodont palaeobiology and functional morphology have rendered established hypotheses of element growth untenable. In order to address this problem, hard tissue histology is reviewed paying particular attention to the relationships during growth of the component hard tissues comprising conodont elements, and ignoring a priori assumptions of the homologies of these tissues. Conodont element growth is considered further in terms of the pattern of formation, of which four distinct types are described, all possibly derived from a primitive condition after heterochronic changes in the timing of various developmental stages. It is hoped that this may provide further means of unravelling conodont phylogeny. The manner in which the tissues grew is considered homologous with other vertebrate hard tissues, and the elements appear to have grown in a way similar to the growing scales and growing dentition of other vertebrates.

  7. Nutritional and other types of oedema, albumin, complex carbohydrates and the interstitium - a response to Malcolm Coulthard's hypothesis: Oedema in kwashiorkor is caused by hypo-albuminaemia.

    PubMed

    Golden, Michael Henry

    2015-05-01

    The various types of oedema in man are considered in relation to Starling's hypothesis of fluid movement from capillaries, with the main emphasis on nutritional oedema and the nephrotic syndrome in children. It is concluded that each condition has sufficient anomalous findings to render Starling's hypothesis untenable. The finding that the endothelial glycocalyx is key to control of fluid movement from and into the capillaries calls for complete revision of our understanding of oedema formation. The factors so far known to affect the function of the glycocalyx are reviewed. As these depend upon sulphated proteoglycans and other glycosaminoglycans, the argument is advanced that the same abnormalities will extend to the interstitial space and that kwashiorkor is fundamentally related to a defect in sulphur metabolism which can explain all the clinical features of the condition, including the formation of oedema.

  8. Emergency canine surgery in a deployed forward surgical team: a case report.

    PubMed

    Beitler, Alan L; Jeanette, Joseph P; McGraw, Andrew L; Butera, Jennifer R; Vanfosson, Christopher A; Seery, Jason M

    2011-04-01

    Forward surgical teams (FSTs) perform a variety of non-doctrinal functions. During their deployment to Afghanistan, the 541st FST (Airborne) performed emergency surgery on a German shepherd military working dog (MWD). Retrospective examination of a case of veterinary surgery in a deployed FST. A 5 1/2-year-old German shepherd MWD presented with extreme lethargy, tachycardia, excessive drooling, and a firm, distended abdomen. These conditions resulted from gastric dilatation with volvulus. Since evacuation to a veterinarian was untenable, emergency laparotomy was performed in the FST. The gastric dilatation with volvulus was treated by detorsion and gastropexy, and the canine patient fully recovered. Canine surgery can be safely performed in an FST. Based on the number of MWDs deployed throughout the theater, FSTs may be called upon to care for them in the absence of available veterinary care.

  9. On the alleged collisional origin of the Kirkwood Gaps. [in asteroid belt

    NASA Technical Reports Server (NTRS)

    Heppenheimer, T. A.

    1975-01-01

    This paper examines two proposed mechanisms whereby asteroidal collisions and close approaches may have given rise to the Kirkwood Gaps. The first hypothesis is that asteroids in near-resonant orbits have markedly increased collision probabilities and so are preferentially destroyed, or suffer decay in population density, within the resonance zones. A simple order-of-magnitude analysis shows that this hypothesis is untenable since it leads to conclusions which are either unrealistic or not in accord with present understanding of asteroidal physics. The second hypothesis is the Brouwer-Jefferys theory that collisions would smooth an asteroidal distribution function, as a function of Jacobi constant, thus forming resonance gaps. This hypothesis is examined by direct numerical integration of 50 asteroid orbits near the 2:1 resonance, with collisions simulated by random variables. No tendency to form a gap was observed.

  10. Clinical research with economically disadvantaged populations

    PubMed Central

    Denny, Colleen C; Grady, Christine

    2007-01-01

    Concerns about exploiting the poor or economically disadvantaged in clinical research are widespread in the bioethics community. For some, any research that involves economically disadvantaged individuals is de facto ethically problematic. The economically disadvantaged are thought of as “venerable” to exploitation, impaired decision making, or both, thus requiring either special protections or complete exclusion from research. A closer examination of the worries about vulnerabilities among the economically disadvantaged reveals that some of these worries are empirically or logically untenable, while others can be better resolved by improved study designs than by blanket exclusion of poorer individuals from research participation. The scientific objective to generate generalisable results and the ethical objective to fairly distribute both the risks and benefits of research oblige researchers not to unnecessarily bar economically disadvantaged subjects from clinical research participation. PMID:17601862

  11. Dissonance and importance: attitude change effects of personal relevance and race of the beneficiary of a counterattitudinal advocacy.

    PubMed

    Eisenstadt, Donna; Leippe, Michael R

    2005-08-01

    The authors asked or instructed White college students to write an essay advocating a large tuition hike to increase scholarships for either students in general or Black students (yielding low or high racial symbolism, respectively) that would take effect in the near or far future (yielding high or low personal relevance, respectively). Especially when high-choice participants wrote highly compliant (i.e., unqualified) essays, attitude change was greater when the advocacy had either high (vs. low) personal relevance or high (vs. low) racial symbolism. Yet there was no attitude change when both symbolism and relevance were high. They may increase dissonance by making the dissonant elements more important and numerous. The coupling of relevance and symbolism, however, may link the attitude so strongly to personal values and self-concept that attitude change becomes untenable as a mode of dissonance reduction.

  12. Un-renormalized classical electromagnetism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibison, Michael

    2006-02-15

    This paper follows in the tradition of direct-action versions of electromagnetism having the aim of avoiding a balance of infinities wherein a mechanical mass offsets an infinite electromagnetic mass so as to arrive at a finite observed value. However, the direct-action approach ultimately failed in that respect because its initial exclusion of self-action was later found to be untenable in the relativistic domain. Pursing the same end, this paper examines instead a version of electromagnetism wherein mechanical action is excluded and self-action is retained. It is shown that the resulting theory is effectively interacting due to the presence of infinitemore » forces. A vehicle for the investigation is a pair of classical point charges in a positronium-like arrangement for which the orbits are found to be self-sustaining and naturally quantized.« less

  13. White dwarfs, the Galaxy and Dirac's cosmology

    NASA Technical Reports Server (NTRS)

    Stothers, R.

    1976-01-01

    The additive and multiplicative versions of Dirac's cosmological hypothesis relating the gravitational constant variation with elapsed time and number of particles populating the universe is invoked to account for the deficiency or absence of white dwarfs fainter than about 0.0001 solar luminosity. An estimate is made of white dwarf luminosity in accordance with the two evolutionary models, and it is conjectured that some old white dwarfs with high space velocities may be on the verge of gravitational collapse. Lack of a special mechanism to produce the vast numbers of black holes or other dead stars accounting for 'missing matter' in the vicinity of the sun and in the galactic halo is noted in Dirac's multiplicative model. Results indicate that either Dirac's theory is untenable, or that radiation and heating are of some unknown nature, or that the process of creation of new matter requires a corresponding input of energy.

  14. Reassessing the possibility of life on venus: proposal for an astrobiology mission.

    PubMed

    Schulze-Makuch, Dirk; Irwin, Louis N

    2002-01-01

    With their similar size, chemical composition, and distance from the Sun, Venus and Earth may have shared a similar early history. Though surface conditions on Venus are now too extreme for life as we know it, it likely had abundant water and favorable conditions for life when the Sun was fainter early in the Solar System. Given the persistence of life under stabilizing selection in static environments, it is possible that life could exist in restricted environmental niches, where it may have retreated after conditions on the surface became untenable. High-pressure subsurface habitats with water in the supercritical liquid state could be a potential refugium, as could be the zone of dense cloud cover where thermoacidophilic life might have retreated. Technology based on the Stardust Mission to collect comet particles could readily be adapted for a pass through the appropriate cloud layer for sample collection and return to Earth.

  15. Reassessing the Possibility of Life on Venus: Proposal for an Astrobiology Mission

    NASA Astrophysics Data System (ADS)

    Schulze-Makuch, Dirk; Irwin, Louis N.

    2002-06-01

    With their similar size, chemical composition, and distance from the Sun, Venus and Earth may have shared a similar early history. Though surface conditions on Venus are now too extreme for life as we know it, it likely had abundant water and favorable conditions for life when the Sun was fainter early in the Solar System. Given the persistence of life under stabilizing selection in static environments, it is possible that life could exist in restricted environmental niches, where it may have retreated after conditions on the surface became untenable. High-pressure subsurface habitats with water in the supercritical liquid state could be a potential refugium, as could be the zone of dense cloud cover where thermoacidophilic life might have retreated. Technology based on the Stardust Mission to collect comet particles could readily be adapted for a pass through the appropriate cloud layer for sample collection and return to Earth.

  16. Limitless capacity: a dynamic object-oriented approach to short-term memory.

    PubMed

    Macken, Bill; Taylor, John; Jones, Dylan

    2015-01-01

    The notion of capacity-limited processing systems is a core element of cognitive accounts of limited and variable performance, enshrined within the short-term memory construct. We begin with a detailed critical analysis of the conceptual bases of this view and argue that there are fundamental problems - ones that go to the heart of cognitivism more generally - that render it untenable. In place of limited capacity systems, we propose a framework for explaining performance that focuses on the dynamic interplay of three aspects of any given setting: the particular task that must be accomplished, the nature and form of the material upon which the task must be performed, and the repertoire of skills and perceptual-motor functions possessed by the participant. We provide empirical examples of the applications of this framework in areas of performance typically accounted for by reference to capacity-limited short-term memory processes.

  17. An optical Fourier transform coprocessor with direct phase determination.

    PubMed

    Macfaden, Alexander J; Gordon, George S D; Wilkinson, Timothy D

    2017-10-20

    The Fourier transform is a ubiquitous mathematical operation which arises naturally in optics. We propose and demonstrate a practical method to optically evaluate a complex-to-complex discrete Fourier transform. By implementing the Fourier transform optically we can overcome the limiting O(nlogn) complexity of fast Fourier transform algorithms. Efficiently extracting the phase from the well-known optical Fourier transform is challenging. By appropriately decomposing the input and exploiting symmetries of the Fourier transform we are able to determine the phase directly from straightforward intensity measurements, creating an optical Fourier transform with O(n) apparent complexity. Performing larger optical Fourier transforms requires higher resolution spatial light modulators, but the execution time remains unchanged. This method could unlock the potential of the optical Fourier transform to permit 2D complex-to-complex discrete Fourier transforms with a performance that is currently untenable, with applications across information processing and computational physics.

  18. Electronic structure of the chiral helimagnet and 3d-intercalated transition metal dichalcogenide Cr 1/3NbS 2

    DOE PAGES

    Sirca, N.; Mo, S. -K.; Bondino, F.; ...

    2016-08-18

    The electronic structure of the chiral helimagnet Cr 1/3NbS 2 has been studied with core level and angle-resolved photoemission spectroscopy (ARPES). Intercalated Cr atoms are found to be effective in donating electrons to the NbS 2 layers but also cause significant modifications of the electronic structure of the host NbS 2 material. Specifically, the data provide evidence that a description of the electronic structure of Cr 1/3NbS 2 on the basis of a simple rigid band picture is untenable. The data also reveal substantial inconsistencies with the predictions of standard density functional theory. In conclusion, the relevance of these resultsmore » to the attainment of a correct description of the electronic structure of chiral helimagnets, magnetic thin films/multilayers, and transition metal dichalcogenides intercalated with 3d magnetic elements is discussed.« less

  19. [Science and ethics].

    PubMed

    Harrer, Friedrich

    2002-01-01

    In current ethics debates, the protection of animals and the environment are central themes. In this context, reference is made to "ethics of responsibility". In the reality of scientific life and the conduct of science, "ethics of responsibility" have only had a modest impact. Reflection based on the history of thought may explain this finding. Upon abolishing God, man has redirected an initially religious love and devotion toward himself. The postulate of unfettered dominion over the Earth has become the supreme principle. "Theoria" in the sense of its original meaning, that is a celebratory admiration for sacred Nature, is incomprehensible to modern man. Nature has been stripped of its sanctity and transformed into something to be exploited for its human utility. Consequently, today the credo applies that to interpret the freedom of science according to ethical principles is utterly untenable. The author pleads for reconsideration.

  20. The diffusion of maize to the southwestern United States and its impact

    PubMed Central

    Merrill, William L.; Hard, Robert J.; Mabry, Jonathan B.; Fritz, Gayle J.; Adams, Karen R.; Roney, John R.; MacWilliams, A. C.

    2009-01-01

    Our understanding of the initial period of agriculture in the southwestern United States has been transformed by recent discoveries that establish the presence of maize there by 2100 cal. B.C. (calibrated calendrical years before the Christian era) and document the processes by which it was integrated into local foraging economies. Here we review archaeological, paleoecological, linguistic, and genetic data to evaluate the hypothesis that Proto-Uto-Aztecan (PUA) farmers migrating from a homeland in Mesoamerica introduced maize agriculture to the region. We conclude that this hypothesis is untenable and that the available data indicate instead a Great Basin homeland for the PUA, the breakup of this speech community into northern and southern divisions ≈6900 cal. B.C. and the dispersal of maize agriculture from Mesoamerica to the US Southwest via group-to-group diffusion across a Southern Uto-Aztecan linguistic continuum. PMID:19995985

  1. Bistatic LIDAR experiment proposed for the shuttle/tethered satellite system missions

    NASA Technical Reports Server (NTRS)

    Mccomas, D. J.; Spense, H. E.; Karl, R. R.; Horak, H. G.; Wilkerson, T. D.

    1986-01-01

    A new experiment concept has been proposed for the shuttle/tethered satellite system missions, which can provide high resolution, global density mappings of certain ionospheric species. The technique utilizes bistatic LIDAR to take advantage of the unique dual platform configuration offered by these missions. A tuned, shuttle-based laser is used to excite a column of the atmosphere adjacent to the tethered satellite, while triangulating photometic detectors on the satellite are employed to measure the fluorescence from sections of the column. The fluorescent intensity at the detectors is increased about six decades over both ground-based and monostatic shuttle-based LIDAR sounding of the same region. In addition, the orbital motion of the Shuttle provides for quasi-global mapping unattainable with ground-based observations. Since this technique provides such vastly improved resolution on a synoptic scale, many important middle atmospheric studies, heretofore untenable, may soon be addressed.

  2. Design specification of an acousto-optic spectrum analyzer that could be used as an auxiliary receiver for CANEWS

    NASA Astrophysics Data System (ADS)

    Studenny, John; Johnstone, Eric

    1991-01-01

    The acousto-optic spectrum analyzer has undergone a theoretical design review and a basic parameter tradeoff analysis has been performed. The main conclusion is that for the given scenario of a 55 dB dynamic range and for a one-second temporal resolution, a 3.9 MHz resolution is a reasonable compromise with respect to current technology. Additional configurations are suggested. Noise testing of the signal detection processor algorithm was conducted. Additive white Gaussian noise was introduced to pure data. As expected, the tradeoff was between algorithm sensitivity and false alarms. No additional algorithm improvements could be made. The algorithm was observed to be robust, provided that the noise floor was set at a proper level. The digitization scheme was mainly driven by hardware constraints. To implement an analog to digital conversion scheme that linearly covers a 55 dB dynamic range would require a minimum of 17 bits. The general consensus was that 17 bits would be untenable for very large scale integration.

  3. Coming to Grips with Radical Social Constructivisms

    NASA Astrophysics Data System (ADS)

    Phillips, D. C.

    This essay distinguishes two broad groups - psychological constructivists and social constructivists - but focusses upon the second of these, although it is stressed that there is great within group variation. More than half of the paper is devoted to general clearing of the ground, during which the reasons for the growing acrimony in the debates between social constructivists and their opponents are assessed, an important consequence of these debates for education is discussed, and an examination is carried out of the radical social constructivist tendency to make strong and exciting but untenable claims which are then backed away from (a tendency which is documented by a close reading of the early pages in Bloors classic book). The last portion of the essay focuses upon social constructivist accounts of the causes of belief in science - the more radical of which denegrate the role of warranting reasons, and which give an exalted place to quasi-anthropological or sociological studies of scientific communities.

  4. We still fail to account for Mendel's observations.

    PubMed

    Porteous, John W

    2004-08-16

    The present article corrects common textbook accounts of Mendel's experiments by re-establishing what he wrote and how he accounted for his observations. It notes the long-established tests for the validity of any explanations that purport to explain observations obtained by experiment. Application of these tests to Mendel's paper shows that the arguments he used to explain his observations were internally consistent but were, on one crucial issue, implausible. The same tests are applied to the currently accepted explanation for Mendel's observations. The currently favoured explanation for Mendel's observations is untenable. It misrepresents Mendel, fails to distinguish between the parameters and the variables of any system of interacting components, its arguments are inconsistent, it repeats the implausibility in Mendel's paper, fails to give a rational explanation for his observed 3:1 trait ratio and cannot explain why this ratio is not always observed in experimental practice. A rational explanation for Mendel's observations is initiated. Readers are challenged to complete the process before a further article appears.

  5. Between-litter variation in developmental studies of hormones and behavior: Inflated false positives and diminished power.

    PubMed

    Williams, Donald R; Carlsson, Rickard; Bürkner, Paul-Christian

    2017-10-01

    Developmental studies of hormones and behavior often include littermates-rodent siblings that share early-life experiences and genes. Due to between-litter variation (i.e., litter effects), the statistical assumption of independent observations is untenable. In two literatures-natural variation in maternal care and prenatal stress-entire litters are categorized based on maternal behavior or experimental condition. Here, we (1) review both literatures; (2) simulate false positive rates for commonly used statistical methods in each literature; and (3) characterize small sample performance of multilevel models (MLM) and generalized estimating equations (GEE). We found that the assumption of independence was routinely violated (>85%), false positives (α=0.05) exceeded nominal levels (up to 0.70), and power (1-β) rarely surpassed 0.80 (even for optimistic sample and effect sizes). Additionally, we show that MLMs and GEEs have adequate performance for common research designs. We discuss implications for the extant literature, the field of behavioral neuroendocrinology, and provide recommendations. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Modeling the geographic distribution of Bacillus anthracis, the causative agent of anthrax disease, for the contiguous United States using predictive ecological [corrected] niche modeling.

    PubMed

    Blackburn, Jason K; McNyset, Kristina M; Curtis, Andrew; Hugh-Jones, Martin E

    2007-12-01

    The ecology and distribution of Bacillus anthracis is poorly understood despite continued anthrax outbreaks in wildlife and livestock throughout the United States. Little work is available to define the potential environments that may lead to prolonged spore survival and subsequent outbreaks. This study used the genetic algorithm for rule-set prediction modeling system to model the ecological niche for B. anthracis in the contiguous United States using wildlife and livestock outbreaks and several environmental variables. The modeled niche is defined by a narrow range of normalized difference vegetation index, precipitation, and elevation, with the geographic distribution heavily concentrated in a narrow corridor from southwest Texas northward into the Dakotas and Minnesota. Because disease control programs rely on vaccination and carcass disposal, and vaccination in wildlife remains untenable, understanding the distribution of B. anthracis plays an important role in efforts to prevent/eradicate the disease. Likewise, these results potentially aid in differentiating endemic/natural outbreaks from industrial-contamination related outbreaks or bioterrorist attacks.

  7. What is a pathogen? Toward a process view of host-parasite interactions

    PubMed Central

    Méthot, Pierre-Olivier; Alizon, Samuel

    2014-01-01

    Until quite recently and since the late 19th century, medical microbiology has been based on the assumption that some micro-organisms are pathogens and others are not. This binary view is now strongly criticized and is even becoming untenable. We first provide a historical overview of the changing nature of host-parasite interactions, in which we argue that large-scale sequencing not only shows that identifying the roots of pathogenesis is much more complicated than previously thought, but also forces us to reconsider what a pathogen is. To address the challenge of defining a pathogen in post-genomic science, we present and discuss recent results that embrace the microbial genetic diversity (both within- and between-host) and underline the relevance of microbial ecology and evolution. By analyzing and extending earlier work on the concept of pathogen, we propose pathogenicity (or virulence) should be viewed as a dynamical feature of an interaction between a host and microbes. PMID:25483864

  8. Ordinary Dark Matter versus Mysterious Dark Matter in Galactic Rotation

    NASA Astrophysics Data System (ADS)

    Gallo, C. F.; Feng, James

    2008-04-01

    To theoretically describe the measured rotational velocity curves of spiral galaxies, there are two different approaches and conclusions. (1) ORDINARY DARK MATTER. We assume Newtonian gravity/dynamics and successfully find (via computer) mass distributions in bulge/disk configurations that duplicate the measured rotational velocities. There is ordinary dark matter within the galactic disk towards the cooler periphery which has lower emissivity/opacity. There are no mysteries in this scenario based on verified physics. (2) MYSTERIOUS DARK MATTER. Others INaccurately assume the galactic mass distributions follow the measured light distributions, and then the measured rotational velocity curves are NOT duplicated. To alleviate this discrepancy, speculations are invoked re ``Massive Peripheral Spherical Halos of Mysterious Dark Matter.'' But NO matter has been detected in this UNtenable Halo configuration. Many UNverified ``Mysteries'' are invoked as necessary and convenient. CONCLUSION. The first approach utilizing Newtonian gravity/dynamics and searching for the ordinary mass distributions within the galactic disk simulates reality and agrees with data.

  9. HIV-1 Latency: An Update of Molecular Mechanisms and Therapeutic Strategies

    PubMed Central

    Battistini, Angela; Sgarbanti, Marco

    2014-01-01

    The major obstacle towards HIV-1 eradication is the life-long persistence of the virus in reservoirs of latently infected cells. In these cells the proviral DNA is integrated in the host’s genome but it does not actively replicate, becoming invisible to the host immune system and unaffected by existing antiviral drugs. Rebound of viremia and recovery of systemic infection that follows interruption of therapy, necessitates life-long treatments with problems of compliance, toxicity, and untenable costs, especially in developing countries where the infection hits worst. Extensive research efforts have led to the proposal and preliminary testing of several anti-latency compounds, however, overall, eradication strategies have had, so far, limited clinical success while posing several risks for patients. This review will briefly summarize the more recent advances in the elucidation of mechanisms that regulates the establishment/maintenance of latency and therapeutic strategies currently under evaluation in order to eradicate HIV persistence. PMID:24736215

  10. Finding an appropriate ethic in a world of moral acquaintances.

    PubMed

    Loewy, E H

    1997-01-01

    This paper discusses the possibility of finding an ethic of at least partial and perhaps ever-growing content in a world not that of moral strangers (where we have nothing except our desire to live freely to unite us) and one of moral friends (in which values, goals and ways of doing things are held in common). I argue that both the world of moral strangers which Engelhardt's world view would support, as the world of moral friends which is the one Pellegrino seeks both are untenable and that furthermore both can lead to a similar state of affairs. I suggest a dynamic world of moral acquaintances in which different belief systems and ways of doing things can come to some broad agreements about some essential thing. This is made possible because although we do not share the intimate framework Pellegrino might suggest, yet we are united by a much broader framework than the one moral strangers share.

  11. A theory of international bioethics: multiculturalism, postmodernism, and the bankruptcy of fundamentalism.

    PubMed

    Baker, Robert

    1998-09-01

    The first of two articles analyzing the justifiability of international bioethical codes and of cross-cultural moral judgments reviews "moral fundamentalism," the theory that cross-cultural moral judgments and international bioethical codes are justified by certain "basic" or "fundamental" moral priniciples that are universally accepted in all cultures and eras. Initially propounded by the judges at the 1947 Nuremberg Tribunal, moral fundamentalism has become the received justification of international bioethics, and of cross-temporal and cross-cultural moral judgments. Yet today we are said to live in a multicultural and postmodern world. This article assesses the challenges that multiculturalism and postmodernism pose to fundamentalism and concludes that these challenges render the position philosophically untenable, thereby undermining the received conception of the foundations of international bioethics. The second article, which follows, offers an alternative model -- a model of negotiated moral order -- as a viable justification for international bioethics and for transcultural and transtemporal moral judgments.

  12. Study on improving the turbidity measurement of the absolute coagulation rate constant.

    PubMed

    Sun, Zhiwei; Liu, Jie; Xu, Shenghua

    2006-05-23

    The existing theories dealing with the evaluation of the absolute coagulation rate constant by turbidity measurement were experimentally tested for different particle-sized (radius = a) suspensions at incident wavelengths (lambda) ranging from near-infrared to ultraviolet light. When the size parameter alpha = 2pi a/lambda > 3, the rate constant data from previous theories for fixed-sized particles show significant inconsistencies at different light wavelengths. We attribute this problem to the imperfection of these theories in describing the light scattering from doublets through their evaluation of the extinction cross section. The evaluations of the rate constants by all previous theories become untenable as the size parameter increases and therefore hampers the applicable range of the turbidity measurement. By using the T-matrix method, we present a robust solution for evaluating the extinction cross section of doublets formed in the aggregation. Our experiments show that this new approach is effective in extending the applicability range of the turbidity methodology and increasing measurement accuracy.

  13. How emotion shapes behavior: feedback, anticipation, and reflection, rather than direct causation.

    PubMed

    Baumeister, Roy F; Vohs, Kathleen D; DeWall, C Nathan; Zhang, Liqing

    2007-05-01

    Fear causes fleeing and thereby saves lives: this exemplifies a popular and common sense but increasingly untenable view that the direct causation of behavior is the primary function of emotion. Instead, the authors develop a theory of emotion as a feedback system whose influence on behavior is typically indirect. By providing feedback and stimulating retrospective appraisal of actions, conscious emotional states can promote learning and alter guidelines for future behavior. Behavior may also be chosen to pursue (or avoid) anticipated emotional outcomes. Rapid, automatic affective responses, in contrast to the full-blown conscious emotions, may inform cognition and behavioral choice and thereby help guide current behavior. The automatic affective responses may also remind the person of past emotional outcomes and provide useful guides as to what emotional outcomes may be anticipated in the present. To justify replacing the direct causation model with the feedback model, the authors review a large body of empirical findings.

  14. Soul man meets the blind watchmaker: C.G. Jung and neo-Darwinism.

    PubMed

    Pietikainen, Petteri

    2003-01-01

    C.G. Jung's name has recently been connected with neo-Darwinian theories. One major reason for this connection is that Jungian psychology is based on the suggestion that there exists a universal structure of the mind that has its own evolutionary history. On this crucial point, Jungians and neo-Darwinian evolutionary psychologists agree. However, it will be argued in this paper that, although Jungian psychology opposes the "tabula rasa" doctrine (mind as a blank state), Jung cannot be regarded as the founding father of evolutionary psychology. From the scientific perspective, Jung's biological assumptions are simply untenable and have been for many decades. In his attempt to fuse biology, spirit, and the unconscious, Jung ended in speculative flights of imagination that bear no resemblance to modern neo-Darwinian theories. The premise of the paper is that, when Jungian psychology is presented to us as a scientific psychology that has implications for the development of neo-Darwinian psychology, we should be on guard and examine the evidence.

  15. Implications for late Grenvillian (Rigolet phase) construction of Rodinia using new U-Pb data from the Mars Hill terrane, Tennessee and North Carolina, United States

    USGS Publications Warehouse

    Aleinikoff, John N.; Southworth, Scott; Merschat, Arthur J.

    2013-01-01

    New data for zircon (external morphology, cathodoluminescence zoning, and sensitive high resolution ion microprobe [SHRIMP] U-Pb ages) from the Carvers Gap granulite gneiss of the Mars Hill terrane (Tennessee and North Carolina, United States) require reevaluation of interpretations of the age and origin of this rock. The new results indicate that the zircon is detrital and that the sedimentary protolith of this gneiss (and related Cloudland gneiss) was deposited no earlier than ca. 1.02 Ga and was metamorphosed at ca. 0.98 Ga. Tectonic models that included the gneiss as a piece of 1.8 Ga Amazonian crust (perhaps as part of the hypothetical Columbia supercontinent) are now untenable. The remarkably fast cycle of exhumation, erosion, deposition, and deep burial also is characteristic of other late Grenvillian (post-Ottawan) Mesoproterozoic paragneisses that occur throughout the Appalachians. These rocks provide new evidence for the duration of the formation of the Rodinia supercontinent lasting until at least 0.98 Ma.

  16. Occurrence of sub-synchronous vibration in a multistage turbine pump and its prevention

    NASA Technical Reports Server (NTRS)

    Kanai, Yanosuke; Saito, Shinobu

    1994-01-01

    It is because of the critical importance the prevention of vibration for high-load rotary machinery assumes in ensuring reliability of a plant as a whole that so many investigations and studies have been performed. A peculiar vibration encountered in a multistage turbine pump is presented and discussed. The pump was serving an active power plant in a part that was a veritable 'heart' of the entire plant, and the major vibration component was about 80 percent frequency of revolution. At first, the propagating stall was thought to be responsible, but the absence of higher harmonics made this presumption untenable. Or else, even though previous reports dealt with seemingly similar mechanical vibration troubles, they offer no clear diagnosis nor suggest simple remedial measures. It is for these reasons that the problem was investigated. Through fundamental tests and experiments, several insights into the nature of this anomalous vibration were gained, the fluid force that caused such a vibration was determined, and effective countermeasures were devised.

  17. Grist to the Mill of Anti-evolutionism: The Failed Strategy of Ruling the Supernatural Out of Science by Philosophical Fiat

    NASA Astrophysics Data System (ADS)

    Boudry, Maarten; Blancke, Stefaan; Braeckman, Johan

    2012-08-01

    According to a widespread philosophical opinion, science is strictly limited to investigating natural causes and putting forth natural explanations. Lacking the tools to evaluate supernatural claims, science must remain studiously neutral on questions of metaphysics. This (self-imposed) stricture, which goes under the name of `methodological naturalism', allows science to be divorced from metaphysical naturalism or atheism, which many people tend to associate with it. However, ruling the supernatural out of science by fiat is not only philosophically untenable, it actually provides grist to the mill of anti-evolutionism. The philosophical flaws in this conception of methodological naturalism have been gratefully exploited by advocates of intelligent design creationism to bolster their false accusations of naturalistic bias and dogmatism on the part of modern science. We argue that it promotes a misleading view of the scientific endeavor and is at odds with the foremost arguments for evolution by natural selection. Reconciling science and religion on the basis of such methodological strictures is therefore misguided.

  18. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    PubMed

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  19. Putting theory of mind in its place: psychological explanations of the socio-emotional-communicative impairments in autistic spectrum disorder.

    PubMed

    Boucher, Jill

    2012-05-01

    In this review, the history of the theory of mind (ToM) theory of autistic spectrum disorder (ASD) is outlined (in which ToM is indexed by success on false belief tasks), and the explanatory power and psychological causes of impaired ToM in ASD are critically discussed. It is concluded that impaired ToM by itself has only limited explanatory power, but that explorations of the psychological precursors of impaired ToM have been fruitful in increasing understanding of mindreading impairments in ASD (where 'mindreading' refers those abilities that underlie triadic interaction as well as ToM). It is argued that early explanations of impaired mindreading are untenable for various reasons, but that impairments of dyadic interaction in ASD that could lead to impaired ability to represent others' mental states may be the critical psychological cause, or causes, of impaired ToM. The complexity of causal routes to impaired ToM is emphasized.

  20. Reliability of Children’s Testimony in the Era of Developmental Reversals

    PubMed Central

    Brainerd, C. J.; Reyna, V. F.

    2012-01-01

    A hoary assumption of the law is that children are more prone to false-memory reports than adults, and hence, their testimony is less reliable than adults’. Since the 1980s, that assumption has been buttressed by numerous studies that detected declines in false memory between early childhood and young adulthood under controlled conditions. Fuzzy-trace theory predicted reversals of this standard developmental pattern in circumstances that are directly relevant to testimony because they involve using the gist of experience to remember events. That prediction has been investigated during the past decade, and a large number of experiments have been published in which false memories have indeed been found to increase between early childhood and young adulthood. Further, experimentation has tied age increases in false memory to improvements in children’s memory for semantic gist. According to current scientific evidence, the principle that children’s testimony is necessarily more infected with false memories than adults’ and that, other things being equal, juries should regard adult’s testimony as necessarily more faithful to actual events is untenable. PMID:23139439

  1. RADIATION INDUCED AGING IN MICE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, H.J.; Gebhard, K.L.

    1958-10-31

    . Experiments were undertaken in an effort to determine the degree of similarity between natural and radiation induced aging, and to determine the causes for the latter. Several severe non-specific stresses were applied to mice either as single massive doses or as smaller doses administered over a large fraction of the life span of the animals. Stresses used included typhoid vaccine, tetanus toxin and tetanus toxoid and turpentine. None of these produced any premature aging comparable to that produced by radiation. The somatic mutation theory of aging and expecially radiationinduced aging has been tested by applying the chemical mutatgen, nitrogenmore » mustard, either as a massive single dose or as smaller single doses repeated over long periods of time. No shortening of the life span has been observed and it is concluded that the somatic mutation theory is untenable. Experiments designed to determine the organ system responsible for radiation induced aging have demonstrated that the hematopoietic system is not primarily involved in this phenomenon. (auth)« less

  2. Memory Interference as a Determinant of Language Comprehension

    PubMed Central

    Van Dyke, Julie A.; Johns, Clinton L.

    2012-01-01

    The parameters of the human memory system constrain the operation of language comprehension processes. In the memory literature, both decay and interference have been proposed as causes of forgetting; however, while there is a long history of research establishing the nature of interference effects in memory, the effects of decay are much more poorly supported. Nevertheless, research investigating the limitations of the human sentence processing mechanism typically focus on decay-based explanations, emphasizing the role of capacity, while the role of interference has received comparatively little attention. This paper reviews both accounts of difficulty in language comprehension by drawing direct connections to research in the memory domain. Capacity-based accounts are found to be untenable, diverging substantially from what is known about the operation of the human memory system. In contrast, recent research investigating comprehension difficulty using a retrieval-interference paradigm is shown to be wholly consistent with both behavioral and neuropsychological memory phenomena. The implications of adopting a retrieval-interference approach to investigating individual variation in language comprehension are discussed. PMID:22773927

  3. The importance of imagination (or lack thereof) in artificial, human and quantum decision making.

    PubMed

    Gustafson, Karl

    2016-01-13

    Enlarging upon experiments and analysis that I did jointly some years ago, in which artificial (symbolic, neural-net and pattern) learning and generalization were compared with that of humans, I will emphasize the role of imagination (or lack thereof) in artificial, human and quantum cognition and decision-making processes. Then I will look in more detail at some of the 'engineering details' of its implementation (or lack thereof) in each of these settings. In other words, the question posed is: What is actually happening? For example, we previously found that humans overwhelmingly seek, create or imagine context in order to provide meaning when presented with abstract, apparently incomplete, contradictory or otherwise untenable decision-making situations. Humans are intolerant of contradiction and will greatly simplify to avoid it. They can partially correlate but do not average. Human learning is not Boolean. These and other human reasoning properties will then be taken to critique how well artificial intelligence methods and quantum mechanical modelling might compete with them in decision-making tasks within psychology and economics. © 2015 The Author(s).

  4. Set up to fail? Consumer participation in the mental health service system.

    PubMed

    Stewart, Sarah; Watson, Sandy; Montague, Roslyn; Stevenson, Caroline

    2008-10-01

    The aim of this paper is to present the findings of a survey of consumers of mental health services who are working (in either paid or unpaid positions) in NSW Health and in the Non Government Organisation sector in NSW. A survey was distributed through the NSW Consumer Advisory Group newsletter to elicit the roles and assess the training needs of consumer employees, as well as those who were working in voluntary capacities as consumer representatives, within the mental health system in NSW. Many mental health consumers have been placed in the untenable position of being engaged in representation and/or advocacy roles with unclear job descriptions and no training. The majority of consumers want a code of ethics and performance standards for consumer workers. The rhetoric of consumer participation is not matched by effective and timely strategies that ensure that consumer involvement is underpinned by relevant training and supportive infrastructure. The goal of meaningful consumer participation in mental health services, as outlined in policy, is yet to be achieved.

  5. Observations of the Dynamics and Thermodynamics of the Corona during the 21 August 2017 Total Solar Eclipse

    NASA Astrophysics Data System (ADS)

    Habbal, Shadia Rifai; Ding, Adalbert; Druckmuller, Miloslav; Solar Wind Sherpas

    2018-01-01

    The visible wavelength range, encompassing forbidden coronal emission lines, offers unique diagnostic tools for exploring the physics of the solar corona, such as its chemical composition and the dynamics of its major and minor constituents. These tools are best exploited during total solar eclipses, when the field of view spans several solar radii, starting from the solar surface. This spatial span is currently untenable from any observing platform. Imaging and spectroscopic eclipse observations, including the 2017 August 21 event, are shown to be the first to yield the temperature distribution in the corona as a function of solar cycle. They are also the first to lead to the discovery of cool prominence material at less than 10,000 to 50,000 K, within more than a radius above the solar surface, streaming away from the Sun, while maintaining its compositional identity. These data underscore the importance of capturing emission from coronal forbidden lines with the next generation space-based instrumentation to address the general problem of coronal heating.

  6. A class of covariate-dependent spatiotemporal covariance functions

    PubMed Central

    Reich, Brian J; Eidsvik, Jo; Guindani, Michele; Nail, Amy J; Schmidt, Alexandra M.

    2014-01-01

    In geostatistics, it is common to model spatially distributed phenomena through an underlying stationary and isotropic spatial process. However, these assumptions are often untenable in practice because of the influence of local effects in the correlation structure. Therefore, it has been of prolonged interest in the literature to provide flexible and effective ways to model non-stationarity in the spatial effects. Arguably, due to the local nature of the problem, we might envision that the correlation structure would be highly dependent on local characteristics of the domain of study, namely the latitude, longitude and altitude of the observation sites, as well as other locally defined covariate information. In this work, we provide a flexible and computationally feasible way for allowing the correlation structure of the underlying processes to depend on local covariate information. We discuss the properties of the induced covariance functions and discuss methods to assess its dependence on local covariate information by means of a simulation study and the analysis of data observed at ozone-monitoring stations in the Southeast United States. PMID:24772199

  7. In one's own image: ethics and the reproduction of deafness.

    PubMed

    Johnston, Trevor

    2005-01-01

    The ethics of the use of genetic screening and reproductive technologies to select against and for deafness is presented. It is argued that insofar as deafness is a disability it is ethical to act in such a way as to avoid the conception or birth of children with genetic or congenital deafness. The discovery and recognition of signing deaf communities as cultural and linguistic communities (minorities) does not alter this basic ethical position, although the consequences of widespread application of this technology appears destined to lead to the eventual disappearance of these communities. The argument that acting to avoid deafness is unethical because it will lead to the elimination of a linguistic or cultural group (genocide or ethnocide) or conversely that acting to ensure deafness is ethical, if not praiseworthy, can only be sustained if deafness is not regarded as a disability at all. I argue that the premise that deafness is not a disability of some sort is false and thus the claim that genetic selection against deafness is unethical is untenable.

  8. A Novel Multi-scale Simulation Strategy for Turbulent Reacting Flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, Sutherland C.

    In this project, a new methodology was proposed to bridge the gap between Direct Numerical Simulation (DNS) and Large Eddy Simulation (LES). This novel methodology, titled Lattice-Based Multiscale Simulation (LBMS), creates a lattice structure of One-Dimensional Turbulence (ODT) models. This model has been shown to capture turbulent combustion with high fidelity by fully resolving interactions between turbulence and diffusion. By creating a lattice of ODT models, which are then coupled, LBMS overcomes the shortcomings of ODT, which are its inability to capture large scale three dimensional flow structures. However, by spacing these lattices significantly apart, LBMS can avoid the cursemore » of dimensionality that creates untenable computational costs associated with DNS. This project has shown that LBMS is capable of reproducing statistics of isotropic turbulent flows while coarsening the spacing between lines significantly. It also investigates and resolves issues that arise when coupling ODT lines, such as flux reconstruction perpendicular to a given ODT line, preservation of conserved quantities when eddies cross a course cell volume and boundary condition application. Robust parallelization is also investigated.« less

  9. Study on Conversion of Municipal Plastic Wastes into Liquid Fuel Compounds, Analysis of Crdi Engine Performance and Emission Characteristics

    NASA Astrophysics Data System (ADS)

    Divakar Shetty, A. S.; Kumar, R. Ravi; Kumarappa, S.; Antony, A. J.

    2016-09-01

    The rate of economic evolution is untenable unless we save or stops misusing the fossil fuels like coal, crude oil or fossil fuels. So we are in need of start count on the alternate or renewable energy sources. In this experimental analysis an attempt has been made to investigate the conversion of municipal plastic wastes like milk covers and water bottles are selected as feed stocks to get oil using pyrolysis method, the performance analysis on CRDI diesel engine and to assess emission characteristics like HC, CO, NOX and smoke by using blends of Diesel-Plastic liquid fuels. The plastic fuel is done with the pH test using pH meter after the purification process and brought to the normal by adding KOH and NaOH. Blends of 0 to 100% plastic liquid fuel-diesel mixture have been tested for performance and emission aspect as well. The experimental results shows the efficiently convert weight of municipal waste plastics into 65% of useful liquid hydrocarbon fuels without emitting much pollutants.

  10. Issues in localization of brain function: The case of lateralized frontal cortex in cognition, emotion, and psychopathology.

    PubMed

    Miller, Gregory A; Crocker, Laura D; Spielberg, Jeffrey M; Infantolino, Zachary P; Heller, Wendy

    2013-01-01

    The appeal of simple, sweeping portraits of large-scale brain mechanisms relevant to psychological phenomena competes with a rich, complex research base. As a prominent example, two views of frontal brain organization have emphasized dichotomous lateralization as a function of either emotional valence (positive/negative) or approach/avoidance motivation. Compelling findings support each. The literature has struggled to choose between them for three decades, without success. Both views are proving untenable as comprehensive models. Evidence of other frontal lateralizations, involving distinctions among dimensions of depression and anxiety, make a dichotomous view even more problematic. Recent evidence indicates that positive valence and approach motivation are associated with different areas in the left-hemisphere. Findings that appear contradictory at the level of frontal lobes as the units of analysis can be accommodated because hemodynamic and electromagnetic neuroimaging studies suggest considerable functional differentiation, in specialization and activation, of subregions of frontal cortex, including their connectivity to each other and to other regions. Such findings contribute to a more nuanced understanding of functional localization that accommodates aspects of multiple theoretical perspectives.

  11. Issues in localization of brain function: The case of lateralized frontal cortex in cognition, emotion, and psychopathology

    PubMed Central

    Miller, Gregory A.; Crocker, Laura D.; Spielberg, Jeffrey M.; Infantolino, Zachary P.; Heller, Wendy

    2013-01-01

    The appeal of simple, sweeping portraits of large-scale brain mechanisms relevant to psychological phenomena competes with a rich, complex research base. As a prominent example, two views of frontal brain organization have emphasized dichotomous lateralization as a function of either emotional valence (positive/negative) or approach/avoidance motivation. Compelling findings support each. The literature has struggled to choose between them for three decades, without success. Both views are proving untenable as comprehensive models. Evidence of other frontal lateralizations, involving distinctions among dimensions of depression and anxiety, make a dichotomous view even more problematic. Recent evidence indicates that positive valence and approach motivation are associated with different areas in the left-hemisphere. Findings that appear contradictory at the level of frontal lobes as the units of analysis can be accommodated because hemodynamic and electromagnetic neuroimaging studies suggest considerable functional differentiation, in specialization and activation, of subregions of frontal cortex, including their connectivity to each other and to other regions. Such findings contribute to a more nuanced understanding of functional localization that accommodates aspects of multiple theoretical perspectives. PMID:23386814

  12. How prostate cancer support groups do and do not survive: British Columbian perspectives.

    PubMed

    Oliffe, John L; Halpin, Michael; Bottorff, Joan L; Hislop, T Gregory; McKenzie, Michael; Mroz, Lawrence

    2008-06-01

    Many prostate cancer support groups (PCSGs) have formed in North America during the past decade, yet their operation or factors influencing sustainability are poorly understood. This article reports micro (intragroup), meso (intergroup), and macro (group/structure) analyses drawn from the fieldwork and participant observations conducted for an ethnographic study of PCSGs based in British Columbia, Canada. The findings indicate that effective group leadership is integral to group sustainability and the recruitment and retention of attendees. At the meso level, intergroup connections and communication were often informal; however, the primary purpose of all the PCSGs was to provide information and support to men and their families. Many PCSGs were uncertain how formal associations with cancer fund-raising societies would influence group effectiveness. Macro issues such as prostate cancer activism resided with individual group "champions" through activities coordinated by provincial and national PCSG organizations. However, activism did not guarantee group sustainability. The study findings reveal why some groups flourish while others appear untenable, and form the basis for discussion about how PCSG sustainability might be best achieved.

  13. Painter and scribe: From model of mind to cognitive strategy.

    PubMed

    MacKisack, Matthew

    2017-12-06

    Since antiquity the mind has been conceived to operate via images and words. Pre-scientific thinkers (and some scientific) who presented the mind as operating in such a way tended to i) bias one representational mode over the other, and ii) claim the dominance of the mode to be the case universally. The rise of empirical psychological science in the late 19th-century rehearses the word/image division of thought but makes universal statements - e.g., that recollection is a verbal process for everyone - untenable. Since then, the investigation of individual differences and case studies of imagery loss have shown rather that words and images present alternative cognitive "strategies" that individuals will be predisposed to employing - but which, should the necessity arise, can be relearned using the other representational mode. The following sketches out this historical shift in understanding, and concludes by inviting consideration of the wider context in which discussion of the relationships between 'images' and 'words' (as both internal and external forms of representation) must take place. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Assessing the Privacy Risks of Data Sharing in Genomics

    PubMed Central

    Heeney, C.; Hawkins, N.; de Vries, J.; Boddington, P.; Kaye, J.

    2010-01-01

    The protection of identity of participants in medical research has traditionally been guaranteed by the maintenance of the confidentiality of health information through mechanisms such as only releasing data in an aggregated form or after identifying variables have been removed. This protection of privacy is regarded as a fundamental principle of research ethics, through which the support of research participants and the public is maintained. Whilst this traditional model was adopted for genetics and genomics research, and was generally considered broadly fit for purpose, we argue that this approach is increasingly untenable in genomics. Privacy risk assessments need to have regard to the whole data environment, not merely the quality of the dataset to be released in isolation. As sources of data proliferate, issues of privacy protection are increasingly problematic in relation to the release of genomic data. However, we conclude that, by paying careful attention to potential pitfalls, scientific funders and researchers can take an important part in attempts to safeguard the public and ensure the continuation of potentially important scientific research. PMID:20339285

  15. Rethinking the western construction of the welfare state.

    PubMed

    Walker, A; Wong, C K

    1996-01-01

    This article employs case studies of China and Hong Kong to question the western ethnocentric construction of the welfare state that predominates in comparative social policy research. The authors argue that welfare regimes, and particularly the "welfare state," have been constructed as capitalist-democratic projects and that this has the damaging effect of excluding from analyses not only several advanced capitalist societies in the Asian-Pacific area but also the world's most populous country. If welfare state regimes can only coexist with western political democracies, then China and Hong Kong are excluded automatically. A similar result occurs if the traditional social administration approach is adopted whereby a "welfare state" is defined in terms only of direct state provision. The authors argue that such assumptions are untenable if state welfare is to be analyzed as a universal phenomenon. Instead of being trapped within an ethnocentric welfare statism, what social policy requires is a global political economy perspective that facilitates comparisons of the meaning of welfare and the state's role in producing it north, south, east and west.

  16. The Case of Abel: Religion as Boon and Bane for a Catholic Gay Man.

    PubMed

    Cerbone, Armand R; Danzer, Graham

    2017-08-01

    Conservative religions that condemn homosexual sexual orientation and acts as unnatural and sinful pose significant challenges for gay persons whose faith is a core part of their identity. The condemnation presents a serious barrier to the acceptance and integration of their sexuality, a primary task of psychosexual development. As a result, they can manifest depression, anxiety, suicidal ideation, and even suicide attempts. The ecclesiastical censure also imposes an untenable dilemma for homosexuals in that they feel pressed to reject their sexual identity or renounce their spiritual identity and heritage. Psychotherapists who treat gay persons caught in this quandary can find themselves facing a similar problem: how to help their homosexual client reconcile their proscribed sexuality with their spiritual commitments. The case presented here recounts the treatment over many years of a gay man suffering from such a conflict and his eventual accommodation of both his homosexuality and his faith. Recommendations are offered for constructive treatment with those torn between two conflicting core identities. © 2017 Wiley Periodicals, Inc.

  17. Power is only skin deep: an institutional ethnography of nurse-driven outpatient psoriasis treatment in the era of clinic web sites.

    PubMed

    Winkelman, Warren J; Halifax, Nancy V Davis

    2007-04-01

    We present an institutional ethnography of hospital-based psoriasis day treatment in the context of evaluating readiness to supplement services and support with a new web site. Through observation, interviews and a critical consideration of documents, forms and other textually-mediated discourses in the day-to-day work of nurses and physicians, we come to understand how the historical gender-determined power structure of nurses and physicians impacts nurses' work. On the one hand, nurses' work can have certain social benefits that would usually be considered untenable in traditional healthcare: nurses as primary decision-makers, nurses as experts in the treatment of disease, physicians as secondary consultants, and patients as co-facilitators in care delivery processes. However, benefits seem to have come at the nurses' expense, as they are required to maintain a cloak of invisibility for themselves and for their workplace, so that the Centre appears like all other outpatient clinics, and the nurses do not enjoy appropriate economic recognition. Implications for this negotiated invisibility on the implementation of new information systems in healthcare are discussed.

  18. Medieval Round Churches and the Shape of the Earth.

    PubMed

    Haagensen, Erling; Lind, Niels C

    2015-12-01

    There is a unique cluster of four medieval round churches, linked by a simple geometry, on Bornholm Island in the Baltic Sea. Why so many and why so close together? Immediate simple answers are "Just by chance" and "For no reason." Why are the churches round? "Defense." This essay proposes another hypothesis for this unique situation: the churches are astronomical observatories, meant to solve a scientific problem (Is the Earth really spherical?) and a practical problem (How far is it to sail west to the Orient?). The capacity and desire to find answers, together with other practical needs related to astronomy, can better explain these round churches' special architecture. The geometry that connects them fits the ideal pattern with an angular accuracy of 1 minute of a degree. The round churches may be the earliest astronomical observatories in Christian Europe; other hypotheses have been shown to be untenable. Their location provides for a good method to estimate the Earth's extent in the east-west direction, seemingly the earliest such measurements.

  19. The role of high-level calculations in the assignment of the Q-band spectra of chlorophyll

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reimers, Jeffrey R.; Cai, Zheng-Li; Kobayashi, Rika

    2014-10-06

    We recently established a novel assignment of the visible absorption spectrum of chlorophyll-a that sees the two components Q{sub x} and Q{sub y} of the low-energy Q band as being intrinsically mixed by non-adiabatic coupling. This ended 50 years debate as to the nature of the Q bands, with prior discussion poised only in the language of the Born-Oppenheimer and Condon approximations. The new assignment presents significant ramifications for exciton transport and quantum coherence effects in photosystems. Results from state of the art electronic structure calculations have always been used to justify assignments, but quantitative inaccuracies and systematic failures havemore » historically limited usefulness. We examine the role of CAM-B3LYP time-dependent density-functional theory (TD-DFT) and Symmetry Adapted Cluster-Configuration Interaction (SAC-CI) calculations in first showing that all previous assignments were untenable, in justifying the new assignment, in making some extraordinary predictions that were vindicated by the new assignment, and in then identifying small but significant anomalies in the extensive experimental data record.« less

  20. The hypothalamic slice approach to neuroendocrinology.

    PubMed

    Hatton, G I

    1983-07-01

    The magnocellular peptidergic cells of the supraoptic and paraventricular nuclei comprise much of what is known as the hypothalamo-neurohypophysial system and is involved in several functions, including body fluid balance, parturition and lactation. While we have learned much from experiments in vivo, they have not produced a clear understanding of some of the crucial features associated with the functioning of this system. In particular, questions relating to the osmosensitivity of magnocellular neurones and the mechanism(s) by which their characteristic firing patterns are generated have not been answered using the older approaches. Electrophysiological studies with brain slices present direct evidence for osmosensitivity, and perhaps even osmoreceptivity, of magnocellular neurones. Other evidence indicates that the phasic bursting patterns of activity associated with vasopressin-releasing neurones (a) occur in the absence of patterned chemical synaptic input, (b) may be modulated by electrotonic conduction across gap junctions connecting magnocellular neurones and (c) are likely to be generated by endogenous membrane currents. These results make untenable the formerly held idea that phasic bursting activity is dependent upon recurrent synaptic inhibition.

  1. Size, time, and asynchrony matter: the species-area relationship for parasites of freshwater fishes.

    PubMed

    Zelmer, Derek A

    2014-10-01

    The tendency to attribute species-area relationships to "island biogeography" effectively bypasses the examination of specific mechanisms that act to structure parasite communities. Positive covariation between fish size and infrapopulation richness should not be examined within the typical extinction-based paradigm, but rather should be addressed from the standpoint of differences in colonization potential among individual hosts. Although most mechanisms producing the aforementioned pattern constitute some variation of passive sampling, the deterministic aspects of the accumulation of parasite individuals by fish hosts makes untenable the suggestion that infracommunities of freshwater fishes are stochastic assemblages. At the component community level, application of extinction-dependent mechanisms might be appropriate, given sufficient time for colonization, but these structuring forces likely act indirectly through their effects on the host community to increase the probability of parasite persistence. At all levels, the passive sampling hypothesis is a relevant null model. The tendency for mechanisms that produce species-area relationships to produce nested subset patterns means that for most systems, the passive sampling hypothesis can be addressed through the application of appropriate null models of nested subset structure.

  2. The state of the "state" debate in hypnosis: a view from the cognitive-behavioral perspective.

    PubMed

    Chaves, J F

    1997-07-01

    For most of the past 50 years, hypnosis research has been driven by a debate about whether hypnotic phenomena can be best described and understood as the product of an altered state of consciousness. The meanings of some of the pivotal concepts in this debate and the nature of the phenomena that gave rise to them were ambiguous at the outset and led to misconceptions and surplus meanings that have obscured the debate through most of its history. The nature of the posited hypnotic state and its assumed consequences have changed during this period, reflecting the abandonment of untenable versions of hypnotic state theory. Carefully conducted studies in laboratories around the world have refined our understanding of hypnotic phenomena and helped identify the critical variables that interact to elicit them. With the maturation of the cognitive-behavioral perspective and the growing refinement of state conceptions of hypnosis, questions arise whether the state debate is still the axis about which hypnosis research and theory pivots. Although heuristic value of this debate has been enormous, we must guard against the cognitive constraints of our own metaphors and conceptual frameworks.

  3. Cognitive architectures, rationality, and next-generation AI: a prolegomenon

    NASA Astrophysics Data System (ADS)

    Bello, Paul; Bringsjord, Selmer; Yang, Yingrui

    2004-08-01

    Computational models that give us insight into the behavior of individuals and the organizations to which they belong will be invaluable assets in our nation's war against terrorists, and state sponsorship of terror organizations. Reasoning and decision-making are essential ingredients in the formula for human cognition, yet the two have almost exclusively been studied in isolation from one another. While we have witnessed the emergence of strong traditions in both symbolic logic, and decision theory, we have yet to describe an acceptable interface between the two. Mathematical formulations of decision-making and reasoning have been developed extensively, but both fields make assumptions concerning human rationality that are untenable at best. True to this tradition, artificial intelligence has developed architectures for intelligent agents under these same assumptions. While these digital models of "cognition" tend to perform superbly, given their tremendous capacity for calculation, it is hardly reasonable to develop simulacra of human performance using these techniques. We will discuss some the challenges associated with the problem of developing integrated cognitive systems for use in modelling, simulation, and analysis, along with some ideas for the future.

  4. Variable directionality of gene expression changes across generations does not constitute negative evidence of epigenetic inheritance.

    PubMed

    Sharma, Abhay

    2015-01-01

    Transgenerational epigenetic inheritance in mammals has been controversial due to inherent difficulties in its experimental demonstration. A recent report has, however, opened a new front in the ongoing debate by claiming that endocrine disrupting chemicals, contrary to previous findings, do not cause effects across generations. This claim is based on the observation that gene expression changes induced by these chemicals in the exposed and unexposed generations are mainly in the opposite direction. This analysis shows that the pattern of gene expression reported in the two generations is not expected by chance and is suggestive of transmission across generations. A meta-analysis of diverse data sets related to endocrine disruptor-induced transgenerational gene expression alterations, including the data provided in the said report, further suggests that effects of endocrine disrupting chemicals persist in unexposed generations. Based on the prior evidence of phenotypic variability and gene expression alterations in opposite direction between generations, it is argued here that calling evidence of mismatched directionality in gene expression in experiments testing potential of environmental agents in inducing epigenetic inheritance of phenotypic traits as negative is untenable. This is expected to settle the newly raised doubts over epigenetic inheritance in mammals.

  5. Population and international security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNamara, R.S.

    The former Secretary of Defense examines data on population trends and military spending around the world and concludes that the industrialized world is spending a disproportionate amount of its resources on defense and not enough on intervention to lower the birth rate. Population projections of 11 billion and higher are felt to be untenable. Indications of a declining birth rate in the developing countries, while welcome, are felt to be too slow to achieve an acceptable population level. Government action is called for to bring about a reduction in fertility. Government interventions to encourage smaller families, however, must be accompaniedmore » by the means to do so. Several strategies are outlined that could influence family size by raising standards of health, education, income distribution, and the status of women. Supporting this demand by couples to limit their offspring should be improved family planning, delivery service, a broader choice of contraceptives, and more research on reproductive biology and contraceptive techniques. The government is in a position to encourage socio-economic changes through public information programs and a system of incentives and disincentives designed to achieve the desired goal. (DCK)« less

  6. A re-examination of the biphasic theory of skeletal muscle growth.

    PubMed Central

    Levine, A S; Hegarty, P V

    1977-01-01

    Because of the importance of fibre diameter measurements it was decided to re-evaluate the biphasic theory of skeletal muscle growth and development. This theory proposes an initial memophasic distribution of muscle fibres which changes to a biphasic distribution during development. The theory is based on observations made on certain muscles in mice, where two distinct populations of fibre diameters (20 and 40 micronm) contribute to the biphasic distribution. In the present investigation corss sections of frozen biceps brachii of mice in rigor mortis were examined. The rigor state was used to avoid complications produced by thaw-rigor contraction. The diameters of the outermost and innermost fibres were found to be significantly different. However, if the outer and inner fibres were combined to form one group, no significant difference between this group and other random groups was found. The distributions of all groups were monophasic. The diameters of isolated fibres from mice and rats also displayed a monophasic distribution. This evidence leads to the conclusion that the biphasic theory of muscle growth is untenable. Some of the variables which may occur in fibre size and shape are discussed. Images Fig. 1 PMID:858691

  7. [Scabies and the significance of "suriones" in the handwritten manuscripts of Hildegard von Bingen].

    PubMed

    Riethe, Peter

    2006-01-01

    In her studies on nature and medicine, the "Liber simplicis medicinae" (LSM or "Physica") and the "Liber compositae medicinae" (LCM or "Causae et Curae"), Hildegard von Bingen mentions Scabies (mange) in several passages. She characterizes "suren aut (= or) sneuelzen" as the cause of the disease, which she calls also "gracillimi vermiculi", that is, tiny worms that burrow into the human skin ("ubi suren aut sneuelzen hominem comedendo ledunt"). In this context the meanings of the German-ancestor terms "suren aut sneuelzen", which are found in the Latin text concerning the "Alia Mynza", are still disputed. The question whether Hildegard knew the cause of scabies the author discusses on the basis of ancient and medieval sources as well as modem medical historical and philological/linguistic research approaches. He concludes that Hildegard was able not only to describe the symptoms exactly, but also to define the cause of the disease as a special parasite. Consequently, she differentiates other diseases of the skin, such as "grint", from scabies. The proposed interpretation of "sneuelzen" as the tick is untenable. The assumption that both terms are synonyms for sarcoptes scabiei can be confirmed by philological and medical historical research.

  8. Action Centered Contextual Bandits.

    PubMed

    Greenewald, Kristjan; Tewari, Ambuj; Klasnja, Predrag; Murphy, Susan

    2017-12-01

    Contextual bandits have become popular as they offer a middle ground between very simple approaches based on multi-armed bandits and very complex approaches using the full power of reinforcement learning. They have demonstrated success in web applications and have a rich body of associated theoretical guarantees. Linear models are well understood theoretically and preferred by practitioners because they are not only easily interpretable but also simple to implement and debug. Furthermore, if the linear model is true, we get very strong performance guarantees. Unfortunately, in emerging applications in mobile health, the time-invariant linear model assumption is untenable. We provide an extension of the linear model for contextual bandits that has two parts: baseline reward and treatment effect. We allow the former to be complex but keep the latter simple. We argue that this model is plausible for mobile health applications. At the same time, it leads to algorithms with strong performance guarantees as in the linear model setting, while still allowing for complex nonlinear baseline modeling. Our theory is supported by experiments on data gathered in a recently concluded mobile health study.

  9. Training scientists as future industry leaders: teaching translational science from an industry executive’s perspective

    PubMed Central

    Lee, Gloria; Kranzler, Jay D; Ramasamy, Ravichandran; Gold-von Simson, Gabrielle

    2018-01-01

    PhDs and post-doctoral biomedical graduates, in greater numbers, are choosing industry based careers. However, most scientists do not have formal training in business strategies and venture creation and may find senior management positions untenable. To fill this training gap, “Biotechnology Industry: Structure and Strategy” was offered at New York University School of Medicine (NYUSOM). The course focuses on the business aspects of translational medicine and research translation and incorporates the practice of business case discussions, mock negotiation, and direct interactions into the didactic. The goal is to teach scientists at an early career stage how to create solutions, whether at the molecular level or via the creation of devices or software, to benefit those with disease. In doing so, young, talented scientists can develop a congruent mindset with biotechnology/industry executives. Our data demonstrates that the course enhances students’ knowledge of the biotechnology industry. In turn, these learned skills may further encourage scientists to seek leadership positions in the field. Implementation of similar courses and educational programs will enhance scientists’ training and inspire them to become innovative leaders in the discovery and development of therapeutics. PMID:29657853

  10. Training scientists as future industry leaders: teaching translational science from an industry executive's perspective.

    PubMed

    Lee, Gloria; Kranzler, Jay D; Ramasamy, Ravichandran; Gold-von Simson, Gabrielle

    2018-01-01

    PhDs and post-doctoral biomedical graduates, in greater numbers, are choosing industry based careers. However, most scientists do not have formal training in business strategies and venture creation and may find senior management positions untenable. To fill this training gap, "Biotechnology Industry: Structure and Strategy" was offered at New York University School of Medicine (NYUSOM). The course focuses on the business aspects of translational medicine and research translation and incorporates the practice of business case discussions, mock negotiation, and direct interactions into the didactic. The goal is to teach scientists at an early career stage how to create solutions, whether at the molecular level or via the creation of devices or software, to benefit those with disease. In doing so, young, talented scientists can develop a congruent mindset with biotechnology/industry executives. Our data demonstrates that the course enhances students' knowledge of the biotechnology industry. In turn, these learned skills may further encourage scientists to seek leadership positions in the field. Implementation of similar courses and educational programs will enhance scientists' training and inspire them to become innovative leaders in the discovery and development of therapeutics.

  11. Human Rights and the Excess of Identity

    PubMed Central

    Al Tamimi, Yussef

    2017-01-01

    Identity is a central theme in contemporary politics, but legal academia lacks a rigorous analysis of this concept. The aim of this article is twofold: (i) firstly, it aims to reveal presumptions on identity in human rights law by mapping how the European Court of Human Rights approaches identity and (ii) secondly, it seeks to analyse these presumptions using theoretical insights on identity. By merging legal and theoretical analysis, this article contributes a reading of the Court’s case law which suggests that the tension between the political and apolitical is visible as a common thread in the Court’s use of identity. In case law concerning paternity, the Court appears to hold a specific view of what is presented as an unquestionable part of identity. This ostensibly pre-political notion of identity becomes untenable in cases where the nature of an identity feature, such as the headscarf, is contended or a minority has adopted a national identity that conflicts with the majoritarian national identity. The Court’s approach to identity in such cases reflects a paradox that is inherent to identity; identity is personal while simultaneously constituted and shaped by overarching power mechanisms. PMID:29881144

  12. Obese persons' physical activity experiences and motivations across weight changes: a qualitative exploratory study.

    PubMed

    Bombak, Andrea E

    2015-11-14

    Obese individuals are encouraged to participate in physical activity. However, few qualitative studies have explored obese individuals' motivations for and experiences with physical activity. The physical activity experiences of self-identified obese or formerly obese persons (n = 15) were explored through in-depth, semi-structured, audio-taped, repeated interviews and ethnography over one year. Participant observation occurred at multiple sites identified by participants as meaningful to them as obese persons. Data from interview transcripts and fieldnotes were analyzed via thematic content analysis. Underlying goals for engaging in physical activity were diverse. Emergent motivation themes included: protection, pressure, and pleasure. Participants were protective of maintaining functional capacity, establishing fit identities, and achieving weight loss. Participants also discussed feelings of excessive pressure to continue progressing toward weight and fitness goals. Enjoyment in physical activity was often a by-product for all participants and could become a sought-after endpoint. Finding an environment in which participants felt safe, accepted, and encouraged to be active was extremely important for continual engagement. Obese individuals enjoyed physical activity and were concerned about maintaining functional fitness. Stigmatization and untenable goals and monitoring could disrupt physical activity.

  13. Working Memory Capacity and Fluid Intelligence: Maintenance and Disengagement.

    PubMed

    Shipstead, Zach; Harrison, Tyler L; Engle, Randall W

    2016-11-01

    Working memory capacity and fluid intelligence have been demonstrated to be strongly correlated traits. Typically, high working memory capacity is believed to facilitate reasoning through accurate maintenance of relevant information. In this article, we present a proposal reframing this issue, such that tests of working memory capacity and fluid intelligence are seen as measuring complementary processes that facilitate complex cognition. Respectively, these are the ability to maintain access to critical information and the ability to disengage from or block outdated information. In the realm of problem solving, high working memory capacity allows a person to represent and maintain a problem accurately and stably, so that hypothesis testing can be conducted. However, as hypotheses are disproven or become untenable, disengaging from outdated problem solving attempts becomes important so that new hypotheses can be generated and tested. From this perspective, the strong correlation between working memory capacity and fluid intelligence is due not to one ability having a causal influence on the other but to separate attention-demanding mental functions that can be contrary to one another but are organized around top-down processing goals. © The Author(s) 2016.

  14. Solid state light engines for bioanalytical instruments and biomedical devices

    NASA Astrophysics Data System (ADS)

    Jaffe, Claudia B.; Jaffe, Steven M.

    2010-02-01

    Lighting subsystems to drive 21st century bioanalysis and biomedical diagnostics face stringent requirements. Industrywide demands for speed, accuracy and portability mean illumination must be intense as well as spectrally pure, switchable, stable, durable and inexpensive. Ideally a common lighting solution could service these needs for numerous research and clinical applications. While this is a noble objective, the current technology of arc lamps, lasers, LEDs and most recently light pipes have intrinsic spectral and angular traits that make a common solution untenable. Clearly a hybrid solution is required to service the varied needs of the life sciences. Any solution begins with a critical understanding of the instrument architecture and specifications for illumination regarding power, illumination area, illumination and emission wavelengths and numerical aperture. Optimizing signal to noise requires careful optimization of these parameters within the additional constraints of instrument footprint and cost. Often the illumination design process is confined to maximizing signal to noise without the ability to adjust any of the above parameters. A hybrid solution leverages the best of the existing lighting technologies. This paper will review the design process for this highly constrained, but typical optical optimization scenario for numerous bioanalytical instruments and biomedical devices.

  15. Synthesis of phylogeny and taxonomy into a comprehensive tree of life.

    PubMed

    Hinchliff, Cody E; Smith, Stephen A; Allman, James F; Burleigh, J Gordon; Chaudhary, Ruchi; Coghill, Lyndon M; Crandall, Keith A; Deng, Jiabin; Drew, Bryan T; Gazis, Romina; Gude, Karl; Hibbett, David S; Katz, Laura A; Laughinghouse, H Dail; McTavish, Emily Jane; Midford, Peter E; Owen, Christopher L; Ree, Richard H; Rees, Jonathan A; Soltis, Douglas E; Williams, Tiffani; Cranston, Karen A

    2015-10-13

    Reconstructing the phylogenetic relationships that unite all lineages (the tree of life) is a grand challenge. The paucity of homologous character data across disparately related lineages currently renders direct phylogenetic inference untenable. To reconstruct a comprehensive tree of life, we therefore synthesized published phylogenies, together with taxonomic classifications for taxa never incorporated into a phylogeny. We present a draft tree containing 2.3 million tips-the Open Tree of Life. Realization of this tree required the assembly of two additional community resources: (i) a comprehensive global reference taxonomy and (ii) a database of published phylogenetic trees mapped to this taxonomy. Our open source framework facilitates community comment and contribution, enabling the tree to be continuously updated when new phylogenetic and taxonomic data become digitally available. Although data coverage and phylogenetic conflict across the Open Tree of Life illuminate gaps in both the underlying data available for phylogenetic reconstruction and the publication of trees as digital objects, the tree provides a compelling starting point for community contribution. This comprehensive tree will fuel fundamental research on the nature of biological diversity, ultimately providing up-to-date phylogenies for downstream applications in comparative biology, ecology, conservation biology, climate change, agriculture, and genomics.

  16. Geodynamo theory and simulations

    NASA Astrophysics Data System (ADS)

    Roberts, Paul H.; Glatzmaier, Gary A.

    2000-10-01

    80 years ago, Joseph Larmor planted the seed that grew into today's imposing body of knowledge about how the Earth's magnetic field is created. His simple idea, that the geomagnetic field is the result of dynamo action in the Earth's electrically conducting, fluid core, encountered many difficulties, but these have by now been largely overcome, while alternative proposals have been found to be untenable. The development of the theory and its current status are reviewed below. The basic electrodynamics are summarized, but the main focus is on dynamical questions. A special study is made of the energy and entropy requirements of the dynamo and in particular of how efficient it is, considered as a heat engine. Particular attention is paid to modeling core magnetohydrodynamics in a way that is tractable but nevertheless incorporates the dynamical effects of core turbulence in an approximate way. This theory has been tested by numerical integrations, some results from which are presented. The success of these simulations seems to be considerable, when measured against the known geomagnetic facts summarized here. Obstacles that still remain to be overcome are discussed, and some other future challenges are described.

  17. Risks, benefits, complications and harms: neglected factors in the current debate on non-therapeutic circumcision.

    PubMed

    Darby, Robert

    2015-03-01

    Much of the contemporary debate about the propriety of non-therapeutic circumcision of male infants and boys revolves around the question of risks vs. With its headline conclusion that the benefits outweigh the risks, the current circumcision policy of the American Academy of Pediatrics [AAP] (released 2012) is a typical instance of this line of thought. Since the AAP states that it cannot assess the true incidence of complications, however, critics have pointed out that this conclusion is unwarranted. In this paper it is argued that the AAP's conclusion is untenable not only for empirical reasons related to lack of data, but also for logical and conceptual reasons: the concept of risk employed-risk of surgical complications-is too narrow to be useful in the circumcision debate. Complications are not the only harms of circumcision: the AAP and other parties debating the pros and cons of circumcision should conceptualize their analysis more broadly as risk of harm vs. prospect of benefit, thereby factoring in the value of the foreskin to the individual and the physical and ethical harms of removing it from a non-consenting child.

  18. Hybrid Microgrid Model based on Solar Photovoltaics with Batteries and Fuel Cells system for intermittent applications

    NASA Astrophysics Data System (ADS)

    Patterson, Maxx

    Microgrids are a subset of the modern power structure; using distributed generation (DG) to supply power to communities rather than vast regions. The reduced scale mitigates loss allowing the power produced to do more with better control, giving greater security, reliability, and design flexibility. This paper explores the performance and cost viability of a hybrid grid-tied microgrid that utilizes Photovoltaic (PV), batteries, and fuel cell (FC) technology. The concept proposes that each community home is equipped with more PV than is required for normal operation. As the homes are part of a microgrid, excess or unused energy from one home is collected for use elsewhere within the microgrid footprint. The surplus power that would have been discarded becomes a community asset, and is used to run intermittent services. In this paper, the modeled community does not have parking adjacent to each home allowing for the installment of a privately owned slower Level 2 charger, making EV ownership option untenable. A solution is to provide a Level 3 DC Quick Charger (DCQC) as the intermittent service. The addition of batteries and Fuel Cells are meant to increase load leveling, reliability, and instill limited island capability.

  19. Lepton jets and low-mass sterile neutrinos at hadron colliders

    NASA Astrophysics Data System (ADS)

    Dube, Sourabh; Gadkari, Divya; Thalapillil, Arun M.

    2017-09-01

    Sterile neutrinos, if they exist, are potential harbingers for physics beyond the Standard Model. They have the capacity to shed light on our flavor sector, grand unification frameworks, dark matter sector and origins of baryon antibaryon asymmetry. There have been a few seminal studies that have broached the subject of sterile neutrinos with low, electroweak-scale masses (i.e. ΛQCD≪mNR≪mW± ) and investigated their reach at hadron colliders using lepton jets. These preliminary studies nevertheless assume background-free scenarios after certain selection criteria which are overly optimistic and untenable in realistic situations. These lead to incorrect projections. The unique signal topology and challenging hadronic environment also make this mass-scale regime ripe for a careful investigation. With the above motivations, we attempt to perform the first systematic study of low, electroweak-scale, right-handed neutrinos at hadron colliders, in this unique signal topology. There are currently no active searches at hadron colliders for sterile neutrino states in this mass range, and we frame the study in the context of the 13 TeV high-luminosity Large Hadron Collider and the proposed FCC-hh/SppC 100 TeV p p -collider.

  20. Human Rights and the Excess of Identity: A Legal and Theoretical Inquiry into the Notion of Identity in Strasbourg Case Law.

    PubMed

    Al Tamimi, Yussef

    2018-06-01

    Identity is a central theme in contemporary politics, but legal academia lacks a rigorous analysis of this concept. The aim of this article is twofold: (i) firstly, it aims to reveal presumptions on identity in human rights law by mapping how the European Court of Human Rights approaches identity and (ii) secondly, it seeks to analyse these presumptions using theoretical insights on identity. By merging legal and theoretical analysis, this article contributes a reading of the Court's case law which suggests that the tension between the political and apolitical is visible as a common thread in the Court's use of identity. In case law concerning paternity, the Court appears to hold a specific view of what is presented as an unquestionable part of identity. This ostensibly pre-political notion of identity becomes untenable in cases where the nature of an identity feature, such as the headscarf, is contended or a minority has adopted a national identity that conflicts with the majoritarian national identity. The Court's approach to identity in such cases reflects a paradox that is inherent to identity; identity is personal while simultaneously constituted and shaped by overarching power mechanisms.

  1. Dead tired and bone weary: Grandmothers as caregivers in drug affected inner city households✩

    PubMed Central

    Dunlap, Eloise; Tourigny, Sylvie C.; Johnson, Bruce D.

    2009-01-01

    At a time of unprecedented growth in the numbers of custodial grandparents, this case study of Emma’s household articulates the stresses inherent to the lives of many grandparents whose own children’s lives are governed by drug use and addiction. We contrast normative expectations traditionally integral to the culture of extended families with the counternormative demands that drug use imposes on households. This highlights the untenable nature of caregiving for Emma and countless others of her generation. Compelled by tradition and sentiment to help their own children, they are thus allowing drug use driven norms, values and beliefs to permeate the lives of the grandchildren in their care. Yet, they are also trying to protect those children from drugs and from the violence and conflict that drugs bring into the household. Emma’s own life illustrates the salience of norms of kinship, reciprocity and respect, and the trauma in her household demonstrates how their absence does, indeed, intensify demands and erode resources. We conclude that the imperatives of raising the next generation may necessitate a counternormative willingness on the part of grandparents to exclude their adult drug using children from their households. PMID:20011671

  2. Against the inalienable right to withdraw from research.

    PubMed

    Chwang, Eric

    2008-08-01

    In this paper I argue, against the current consensus, that the right to withdraw from research is sometimes alienable. In other words, research subjects are sometimes morally permitted to waive their right to withdraw. The argument proceeds in three major steps. In the first step, I argue that rights typically should be presumed alienable, both because that is not illegitimately coercive and because the general paternalistic motivation for keeping them inalienable is untenable. In the second step of the argument, I consider three special characteristics of the right to withdraw, first that its waiver might be exploitative, second that research involves intimate bodily access, and third that it is irreversible. I argue that none of these characteristics justify an inalienable right to withdraw. In the third step, I examine four considerations often taken to justify various other allegedly inalienable rights: concerns about treating yourself merely as a means as might be the case in suicide, concerns about revoking all your future freedoms in slavery contracts, the resolution of coordination problems, and public interest. I argue that the motivations involved in these four types of situations do not apply to the right to withdraw from research.

  3. New Insights into Signed Path Coefficient Granger Causality Analysis

    PubMed Central

    Zhang, Jian; Li, Chong; Jiang, Tianzi

    2016-01-01

    Granger causality analysis, as a time series analysis technique derived from econometrics, has been applied in an ever-increasing number of publications in the field of neuroscience, including fMRI, EEG/MEG, and fNIRS. The present study mainly focuses on the validity of “signed path coefficient Granger causality,” a Granger-causality-derived analysis method that has been adopted by many fMRI researches in the last few years. This method generally estimates the causality effect among the time series by an order-1 autoregression, and defines a positive or negative coefficient as an “excitatory” or “inhibitory” influence. In the current work we conducted a series of computations from resting-state fMRI data and simulation experiments to illustrate the signed path coefficient method was flawed and untenable, due to the fact that the autoregressive coefficients were not always consistent with the real causal relationships and this would inevitablely lead to erroneous conclusions. Overall our findings suggested that the applicability of this kind of causality analysis was rather limited, hence researchers should be more cautious in applying the signed path coefficient Granger causality to fMRI data to avoid misinterpretation. PMID:27833547

  4. New Insights into Signed Path Coefficient Granger Causality Analysis.

    PubMed

    Zhang, Jian; Li, Chong; Jiang, Tianzi

    2016-01-01

    Granger causality analysis, as a time series analysis technique derived from econometrics, has been applied in an ever-increasing number of publications in the field of neuroscience, including fMRI, EEG/MEG, and fNIRS. The present study mainly focuses on the validity of "signed path coefficient Granger causality," a Granger-causality-derived analysis method that has been adopted by many fMRI researches in the last few years. This method generally estimates the causality effect among the time series by an order-1 autoregression, and defines a positive or negative coefficient as an "excitatory" or "inhibitory" influence. In the current work we conducted a series of computations from resting-state fMRI data and simulation experiments to illustrate the signed path coefficient method was flawed and untenable, due to the fact that the autoregressive coefficients were not always consistent with the real causal relationships and this would inevitablely lead to erroneous conclusions. Overall our findings suggested that the applicability of this kind of causality analysis was rather limited, hence researchers should be more cautious in applying the signed path coefficient Granger causality to fMRI data to avoid misinterpretation.

  5. Vindicating virtue: a critical analysis of the situationist challenge against Aristotelian moral psychology.

    PubMed

    Croom, Adam M

    2014-03-01

    This article provides a critical analysis of the situationist challenge against Aristotelian moral psychology. It first outlines the details and results from four paradigmatic studies in psychology that situationists have heavily drawn upon in their critique of the Aristotelian conception of virtuous characteristics, including studies conducted by Hartshorne and May (1928), Darley and Batson (Journal of Personality and Social Psychology 27:100-108, 1973), Isen and Levin (Journal of Personality and Social Psychology 21:384-388, 1972), and Milgram (Journal of Abnormal and Social Psychology 67:371-378, 1963). It then presents ten problems with the way situationists have used these studies to challenge Aristotelian moral psychology. After challenging the situationists on these grounds, the article then proceeds to challenge the situationist presentation of the Aristotelian conception, showing that situationists have provided an oversimplified caricature of it that goes against the grain of much Aristotelian text. In evaluating the situationist challenge against the actual results from empirical research as well as primary Aristotelian text, it will be shown that the situationist debate has advanced both an extreme, untenable view about the nature of characteristics and situations, as well as an inaccurate presentation of the Aristotelian view.

  6. A New Spin to Exoplanet Habitability Criteria

    NASA Astrophysics Data System (ADS)

    Georgoulis, M. K.; Patsourakos, S.

    2017-12-01

    We describe a physically- and statistically-based method to infer the near-Sun magnetic field of coronal mass ejections (CMEs) and then extrapolate it to the inner heliosphere and beyond. Besides a ballpark agreement with in-situ observations of interplanetary CMEs (ICMEs) at L1, we use our estimates to show that Earth does not seem to be at risk of an extinction-level atmospheric erosion or stripping by the magnetic pressure of extreme solar eruptions, even way above a Carrington-type event. This does not seem to be the case with exoplanets, however, at least those orbiting in the classically defined habitability zones of magnetically active dwarf stars at orbital radii of a small fraction of 1 AU. We show that the combination of stellar ICMEs and the tidally locking zone of mother stars, that quite likely does not allow these exoplanets to attain Earth-like magnetic fields to shield themselves, probably render the existence of a proper atmosphere in them untenable. We propose, therefore, a critical revision of habitability criteria in these cases that would limit the number of target exoplanets considered as potential biosphere hosts.

  7. Informatics in radiology: automated Web-based graphical dashboard for radiology operational business intelligence.

    PubMed

    Nagy, Paul G; Warnock, Max J; Daly, Mark; Toland, Christopher; Meenan, Christopher D; Mezrich, Reuben S

    2009-11-01

    Radiology departments today are faced with many challenges to improve operational efficiency, performance, and quality. Many organizations rely on antiquated, paper-based methods to review their historical performance and understand their operations. With increased workloads, geographically dispersed image acquisition and reading sites, and rapidly changing technologies, this approach is increasingly untenable. A Web-based dashboard was constructed to automate the extraction, processing, and display of indicators and thereby provide useful and current data for twice-monthly departmental operational meetings. The feasibility of extracting specific metrics from clinical information systems was evaluated as part of a longer-term effort to build a radiology business intelligence architecture. Operational data were extracted from clinical information systems and stored in a centralized data warehouse. Higher-level analytics were performed on the centralized data, a process that generated indicators in a dynamic Web-based graphical environment that proved valuable in discussion and root cause analysis. Results aggregated over a 24-month period since implementation suggest that this operational business intelligence reporting system has provided significant data for driving more effective management decisions to improve productivity, performance, and quality of service in the department.

  8. Holocene alluvial stratigraphy and response to climate change in the Roaring River valley, Front Range, Colorado, USA

    USGS Publications Warehouse

    Madole, Richard F.

    2012-01-01

    Stratigraphic analyses and radiocarbon geochronology of alluvial deposits exposed along the Roaring River, Colorado, lead to three principal conclusions: (1) the opinion that stream channels in the higher parts of the Front Range are relics of the Pleistocene and nonalluvial under the present climate, as argued in a water-rights trial USA v. Colorado, is untenable, (2) beds of clast-supported gravel alternate in vertical succession with beds of fine-grained sediment (sand, mud, and peat) in response to centennial-scale changes in snowmelt-driven peak discharges, and (3) alluvial strata provide information about Holocene climate history that complements the history provided by cirque moraines, periglacial deposits, and paleontological data. Most alluvial strata are of late Holocene age and record, among other things, that: (1) the largest peak flows since the end of the Pleistocene occurred during the late Holocene; (2) the occurrence of a mid- to late Holocene interval (~2450–1630(?) cal yr BP) of warmer climate, which is not clearly identified in palynological records; and (3) the Little Ice Age climate seems to have had little impact on stream channels, except perhaps for minor (~1 m) incision. Published

  9. Holocene alluvial stratigraphy and response to climate change in the Roaring River valley, Front Range, Colorado, USA

    NASA Astrophysics Data System (ADS)

    Madole, Richard F.

    2012-09-01

    Stratigraphic analyses and radiocarbon geochronology of alluvial deposits exposed along the Roaring River, Colorado, lead to three principal conclusions: (1) the opinion that stream channels in the higher parts of the Front Range are relics of the Pleistocene and nonalluvial under the present climate, as argued in a water-rights trial USA v. Colorado, is untenable, (2) beds of clast-supported gravel alternate in vertical succession with beds of fine-grained sediment (sand, mud, and peat) in response to centennial-scale changes in snowmelt-driven peak discharges, and (3) alluvial strata provide information about Holocene climate history that complements the history provided by cirque moraines, periglacial deposits, and paleontological data. Most alluvial strata are of late Holocene age and record, among other things, that: (1) the largest peak flows since the end of the Pleistocene occurred during the late Holocene; (2) the occurrence of a mid- to late Holocene interval (~ 2450-1630(?) cal yr BP) of warmer climate, which is not clearly identified in palynological records; and (3) the Little Ice Age climate seems to have had little impact on stream channels, except perhaps for minor (~ 1 m) incision.

  10. Science education as an exercise in foreign affairs

    NASA Astrophysics Data System (ADS)

    Cobern, William W.

    1995-07-01

    In Kuhnian terms, science education has been a process of inducting students into the reigning paradigms of science. In 1985, Duschl noted that science education had not kept pace with developments in the history and philosophy of science. The claim of certainty for scientific knowledge which science educators grounded in positivist philosophy was rendered untenable years ago and it turns out that social and cultural factors surrounding discovery may be at least as important as the justification of knowledge. Capitalizing on these new developments, Duschl, Hamilton, and Grandy (1990) wrote a compelling argument for the need to have a joint research effort in science education involving the philosophy and history of science along with cognitive psychology. However, the issue of discovery compels the research community go one step further. If the science education community has been guilty of neglecting historical and philosophical issues in science, let it not now be guilty of ignoring sociological issues in science. A collaborative view ought also to include the sociological study of cultural milieu in which scientific ideas arise. In other words, an external sociological perspective on science. The logic of discovery from a sociological point of view implies that conceptual change can also be viewed from a sociological perspective.

  11. Coupled local facilitation and global hydrologic inhibition drive landscape geometry in a patterned peatland

    NASA Astrophysics Data System (ADS)

    Acharya, S.; Kaplan, D. A.; Casey, S.; Cohen, M. J.; Jawitz, J. W.

    2015-05-01

    Self-organized landscape patterning can arise in response to multiple processes. Discriminating among alternative patterning mechanisms, particularly where experimental manipulations are untenable, requires process-based models. Previous modeling studies have attributed patterning in the Everglades (Florida, USA) to sediment redistribution and anisotropic soil hydraulic properties. In this work, we tested an alternate theory, the self-organizing-canal (SOC) hypothesis, by developing a cellular automata model that simulates pattern evolution via local positive feedbacks (i.e., facilitation) coupled with a global negative feedback based on hydrology. The model is forced by global hydroperiod that drives stochastic transitions between two patch types: ridge (higher elevation) and slough (lower elevation). We evaluated model performance using multiple criteria based on six statistical and geostatistical properties observed in reference portions of the Everglades landscape: patch density, patch anisotropy, semivariogram ranges, power-law scaling of ridge areas, perimeter area fractal dimension, and characteristic pattern wavelength. Model results showed strong statistical agreement with reference landscapes, but only when anisotropically acting local facilitation was coupled with hydrologic global feedback, for which several plausible mechanisms exist. Critically, the model correctly generated fractal landscapes that had no characteristic pattern wavelength, supporting the invocation of global rather than scale-specific negative feedbacks.

  12. Coupled local facilitation and global hydrologic inhibition drive landscape geometry in a patterned peatland

    NASA Astrophysics Data System (ADS)

    Acharya, S.; Kaplan, D. A.; Casey, S.; Cohen, M. J.; Jawitz, J. W.

    2015-01-01

    Self-organized landscape patterning can arise in response to multiple processes. Discriminating among alternative patterning mechanisms, particularly where experimental manipulations are untenable, requires process-based models. Previous modeling studies have attributed patterning in the Everglades (Florida, USA) to sediment redistribution and anisotropic soil hydraulic properties. In this work, we tested an alternate theory, the self-organizing canal (SOC) hypothesis, by developing a cellular automata model that simulates pattern evolution via local positive feedbacks (i.e., facilitation) coupled with a global negative feedback based on hydrology. The model is forced by global hydroperiod that drives stochastic transitions between two patch types: ridge (higher elevation) and slough (lower elevation). We evaluated model performance using multiple criteria based on six statistical and geostatistical properties observed in reference portions of the Everglades landscape: patch density, patch anisotropy, semivariogram ranges, power-law scaling of ridge areas, perimeter area fractal dimension, and characteristic pattern wavelength. Model results showed strong statistical agreement with reference landscapes, but only when anisotropically acting local facilitation was coupled with hydrologic global feedback, for which several plausible mechanisms exist. Critically, the model correctly generated fractal landscapes that had no characteristic pattern wavelength, supporting the invocation of global rather than scale-specific negative feedbacks.

  13. Stochastic gradient ascent outperforms gamers in the Quantum Moves game

    NASA Astrophysics Data System (ADS)

    Sels, Dries

    2018-04-01

    In a recent work on quantum state preparation, Sørensen and co-workers [Nature (London) 532, 210 (2016), 10.1038/nature17620] explore the possibility of using video games to help design quantum control protocols. The authors present a game called "Quantum Moves" (https://www.scienceathome.org/games/quantum-moves/) in which gamers have to move an atom from A to B by means of optical tweezers. They report that, "players succeed where purely numerical optimization fails." Moreover, by harnessing the player strategies, they can "outperform the most prominent established numerical methods." The aim of this Rapid Communication is to analyze the problem in detail and show that those claims are untenable. In fact, without any prior knowledge and starting from a random initial seed, a simple stochastic local optimization method finds near-optimal solutions which outperform all players. Counterdiabatic driving can even be used to generate protocols without resorting to numeric optimization. The analysis results in an accurate analytic estimate of the quantum speed limit which, apart from zero-point motion, is shown to be entirely classical in nature. The latter might explain why gamers are reasonably good at the game. A simple modification of the BringHomeWater challenge is proposed to test this hypothesis.

  14. Erasmus Darwin, Herbert Spencer, and the origins of the evolutionary worldview in British provincial scientific culture, 1770-1850.

    PubMed

    Elliott, Paul

    2003-03-01

    The significance of Herbert Spencer's evolutionary philosophy has been generally recognized for over a century, as the familiarity of his phrase "survival of the fittest" indicates, yet accounts of the origins of his system still tend to follow too closely his own description, written many decades later. This essay argues that Spencer's own interpretation of his intellectual development gives an inadequate impression of the debt he owed to provincial scientific culture and its institutions. Most important, it shows that his evolutionism was originally stimulated by his association with the Derby philosophical community, for it was through this group--of which his father, who also appears to have espoused a deistic evolutionary theory, was a member--that he was first exposed to progressive Englightenment social and educational philosophies and to the evolutionary worldview of Erasmus Darwin, the first president of the Derby Philosophical Society. Darwin's scheme was the first to incorporate biological evolution, associationist psychology, evolutionary geology, and cosmological developmentalism. Spencer's own implicit denials of the link with Darwin are shown to be implausible in the face of Darwin's continuing influence on the Derby savants, the product of insecurity in his later years when he feared for his reputation as Lamarckism became increasingly untenable.

  15. Greater magnocellular saccadic suppression in high versus low autistic tendency suggests a causal path to local perceptual style.

    PubMed

    Crewther, David P; Crewther, Daniel; Bevan, Stephanie; Goodale, Melvyn A; Crewther, Sheila G

    2015-12-01

    Saccadic suppression-the reduction of visual sensitivity during rapid eye movements-has previously been proposed to reflect a specific suppression of the magnocellular visual system, with the initial neural site of that suppression at or prior to afferent visual information reaching striate cortex. Dysfunction in the magnocellular visual pathway has also been associated with perceptual and physiological anomalies in individuals with autism spectrum disorder or high autistic tendency, leading us to question whether saccadic suppression is altered in the broader autism phenotype. Here we show that individuals with high autistic tendency show greater saccadic suppression of low versus high spatial frequency gratings while those with low autistic tendency do not. In addition, those with high but not low autism spectrum quotient (AQ) demonstrated pre-cortical (35-45 ms) evoked potential differences (saccade versus fixation) to a large, low contrast, pseudo-randomly flashing bar. Both AQ groups showed similar differential visual evoked potential effects in later epochs (80-160 ms) at high contrast. Thus, the magnocellular theory of saccadic suppression appears untenable as a general description for the typically developing population. Our results also suggest that the bias towards local perceptual style reported in autism may be due to selective suppression of low spatial frequency information accompanying every saccadic eye movement.

  16. Synthesis of phylogeny and taxonomy into a comprehensive tree of life

    PubMed Central

    Hinchliff, Cody E.; Smith, Stephen A.; Allman, James F.; Burleigh, J. Gordon; Chaudhary, Ruchi; Coghill, Lyndon M.; Crandall, Keith A.; Deng, Jiabin; Drew, Bryan T.; Gazis, Romina; Gude, Karl; Hibbett, David S.; Katz, Laura A.; Laughinghouse, H. Dail; McTavish, Emily Jane; Midford, Peter E.; Owen, Christopher L.; Ree, Richard H.; Rees, Jonathan A.; Soltis, Douglas E.; Williams, Tiffani; Cranston, Karen A.

    2015-01-01

    Reconstructing the phylogenetic relationships that unite all lineages (the tree of life) is a grand challenge. The paucity of homologous character data across disparately related lineages currently renders direct phylogenetic inference untenable. To reconstruct a comprehensive tree of life, we therefore synthesized published phylogenies, together with taxonomic classifications for taxa never incorporated into a phylogeny. We present a draft tree containing 2.3 million tips—the Open Tree of Life. Realization of this tree required the assembly of two additional community resources: (i) a comprehensive global reference taxonomy and (ii) a database of published phylogenetic trees mapped to this taxonomy. Our open source framework facilitates community comment and contribution, enabling the tree to be continuously updated when new phylogenetic and taxonomic data become digitally available. Although data coverage and phylogenetic conflict across the Open Tree of Life illuminate gaps in both the underlying data available for phylogenetic reconstruction and the publication of trees as digital objects, the tree provides a compelling starting point for community contribution. This comprehensive tree will fuel fundamental research on the nature of biological diversity, ultimately providing up-to-date phylogenies for downstream applications in comparative biology, ecology, conservation biology, climate change, agriculture, and genomics. PMID:26385966

  17. Analytical Sociology: A Bungean Appreciation

    NASA Astrophysics Data System (ADS)

    Wan, Poe Yu-ze

    2012-10-01

    Analytical sociology, an intellectual project that has garnered considerable attention across a variety of disciplines in recent years, aims to explain complex social processes by dissecting them, accentuating their most important constituent parts, and constructing appropriate models to understand the emergence of what is observed. To achieve this goal, analytical sociologists demonstrate an unequivocal focus on the mechanism-based explanation grounded in action theory. In this article I attempt a critical appreciation of analytical sociology from the perspective of Mario Bunge's philosophical system, which I characterize as emergentist systemism. I submit that while the principles of analytical sociology and those of Bunge's approach share a lot in common, the latter brings to the fore the ontological status and explanatory importance of supra-individual actors (as concrete systems endowed with emergent causal powers) and macro-social mechanisms (as processes unfolding in and among social systems), and therefore it does not stipulate that every causal explanation of social facts has to include explicit references to individual-level actors and mechanisms. In this sense, Bunge's approach provides a reasonable middle course between the Scylla of sociological reification and the Charybdis of ontological individualism, and thus serves as an antidote to the untenable "strong program of microfoundations" to which some analytical sociologists are committed.

  18. The uncertain foundation of neo-Darwinism: metaphysical and epistemological pluralism in the evolutionary synthesis.

    PubMed

    Delisle, Richard G

    2009-06-01

    The Evolutionary Synthesis is often seen as a unification process in evolutionary biology, one which provided this research area with a solid common theoretical foundation. As such, neo-Darwinism is believed to constitute from this time onward a single, coherent, and unified movement offering research guidelines for investigations. While this may be true if evolutionary biology is solely understood as centred around evolutionary mechanisms, an entirely different picture emerges once other aspects of the founding neo-Darwinists' views are taken into consideration, aspects potentially relevant to the elaboration of an evolutionary worldview: the tree of life, the ontological distinctions of the main cosmic entities (inert matter, biological organisms, mind), the inherent properties of self-organizing matter, evolutionary ethics, and so on. Profound tensions and inconsistencies are immediately revealed in the neo-Darwinian movement once this broader perspective is adopted. This pluralism is such that it is possible to identify at least three distinct and quasi-incommensurable epistemological/metaphysical frameworks as providing a proper foundation for neo-Darwinism. The analysis of the views of Theodosius Dobzhansky, Bernhard Rensch, and Ernst Mayr will illustrate this untenable pluralism, one which requires us to conceive of the neo-Darwinian research agenda as being conducted in more than one research programme or research tradition at the same time.

  19. On the Historical and Conceptual Foundations of a Community Psychology of Social Transformation.

    PubMed

    Gokani, Ravi; Walsh, Richard T G

    2017-06-01

    We examine historical and conceptual literature in community psychology in order to understand the field's potential to be the socially transformative subdiscipline of psychology to which it aspires. By reviewing papers from two prominent journals and other literature, we conclude that the claim that community psychology is well-suited to social transformation, because it is a product of Sixties' radicalism and is theoretically equipped, is untenable. Systematic accounts of the subdiscipline's origins suggest that the transformative aspirations of current community psychologists do not correspond to the subdiscipline's reformist past. Furthermore, in analyzing three related concepts currently employed in the field-social justice, power, and praxis-we show that each suffers from conceptual ambiguity and a restricted political scope. These conceptual flaws, coupled with community psychology's historical inclination toward social reform, inhibit the possibility of contributing to radical social transformation. We conclude that neither questionable historical claims nor ambiguous and politically dubious concepts support a community psychology of social transformation. We offer solutions for the historical and conceptual problems we identify and, as a broader solution to the problem of engaging in socially transformative work, propose that community psychologists should seek direct political engagement in solidarity with other citizens as fellow citizens not as psychologists. © Society for Community Research and Action 2017.

  20. Judging without criteria? Sickness certification in Dutch disability schemes.

    PubMed

    Meershoek, Agnes; Krumeich, Anja; Vos, Rein

    2007-05-01

    The gate-keeping function that physicians perform in determining clients' physical and mental incapacities is widely assumed to be the main reason for the rising numbers of disabled people. The sharp rise in the number of disabled has led many to claim that the disability benefits schemes are untenable. In order to regain public control and to make disabled eligibility procedures more transparent guidelines have been introduced in which medical evaluations are conceptualised as formal rational decisions. It is, however, questionable whether such measures are helpful in achieving their stated aims. This paper is based on ethnographic research on the ways physicians evaluate the eligibility of clients for disability benefits. It argues that assessing incapacity involves much more than formal rational decision-making. Doctors' reasoning is contextual and deliberative in character, and thus their assessment of a client's incapacity is less a technical matter than a normative one. Instead of generating transparency, guidelines based on formal rationality make the complex deliberations on which such judgments are based invisible, because they deny the normative dimension of medical expert decision-making. Therefore, different measures have to be developed that allow this normative dimension to be articulated, since insight into this normative dimension is a necessary pre-condition to be able to criticise disability judgments at all.

  1. Horizontal Axis Wind Turbine Experiments at Full-Scale Reynolds Numbers

    NASA Astrophysics Data System (ADS)

    Miller, Mark; Kiefer, Janik; Nealon, Tara; Westergaard, Carsten; Hultmark, Marcus

    2017-11-01

    Achieving high Reynolds numbers on a wind turbine model remains a major challenge for experimentalists. Since Reynolds number effects need to be captured accurately, matching this parameter is of great importance. The challenge stems from the large scale ratio between model and full-size, typically on the order of 1:100. Traditional wind tunnels are limited due to finite tunnel size, with velocity as the only free-parameter available for increasing the Reynolds number. Unfortunately, increasing the velocity 100 times is untenable because it violates Mach number matching with the full-scale and results in unfeasible rotation rates. Present work in Princeton University's high pressure wind tunnel makes it possible to evaluate the Reynolds number sensitivity with regard to wind turbine aerodynamics. This facility, which uses compressed air as the working fluid, allows for adjustment of the Reynolds number, via the fluid density, independent of the Tip Speed Ratio (TSR) and Mach number. Power and thrust coefficients will be shown as a function of Reynolds number and TSR for a model wind turbine. The Reynolds number range investigated exceeds 10 ×106 based on diameter and free-stream conditions or 3 ×106 based on the tip chord, matching those of the full-scale. National Science Foundation and Andlinger Center for Energy and the Environment.

  2. Advances in the quantification of mitochondrial function in primary human immune cells through extracellular flux analysis.

    PubMed

    Nicholas, Dequina; Proctor, Elizabeth A; Raval, Forum M; Ip, Blanche C; Habib, Chloe; Ritou, Eleni; Grammatopoulos, Tom N; Steenkamp, Devin; Dooms, Hans; Apovian, Caroline M; Lauffenburger, Douglas A; Nikolajczyk, Barbara S

    2017-01-01

    Numerous studies show that mitochondrial energy generation determines the effectiveness of immune responses. Furthermore, changes in mitochondrial function may regulate lymphocyte function in inflammatory diseases like type 2 diabetes. Analysis of lymphocyte mitochondrial function has been facilitated by introduction of 96-well format extracellular flux (XF96) analyzers, but the technology remains imperfect for analysis of human lymphocytes. Limitations in XF technology include the lack of practical protocols for analysis of archived human cells, and inadequate data analysis tools that require manual quality checks. Current analysis tools for XF outcomes are also unable to automatically assess data quality and delete untenable data from the relatively high number of biological replicates needed to power complex human cell studies. The objectives of work presented herein are to test the impact of common cellular manipulations on XF outcomes, and to develop and validate a new automated tool that objectively analyzes a virtually unlimited number of samples to quantitate mitochondrial function in immune cells. We present significant improvements on previous XF analyses of primary human cells that will be absolutely essential to test the prediction that changes in immune cell mitochondrial function and fuel sources support immune dysfunction in chronic inflammatory diseases like type 2 diabetes.

  3. `The Wildest Speculation of All': Lemaître and the Primeval-Atom Universe

    NASA Astrophysics Data System (ADS)

    Kragh, Helge

    Although there is no logical connection between the expanding universe and the idea of a big bang, from a historical perspective the two concepts were intimately connected. Four years after his pioneering work on the expanding universe, Lemaître suggested that the entire universe had originated in a kind of explosive act from what he called a primeval atom and which he likened to a huge atomic nucleus. His theory of 1931 was the first realistic finite-age model based upon relativistic cosmology, but it presupposed a material proto-universe and thus avoided an initial singularity. What were the sources of Lemaître's daring proposal? Well aware that his new cosmological model needed to have testable consequences, he argued that the cosmic rays were fossils of the original radioactive explosion. However, this hypothesis turned out to be untenable. The first big-bang model ever was received with a mixture of indifference and hostility. Why? The answer is not that contemporary cosmologists failed to recognize Lemaître's genius, but rather that his model was scientifically unconvincing. Although Lemaître was indeed the father of big-bang cosmology, his brilliant idea was only turned into a viable cosmological theory by later physicists.

  4. The Johns Hopkins Hospital: identifying and addressing risks and safety issues.

    PubMed

    Paine, Lori A; Baker, David R; Rosenstein, Beryl; Pronovost, Peter J

    2004-10-01

    At The Johns Hopkins Hospital (JHH), a culture of safety refers to the presence of characteristics such as the belief that harm is untenable and the use of a systems approach to analyzing safety issues. The leadership of JHH provides strategic planning guidance for safety and improvement initiatives, involves the patient safety committee in capital investment allocation decisions and in designing and planning new hospital facilities, and ensures that safety and quality head the agenda of board-of-trustees meetings. Although JHH takes a systems approach, structures such as monitoring staff behavior trends are used to hold people accountable for job performance. JHH encountered three major hurdles in implementing and sustaining a culture of safety. First, JHH's decentralized organizational structure contributes to a silo effect that limits the spread of ideas, practices, and culture. JHH intends to create an internal collaborative of departmental safety initiatives to foster opportunities for units to share ideas and results. Second, in response to the challenge of encouraging teams to think and act in an interdisciplinary fashion, communication and teamwork training are being used to enhance the effectiveness of interdisciplinary teams. Further development of valid and meaningful safety-related measurement and data collection methodologies is JHH's largest remaining challenge.

  5. Biomarker combinations for diagnosis and prognosis in multicenter studies: Principles and methods.

    PubMed

    Meisner, Allison; Parikh, Chirag R; Kerr, Kathleen F

    2017-01-01

    Many investigators are interested in combining biomarkers to predict a binary outcome or detect underlying disease. This endeavor is complicated by the fact that many biomarker studies involve data from multiple centers. Depending upon the relationship between center, the biomarkers, and the target of prediction, care must be taken when constructing and evaluating combinations of biomarkers. We introduce a taxonomy to describe the role of center and consider how a biomarker combination should be constructed and evaluated. We show that ignoring center, which is frequently done by clinical researchers, is often not appropriate. The limited statistical literature proposes using random intercept logistic regression models, an approach that we demonstrate is generally inadequate and may be misleading. We instead propose using fixed intercept logistic regression, which appropriately accounts for center without relying on untenable assumptions. After constructing the biomarker combination, we recommend using performance measures that account for the multicenter nature of the data, namely the center-adjusted area under the receiver operating characteristic curve. We apply these methods to data from a multicenter study of acute kidney injury after cardiac surgery. Appropriately accounting for center, both in construction and evaluation, may increase the likelihood of identifying clinically useful biomarker combinations.

  6. Evolutionary history of true crabs (Crustacea: Decapoda: Brachyura) and the origin of freshwater crabs.

    PubMed

    Tsang, Ling Ming; Schubart, Christoph D; Ahyong, Shane T; Lai, Joelle C Y; Au, Eugene Y C; Chan, Tin-Yam; Ng, Peter K L; Chu, Ka Hou

    2014-05-01

    Crabs of the infra-order Brachyura are one of the most diverse groups of crustaceans with approximately 7,000 described species in 98 families, occurring in marine, freshwater, and terrestrial habitats. The relationships among the brachyuran families are poorly understood due to the high morphological complexity of the group. Here, we reconstruct the most comprehensive phylogeny of Brachyura to date using sequence data of six nuclear protein-coding genes and two mitochondrial rRNA genes from more than 140 species belonging to 58 families. The gene tree confirms that the "Podotremata," are paraphyletic. Within the monophyletic Eubrachyura, the reciprocal monophyly of the two subsections, Heterotremata and Thoracotremata, is supported. Monophyly of many superfamilies, however, is not recovered, indicating the prevalence of morphological convergence and the need for further taxonomic studies. Freshwater crabs were derived early in the evolution of Eubrachyura and are shown to have at least two independent origins. Bayesian relaxed molecular methods estimate that freshwater crabs separated from their closest marine sister taxa ~135 Ma, that is, after the break up of Pangaea (∼200 Ma) and that a Gondwanan origin of these freshwater representatives is untenable. Most extant families and superfamilies arose during the late Cretaceous and early Tertiary.

  7. Innovation and evaluation: taming and unleashing telecare technology.

    PubMed

    Pols, Jeannette; Willems, Dick

    2011-03-01

    Telecare is advocated in most European countries with great, if not grandiose, promises: improving healthcare, lowering costs, solving workforce shortage. This paper does not so much question these specific promises, but rather the 'register of promising' as such, by comparing the promises with actual processes of incorporating technologies in healthcare practices. The case we study is the use of webcams in follow-up care from a Dutch rehabilitation clinic for people with severe chronic obstructive pulmonary disease (COPD). This process shows many changes and contingencies, and corresponding shifts in goals and aims. The conclusion is that when innovative technologies such as telecare are actually put to work, 'the same' technology will perform differently. In order to function at all, technology has to be tamed, it has to be tinkered with to fit the practices of the users. The technology, however, is not meekly put to use (tamed), but is unleashed as well, affecting care practices in unforeseen ways. The untenability of pre-given promises and the fluidity of locally evolving goals has important implications for the way in which innovations are promoted, as well as for the way innovative technologies may be evaluated. © 2010 The Authors. Sociology of Health & Illness © 2010 Foundation for the Sociology of Health & Illness/Blackwell Publishing Ltd.

  8. Pangloss revisited: a critique of the dilution effect and the biodiversity-buffers-disease paradigm.

    PubMed

    Randolph, S E; Dobson, A D M

    2012-06-01

    The twin concepts of zooprophylaxis and the dilution effect originated with vector-borne diseases (malaria), were driven forward by studies on Lyme borreliosis and have now developed into the mantra "biodiversity protects against disease". The basic idea is that by diluting the assemblage of transmission-competent hosts with non-competent hosts, the probability of vectors feeding on transmission-competent hosts is reduced and so the abundance of infected vectors is lowered. The same principle has recently been applied to other infectious disease systems--tick-borne, insect-borne, indirectly transmitted via intermediate hosts, directly transmitted. It is claimed that the presence of extra species of various sorts, acting through a variety of distinct mechanisms, causes the prevalence of infectious agents to decrease. Examination of the theoretical and empirical evidence for this hypothesis reveals that it applies only in certain circumstances even amongst tick-borne diseases, and even less often if considering the correct metric--abundance rather than prevalence of infected vectors. Whether dilution or amplification occurs depends more on specific community composition than on biodiversity per se. We warn against raising a straw man, an untenable argument easily dismantled and dismissed. The intrinsic value of protecting biodiversity and ecosystem function outweighs this questionable utilitarian justification.

  9. A model of differential amygdala activation in psychopathy.

    PubMed

    Moul, Caroline; Killcross, Simon; Dadds, Mark R

    2012-10-01

    This article introduces a novel hypothesis regarding amygdala function in psychopathy. The first part of this article introduces the concept of psychopathy and describes the main cognitive and affective impairments demonstrated by this population; that is, a deficit in fear-recognition, lower conditioned fear responses and poor performance in passive avoidance, and response-reversal learning tasks. Evidence for amygdala dysfunction in psychopathy is considered with regard to these deficits; however, the idea of unified amygdala function is untenable. A model of differential amygdala activation in which the basolateral amygdala (BLA) is underactive while the activity of the central amygdala (CeA) is of average to above average levels is proposed to provide a more accurate and up-to-date account for the specific cognitive and emotional deficits found in psychopathy. In addition, the model provides a mechanism by which attentional-based models and emotion-based models of psychopathy can coexist. Data to support the differential amygdala activation model are provided from studies from both human and animal research. Supporting evidence concerning some of the neurochemicals implicated in psychopathy is then reviewed. Implications of the model and areas of future research are discussed. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  10. [ADHD, a 'fashion' that won't go out of fashion. An illustration of the many-sidedness of earlier psychiatry].

    PubMed

    Nieweg, E H

    2006-01-01

    This article reviews the older Dutch literature on what is now known as ADHD (attention deficit hyperactivity disorder). Because this older literature is not included in electronic databases, it was traced, in a systematic and a selective manner, by means of bibliographic references. It is often assumed that ADHD is a purely Anglo-Saxon phenomenon that was introduced into The Netherlands along with the concept of MBD (minimal brain damage). This assumption is incorrect. In fact, ADHD has had quite a long, respectable history of its own in The Netherlands (and in surrounding countries). The antecedents of ADHD were regarded as very common phenomena. Over the years these concepts have often been the subject of considerable controversy. Biological factors were thought to be important in the etiology. It would be wrong to regard ADHD only as a fashionable ailment originating in the age of computers and working mothers. Equally untenable is the current view that pre-DSM III child psychiatrists attributed psychiatric disorders solely to parental influences ('parent-blaming'). The same conclusion was drawn in other publications by the author. The history of ADHD thus demonstrates that, contrary to general belief, early psychiatry in The Netherlands was more balanced and many-sided.

  11. Traces of embryogenesis are the same in monozygotic and dizygotic twins: not compatible with double ovulation.

    PubMed

    Boklage, Charles E

    2009-06-01

    Common knowledge of over a century has it that monozygotic and dizygotic twinning events occur by unrelated mechanisms: monozygotic twinning 'splits' embryos, producing anomalously re-arranged embryogenic asymmetries; dizygotic twinning begins with independent ovulations yielding undisturbed parallel embryogeneses with no expectation of departures from singleton outcomes. The anomalies statistically associated with twin births are due to the re-arranged embryos of the monozygotics. Common knowledge further requires that dizygotic pairs are dichorionic; monochorionicity is exclusive to monozygotic pairs. These are fundamental certainties in the literature of twin biology. Multiple observations contradict those common knowledge understandings. The double ovulation hypothesis of dizygotic twinning is untenable. Girl-boy twins differ subtly from all other humans of either sex, absolutely not representative of all dizygotics. Embryogenesis of dizygotic twins differs from singleton development at least as much as monozygotic embryogenesis does, and in the same ways, and the differences between singletons and twins of both zygosities represent a coherent system of re-arranged embryogenic asymmetries. Dizygotic twinning and monozygotic twinning have the same list of consequences of anomalous embryogenesis. Those include an unignorable fraction of dizygotic pairs that are in fact monochorionic, plus many more sharing co-twins' cells in tissues other than a common chorion. The idea that monozygotic and dizygotic twinning events arise from the same embryogenic mechanism is the only plausible hypothesis that might explain all of the observations.

  12. Development of a fieldable rugged TATP surface-enhanced Raman spectroscopy sensor

    NASA Astrophysics Data System (ADS)

    Spencer, Kevin M.; Clauson, Susan L.; Sylvia, James M.

    2011-06-01

    Surface-enhanced Raman spectroscopy (SERS) has repeatedly been shown to be capable of single molecule detection in laboratory controlled environments. However, superior detection of desired compounds in complex situations requires optimization of factors in addition to sensitivity. For example, SERS sensors are metals with surface roughness in the nm scale. This metallic roughness scale may not adsorb the analyte of interest but instead cause a catalytic reaction unless stabilization is designed into the sensor interface. In addition, the SERS sensor needs to be engineered sensitive only to the desired analyte(s) or a small subset of analytes; detection of every analyte would saturate the sensor and make data interpretation untenable. Finally, the SERS sensor has to be a preferable adsorption site in passive sampling applications, whether vapor or liquid. In this paper, EIC Laboratories will discuss modifications to SERS sensors that increase the likelihood of detection of the analyte of interest. We will then demonstrate data collected for TATP, a compound that rapidly decomposes and is undetected on standard silver SERS sensors. With the modified SERS sensor, ROC curves for room temperature TATP vapor detection, detection of TATP in a non equilibrium vapor environment in 30 s, detection of TATP on a sensor exposed to a ventilation duct, and detection of TATP in the presence of fuel components were all created and will be presented herein.

  13. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    PubMed

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  14. Mechanics of evolutionary digit reduction in fossil horses (Equidae).

    PubMed

    McHorse, Brianna K; Biewener, Andrew A; Pierce, Stephanie E

    2017-08-30

    Digit reduction is a major trend that characterizes horse evolution, but its causes and consequences have rarely been quantitatively tested. Using beam analysis on fossilized centre metapodials, we tested how locomotor bone stresses changed with digit reduction and increasing body size across the horse lineage. Internal bone geometry was captured from 13 fossil horse genera that covered the breadth of the equid phylogeny and the spectrum of digit reduction and body sizes, from Hyracotherium to Equus To account for the load-bearing role of side digits, a novel, continuous measure of digit reduction was also established-toe reduction index (TRI). Our results show that without accounting for side digits, three-toed horses as late as Parahippus would have experienced physiologically untenable bone stresses. Conversely, when side digits are modelled as load-bearing, species at the base of the horse radiation through Equus probably maintained a similar safety factor to fracture stress. We conclude that the centre metapodial compensated for evolutionary digit reduction and body mass increases by becoming more resistant to bending through substantial positive allometry in internal geometry. These results lend support to two historical hypotheses: that increasing body mass selected for a single, robust metapodial rather than several smaller ones; and that, as horse limbs became elongated, the cost of inertia from the side toes outweighed their utility for stabilization or load-bearing. © 2017 The Author(s).

  15. Controversies and priorities in amyotrophic lateral sclerosis

    PubMed Central

    Turner, Martin R; Hardiman, Orla; Benatar, Michael; Brooks, Benjamin R; Chio, Adriano; de Carvalho, Mamede; Ince, Paul G; Lin, Cindy; Miller, Robert G; Mitsumoto, Hiroshi; Nicholson, Garth; Ravits, John; Shaw, Pamela J; Swash, Michael; Talbot, Kevin; Traynor, Bryan J; den Berg, Leonard H Van; Veldink, Jan H; Vucic, Steve; Kiernan, Matthew C

    2015-01-01

    Summary Two decades after the discovery that 20% of familial amyotrophic lateral sclerosis (ALS) cases were linked to mutations in the superoxide dismutase-1 (SOD1) gene, a substantial proportion of the remainder of cases of familial ALS have now been traced to an expansion of the intronic hexanucleotide repeat sequence in C9orf72. This breakthrough provides an opportunity to re-evaluate longstanding concepts regarding the cause and natural history of ALS, coming soon after the pathological unification of ALS with frontotemporal dementia through a shared pathological signature of cytoplasmic inclusions of the ubiquitinated protein TDP-43. However, with profound clinical, prognostic, neuropathological, and now genetic heterogeneity, the concept of ALS as one disease appears increasingly untenable. This background calls for the development of a more sophisticated taxonomy, and an appreciation of ALS as the breakdown of a wider network rather than a discrete vulnerable population of specialised motor neurons. Identification of C9orf72 repeat expansions in patients without a family history of ALS challenges the traditional division between familial and sporadic disease. By contrast, the 90% of apparently sporadic cases and incomplete penetrance of several genes linked to familial cases suggest that at least some forms of ALS arise from the interplay of multiple genes, poorly understood developmental, environmental, and age-related factors, as well as stochastic events. PMID:23415570

  16. Greater magnocellular saccadic suppression in high versus low autistic tendency suggests a causal path to local perceptual style

    PubMed Central

    Crewther, David P.; Crewther, Daniel; Bevan, Stephanie; Goodale, Melvyn A.; Crewther, Sheila G.

    2015-01-01

    Saccadic suppression—the reduction of visual sensitivity during rapid eye movements—has previously been proposed to reflect a specific suppression of the magnocellular visual system, with the initial neural site of that suppression at or prior to afferent visual information reaching striate cortex. Dysfunction in the magnocellular visual pathway has also been associated with perceptual and physiological anomalies in individuals with autism spectrum disorder or high autistic tendency, leading us to question whether saccadic suppression is altered in the broader autism phenotype. Here we show that individuals with high autistic tendency show greater saccadic suppression of low versus high spatial frequency gratings while those with low autistic tendency do not. In addition, those with high but not low autism spectrum quotient (AQ) demonstrated pre-cortical (35–45 ms) evoked potential differences (saccade versus fixation) to a large, low contrast, pseudo-randomly flashing bar. Both AQ groups showed similar differential visual evoked potential effects in later epochs (80–160 ms) at high contrast. Thus, the magnocellular theory of saccadic suppression appears untenable as a general description for the typically developing population. Our results also suggest that the bias towards local perceptual style reported in autism may be due to selective suppression of low spatial frequency information accompanying every saccadic eye movement. PMID:27019719

  17. [Cartesian misunderstanding as a cause of therapeutic failure].

    PubMed

    Isler, H

    1986-01-01

    Headache patients disassociate themselves from their own automatic responses, relying on the traditional separation of body and mind. On the other hand, patients who obtain voluntary control of automatic functions by biofeedback training modify not only vegetative but also voluntary behaviour patterns, losing "neurotic" traits. The basic misconception of the separation of body and mind, Cartesian dualism, is now ingrained in our culture. In the 17th century Descartes asserted that concepts applied to the soul must be entirely different from those used for the body in order to improve comprehension of the immortality of the soul. This dualism also led to "enlightenment" and to many later social and philosophical developments. But his basic neurophysiology was obsolete when he wrote it down. Other models from mainstream natural philosophy were better compatible with observation and experiments. Gassendi assumed a "body soul" consisting of energy as the functional principle of the nervous system, and Willis accommodated a series of anticipations of 19th century discoveries within this model. No comparable progress resulted from Descartes' own medieval model. Cartesian dualism has become untenable in view of recent neuropsychology but it still obstructs our management of functional patients. Instead of reinforcing the delusion of separation of psyche and soma, we ought to encourage patients to understand that their malfunctioning organs are on-line with their emotions, and with their mind.

  18. Identifying common traits among Australian irrigators using cluster analysis.

    PubMed

    Kuehne, G; Bjornlund, H; Cheers, B

    2008-01-01

    In Australia there is a growing awareness that the over-allocation of water entitlements to irrigators needs to be reduced so that environmental flow allocations can be increased. This means that some water will need to be acquired from irrigators and returned to the environment. Most current water reform policies assume that irrigators are solely motivated by profit and will be willing sellers of water, but this might be an untenable approach. Authorities will need to consider new ways of encouraging the participation of irrigators in water reform. The main aim of this research was to identify the non-commercial influences acting on irrigators' behaviour, especially the influence of the values that they hold toward family, land, water, community and lifestyle. The study also aimed to investigate whether it is possible to group irrigators according to these values and then use the groupings to describe how these might affect their willingness to participate in environmental reforms. We clustered the irrigators into three groups with differing orientations; (i) Investors [25%]-profit oriented, (ii) Lifestylers [25%]-lifestyle oriented, (iii) Providers [50%]-family-succession oriented. This research indicates that when designing policy instruments to acquire water for environmental purposes policy-makers should pay more attention to the factors influencing irrigators' decision making, especially non-commercial factors. (c) IWA Publishing 2008.

  19. Can one puff really make an adolescent addicted to nicotine? A critical review of the literature

    PubMed Central

    2010-01-01

    Rationale In the past decade, there have been various attempts to understand the initiation and progression of tobacco smoking among adolescents. One line of research on these issues has made strong claims regarding the speed in which adolescents can become physically and mentally addicted to smoking. According to these claims, and in contrast to other models of smoking progression, adolescents can lose autonomy over their smoking behavior after having smoked one puff in their lifetime and never having smoked again, and can become mentally and physically "hooked on nicotine" even if they have never smoked a puff. Objectives To critically examine the conceptual and empirical basis for the claims made by the "hooked on nicotine" thesis. Method We reviewed the major studies on which the claims of the "hooked on nicotine" research program are based. Results The studies we reviewed contained substantive conceptual and methodological flaws. These include an untenable and idiosyncratic definition of addiction, use of single items or of very lenient criteria for diagnosing nicotine dependence, reliance on responders' causal attributions in determining physical and mental addiction to nicotine and biased coding and interpretation of the data. Discussion The conceptual and methodological problems detailed in this review invalidate many of the claims made by the "hooked on nicotine" research program and undermine its contribution to the understanding of the nature and development of tobacco smoking in adolescents. PMID:21067587

  20. Evolutionary continuity and personhood: Legal and therapeutic implications of animal consciousness and human unconsciousness.

    PubMed

    Benvenuti, Anne

    Convergent lines of research in the biological sciences have made obsolete the commonly held assumption that humans are distinct from and superior to all other animals, a development predicted by evolutionary science. Cumulative evidence has both elevated other animals from the status of "dumb brutes" to that of fully sentient and intentional beings and has simultaneously discredited elevated claims of human rationality, intentionality, and freedom from the constraints experienced by other animals. It follows then that any theoretical model in which humans occupy the top of an imagined evolutionary hierarchy is untenable. This simple fact calls for a rethinking of foundational concepts in law and health sciences. A further cultural fallacy that is exposed by these converging lines of scientific evidence is the notion that the subjective inner and abstract dimension of human beings is the most true and valuable level of analysis for organizing human lives. In fact, our individual and collective minds are particularly vulnerable to elaborated false narratives that may be definitive of the particular forms of suffering that humans experience and seek to heal with modalities like psychoanalytic psychotherapies. I conclude with the suggestion that other animals may have the capacity to help us with this healing project, even as we are ethically bound to heal the suffering that we have collectively imposed upon them. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Health rights in the balance: the case against perinatal shackling of women behind bars.

    PubMed

    Dignam, Brett; Adashi, Eli Y

    2014-12-11

    Rationalized for decades on security grounds, perinatal shackling entails the application of handcuffs, leg irons, and/or waist shackles to the incarcerated woman prior to, during, and after labor and delivery. During labor and delivery proper, perinatal shackling may entail chaining women to the hospital bed by the ankle, wrist, or both. Medically untenable, legally challenged, and ever controversial, perinatal shackling remains the standard of practice in most US states despite sustained two-decades-long efforts by health rights legal advocates, human rights organizations, and medical professionals. Herein we review the current statutory, regulatory, legal, and medical framework undergirding the use of restraints on pregnant inmates and explore potential avenues of redress and relief to this challenge. We also recognize the courage of the women whose stories are being told. If history is any guide, the collective thrust of domestic and international law, attendant litigation, dedicated advocacy, and strength of argument bode well for continued progress toward restraint-free pregnancies in correctional settings. Copyright © 2014 Dignam and Adashi. This is an open access article distributed under the terms of the Creative Commons Attribution Non-Commercial License (http://creativecommons.org/licenses/by-nc/3.0/), which permits unrestricted non-commercial use, distribution, and reproduction in any medium, provided the original author and source are credited.

  2. Adjudicating non-knowledge in the Omnibus Autism Proceedings.

    PubMed

    Decoteau, Claire Laurier; Underman, Kelly

    2015-08-01

    After 5600 families of children diagnosed with autism filed claims with the National Vaccine Injury Compensation Program in the United States, the court selected 'test' cases consolidated into the Omnibus Autism Proceedings, held from 2007 to 2008, to examine claims that vaccines caused the development of autism. The court found all of the causation theories presented to be untenable and did not award damages to any parents. We analyze the Omnibus Autism Proceedings as a struggle within the scientific field between the scientific orthodoxy of the respondents and the heterodox position taken by the plaintiffs, suggesting that the ruling in these cases helped to shore up hegemony on autism causation. Drawing on the literature on non-knowledge, we suggest that only the respondents had enough scientific capital to strategically direct non-knowledge toward genetic research, thereby foreclosing the possibility of environmental causation of autism. The plaintiffs, who promote a non-standard ontology of autism, suggest that the science on autism remains undone and should not be circumscribed. In analyzing the Omnibus Autism Proceedings with field theory, we highlight the way in which scientific consensus-building and the setting of research agendas are the result of struggle, and we show that the strategic deployment of non-knowledge becomes a major stake in battles for scientific legitimacy and the settling of scientific controversies.

  3. Introducing Jus ante Bellum as a cosmopolitan approach to humanitarian intervention

    PubMed Central

    Brown, Garrett Wallace; Bohm, Alexandra

    2015-01-01

    Cosmopolitans often argue that the international community has a humanitarian responsibility to intervene militarily in order to protect vulnerable individuals from violent threats and to pursue the establishment of a condition of cosmopolitan justice based on the notion of a ‘global rule of law’. The purpose of this article is to argue that many of these cosmopolitan claims are incomplete and untenable on cosmopolitan grounds because they ignore the systemic and chronic structural factors that underwrite the root causes of these humanitarian threats. By way of examining cosmopolitan arguments for humanitarian military intervention and how systemic problems are further ignored in iterations of the Responsibility to Protect, this article suggests that many contemporary cosmopolitan arguments are guilty of focusing too narrowly on justifying a responsibility to respond to the symptoms of crisis versus demanding a similarly robust justification for a responsibility to alleviate persistent structural causes. Although this article recognizes that immediate principles of humanitarian intervention will, at times, be necessary, the article seeks to draw attention to what we are calling principles of Jus ante Bellum (right before war) and to stress that current cosmopolitan arguments about humanitarian intervention will remain insufficient without the incorporation of robust principles of distributive global justice that can provide secure foundations for a more thoroughgoing cosmopolitan condition of public right. PMID:29708128

  4. Farmer Attitudes and Livestock Disease: Exploring Citizenship Behaviour and Peer Monitoring across Two BVD Control Schemes in the UK.

    PubMed

    Heffernan, Claire; Azbel-Jackson, Lena; Brownlie, Joe; Gunn, George

    2016-01-01

    The eradication of BVD in the UK is technically possible but appears to be socially untenable. The following study explored farmer attitudes to BVD control schemes in relation to advice networks and information sharing, shared aims and goals, motivation and benefits of membership, notions of BVD as a priority disease and attitudes toward regulation. Two concepts from the organisational management literature framed the study: citizenship behaviour where actions of individuals support the collective good (but are not explicitly recognised as such) and peer to peer monitoring (where individuals evaluate other's behaviour). Farmers from two BVD control schemes in the UK participated in the study: Orkney Livestock Association BVD Eradication Scheme and Norfolk and Suffolk Cattle Breeders Association BVD Eradication Scheme. In total 162 farmers participated in the research (109 in-scheme and 53 out of scheme). The findings revealed that group helping and information sharing among scheme members was low with a positive BVD status subject to social censure. Peer monitoring in the form of gossip with regard to the animal health status of other farms was high. Interestingly, farmers across both schemes supported greater regulation with regard to animal health, largely due to the mistrust of fellow farmers following voluntary disease control measures. While group cohesiveness varied across the two schemes, without continued financial inducements, longer-term sustainability is questionable.

  5. Dietary protein and skeletal health: a review of recent human research.

    PubMed

    Kerstetter, Jane E; Kenny, Anne M; Insogna, Karl L

    2011-02-01

    Both dietary calcium and vitamin D are undoubtedly beneficial to skeletal health. In contrast, despite intense investigation, the impact of dietary protein on calcium metabolism and bone balance remains controversial. A widely held view is that high intakes of animal protein result in increased bone resorption, reduced bone mineral density, and increased fractures because of its ability to generate a high fixed metabolic acid load. The purpose of this review is to present the recent or most important epidemiological and clinical trials in humans that evaluated dietary protein's impact on skeletal health. Many epidemiological studies have found a significant positive relationship between protein intake and bone mass or density. Similarly, isotopic studies in humans have also demonstrated greater calcium retention and absorption by individuals consuming high-protein diets, particularly when the calcium content of the diet was limiting. High-protein intake may positively impact bone health by several mechanisms, including calcium absorption, stimulation of the secretion of insulin-like growth factor-1, and enhancement of lean body mass. The concept that an increase in dietary protein induces a large enough shift in systemic pH to increase osteoclastic bone resorption seems untenable. Recent epidemiological, isotopic and meta-analysis studies suggest that dietary protein works synergistically with calcium to improve calcium retention and bone metabolism. The recommendation to intentionally restrict dietary protein to improve bone health is unwarranted, and potentially even dangerous to those individuals who consume inadequate protein.

  6. Alternating carrier models of asymmetric glucose transport violate the energy conservation laws.

    PubMed

    Naftalin, Richard J

    2008-11-01

    Alternating access transporters with high-affinity externally facing sites and low-affinity internal sites relate substrate transit directly to the unliganded asymmetric "carrier" (Ci) distribution. When both bathing solutions contain equimolar concentrations of ligand, zero net flow of the substrate-carrier complex requires a higher proportion of unliganded low-affinity inside sites (proportional, variant 1/KD(in)) and slower unliganded "free" carrier transit from inside to outside than in the reverse direction. However, asymmetric rates of unliganded carrier movement, kij, imply that an energy source, DeltaGcarrier = RT ln (koi/kio) = RT ln (Cin/Cout) = RT ln (KD(in)/KD(out)), where R is the universal gas constant (8.314 Joules/M/K degrees), and T is the temperature, assumed here to be 300 K degrees , sustains the asymmetry. Without this invalid assumption, the constraints of carrier path cyclicity, combined with asymmetric ligand affinities and equimolarity at equilibrium, are irreconcilable, and any passive asymmetric uniporter or cotransporter model system, e.g., Na-glucose cotransporters, espousing this fundamental error is untenable. With glucose transport via GLUT1, the higher maximal rate and Km of net ligand exit compared to net ligand entry is only properly simulated if ligand transit occurs by serial dissociation-association reactions between external high-affinity and internal low-affinity immobile sites. Faster intersite transit rates occur from lower-affinity sites than from higher-affinity sites and require no other energy source to maintain equilibrium. Similar constraints must apply to cotransport.

  7. Fast and flexible gpu accelerated binding free energy calculations within the amber molecular dynamics package.

    PubMed

    Mermelstein, Daniel J; Lin, Charles; Nelson, Gard; Kretsch, Rachael; McCammon, J Andrew; Walker, Ross C

    2018-07-15

    Alchemical free energy (AFE) calculations based on molecular dynamics (MD) simulations are key tools in both improving our understanding of a wide variety of biological processes and accelerating the design and optimization of therapeutics for numerous diseases. Computing power and theory have, however, long been insufficient to enable AFE calculations to be routinely applied in early stage drug discovery. One of the major difficulties in performing AFE calculations is the length of time required for calculations to converge to an ensemble average. CPU implementations of MD-based free energy algorithms can effectively only reach tens of nanoseconds per day for systems on the order of 50,000 atoms, even running on massively parallel supercomputers. Therefore, converged free energy calculations on large numbers of potential lead compounds are often untenable, preventing researchers from gaining crucial insight into molecular recognition, potential druggability and other crucial areas of interest. Graphics Processing Units (GPUs) can help address this. We present here a seamless GPU implementation, within the PMEMD module of the AMBER molecular dynamics package, of thermodynamic integration (TI) capable of reaching speeds of >140 ns/day for a 44,907-atom system, with accuracy equivalent to the existing CPU implementation in AMBER. The implementation described here is currently part of the AMBER 18 beta code and will be an integral part of the upcoming version 18 release of AMBER. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  8. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  9. Wildfire Suppression Costs for Canada under a Changing Climate

    PubMed Central

    Stocks, Brian J.; Gauthier, Sylvie

    2016-01-01

    Climate-influenced changes in fire regimes in northern temperate and boreal regions will have both ecological and economic ramifications. We examine possible future wildfire area burned and suppression costs using a recently compiled historical (i.e., 1980–2009) fire management cost database for Canada and several Intergovernmental Panel on Climate Change (IPCC) climate projections. Area burned was modelled as a function of a climate moisture index (CMI), and fire suppression costs then estimated as a function of area burned. Future estimates of area burned were generated from projections of the CMI under two emissions pathways for four General Circulation Models (GCMs); these estimates were constrained to ecologically reasonable values by incorporating a minimum fire return interval of 20 years. Total average annual national fire management costs are projected to increase to just under $1 billion (a 60% real increase from the 1980–2009 period) under the low greenhouse gas emissions pathway and $1.4 billion (119% real increase from the base period) under the high emissions pathway by the end of the century. For many provinces, annual costs that are currently considered extreme (i.e., occur once every ten years) are projected to become commonplace (i.e., occur once every two years or more often) as the century progresses. It is highly likely that evaluations of current wildland fire management paradigms will be necessary to avoid drastic and untenable cost increases as the century progresses. PMID:27513660

  10. Long-term reliability of the Athabasca River (Alberta, Canada) as the water source for oil sands mining

    PubMed Central

    Sauchyn, David J.; St-Jacques, Jeannine-Marie; Luckman, Brian H.

    2015-01-01

    Exploitation of the Alberta oil sands, the world’s third-largest crude oil reserve, requires fresh water from the Athabasca River, an allocation of 4.4% of the mean annual flow. This allocation takes into account seasonal fluctuations but not long-term climatic variability and change. This paper examines the decadal-scale variability in river discharge in the Athabasca River Basin (ARB) with (i) a generalized least-squares (GLS) regression analysis of the trend and variability in gauged flow and (ii) a 900-y tree-ring reconstruction of the water-year flow of the Athabasca River at Athabasca, Alberta. The GLS analysis removes confounding transient trends related to the Pacific Decadal Oscillation (PDO) and Pacific North American mode (PNA). It shows long-term declining flows throughout the ARB. The tree-ring record reveals a larger range of flows and severity of hydrologic deficits than those captured by the instrumental records that are the basis for surface water allocation. It includes periods of sustained low flow of multiple decades in duration, suggesting the influence of the PDO and PNA teleconnections. These results together demonstrate that low-frequency variability must be considered in ARB water allocation, which has not been the case. We show that the current and projected surface water allocations from the Athabasca River for the exploitation of the Alberta oil sands are based on an untenable assumption of the representativeness of the short instrumental record. PMID:26392554

  11. The mechanism of long phosphorescence of SrAl{sub 2-x}B{sub x}O{sub 4} (0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nag, Abanti; Kutty, T.R.N

    2004-03-01

    The role of B{sub 2}O{sub 3} in realizing the long phosphorescence of Eu(II)+Dy(III) doped strontium aluminates has been investigated. IR and solid state {sup 27}Al MAS NMR spectra show the incorporation of boron as BO{sub 4} in the AlO{sub 4} framework of SrAl{sub 2}O{sub 4} and Sr{sub 4}Al{sub 14}O{sub 25}. Phosphor, made free of glassy phases by washing with hot acetic acid+glycerol, did not show any photoconductivity under UV irradiation, indicating that the mechanism involving hole conduction in valence band is untenable for long phosphorescence. EPR studies confirm the presence of both electron and hole trap centers. Dy{sup 3+} formsmore » substitutional defect complex with borate; [Dy-BO{sub 4}-V{sub Sr}]{sup 2-}, and acts as a hole trap center. The electron centers are formed by the oxygen vacancies associated with BO{sub 3}{sup 3-}, i.e. [BO{sub 3}-V{sub O}]{sup 3-}. Under indigo light or near UV irradiation, the photoinduced electron centers are formed as [BO{sub 3}-V{sub O}(e')]{sup 4-}. The holes are released from [Dy-BO{sub 4}-V{sub Sr}(h{center_dot})]{sup 1-} under thermal excitation at room temperature. The recombination of electrons with holes releases energy which is expended to excite Eu{sup 2+} to induce long phosphorescence.« less

  12. Tuberculosis in elephants-a reemergent disease: diagnostic dilemmas, the natural history of infection, and new immunological tools.

    PubMed

    Maslow, J N; Mikota, S K

    2015-05-01

    Tuberculosis (TB) in elephants has been described since ancient times. However, it was not until 1996 when infection with Mycobacterium tuberculosis was identified in a herd of circus elephants that significant research into this disease began. The epidemiology and natural history of TB were unknown in elephants since there had been no comprehensive screening programs, and diagnostic techniques developed for cervidae and bovidae were of unknown value. And, while precepts of test and slaughter were the norm for cattle and deer, this was considered untenable for an endangered species. With no precedent for the treatment of TB in animals, treatment regimens for elephants were extrapolated from human protocols, which guided changes to the Guidelines for the Control of Tuberculosis in Elephants. In the absence of diagnostic testing to confirm cure in elephants, the efficacy of these treatment regimens is only beginning to be understood as treated elephants die and are examined postmortem. However, because of pressures arising from public relations related to elephant husbandry and the added considerations of TB infection in animals (whether real or imagined), sharing of information to aid in research and treatment has been problematic. Here we review the challenges and successes of the diagnosis of tuberculosis in elephants and discuss the natural history of the disease to put the work of Landolfi et al on the immunological response to tuberculosis in elephants in perspective. © The Author(s) 2015.

  13. How useful is the concept of the 'harm threshold' in reproductive ethics and law?

    PubMed

    Smajdor, Anna

    2014-10-01

    In his book Reasons and Persons, Derek Parfit suggests that people are not harmed by being conceived with a disease or disability if they could not have existed without suffering that particular condition. He nevertheless contends that entities can be harmed if the suffering they experience is sufficiently severe. By implication, there is a threshold which divides harmful from non-harmful conceptions. The assumption that such a threshold exists has come to play a part in UK policy making. I argue that Parfit's distinction between harmful and non-harmful conceptions is untenable. Drawing on Kant's refutation of the ontological argument for God's existence, I suggest that the act of creation cannot be identical with the act of harming-nor indeed of benefiting-however great the offspring's suffering may be. I suggest that Parfit is right that bringing children into existence does not usually harm them, but I argue that this must be applied to all conceptions, since Parfit cannot show how the harm threshold can be operationalised. If we think certain conceptions are unethical or should be illegal, this must be on other grounds than that the child is harmed by them. I show that a Millian approach in this context fails to exemplify the empirical and epistemological advantages which are commonly associated with it, and that harm-based legislation would need to be based on broader harm considerations than those relating to the child who is conceived.

  14. Identifying pollination service hotspots and coldspots using citizen science data from the Great Sunflower Project

    NASA Astrophysics Data System (ADS)

    LeBuhn, G.; Schmucki, R.

    2016-12-01

    Identifying the spatial patterns of pollinator visitation rates is key to identifying the drivers of differences in pollination service and the areas where pollinator conservation will provide the highest return on investment. However, gathering pollinator abundance data at the appropriate regional and national scales is untenable. As a surrogate, habitat models have been developed to identify areas of pollinator losses but these models have been developed using expert opinion based on foraging and nesting requirements. Thousands of citizen scientists across the United States participating in The Great Sunflower Project (www.GreatSunflower.org) contribute timed counts of pollinator visits to a focal sunflower variety planted in local gardens and green spaces. While these data provide a more direct measure of pollination service to a standardized plant and include a measure of effort, the data are complicated. Each location is sampled at different dates, times and frequencies as well as different points across the local flight season. To overcome this complication, we have used a generalized additive model to generate regional flight curves to calibrate each individual data point and to attain better estimates of pollination service at each site. Using these flight season corrected data, we identify hotspots and cold spots in pollinator service across the United States, evaluate the drivers shaping the spatial patterns and observe how these data align with the results obtained from predictive models that are based on expert knowledge on foraging and nesting habitats.

  15. From railway spine to whiplash--the recycling of nervous irritation.

    PubMed

    Ferrari, Robert; Shorter, Edward

    2003-11-01

    The search for a specific structural basis for chronic whiplash and other chronic pain and fatigue syndromes has been in progress for decades, and yet currently there remains no "structural" solution to these enigmata. In light of the failure of research to identify the chronic "damage" or pathology as lying in a muscular, bony, or "connective tissue" sites for many chronic pain syndromes like whiplash, fibromyalgia, et cetera, more recent attention has been paid to nervous system structures. Nerve irritation has been implicated as the basis for the pain and other symptoms that are common to many chronic disability syndromes. We postulate here, however, that the concept of nervous irritation has been prostituted for centuries whenever more concrete structural explanations for chronic pain and other controversial illness have been untenable. We suggest that, after each cycle of nervous irritation as a disease, and subsequent dismissal of the notion, the doctrine of irritation as a disease was too good to go away. First, with the hypersthenic and asthenic diseases of the nineteenth century, then railway spine, whiplash, thoracic outlet syndrome, and now brachial plexus irritation, we detect the same pattern: patients with symptoms, but no objective evidence of nerve disease. Nervous irritation has repeatedly served this purpose for the last 200 years. It is our intent that bringing an understanding of this trend will encourage current clinicians and researchers to appreciate the need to abandon this form of speculation without historical insight when dealing with today's controversial syndromes.

  16. Smart drugs for cognitive enhancement: ethical and pragmatic considerations in the era of cosmetic neurology.

    PubMed

    Cakic, V

    2009-10-01

    Reports in the popular press suggest that smart drugs or "nootropics" such as methylphenidate, modafinil and piracetam are increasingly being used by the healthy to augment cognitive ability. Although current nootropics offer only modest improvements in cognitive performance, it appears likely that more effective compounds will be developed in the future and that their off-label use will increase. One sphere in which the use of these drugs may be commonplace is by healthy students within academia. This article reviews the ethical and pragmatic implications of nootropic use in academia by drawing parallels with issues relevant to the drugs in sport debate. It is often argued that performance-enhancing drugs should be prohibited because they create an uneven playing field. However, this appears dubious given that "unfair" advantages are already ubiquitous and generally tolerated by society. There are concerns that widespread use will indirectly coerce non-users also to employ nootropics in order to remain competitive. However, to restrict the autonomy of all people for fear that it may influence the actions of some is untenable. The use of potentially harmful drugs for the purposes of enhancement rather than treatment is often seen as unjustified, and libertarian approaches generally champion the rights of the individual in deciding if these risks are acceptable. Finally, whether the prohibition of nootropics can be effectively enforced is doubtful. As nootropics use becomes widespread among students in the future, discussion of this issue will become more pressing in the years to come.

  17. Price smarter on the Net.

    PubMed

    Baker, W; Marn, M; Zawada, C

    2001-02-01

    Companies generally have set prices on the Internet in two ways. Many start-ups have offered untenably low prices in a rush to capture first-mover advantage. Many incumbents have simply charged the same prices on-line as they do off-line. Either way, companies are missing a big opportunity. The fundamental value of the Internet lies not in lowering prices or making them consistent but in optimizing them. After all, if it's easy for customers to compare prices on the Internet, it's also easy for companies to track customers' behavior and adjust prices accordingly. The Net lets companies optimize prices in three ways. First, it lets them set and announce prices with greater precision. Different prices can be tested easily, and customers' responses can be collected instantly. Companies can set the most profitable prices, and they can tap into previously hidden customer demand. Second, because it's so easy to change prices on the Internet, companies can adjust prices in response to even small fluctuations in market conditions, customer demand, or competitors' behavior. Third, companies can use the clickstream data and purchase histories that it collects through the Internet to segment customers quickly. Then it can offer segment-specific prices or promotions immediately. By taking full advantage of the unique possibilities afforded by the Internet to set prices with precision, adapt to changing circumstances quickly, and segment customers accurately, companies can get their pricing right. It's one of the ultimate drivers of e-business success.

  18. Approach to the obese adolescent with new-onset diabetes.

    PubMed

    Zeitler, Philip

    2010-12-01

    The prevalence of both type 1 and type 2 diabetes among children and adolescents has been steadily increasing over the last few decades. However, as the general pediatric population becomes more obese and more ethnically diverse, reliance on phenotypic characteristics for distinguishing between these types of diabetes is becoming increasingly untenable. Yet, the recognition of differences in treatment strategies, associated disorders, and both short- and long-term diabetes and cardiovascular outcomes supports the importance of diagnostic efforts to make a distinction between diabetes types. An approach to determination of diabetes type is discussed, focused on the presence or absence of autoimmunity and assessment of β-cell function. At the time of diagnosis, it is generally not possible to be certain of diabetes type, and therefore, initial treatment decisions must be made based on aspects of the presenting physiology, with adjustments in treatment approach made as the individual's course proceeds and additional information becomes available. The apparent overlap between type 1 and type 2 diabetes that occurs in obese adolescents has resulted in some controversy regarding mixed forms of diabetes that are ultimately semantic, but this does raise interesting questions about the treatment of type 1 diabetes in the presence of an insulin-resistant phenotype. Finally, the lack of information about the efficacy of treatment of cardiovascular risk factors, such as dyslipidemia and hypertension, along with the well-documented challenges in adherence to chronic illness treatment in this population, creates substantial challenges.

  19. Comment on [open quotes]Weathering, plants, and the long-term carbon cycle[close quotes] by Robert A. Berner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, T.A.

    1993-05-01

    Berner (1992) has asserted that Jackson and Keller (1970a) misinterpreted the conspicuous reddish crust which forms on young lava flows in areas of rock surface colonised by the lichen Stereocaulon vulcani (but not in adjacent areas of bare rock) in regions of high rain fall on the Island of Hawaii. Jackson (1968) and Jackson and Keller (1970a,b) concluded from the results of a thorough interdisiplinary investigation employing a wide spectrum of techniques and information that his reddish coating, is an intensely leached weathering crust formed in situ, and that biochemical activities of the lichen or its associated microflora not onlymore » accelerate the chemical weathering of the rock by orders of magnitude but also determine the specific mineralogical and chemical properties of the weathering products. Berner, however, maintained that the reddish crust is in reality a deposit of [open quotes]wind-blown soil dust[close quotes] entrapped by a sticky organic substance secreted by the lichen. Berner fixed his attention on just one aspect of the many-sided body of interrelated data on which the conclusions of Jackson and Keller are founded-the observation that the weathering crust is much thicker on lichen-covered rock surfaces than on lichen-free [open quotes]control[close quotes] areas of the same rock. The totality of published evidence overwhelmingly supports the conclusions of Jackson and Keller an demonstrates that Berner's rival hypothesis is untenable.« less

  20. Expert opinion on "best practices" in the delivery of health care services to immigrants in Denmark.

    PubMed

    Jensen, Natasja Koitzsch; Nielsen, Signe Smith; Krasnik, Allan

    2010-08-01

    Delivery of health care to immigrants is an emerging field of interest. Immigrants are frequently characterised by health outcomes that are inferior to those of other groups with regard to morbidity and mortality. In addition, health professionals report difficulties associated with the encounter with immigrant patients. A Delphi process with eight Danish experts from the field of immigrant health was performed as part of an EU project. The objective of the Delphi process was to investigate expert opinion on "best practice in the delivery of healthcare to immigrants". Initially, 60 factors were suggested by the experts. Next, these factors were summarised into 32 factors that the experts were invited to rate and, if possible, agree on. The top 11 factors identified in the Delphi process were access to interpreters, quality of interpretation, ensuring medication compliance, having sufficient consultation time, coherence of offers, interdisciplinary collaboration, allocation of resources, the role of the practitioner, acknowledgement of the individual patient, education of health professionals and students and access to telephone interpretation to supplement other services. The Delphi process can be a valuable tool in the investigation of expert opinion and may thereby help to guide future policy directives. In the light of the importance experts placed on access to interpreters and on the quality of the interpretation services offered, it seems as an untenable strategy to introduce as from June 2011 self-payment for interpretation services provided to immigrants who have stayed in the country for more than seven years.

  1. Application of Stem Cell Technology in Dental Regenerative Medicine.

    PubMed

    Feng, Ruoxue; Lengner, Chistopher

    2013-07-01

    In this review, we summarize the current literature regarding the isolation and characterization of dental tissue-derived stem cells and address the potential of these cell types for use in regenerative cell transplantation therapy. Looking forward, platforms for the delivery of stem cells via scaffolds and the use of growth factors and cytokines for enhancing dental stem cell self-renewal and differentiation are discussed. We aim to understand the developmental origins of dental tissues in an effort to elucidate the molecular pathways governing the genesis of somatic dental stem cells. The advantages and disadvantages of several dental stem cells are discussed, including the developmental stage and specific locations from which these cells can be purified. In particular, stem cells from human exfoliated deciduous teeth may act as a very practical and easily accessibly reservoir for autologous stem cells and hold the most value in stem cell therapy. Dental pulp stem cells and periodontal ligament stem cells should also be considered for their triple lineage differentiation ability and relative ease of isolation. Further, we address the potentials and limitations of induced pluripotent stem cells as a cell source in dental regenerative. From an economical and a practical standpoint, dental stem cell therapy would be most easily applied in the prevention of periodontal ligament detachment and bone atrophy, as well as in the regeneration of dentin-pulp complex. In contrast, cell-based tooth replacement due to decay or other oral pathology seems, at the current time, an untenable approach.

  2. Impact production of NO and reduced species

    NASA Technical Reports Server (NTRS)

    Zahnle, K.; Kasting, J.; Sleep, N.

    1988-01-01

    It has recently been suggested that a reported spike in seawater (87)Sr/(86)Sr at the K-T boundary is the signature of an impact-generated acid deluge. However, the amount of acid required is implausibly large. Some about 3 x 10 to the 15th power moles of Sr must be weathered from silicates to produce the inferred Sr spike. The amount of acid required is at least 100 and probably 1000 times greater. Production of 3 x 10 to the 18th power moles of NO is clearly untenable. The atmosphere presently contains only 1.4 x 10 to the 20th power moles of N-sub 2 and 3.8 x 10 to the 19th power moles of O sub 2 If the entire atmosphere were shocked to 2000 K and cooled within a second, the total NO produced would be about 3 x 10 to the 18th power moles. This is obviously unrealistic. A (still to short) cooling time of 10th to the 3rd power sec reduces NO production by an order of magnitude. In passing, we note that if the entire atmosphere had in fact been shocked to 2000 K, acid rain would have been the least of a dinosaur's problems. Acid rain as a mechanism poses poses other difficulties. Recently deposited carbonates would have been most susceptable to acid attack. The researchers' preferred explanation is simply increased continental erosion following ecological trauma, coupled with enchanced levels of CO-sub 2.

  3. Hadronic and nuclear interactions in QCD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Despite the evidence that QCD - or something close to it - gives a correct description of the structure of hadrons and their interactions, it seems paradoxical that the theory has thus far had very little impact in nuclear physics. One reason for this is that the application of QCD to distances larger than 1 fm involves coherent, non-perturbative dynamics which is beyond present calculational techniques. For example, in QCD the nuclear force can evidently be ascribed to quark interchange and gluon exchange processes. These, however, are as complicated to analyze from a fundamental point of view as is themore » analogous covalent bond in molecular physics. Since a detailed description of quark-quark interactions and the structure of hadronic wavefunctions is not yet well-understood in QCD, it is evident that a quantitative first-principle description of the nuclear force will require a great deal of theoretical effort. Another reason for the limited impact of QCD in nuclear physics has been the conventional assumption that nuclear interactions can for the most part be analyzed in terms of an effective meson-nucleon field theory or potential model in isolation from the details of short distance quark and gluon structure of hadrons. These lectures, argue that this view is untenable: in fact, there is no correspondence principle which yields traditional nuclear physics as a rigorous large-distance or non-relativistic limit of QCD dynamics. On the other hand, the distinctions between standard nuclear physics dynamics and QCD at nuclear dimensions are extremely interesting and illuminating for both particle and nuclear physics.« less

  4. Impact of concomitant trauma in the management of blunt splenic injuries.

    PubMed

    Lo, Albert; Matheson, Anne-Marie; Adams, Dave

    2004-09-10

    Conservative management of isolated blunt splenic injuries has become widely accepted for haemodynamically stable patients, but may be untenable in those with multiple injuries. A retrospective review was performed to evaluate of our cumulative experience with non-operative management of splenic injuries, and to identify the risk factors for operative management. Eighty patients were identified. Demographics, mechanism of injury, injury severity score (ISS), clinical signs at presentation, utility of computed tomography scans and methods of treatment (operative management vs conservative management) were documented and statistically analysed to identify predictors for operative management. Initially, 45 patients (56%) were managed without operation, while 35 patients underwent urgent laparotomy - with 26 (74% in operative group) of these having splenectomy performed. Two patients (out of 45) failed conservative management and required delayed splenectomy, a 96% success rate for intended conservative management. Thus, overall rates of 54% non-operative management and 65% splenic conservation were achieved. The mean ISS of the operative management group (ISS=30) was higher than that of the non-operative treatment group (ISS=13, p<0.05), reflecting not only the grade of the splenic injury but also the severity of concomitant trauma. Risk factors for patients with blunt splenic injuries requiring operative management include ISS > or =16, hypotension, GCS < or =13, and requirement for blood transfusion (p<0.05). Appropriate patient selection is the most important element of non-operative management. Patients with splenic injuries who are haemodynamically stable can be managed non-operatively with acceptable outcome. However, in the presence of concomitant trauma, there is an increasing trend towards operative management.

  5. Thalidomide induces apoptosis in undifferentiated human induced pluripotent stem cells.

    PubMed

    Tachikawa, Saoko; Nishimura, Toshinobu; Nakauchi, Hiromitsu; Ohnuma, Kiyoshi

    2017-10-01

    Thalidomide, which was formerly available commercially to control the symptoms of morning sickness, is a strong teratogen that causes fetal abnormalities. However, the mechanism of thalidomide teratogenicity is not fully understood; thalidomide toxicity is not apparent in rodents, and the use of human embryos is ethically and technically untenable. In this study, we designed an experimental system featuring human-induced pluripotent stem cells (hiPSCs) to investigate the effects of thalidomide. These cells exhibit the same characteristics as those of epiblasts originating from implanted fertilized ova, which give rise to the fetus. Therefore, theoretically, thalidomide exposure during hiPSC differentiation is equivalent to that in the human fetus. We examined the effects of thalidomide on undifferentiated hiPSCs and early-differentiated hiPSCs cultured in media containing bone morphogenetic protein-4, which correspond, respectively, to epiblast (future fetus) and trophoblast (future extra-embryonic tissue). We found that only the number of undifferentiated cells was reduced. In undifferentiated cells, application of thalidomide increased the number of apoptotic and dead cells at day 2 but not day 4. Application of thalidomide did not affect the cell cycle. Furthermore, immunostaining and flow cytometric analysis revealed that thalidomide exposure had no effect on the expression of specific markers of undifferentiated and early trophectodermal differentiated cells. These results suggest that the effect of thalidomide was successfully detected in our experimental system and that thalidomide eliminated a subpopulation of undifferentiated hiPSCs. This study may help to elucidate the mechanisms underlying thalidomide teratogenicity and reveal potential strategies for safely prescribing this drug to pregnant women.

  6. Surgical referral coordination from a first-level hospital: a prospective case study from rural Nepal.

    PubMed

    Fleming, Matthew; King, Caroline; Rajeev, Sindhya; Baruwal, Ashma; Schwarz, Dan; Schwarz, Ryan; Khadka, Nirajan; Pande, Sami; Khanal, Sumesh; Acharya, Bibhav; Benton, Adia; Rogers, Selwyn O; Panizales, Maria; Gyorki, David; McGee, Heather; Shaye, David; Maru, Duncan

    2017-09-25

    Patients in isolated rural communities typically lack access to surgical care. It is not feasible for most rural first-level hospitals to provide a full suite of surgical specialty services. Comprehensive surgical care thus depends on referral systems. There is minimal literature, however, on the functioning of such systems. We undertook a prospective case study of the referral and care coordination process for cardiac, orthopedic, plastic, gynecologic, and general surgical conditions at a district hospital in rural Nepal from 2012 to 2014. We assessed the referral process using the World Health Organization's Health Systems Framework. We followed the initial 292 patients referred for surgical services in the program. 152 patients (52%) received surgery and four (1%) suffered a complication (three deaths and one patient reported complication). The three most common types of surgery performed were: orthopedics (43%), general (32%), and plastics (10%). The average direct and indirect cost per patient referred, including food, transportation, lodging, medications, diagnostic examinations, treatments, and human resources was US$840, which was over 1.5 times the local district's per capita income. We identified and mapped challenges according to the World Health Organization's Health Systems Framework. Given the requirement of intensive human capital, poor quality control of surgical services, and the overall costs of the program, hospital leadership decided to terminate the referral coordination program and continue to build local surgical capacity. The results of our case study provide some context into the challenges of rural surgical referral systems. The high relative costs to the system and challenges in accountability rendered the program untenable for the implementing organization.

  7. Systematic Reviews of Animal Models: Methodology versus Epistemology

    PubMed Central

    Greek, Ray; Menache, Andre

    2013-01-01

    Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions. PMID:23372426

  8. Weak, strong, and coherent regimes of Fröhlich condensation and their applications to terahertz medicine and quantum consciousness.

    PubMed

    Reimers, Jeffrey R; McKemmish, Laura K; McKenzie, Ross H; Mark, Alan E; Hush, Noel S

    2009-03-17

    In 1968, Fröhlich showed that a driven set of oscillators can condense with nearly all of the supplied energy activating the vibrational mode of lowest frequency. This is a remarkable property usually compared with Bose-Einstein condensation, superconductivity, lasing, and other unique phenomena involving macroscopic quantum coherence. However, despite intense research, no unambiguous example has been documented. We determine the most likely experimental signatures of Fröhlich condensation and show that they are significant features remote from the extraordinary properties normally envisaged. Fröhlich condensates are classified into 3 types: weak condensates in which profound effects on chemical kinetics are possible, strong condensates in which an extremely large amount of energy is channeled into 1 vibrational mode, and coherent condensates in which this energy is placed in a single quantum state. Coherent condensates are shown to involve extremely large energies, to not be produced by the Wu-Austin dynamical Hamiltonian that provides the simplest depiction of Fröhlich condensates formed using mechanically supplied energy, and to be extremely fragile. They are inaccessible in a biological environment. Hence the Penrose-Hameroff orchestrated objective-reduction model and related theories for cognitive function that embody coherent Fröhlich condensation as an essential element are untenable. Weak condensates, however, may have profound effects on chemical and enzyme kinetics, and may be produced from biochemical energy or from radio frequency, microwave, or terahertz radiation. Pokorný's observed 8.085-MHz microtubulin resonance is identified as a possible candidate, with microwave reactors (green chemistry) and terahertz medicine appearing as other feasible sources.

  9. Relevance of the Flexner Report to contemporary medical education in South Asia.

    PubMed

    Amin, Zubair; Burdick, William P; Supe, Avinash; Singh, Tejinder

    2010-02-01

    A century after the publication of Medical Education in the United States and Canada: A Report to the Carnegie Foundation for the Advancement of Teaching (the Flexner Report), the quality of medical education in much of Asia is threatened by weak regulation, inadequate public funding, and explosive growth of private medical schools. Competition for students' fees and an ineffectual accreditation process have resulted in questionable admission practices, stagnant curricula, antiquated learning methods, and dubious assessment practices. The authors' purpose is to explore the relevance of Flexner's observations, as detailed in his report, to contemporary medical education in South Asia, to analyze the consequences of growth, and to recommend pragmatic changes. Major drivers for growth are the supply-demand mismatch for medical school positions, weak governmental regulation, private sector participation, and corruption. The consequences are urban-centric growth, shortage of qualified faculty, commercialization of postgraduate education, untenable assessment practices, emphasis on rote learning, and inadequate clinical exposure. Recommendations include strengthening accreditation standards and processes possibly by introducing regional or national student assessment, developing defensible student assessment systems, recognizing health profession education as a field of scholarship, and creating a tiered approach to faculty development in education. The relevance of Flexner's recommendations to the current status of medical education in South Asia is striking, in terms of both the progressive nature of his thinking in 1910 and the need to improve medical education in Asia today. In a highly connected world, the improvement of Asian medical education will have a global impact.

  10. Putting the "ecology" into environmental flows: ecological dynamics and demographic modelling.

    PubMed

    Shenton, Will; Bond, Nicholas R; Yen, Jian D L; Mac Nally, Ralph

    2012-07-01

    There have been significant diversions of water from rivers and streams around the world; natural flow regimes have been perturbed by dams, barriers and excessive extractions. Many aspects of the ecological 'health' of riverine systems have declined due to changes in water flows, which has stimulated the development of thinking about the maintenance and restoration of these systems, which we refer to as environmental flow methodologies (EFMs). Most existing EFMs cannot deliver information on the population viability of species because they: (1) use habitat suitability as a proxy for population status; (2) use historical time series (usually of short duration) to forecast future conditions and flow sequences; (3) cannot, or do not, handle extreme flow events associated with climate variability; and (4) assume process stationarity for flow sequences, which means the past sequences are treated as good indicators of the future. These assumptions undermine the capacity of EFMs to properly represent risks associated with different flow management options; assumption (4) is untenable given most climate-change predictions. We discuss these concerns and advocate the use of demographic modelling as a more appropriate tool for linking population dynamics to flow regime change. A 'meta-species' approach to demographic modelling is discussed as a useful step from habitat based models towards modelling strategies grounded in ecological theory when limited data are available on flow-demographic relationships. Data requirements of demographic models will undoubtedly expose gaps in existing knowledge, but, in so doing, will strengthen future efforts to link changes in river flows with their ecological consequences.

  11. Memory as embodiment: The case of modality and serial short-term memory.

    PubMed

    Macken, Bill; Taylor, John C; Kozlov, Michail D; Hughes, Robert W; Jones, Dylan M

    2016-10-01

    Classical explanations for the modality effect-superior short-term serial recall of auditory compared to visual sequences-typically recur to privileged processing of information derived from auditory sources. Here we critically appraise such accounts, and re-evaluate the nature of the canonical empirical phenomena that have motivated them. Three experiments show that the standard account of modality in memory is untenable, since auditory superiority in recency is often accompanied by visual superiority in mid-list serial positions. We explain this simultaneous auditory and visual superiority by reference to the way in which perceptual objects are formed in the two modalities and how those objects are mapped to speech motor forms to support sequence maintenance and reproduction. Specifically, stronger obligatory object formation operating in the standard auditory form of sequence presentation compared to that for visual sequences leads both to enhanced addressability of information at the object boundaries and reduced addressability for that in the interior. Because standard visual presentation does not lead to such object formation, such sequences do not show the boundary advantage observed for auditory presentation, but neither do they suffer loss of addressability associated with object information, thereby affording more ready mapping of that information into a rehearsal cohort to support recall. We show that a range of factors that impede this perceptual-motor mapping eliminate visual superiority while leaving auditory superiority unaffected. We make a general case for viewing short-term memory as an embodied, perceptual-motor process. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Does the history of food energy units suggest a solution to "Calorie confusion"?

    PubMed

    Hargrove, James L

    2007-12-17

    The Calorie (kcal) of present U.S. food labels is similar to the original French definition of 1825. The original published source (now available on the internet) defined the Calorie as the quantity of heat needed to raise the temperature of 1 kg of water from 0 to 1 degrees C. The Calorie originated in studies concerning fuel efficiency for the steam engine and had entered dictionaries by 1840. It was the only energy unit in English dictionaries available to W.O. Atwater in 1887 for his popular articles on food and tables of food composition. Therefore, the Calorie became the preferred unit of potential energy in nutrition science and dietetics, but was displaced when the joule, g-calorie and kcal were introduced. This article will explain the context in which Nicolas Clément-Desormes defined the original Calorie and the depth of his collaboration with Sadi Carnot. It will review the history of other energy units and show how the original Calorie was usurped during the period of international standardization. As a result, no form of the Calorie is recognized as an SI unit. It is untenable to continue to use the same word for different thermal units (g-calorie and kg-calorie) and to use different words for the same unit (Calorie and kcal). The only valid use of the Calorie is in common speech and public nutrition education. To avoid ongoing confusion, scientists should complete the transition to the joule and cease using kcal in any context.

  13. Planning For Multiple NASA Missions With Use Of Enabling Radioisotope Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S.G. Johnson; K.L. Lively; C.C. Dwight

    Since the early 1960’s the Department of Energy (DOE) and its predecessor agencies have provided radioisotope power systems (RPS) to NASA as an enabling technology for deep space and various planetary missions. They provide reliable power in situations where solar and/or battery power sources are either untenable or would place an undue mass burden on the mission. In the modern era of the past twenty years there has been no time that multiple missions have been considered for launching from Kennedy Space Center (KSC) during the same year. The closest proximity of missions that involved radioisotope power systems would bemore » that of Galileo (October 1989) and Ulysses (October 1990). The closest that involved radioisotope heater units would be the small rovers Spirit and Opportunity (May and July 2003) used in the Mars Exploration Rovers (MER) mission. It can be argued that the rovers sent to Mars in 2003 were essentially a special case since they staged in the same facility and used a pair of small launch vehicles (Delta II). This paper examines constraints on the frequency of use of radioisotope power systems with regard to launching them from Kennedy Space Center using currently available launch vehicles. This knowledge may be useful as NASA plans for its future deep space or planetary missions where radioisotope power systems are used as an enabling technology. Previous descriptions have focused on single mission chronologies and not analyzed the timelines with an emphasis on multiple missions.« less

  14. Circumcision Is Unethical and Unlawful.

    PubMed

    Svoboda, J Steven; Adler, Peter W; Van Howe, Robert S

    2016-06-01

    The foreskin is a complex structure that protects and moisturizes the head of the penis, and, being the most densely innervated and sensitive portion of the penis, is essential to providing the complete sexual response. Circumcision-the removal of this structure-is non-therapeutic, painful, irreversible surgery that also risks serious physical injury, psychological sequelae, and death. Men rarely volunteer for it, and increasingly circumcised men are expressing their resentment about it.Circumcision is usually performed for religious, cultural and personal reasons. Early claims about its medical benefits have been proven false. The American Academy of Pediatrics and the Centers for Disease Prevention and Control have made many scientifically untenable claims promoting circumcision that run counter to the consensus of Western medical organizations.Circumcision violates the cardinal principles of medical ethics, to respect autonomy (self-determination), to do good, to do no harm, and to be just. Without a clear medical indication, circumcision must be deferred until the child can provide his own fully informed consent.In 2012, a German court held that circumcision constitutes criminal assault. Under existing United States law and international human rights declarations as well, circumcision already violates boys› absolute rights to equal protection, bodily integrity, autonomy, and freedom to choose their own religion. A physician has a legal duty to protect children from unnecessary interventions. Physicians who obtain parental permission through spurious claims or omissions, or rely on the American Academy of Pediatrics' position, also risk liability for misleading parents about circumcision. © 2016 American Society of Law, Medicine & Ethics.

  15. Practical considerations in medical cannabis administration and dosing.

    PubMed

    MacCallum, Caroline A; Russo, Ethan B

    2018-03-01

    Cannabis has been employed medicinally throughout history, but its recent legal prohibition, biochemical complexity and variability, quality control issues, previous dearth of appropriately powered randomised controlled trials, and lack of pertinent education have conspired to leave clinicians in the dark as to how to advise patients pursuing such treatment. With the advent of pharmaceutical cannabis-based medicines (Sativex/nabiximols and Epidiolex), and liberalisation of access in certain nations, this ignorance of cannabis pharmacology and therapeutics has become untenable. In this article, the authors endeavour to present concise data on cannabis pharmacology related to tetrahydrocannabinol (THC), cannabidiol (CBD) et al., methods of administration (smoking, vaporisation, oral), and dosing recommendations. Adverse events of cannabis medicine pertain primarily to THC, whose total daily dose-equivalent should generally be limited to 30mg/day or less, preferably in conjunction with CBD, to avoid psychoactive sequelae and development of tolerance. CBD, in contrast to THC, is less potent, and may require much higher doses for its adjunctive benefits on pain, inflammation, and attenuation of THC-associated anxiety and tachycardia. Dose initiation should commence at modest levels, and titration of any cannabis preparation should be undertaken slowly over a period of as much as two weeks. Suggestions are offered on cannabis-drug interactions, patient monitoring, and standards of care, while special cases for cannabis therapeutics are addressed: epilepsy, cancer palliation and primary treatment, chronic pain, use in the elderly, Parkinson disease, paediatrics, with concomitant opioids, and in relation to driving and hazardous activities. Copyright © 2018 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  16. Diabetes benefit management: evolving strategies for payers.

    PubMed

    Tzeel, Albert L

    2011-11-01

    Over the next quarter century, the burden of type 2 diabetes mellitus (T2DM) is expected to at least double. Currently, 1 in every 10 healthcare dollars is spent on diabetes management; by 2050, it has been projected that the annual costs of managing T2DM will rise to $336 billion. Without substantial, systemic changes, T2DM management costs will lead to a potentially untenable strain on the healthcare system. However, the appropriate management of diabetes can reduce associated mortality and delay comorbidities. In addition, adequate glycemic control can improve patient outcomes and significantly reduce diabetes-related complications. This article provides an overview of key concepts associated with a value-based insurance design (VBID) approach to T2DM coverage. By promoting the use of services or treatments that provide high benefits relative to cost, and by alternatively discouraging patients from utilizing services whose benefits do not justify their cost, VBID improves the quality of healthcare while simultaneously reining in spending. VBID initiatives tend to focus on chronic disease management and generally target prescription drug use. However, some programs have expanded their scope by incorporating services traditionally offered by wellness and disease management programs. The concept of VBID is growing, and it is increasingly being implemented by a diverse and growing number of public and private entities, including pharmacy benefit managers, health plans, and employers. This article provides key background on VBID strategies, with a focus on T2DM management. It also provides a road map for health plans seeking to implement VBID as part of their programs.

  17. Optimizing diabetes management: managed care strategies.

    PubMed

    Tzeel, E Albert

    2013-06-01

    Both the prevalence of type 2 diabetes mellitus (DM) and its associated costs have been rising over time and are projected to continue to escalate. Therefore, type 2 DM (T2DM) management costs represent a potentially untenable strain on the healthcare system unless substantial, systemic changes are made. Managed care organizations (MCOs) are uniquely positioned to attempt to make the changes necessary to reduce the burdens associated with T2DM by developing policies that align with evidence-based DM management guidelines and other resources. For example, MCOs can encourage members to implement healthy lifestyle choices, which have been shown to reduce DM-associated mortality and delay comorbidities. In addition, MCOs are exploring the strengths and weaknesses of several different benefit plan designs. Value-based insurance designs, sometimes referred to as value-based benefit designs, use both direct and indirect data to invest in incentives that change behaviors through health information technologies, communications, and services to improve health, productivity, quality, and financial trends. Provider incentive programs, sometimes referred to as "pay for performance," represent a payment/delivery paradigm that places emphasis on rewarding value instead of volume to align financial incentives and quality of care. Accountable care organizations emphasize an alignment between reimbursement and implementation of best practices through the use of disease management and/ or clinical pathways and health information technologies. Consumer-directed health plans, or high-deductible health plans, combine lower premiums with high annual deductibles to encourage members to seek better value for health expenditures. Studies conducted to date on these different designs have produced mixed results.

  18. "This base stallion trade": he-whores and male sexuality on the early modern stage.

    PubMed

    Panek, Jennifer

    2010-01-01

    Recent scholarship on early modern male sexuality has stressed the threat that sexual relations with women were believed to pose to manhood. Focusing on such plays as Middleton's Your Five Gallants (c. 1608), Fletcher and Massinger's The Custom of The Country (c.1620), and Davenant's The Just Italian (1630), this paper analyzes representations of male prostitutes for women to argue that cultural attitudes toward male sexual performance were more complex and self-contradictory than generally acknowledged. The patriarchal codes that warned against effeminating sexual desire and advocated parsimonious seminal “spending” are undermined by their own inherent corollary: the most masculine man is one who can demonstrate unlimited seminal capacity. Furthermore, it has been posited that the early modern period marked the beginning of a shift from “reproductive” to “performative” constructions of manhood, in which the manhood-affirming aspects of male sexuality gradually became unmoored from their traditional association with bloodlines and attached instead to penetrative sexual conquest. The class implications of this shift inform patriarchal anxieties about the superior sexual stamina of servant-class men and their bodily “service” to elite women. Representing a fantasy of empowering male sexuality that relies on detaching virile performance from effeminating desire—a physiologically absurd notion—and on providing sexual “service” while leaving intact both class and gender hierarchies, a successful he-whore like Middleton's Tailby or Davenant's Sciolto playfully challenges the dictates of patriarchal masculinity by fulfilling them in absurd and unorthodox ways. Ultimately, he illuminates just how untenable those dictates might be.

  19. What if Finding Data was as Easy as Subscribing to the News?

    NASA Astrophysics Data System (ADS)

    Duerr, R. E.

    2011-12-01

    Data are the "common wealth of humanity," the fuel that drives the sciences; but much of the data that exist are inaccessible, buried in one of numerous stove-piped data systems, or entirely hidden unless you have direct knowledge of and contact with the investigator that acquired them. Much of the "wealth" is squandered and overall scientific progress inhibited, a situation that is becoming increasingly untenable with the openness required by data-driven science. What are needed are simple interoperability protocols and advertising mechanisms that allow data from disparate data systems to be easily discovered, explored, and accessed. The tools must be simple enough that individual investigators can use them without IT support. The tools cannot rely on centralized repositories or registries but must enable the development of ad-hoc or special purpose aggregations of data and services tailored to individual community needs. In addition, the protocols must scale to support the discovery of and access to the holdings of the global, interdisciplinary community, be they individual investigators or major data centers. NSIDC, in conjunction with other members of the Federation of Earth Science Information Partners and the Polar Information Commons, are working on just such a suite of tools and protocols. In this talk, I discuss data and service casting, aggregation, data badging, and OpenSearch - a suite of tools and protocols which, when used in conjunction with each other, have the potential of completely changing the way that data and services worldwide are discovered and used.

  20. Is steady-state capitalism viable? A review of the issues and an answer in the affirmative.

    PubMed

    Lawn, Philip

    2011-02-01

    Most ecological economists believe that the transition to a steady-state economy is necessary to ensure ecological sustainability and to maximize a nation's economic welfare. While some observers agree with the necessity of the steady-state economy, they are nonetheless critical of the suggestion made by ecological economists-in particular, Herman Daly-that a steady-state economy is compatible with a capitalist system. First, they believe that steady-state capitalism is based on the untenable assumption that growth is an optional rather than in-built element of capitalism. Second, they argue that capitalist notions of efficient resource allocation are too restrictive to facilitate the transition to an "ecological" or steady-state economy. I believe these observers are outright wrong with their first criticism and, because they misunderstand Daly's vision of a steady-state economy, are misplaced with their second criticism. The nature of a capitalist system depends upon the institutional framework that supports and shapes it. Hence, a capitalist system can exist in a wide variety of forms. Unfortunately, many observers fail to recognize that the current "growth imperative" is the result of capitalist systems everywhere being institutionally designed to grow. They need not be designed this way to survive and thrive. Indeed, because continued growth is both existentially undesirable and ecologically unsustainable, redesigning capitalist systems through the introduction of Daly-like institutions would prove to be capitalism's savior. What's more, it would constitute humankind's best hope of achieving sustainable development. © 2011 New York Academy of Sciences.

  1. European Society of Veterinary Cardiology screening guidelines for dilated cardiomyopathy in Doberman Pinschers.

    PubMed

    Wess, G; Domenech, O; Dukes-McEwan, J; Häggström, J; Gordon, S

    2017-10-01

    Dilated cardiomyopathy (DCM) is the most common cardiac disease in large breed dogs and is inherited in Doberman Pinschers with a high prevalence (58%). The European Society for Veterinary Cardiology convened a task force to formulate screening guidelines for DCM in Dobermans. Screening for occult DCM in Dobermans should start at three years of age and use both Holter monitoring and echocardiography. Yearly screening over the life of the dog is recommended, as a one-time screening is not sufficient to rule out future development of DCM. The preferred echocardiographic method is the measurement of the left ventricular volume by Simpson's method of discs (SMOD). Less than 50 single ventricular premature complexes (VPCs) in 24 h are considered to be normal in Dobermans, although detection of any number of VPCs is cause for concern. Greater than 300 VPCs in 24 h or two subsequent recordings within a year showing between 50 and 300 VPCs in 24 h is considered diagnostic of occult DCM in Dobermans regardless of the concurrent echocardiographic findings. The guidelines also provide recommendations concerning ancillary tests, that are not included in the standard screening protocol, but which may have some utility when recommended tests are not available or financially untenable on an annual basis. These tests include assay of cardiac biomarkers (Troponin I and N-Terminal pro-B-type Natriuretic Peptide) as well as a 5-min resting electrocardiogram (ECG). The current guidelines should help to establish an early diagnosis of DCM in Dobermans. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  2. Putting the "Ecology" into Environmental Flows: Ecological Dynamics and Demographic Modelling

    NASA Astrophysics Data System (ADS)

    Shenton, Will; Bond, Nicholas R.; Yen, Jian D. L.; Mac Nally, Ralph

    2012-07-01

    There have been significant diversions of water from rivers and streams around the world; natural flow regimes have been perturbed by dams, barriers and excessive extractions. Many aspects of the ecological `health' of riverine systems have declined due to changes in water flows, which has stimulated the development of thinking about the maintenance and restoration of these systems, which we refer to as environmental flow methodologies (EFMs). Most existing EFMs cannot deliver information on the population viability of species because they: (1) use habitat suitability as a proxy for population status; (2) use historical time series (usually of short duration) to forecast future conditions and flow sequences; (3) cannot, or do not, handle extreme flow events associated with climate variability; and (4) assume process stationarity for flow sequences, which means the past sequences are treated as good indicators of the future. These assumptions undermine the capacity of EFMs to properly represent risks associated with different flow management options; assumption (4) is untenable given most climate-change predictions. We discuss these concerns and advocate the use of demographic modelling as a more appropriate tool for linking population dynamics to flow regime change. A `meta-species' approach to demographic modelling is discussed as a useful step from habitat based models towards modelling strategies grounded in ecological theory when limited data are available on flow-demographic relationships. Data requirements of demographic models will undoubtedly expose gaps in existing knowledge, but, in so doing, will strengthen future efforts to link changes in river flows with their ecological consequences.

  3. Commentary on fast atmospheric pulsations. Technical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vampola, A.L.

    A recent paper proposed that Fast Atmospheric Light Pulsations (FAPs), which have been observed at L=1.5-2.2 in the northern hemisphere, are optical signatures of >2-MeV electrons associated with Lightning-induced Electron Precipitation (LEP) events produced by lightning strokes in the southern hemisphere. FAPs cannot be produced by >2-MeV electrons in the inner radiation belt because the upper limit for fluxes of such particles is only about 0.2% of the value that was used in the analysis and would lead to an unrealistically short electron lifetime. The discrepancy comes from using an electron model, AE-2, which included the Starfish fission electrons. Latermore » inner-zone electron environment models show the inner-zone to have negligible fluxes of electrons in excess of 2 MeV. The use of a model in which southern hemisphere lightning strokes result in northern hemisphere FAPs via a cyclotron mode interaction between magnetospheric electrons and lightning generated waves is also untenable because it would result in FAP intensities two orders of magnitude greater in the southern hemisphere than in the northern hemisphere, leading to a further two orders of magnitude reduction in estimated inner-zone electron lifetimes. The estimated light intensity of FAPs is within acceptable bounds compared to the lifetime of inner zone electrons if all electrons above 100 keV contribute to the light production, if southern hemisphere FAP intensity is no greater than the FAP intensity observed in the northern hemisphere, and if the light-production efficiency is of the order of .001.« less

  4. Informed consent: Enforcing pharmaceutical companies' obligations abroad.

    PubMed

    Lee, Stacey B

    2010-06-15

    The past several years have seen an evolution in the obligations of pharmaceutical companies conducting clinical trials abroad. Key players, such as international human rights organizations, multinational pharmaceutical companies, the United States government and courts, and the media, have played a significant role in defining these obligations. This article examines how such obligations have developed through the lens of past, present, and future recommendations for informed consent protections. In doing so, this article suggests that, no matter how robust obligations appear, they will continue to fall short of providing meaningful protection until they are accompanied by a substantive enforcement mechanism that holds multinational pharmaceutical companies accountable for their conduct. Issues of national sovereignty, particularly in the United States, will continue to prevent meaningful enforcement by an international tribunal or through one universally adopted code of ethics. This article argues that, rather than continuing to pursue an untenable international approach, the Alien Torts Statute (ATS) offers a viable enforcement mechanism, at least for US-based pharmaceutical companies. Recent federal appellate court precedent interpreting the ATS provides the mechanism for granting victims redress and enforcing accountability of sponsors (usually pharmaceutical companies and research and academic institutions) for informed consent misconduct. Substantive human rights protections are vital in order to ensure that every person can realize the "right to health." This article concludes that by building on the federal appellate court's ATS analysis, which grants foreign trial participants the right to pursue claims of human rights violations in US courts, a mechanism can be created for enforcing not only substantive informed consent, but also human rights protections.

  5. On the convexity of ROC curves estimated from radiological test results.

    PubMed

    Pesce, Lorenzo L; Metz, Charles E; Berbaum, Kevin S

    2010-08-01

    Although an ideal observer's receiver operating characteristic (ROC) curve must be convex-ie, its slope must decrease monotonically-published fits to empirical data often display "hooks." Such fits sometimes are accepted on the basis of an argument that experiments are done with real, rather than ideal, observers. However, the fact that ideal observers must produce convex curves does not imply that convex curves describe only ideal observers. This article aims to identify the practical implications of nonconvex ROC curves and the conditions that can lead to empirical or fitted ROC curves that are not convex. This article views nonconvex ROC curves from historical, theoretical, and statistical perspectives, which we describe briefly. We then consider population ROC curves with various shapes and analyze the types of medical decisions that they imply. Finally, we describe how sampling variability and curve-fitting algorithms can produce ROC curve estimates that include hooks. We show that hooks in population ROC curves imply the use of an irrational decision strategy, even when the curve does not cross the chance line, and therefore usually are untenable in medical settings. Moreover, we sketch a simple approach to improve any nonconvex ROC curve by adding statistical variation to the decision process. Finally, we sketch how to test whether hooks present in ROC data are likely to have been caused by chance alone and how some hooked ROCs found in the literature can be easily explained as fitting artifacts or modeling issues. In general, ROC curve fits that show hooks should be looked on with suspicion unless other arguments justify their presence. 2010 AUR. Published by Elsevier Inc. All rights reserved.

  6. The limits of scientific information for informing forest policy decisions under changing climate

    NASA Astrophysics Data System (ADS)

    McLachlan, J. S.

    2011-12-01

    The distribution of tree species is largely determined by climate, with important consequences for ecosystem function, biodiversity, and the human economy. In the past, conflicts about priority among these various goods have produced persistent debate about forest policy and management. Despite this history of conflict, there has been general agreement on the framework for the debate: Our benchmark for assessing human impact is generally some historical condition (in the New World, this is often pre-European settlement). Wilderness is to be managed with minimal human intervention. Native species are preferred over non-natives. And regional landscapes can be effectively partitioned into independent jurisdictions with different management priorities. Each of these principles was always somewhat mythical, but the dynamics of broad scale species range shifts under climate change make all of them untenable in the future. Managed relocation (MR, or assisted migration) is a controversial proposal partly because it demands scientific answers that we do not have: Are trees naturally capable of shifting their ranges as fast as climate will force them? Will deliberate introductions of species beyond their native ranges have adverse impacts on the receiving ecosystem? What are appropriate targets for hydrologic or fire management under novel no-analog climates? However, these demands on science mask a more fundamental concern: the ethical framework underlying existing forest policy is unsupported in the context of long-term non-stationary environmental trends. Whether or not we conclude that MR is a useful policy option, debate about MR is useful because it forces us to place the global change ecology agenda in a larger ethical debate about our goals when managing novel ecosystems.

  7. Weak, strong, and coherent regimes of Fröhlich condensation and their applications to terahertz medicine and quantum consciousness

    PubMed Central

    Reimers, Jeffrey R.; McKemmish, Laura K.; McKenzie, Ross H.; Mark, Alan E.; Hush, Noel S.

    2009-01-01

    In 1968, Fröhlich showed that a driven set of oscillators can condense with nearly all of the supplied energy activating the vibrational mode of lowest frequency. This is a remarkable property usually compared with Bose–Einstein condensation, superconductivity, lasing, and other unique phenomena involving macroscopic quantum coherence. However, despite intense research, no unambiguous example has been documented. We determine the most likely experimental signatures of Fröhlich condensation and show that they are significant features remote from the extraordinary properties normally envisaged. Fröhlich condensates are classified into 3 types: weak condensates in which profound effects on chemical kinetics are possible, strong condensates in which an extremely large amount of energy is channeled into 1 vibrational mode, and coherent condensates in which this energy is placed in a single quantum state. Coherent condensates are shown to involve extremely large energies, to not be produced by the Wu–Austin dynamical Hamiltonian that provides the simplest depiction of Fröhlich condensates formed using mechanically supplied energy, and to be extremely fragile. They are inaccessible in a biological environment. Hence the Penrose–Hameroff orchestrated objective-reduction model and related theories for cognitive function that embody coherent Fröhlich condensation as an essential element are untenable. Weak condensates, however, may have profound effects on chemical and enzyme kinetics, and may be produced from biochemical energy or from radio frequency, microwave, or terahertz radiation. Pokorný's observed 8.085-MHz microtubulin resonance is identified as a possible candidate, with microwave reactors (green chemistry) and terahertz medicine appearing as other feasible sources. PMID:19251667

  8. On the Viability of Conspiratorial Beliefs

    PubMed Central

    Grimes, David Robert

    2016-01-01

    Conspiratorial ideation is the tendency of individuals to believe that events and power relations are secretly manipulated by certain clandestine groups and organisations. Many of these ostensibly explanatory conjectures are non-falsifiable, lacking in evidence or demonstrably false, yet public acceptance remains high. Efforts to convince the general public of the validity of medical and scientific findings can be hampered by such narratives, which can create the impression of doubt or disagreement in areas where the science is well established. Conversely, historical examples of exposed conspiracies do exist and it may be difficult for people to differentiate between reasonable and dubious assertions. In this work, we establish a simple mathematical model for conspiracies involving multiple actors with time, which yields failure probability for any given conspiracy. Parameters for the model are estimated from literature examples of known scandals, and the factors influencing conspiracy success and failure are explored. The model is also used to estimate the likelihood of claims from some commonly-held conspiratorial beliefs; these are namely that the moon-landings were faked, climate-change is a hoax, vaccination is dangerous and that a cure for cancer is being suppressed by vested interests. Simulations of these claims predict that intrinsic failure would be imminent even with the most generous estimates for the secret-keeping ability of active participants—the results of this model suggest that large conspiracies (≥1000 agents) quickly become untenable and prone to failure. The theory presented here might be useful in counteracting the potentially deleterious consequences of bogus and anti-science narratives, and examining the hypothetical conditions under which sustainable conspiracy might be possible. PMID:26812482

  9. DDT and Malaria Prevention: Addressing the Paradox

    PubMed Central

    Bouwman, Hindrik; van den Berg, Henk; Kylin, Henrik

    2011-01-01

    Background The debate regarding dichlorodiphenyltrichloroethane (DDT) in malaria prevention and human health is polarized and can be classified into three positions: anti-DDT, centrist-DDT, pro-DDT. Objective We attempted to arrive at a synthesis by matching a series of questions on the use of DDT for indoor residual spraying (IRS) with literature and insights, and to identify options and opportunities. Discussion Overall, community health is significantly improved through all available malaria control measures, which include IRS with DDT. Is DDT “good”? Yes, because it has saved many lives. Is DDT safe as used in IRS? Recent publications have increasingly raised concerns about the health implications of DDT. Therefore, an unqualified statement that DDT used in IRS is safe is untenable. Are inhabitants and applicators exposed? Yes, and to high levels. Should DDT be used? The fact that DDT is “good” because it saves lives, and “not safe” because it has health and environmental consequences, raises ethical issues. The evidence of adverse human health effects due to DDT is mounting. However, under certain circumstances, malaria control using DDT cannot yet be halted. Therefore, the continued use of DDT poses a paradox recognized by a centrist-DDT position. At the very least, it is now time to invoke precaution. Precautionary actions could include use and exposure reduction. Conclusions There are situations where DDT will provide the best achievable health benefit, but maintaining that DDT is safe ignores the cumulative indications of many studies. In such situations, addressing the paradox from a centrist-DDT position and invoking precaution will help design choices for healthier lives. PMID:21245017

  10. British torture in the ‘war on terror’

    PubMed Central

    Blakeley, Ruth; Raphael, Sam

    2016-01-01

    Despite long-standing allegations of UK involvement in prisoner abuse during counterterrorism operations as part of the US-led ‘war on terror’, a consistent narrative emanating from British government officials is that Britain neither uses, condones nor facilitates torture or other cruel, inhuman or degrading treatment and punishment. We argue that such denials are untenable. We have established beyond reasonable doubt that Britain has been deeply involved in post-9/11 prisoner abuse, and we can now provide the most detailed account to date of the depth of this involvement. We argue that it is possible to identify a peculiarly British approach to torture in the ‘war on terror’, which is particularly well-suited to sustaining a narrative of denial. To explain the nature of UK involvement, we argue that it can be best understood within the context of how law and sovereign power have come to operate during the ‘war on terror’. We turn here to the work of Judith Butler, and explore the role of Britain as a ‘petty sovereign’, operating under the state of exception established by the US executive. UK authorities have not themselves suspended the rule of law so overtly; indeed, they have repeatedly insisted on their commitment to it. Nevertheless, they have been able to construct a rhetorical, legal and policy ‘scaffold’ that has enabled them to demonstrate at least procedural adherence to human rights norms while, at the same time, allowing UK officials to acquiesce in the arbitrary exercise of sovereignty over individuals who are denied any access to appropriate representation or redress in compliance with the rule of law. PMID:29708134

  11. British torture in the 'war on terror'.

    PubMed

    Blakeley, Ruth; Raphael, Sam

    2017-06-01

    Despite long-standing allegations of UK involvement in prisoner abuse during counterterrorism operations as part of the US-led 'war on terror', a consistent narrative emanating from British government officials is that Britain neither uses, condones nor facilitates torture or other cruel, inhuman or degrading treatment and punishment. We argue that such denials are untenable. We have established beyond reasonable doubt that Britain has been deeply involved in post-9/11 prisoner abuse, and we can now provide the most detailed account to date of the depth of this involvement. We argue that it is possible to identify a peculiarly British approach to torture in the 'war on terror', which is particularly well-suited to sustaining a narrative of denial. To explain the nature of UK involvement, we argue that it can be best understood within the context of how law and sovereign power have come to operate during the 'war on terror'. We turn here to the work of Judith Butler, and explore the role of Britain as a 'petty sovereign', operating under the state of exception established by the US executive. UK authorities have not themselves suspended the rule of law so overtly; indeed, they have repeatedly insisted on their commitment to it. Nevertheless, they have been able to construct a rhetorical, legal and policy 'scaffold' that has enabled them to demonstrate at least procedural adherence to human rights norms while, at the same time, allowing UK officials to acquiesce in the arbitrary exercise of sovereignty over individuals who are denied any access to appropriate representation or redress in compliance with the rule of law.

  12. Collaboration Between Medical Providers and Dental Hygienists in Pediatric Health Care.

    PubMed

    Braun, Patricia A; Cusick, Allison

    2016-06-01

    Basic preventive oral services for children can be provided within the medical home through the collaborative care of medical providers and dental hygienists to expand access for vulnerable populations. Because dental caries is a largely preventable disease, it is untenable that it remains the most common chronic disease of childhood. Leveraging the multiple visits children have with medical providers has potential to expand access to early preventive oral services. Developing interprofessional relationships between dental providers, including dental hygienists, and medical providers is a strategic approach to symbiotically expand access to dental care. Alternative care delivery models that provide dental services in the medical home expand access to these services for vulnerable populations. The purpose of this article is to explore 4 innovative care models aimed to expand access to dental care. Current activities in Colorado and around the nation are described regarding the provision of basic preventive oral health services (eg, fluoride varnish) by medical providers with referral to a dentist (expanded coordinated care), the colocation of dental hygiene services into the medical home (colocated care), the integration of a dental hygienist into the medical care team (integrated care), and the expansion of the dental home into the community setting through telehealth-enabled teams (virtual dental home). Gaps in evidence regarding the impacts of these models are elucidated. Bringing preventive and restorative dental services to the patient both in the medical home and in the community has potential to reduce long-standing barriers to receive these services, improve oral health outcomes of vulnerable patients, and decrease oral health disparities. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Foundational errors in the Neutral and Nearly-Neutral theories of evolution in relation to the Synthetic Theory: is a new evolutionary paradigm necessary?

    PubMed

    Valenzuela, Carlos Y

    2013-01-01

    The Neutral Theory of Evolution (NTE) proposes mutation and random genetic drift as the most important evolutionary factors. The most conspicuous feature of evolution is the genomic stability during paleontological eras and lack of variation among taxa; 98% or more of nucleotide sites are monomorphic within a species. NTE explains this homology by random fixation of neutral bases and negative selection (purifying selection) that does not contribute either to evolution or polymorphisms. Purifying selection is insufficient to account for this evolutionary feature and the Nearly-Neutral Theory of Evolution (N-NTE) included negative selection with coefficients as low as mutation rate. These NTE and N-NTE propositions are thermodynamically (tendency to random distributions, second law), biotically (recurrent mutation), logically and mathematically (resilient equilibria instead of fixation by drift) untenable. Recurrent forward and backward mutation and random fluctuations of base frequencies alone in a site make life organization and fixations impossible. Drift is not a directional evolutionary factor, but a directional tendency of matter-energy processes (second law) which threatens the biotic organization. Drift cannot drive evolution. In a site, the mutation rates among bases and selection coefficients determine the resilient equilibrium frequency of bases that genetic drift cannot change. The expected neutral random interaction among nucleotides is zero; however, huge interactions and periodicities were found between bases of dinucleotides separated by 1, 2... and more than 1,000 sites. Every base is co-adapted with the whole genome. Neutralists found that neutral evolution is independent of population size (N); thus neutral evolution should be independent of drift, because drift effect is dependent upon N. Also, chromosome size and shape as well as protein size are far from random.

  14. Nothing prepared me to manage AIDS.

    PubMed

    Banas, G E

    1992-01-01

    Articles and seminars about AIDS in the workplace are not adequate preparation for the genuine problems faced by actual managers in real organizations. There are no easy, win-win solutions to the impossible dilemmas AIDS presents, only various forms of damage control and, at best, more or less humane compromises. Gary Banas knows. Over a period of four years, two of his direct reports developed AIDS, and he watched them suffer through debility, slowly deteriorating performance, and eventual death. He also watched the gradual decline of their subordinates' productivity and morale. He found that, to different degrees, both men refused to acknowledge their illness and their decreasing organizational effectiveness. One of them resisted the author's efforts to give him an easier job at no loss in salary. Both insisted on confidentiality long after the rumor mill had identified their problem. In the course of these two consecutive ordeals, Banas discovered that AIDS patients fall into no single, neat category. AIDS is not an issue but a disease, and the people who get it are human beings first and victims second. He also learned that AIDS affects everyone around the sick individual and that almost every choice a manager makes will injure someone. Finally, he came to understand that while managers have an unequivocal obligation to treat AIDS-afflicted employees with compassion and respect, they have an equally unequivocal obligation to keep their organizations functioning. "Don't let anyone kid you," Banas warns. "When you confront AIDS in the workplace, you will face untenable choices that seem to pit your obligation to humanity against your obligation to your organization.(ABSTRACT TRUNCATED AT 250 WORDS)

  15. Estimating Kinship in Admixed Populations

    PubMed Central

    Thornton, Timothy; Tang, Hua; Hoffmann, Thomas J.; Ochs-Balcom, Heather M.; Caan, Bette J.; Risch, Neil

    2012-01-01

    Genome-wide association studies (GWASs) are commonly used for the mapping of genetic loci that influence complex traits. A problem that is often encountered in both population-based and family-based GWASs is that of identifying cryptic relatedness and population stratification because it is well known that failure to appropriately account for both pedigree and population structure can lead to spurious association. A number of methods have been proposed for identifying relatives in samples from homogeneous populations. A strong assumption of population homogeneity, however, is often untenable, and many GWASs include samples from structured populations. Here, we consider the problem of estimating relatedness in structured populations with admixed ancestry. We propose a method, REAP (relatedness estimation in admixed populations), for robust estimation of identity by descent (IBD)-sharing probabilities and kinship coefficients in admixed populations. REAP appropriately accounts for population structure and ancestry-related assortative mating by using individual-specific allele frequencies at SNPs that are calculated on the basis of ancestry derived from whole-genome analysis. In simulation studies with related individuals and admixture from highly divergent populations, we demonstrate that REAP gives accurate IBD-sharing probabilities and kinship coefficients. We apply REAP to the Mexican Americans in Los Angeles, California (MXL) population sample of release 3 of phase III of the International Haplotype Map Project; in this sample, we identify third- and fourth-degree relatives who have not previously been reported. We also apply REAP to the African American and Hispanic samples from the Women's Health Initiative SNP Health Association Resource (WHI-SHARe) study, in which hundreds of pairs of cryptically related individuals have been identified. PMID:22748210

  16. Occupational health and safety in the least developed countries--a simple case of neglect.

    PubMed

    Ahasan, M R; Partanen, T

    2001-03-01

    In many of the least developed countries, working people are significantly exposed to a number of occupational problems that may result in a deterioration of their health, safety and well being. These work-related problems are untenable, not only because of the occupational problems itself but also because of the simultaneous exposure to heat, dusts, noise, organo-chemicals, and biological and environmental pollution. This situation has existed for a long time due to various socio economic,geographical, cultural and local factors. The deteriorating situation of health and safety in the workplace may perhaps exist due to the inadequate resource facilities, economic constraints and lack of opportunity to conduct research and studies on the assessment of exposure-diseases associations. Officials, who are employed by the state, are not able to implement work regulations and labour legislation easily. Generally, they are not professionally trained and expert in the occupational health, industrial hygiene and/or safety fields, and thus, successful application and implementation of control measures are lacking. Steps to control work exposure limits have been ineffective, since national policies have been rare, owing to the multiple obstacles in preventing occupational problems. However, the major focus is on practical solutions to differing workers' needs, consideration of which is very important, depending on the what the industrial entrepreneurs could reasonably to be expected to afford. Why there is a lack of motivation and effort regarding the development of health and safety-this paper explores some important issues, aiming to focus public attention on the legacy of national and international efforts. Examples are likewise given to show the real situation of health and safety in the least developed countries.

  17. Why Do the Very Old Self-Harm? A Qualitative Study.

    PubMed

    Wand, Anne P F; Peisah, Carmelle; Draper, Brian; Brodaty, Henry

    2018-03-15

    To examine the perspectives of people aged 80 years or older who self-harmed regarding their reasons for self-harm and its consequences, and their perceptions of care. A qualitative study using in-depth interviews. Participants were recruited from two teaching hospitals and associated community services. People aged 80 years or older who had self-harmed within the previous month. Structured psychiatric assessment including cognitive testing, DSM-5 diagnosis, and an in-depth qualitative interview focusing upon the reasons for and consequences of self-harm. Narrative enquiry was used to guide the discussion. All interviews were undertaken by a geriatric psychiatrist, audio recorded, transcribed verbatim, and subjected to thematic analysis using N-VIVO. Themes that emerged for the reasons for self-harm included "enough is enough"; "loneliness"; "disintegration of self"; "being a burden"; "cumulative adversity"; "hopelessness and endless suffering"; "helplessness with rejection"; and "the untenable situation". Themes for the consequences of self-harm were "becoming engaged with or distanced from family"; "the problem was solved"; "gaining control"; "I"m worse off now"; "rejection by health professionals"; and "tension in the role of the inpatient clinical environment". Self-harm may communicate a need that cannot otherwise be expressed. An individualized person-centered approach is required to respond to self-harm, including a combination of practical, medical, and psychological approaches as indicated. Involvement of families in the process of understanding the meaning of and responding to self-harm through education and family therapy, as well as education of healthcare professionals beyond risk factor notation may be indicated. Copyright © 2018 American Association for Geriatric Psychiatry. All rights reserved.

  18. Analysis of molecular mechanisms of ATP synthesis from the standpoint of the principle of electrical neutrality.

    PubMed

    Nath, Sunil

    2017-05-01

    Theories of biological energy coupling in oxidative phosphorylation (OX PHOS) and photophosphorylation (PHOTO PHOS) are reviewed and applied to ATP synthesis by an experimental system containing purified ATP synthase reconstituted into liposomes. The theories are critically evaluated from the standpoint of the principle of electrical neutrality. It is shown that the obligatory requirement to maintain overall electroneutrality of bulk aqueous phases imposes strong constraints on possible theories of energy coupling and molecular mechanisms of ATP synthesis. Mitchell's chemiosmotic theory is found to violate the electroneutrality of bulk aqueous phases and is shown to be untenable on these grounds. Purely electroneutral mechanisms or mechanisms where the anion/countercation gradient is dissipated or simply flows through the lipid bilayer are also shown to be inadequate. A dynamically electrogenic but overall electroneutral mode of ion transport postulated by Nath's torsional mechanism of energy transduction and ATP synthesis is shown to be consistent both with the experimental findings and the principle of electrical neutrality. It is concluded that the ATP synthase functions as a proton-dicarboxylic acid anion cotransporter in OX PHOS or PHOTO PHOS. A logical chemical explanation for the selection of dicarboxylic acids as intermediates in OX PHOS and PHOTO PHOS is suggested based on the pioneering classical thermodynamic work of Christensen, Izatt, and Hansen. The nonequilibrium thermodynamic consequences for theories in which the protons originate from water vis-a-vis weak organic acids are compared and contrasted, and several new mechanistic and thermodynamic insights into biological energy transduction by ATP synthase are offered. These considerations make the new theory of energy coupling more complete, and lead to a deeper understanding of the molecular mechanism of ATP synthesis. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. The dawn of science-based moral reasoning.

    PubMed

    Baschetti, Riccardo

    2007-01-01

    In 1998, Edward O. Wilson discussed the biological basis of morality, pointed out that the analysis of its material origins should enable us to fashion a wise ethical consensus, and predicted the dawn of science-based moral reasoning. This article testifies that his prediction was right. Morality, being based on altruism and collaboration, evolved as a socially advantageous biological phenomenon aimed at ensuring the survival of our species, which was structured in small groups at high risk of extinction for the 99.5% of its existence. In the last 0.5%, the advent of agriculture resulted in a demographic explosion that impaired human beings' moral discernment, because the socially detrimental consequences of immoral actions, which had been recognised and condemned promptly in small groups consisting of a few tens of members, were diluted among millions of untouched individuals, thereby becoming less easily recognisable. Nowadays, to test the supposed morality of individual actions and government policies, we should use reason or, in doubtful cases, mathematical modelling to determine their predictable effects on the survival of small theoretical communities. Unless we untenably claim that the unlikelihood of extinction of today's immense societies entitles us to overturn the meaning of morality, all actions and policies that would cause the extinction of small communities should be regarded as indisputably immoral. This article also presents some examples of science-based moral arguments showing the immorality of restrictions and bans on research with human embryonic stem cells and demonstrates that the old concept of the "naturalistic fallacy", which philosophers frequently invoke to dismiss any scientific approach to morality, is no longer tenable, because it increasingly emerges to be a proof of what may well be defined the "philosophical fallacy".

  20. An X-linked three allele model of hand preference and hand posture for writing.

    PubMed

    McKeever, Walter F

    2004-04-01

    This paper describes a genetic model of hand preferences for writing and for handwriting posture (HWP). The challenge of devising an X-linked model for these aspects of human handedness was posed by the results of a large family handedness study (McKeever, 2000) that showed evidence of such linkage. Because X-linkage for handedness has been widely regarded as untenable, the prospects for developing such a model were not initially encouraging, but ultimately a viable model did suggest itself. Family studies of handedness and leading theories of handedness are briefly described, as is some of the research on HWP motivated by the theory of Levy and Nagylaki (1972). It is argued that there is evidence that HWP reflects a biological dictate and not just individual "choices" or "adaptations" to writing in a left-to-right direction with the left hand. The model proposes that inverted handwriting posture is not necessarily highly related to speech and language lateralities of sinistrals, but that it reveals an interhemispheric mediation of writing. It is hypothesised that it reflects a specialisation of the left angular gyrus (with some possible extension into the supramarginal gyrus) for the storage of movement and timing sequences of cursive writing, and right hemisphere motor programming of the motor output of writing. It is also argued that no family handedness study conducted to date is adequate for testing the predictions of extant handedness theories, and the often wide variations between the results of family handedness studies are noted. It is suggested that fMRI studies could definitively test the HWP hypotheses of the model and that the hypothesis of X-linkage could be tested definitively should studies of the human genome identify a gene for handedness.

  1. Human neutrophil kinetics: modeling of stable isotope labeling data supports short blood neutrophil half-lives.

    PubMed

    Lahoz-Beneytez, Julio; Elemans, Marjet; Zhang, Yan; Ahmed, Raya; Salam, Arafa; Block, Michael; Niederalt, Christoph; Asquith, Becca; Macallan, Derek

    2016-06-30

    Human neutrophils have traditionally been thought to have a short half-life in blood; estimates vary from 4 to 18 hours. This dogma was recently challenged by stable isotope labeling studies with heavy water, which yielded estimates in excess of 3 days. To investigate this disparity, we generated new stable isotope labeling data in healthy adult subjects using both heavy water (n = 4) and deuterium-labeled glucose (n = 9), a compound with more rapid labeling kinetics. To interpret results, we developed a novel mechanistic model and applied it to previously published (n = 5) and newly generated data. We initially constrained the ratio of the blood neutrophil pool to the marrow precursor pool (ratio = 0.26; from published values). Analysis of heavy water data sets yielded turnover rates consistent with a short blood half-life, but parameters, particularly marrow transit time, were poorly defined. Analysis of glucose-labeling data yielded more precise estimates of half-life (0.79 ± 0.25 days; 19 hours) and marrow transit time (5.80 ± 0.42 days). Substitution of this marrow transit time in the heavy water analysis gave a better-defined blood half-life of 0.77 ± 0.14 days (18.5 hours), close to glucose-derived values. Allowing the ratio of blood neutrophils to mitotic neutrophil precursors (R) to vary yielded a best-fit value of 0.19. Reanalysis of the previously published model and data also revealed the origin of their long estimates for neutrophil half-life: an implicit assumption that R is very large, which is physiologically untenable. We conclude that stable isotope labeling in healthy humans is consistent with a blood neutrophil half-life of less than 1 day. © 2016 by The American Society of Hematology.

  2. Interactive Scripting for Analysis and Visualization of Arbitrarily Large, Disparately Located Climate Data Ensembles Using a Progressive Runtime Server

    NASA Astrophysics Data System (ADS)

    Christensen, C.; Summa, B.; Scorzelli, G.; Lee, J. W.; Venkat, A.; Bremer, P. T.; Pascucci, V.

    2017-12-01

    Massive datasets are becoming more common due to increasingly detailed simulations and higher resolution acquisition devices. Yet accessing and processing these huge data collections for scientific analysis is still a significant challenge. Solutions that rely on extensive data transfers are increasingly untenable and often impossible due to lack of sufficient storage at the client side as well as insufficient bandwidth to conduct such large transfers, that in some cases could entail petabytes of data. Large-scale remote computing resources can be useful, but utilizing such systems typically entails some form of offline batch processing with long delays, data replications, and substantial cost for any mistakes. Both types of workflows can severely limit the flexible exploration and rapid evaluation of new hypotheses that are crucial to the scientific process and thereby impede scientific discovery. In order to facilitate interactivity in both analysis and visualization of these massive data ensembles, we introduce a dynamic runtime system suitable for progressive computation and interactive visualization of arbitrarily large, disparately located spatiotemporal datasets. Our system includes an embedded domain-specific language (EDSL) that allows users to express a wide range of data analysis operations in a simple and abstract manner. The underlying runtime system transparently resolves issues such as remote data access and resampling while at the same time maintaining interactivity through progressive and interruptible processing. Computations involving large amounts of data can be performed remotely in an incremental fashion that dramatically reduces data movement, while the client receives updates progressively thereby remaining robust to fluctuating network latency or limited bandwidth. This system facilitates interactive, incremental analysis and visualization of massive remote datasets up to petabytes in size. Our system is now available for general use in the

  3. Fouling-Resistant Membranes for Treating Concentrated Brines for Water Reuse in Advanced Energy Systems- Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendren, Zachary; Choi, Young Chul

    The high total dissolved solids (TDS) levels in the wastewater quality generated from unconventional oil and gas development make the current state-of-the art approach to water treatment/disposal untenable. Our proposed membrane technology approach addresses the two major challenges associated with this water: 1) the membrane distillation process removes the high TDS content, which is often 8 times higher than that of seawater, and 2) our novel membrane coating prevents the formation of scale that would otherwise pose a significant operational hurdle. This is accomplished through next-generation electrically conductive membranes that mitigate fouling beyond what is currently possible, and allow formore » the flexibility to treat to the water to levels desirable for multiple reuse options, thus reducing fresh water withdrawal, all the way to direct disposal into the environment. The overall project objective was to demonstrate the efficacy of membrane distillation (MD) as a cost-savings technology to treat concentrated brines (such as, but not limited to, produced waters generated from fossil fuel extraction) that have high levels of TDS for beneficial water reuse in power production and other industrial operations as well as agricultural and municipal water uses. In addition, a novel fouling-resistant nanocomposite membrane was developed to reduce the need for chemicals to address membrane scaling due to the precipitation of divalent ions in high-TDS waters and improve overall MD performance via an electrically conductive membrane distillation process (ECMD). This anti-fouling membrane technology platform is based on incorporating carbon nanotubes (CNTs) into the surface layer of existing, commercially available MD membranes. The CNTs impart electrical conductivity to the membrane surface to prevent membrane scaling and fouling when an electrical potential is applied.« less

  4. Preclinical assessment of abuse liability of biologics: In defense of current regulatory control policies.

    PubMed

    Gauvin, David V; Zimmermann, Zachary J; Baird, Theodore J

    2015-10-01

    Current regulatory policies of both the US Food and Drug Administration and Drug Enforcement Administration do not delineate automatic exceptions for biologics with respect to preclinical assessments for abuse liability of all new entities. As defined in current guidance documents and drug control policies, an exception may be given upon thorough review of available data, therapeutic target and in consultation with the Controlled Substances Staff within the Center for Drug Evaluation and Research of the FDA, but a blanket exception for all biological entities is not currently available. We review the abuse liability testing of four known biologics with definitive positive abuse liability signals in the three core abuse liability assays, self-administration, drug discrimination, and dependence potential described in the FDA draft guidance document. Interestingly, while all four examplars have positive abuse liability signals in all three assays, two of these biologics are controlled under the Comprehensive Drug Abuse and Control Act (CSA, 1970) and the other two are not currently controlled. Admittedly, these four biologics are small molecule entities. However, there is no reference to "molecular size" in the legally-binding statutory definition of biologics under the FD&C act or in the Controlled Substances Act. Neither of these drug control policy mandates have a bifurcated control status in which to make exceptions based solely on molecular size. With the current pharmaceutical focus on new technologies, such as "Trojan Horses", targeting the active transport of large molecule entities directly into the CNS, an argument to automatically exempt new molecular entities solely on molecular size is untenable. We argue that for the safety and health of general public the current regulatory control status be maintained until definitive criteria for exceptions can be identified and amended to both the FD&CA and CSA, if warranted. Copyright © 2015 Elsevier Inc. All

  5. Isles within islets: The lattice origin of small-world networks in pancreatic tissues

    NASA Astrophysics Data System (ADS)

    Barua, Amlan K.; Goel, Pranay

    2016-02-01

    The traditional computational model of the pancreatic islets of Langerhans is a lattice of β-cells connected with gap junctions. Numerous studies have investigated the behavior of networks of coupled β-cells and have shown that gap junctions synchronize bursting strongly. This simplistic architecture of islets, however, seems increasingly untenable at the face of recent experimental advances. In a microfluidics experiment on isolated islets, Rocheleau et al. (2004) showed a failure of penetration of excitation when one end received high glucose and other end was not excited sufficiently; this suggested that gap junctions may not be efficient at inducing synchrony throughout the islet. Recently, Stozer et al. (2013) have argued that the functional networks of β-cells in an islet are small world. Their results implicate the existence of a few long-range connections among cells in the network. The physiological reason underlying this claim is not well understood. These studies cast doubt on the original lattice model that largely predict an all-or-none synchrony among the cells. Here we have attempted to reconcile these observations in a unified framework. We assume that cells in the islet are coupled randomly to their nearest neighbors with some probability, p. We simulated detailed β-cell bursting in such islets. By varying p systematically we were led to network parameters similar to those obtained by Stozer et al. (2013). We find that the networks within islets break up into components giving rise to smaller isles within the super structure-isles-within-islets, as it were. This structure can also account for the partial excitation seen by Rocheleau et al. (2004). Our updated view of islet architecture thus explains the paradox how islets can have strongly synchronizing gap junctions, and be weakly coordinated at the same time.

  6. Nomenclature for very superficial squamous cell carcinoma of the skin and of the cervix: a critique in historical perspective.

    PubMed

    Kessler, Galen M; Ackerman, A Bernard

    2006-12-01

    Squamous-cell carcinoma is the most common of all cancers and it develops in diverse organs of the body, among those being the skin, lung, gastrointestinal tract, and genitourinary tract, the latter including the cervix. Unfortunately, no unanimity exists for naming very superficial squamous-cell carcinoma; it has not been designated in consistent fashion in a single organ, let alone in all of them, thereby resulting in confusion, not only in regard to terminology per se, but concerning matters conceptual, not the least of those being what appellation to apply to that condition when it is encountered histopathologically. This vexing situation is illustrated graphically in the skin by diagnoses for very superficial squamous-cell carcinoma as disparate as solar keratosis (actinic keratosis, senile keratosis), arsenical keratosis, radiation keratosis, Bowen disease, bowenoid papulosis, squamous-cell carcinoma in situ, as well as variations on the theme of "keratinocytic intraepidermal neoplasia" and "dysplasia," and in the cervix by squamous-cell carcinoma in situ, leukoplakia, cervical intraepithelial neoplasia I-III, as well as variations on the theme of "squamous dysplasia ()." What follows now is a recounting of the history of the subject under consideration here, a critique of dizzying, opaque terms and phrases given to that subject, and a proposal for rectifying what currently is a thoroughly untenable situation because the language, and the ideas expressed by it, are impenetrable to physicians and, thereby, are decidedly disadvantageous to patients. There is a need urgently for a single term for very superficial squamous-cell carcinoma in every organ of the body in which it develops, to wit, one that conveys diagnosis in such logical, lucid, comprehensible fashion that it is understandable, readily and immediately, to clinicians. In that way, physicians charged with management of patients can plan therapy rationally.

  7. Are sustainable water resources possible in northwestern India?

    NASA Astrophysics Data System (ADS)

    Troy, T. J.; Devineni, N.; Perveen, S.; Robertson, A. W.; Lall, U.

    2012-12-01

    Sustainable water resources can have many definitions with the simplest as a supply-demand problem, with climate dictating the supply of water and human water use the demand. One sign of a system that is not sustainable would be falling groundwater tables, as is the case in northwest India. This region serves as the country's breadbasket, and irrigated agriculture is ubiquitous. The state of Punjab alone produces 22% of the country's wheat and 13% of all the country's grains while only accounting for 1.5% of the country's area. Although the region receives an average precipitation of 600mm per year, it is dominated by monsoonal rainfall with streamflow augmented by upstream snowmelt and glacial melt in spring and summer that is released from a large dam into canals. Large agricultural water demands occur both during the rainy season as well as during the drier winter season. Water and food security are inextricably linked here, and when considering how to manage water sustainably, the consequences on agriculture must also be considered. In this study, we evaluate what a sustainable water resources system would look like in this region, accounting for current climate, crop water demands, and available reservoir storage. The effects of multiple water-saving scenarios are considered, such as crop choice, cropped area, and the use of forecasts in irrigation scheduling. We find that the current system is untenable and hard decisions will have to be made by policymakers in order to halt the depletion of groundwater and manage the region's water resources in a sustainable, effective manner. This work serves as a prototype for evaluating water resources in other regions with high seasonal variability in rainfall and streamflow and large irrigation demands.

  8. Some relevant parameters for assessing fire hazards of combustible mine materials using laboratory scale experiments

    PubMed Central

    Litton, Charles D.; Perera, Inoka E.; Harteis, Samuel P.; Teacoach, Kara A.; DeRosa, Maria I.; Thomas, Richard A.; Smith, Alex C.

    2018-01-01

    When combustible materials ignite and burn, the potential for fire growth and flame spread represents an obvious hazard, but during these processes of ignition and flaming, other life hazards present themselves and should be included to ensure an effective overall analysis of the relevant fire hazards. In particular, the gases and smoke produced both during the smoldering stages of fires leading to ignition and during the advanced flaming stages of a developing fire serve to contaminate the surrounding atmosphere, potentially producing elevated levels of toxicity and high levels of smoke obscuration that render the environment untenable. In underground mines, these hazards may be exacerbated by the existing forced ventilation that can carry the gases and smoke to locations far-removed from the fire location. Clearly, materials that require high temperatures (above 1400 K) and that exhibit low mass loss during thermal decomposition, or that require high heat fluxes or heat transfer rates to ignite represent less of a hazard than materials that decompose at low temperatures or ignite at low levels of heat flux. In order to define and quantify some possible parameters that can be used to assess these hazards, small-scale laboratory experiments were conducted in a number of configurations to measure: 1) the toxic gases and smoke produced both during non-flaming and flaming combustion; 2) mass loss rates as a function of temperature to determine ease of thermal decomposition; and 3) mass loss rates and times to ignition as a function of incident heat flux. This paper describes the experiments that were conducted, their results, and the development of a set of parameters that could possibly be used to assess the overall fire hazard of combustible materials using small scale laboratory experiments. PMID:29599565

  9. Some relevant parameters for assessing fire hazards of combustible mine materials using laboratory scale experiments.

    PubMed

    Litton, Charles D; Perera, Inoka E; Harteis, Samuel P; Teacoach, Kara A; DeRosa, Maria I; Thomas, Richard A; Smith, Alex C

    2018-04-15

    When combustible materials ignite and burn, the potential for fire growth and flame spread represents an obvious hazard, but during these processes of ignition and flaming, other life hazards present themselves and should be included to ensure an effective overall analysis of the relevant fire hazards. In particular, the gases and smoke produced both during the smoldering stages of fires leading to ignition and during the advanced flaming stages of a developing fire serve to contaminate the surrounding atmosphere, potentially producing elevated levels of toxicity and high levels of smoke obscuration that render the environment untenable. In underground mines, these hazards may be exacerbated by the existing forced ventilation that can carry the gases and smoke to locations far-removed from the fire location. Clearly, materials that require high temperatures (above 1400 K) and that exhibit low mass loss during thermal decomposition, or that require high heat fluxes or heat transfer rates to ignite represent less of a hazard than materials that decompose at low temperatures or ignite at low levels of heat flux. In order to define and quantify some possible parameters that can be used to assess these hazards, small-scale laboratory experiments were conducted in a number of configurations to measure: 1) the toxic gases and smoke produced both during non-flaming and flaming combustion; 2) mass loss rates as a function of temperature to determine ease of thermal decomposition; and 3) mass loss rates and times to ignition as a function of incident heat flux. This paper describes the experiments that were conducted, their results, and the development of a set of parameters that could possibly be used to assess the overall fire hazard of combustible materials using small scale laboratory experiments.

  10. Climate dominated topography in a tectonically active mountain range

    NASA Astrophysics Data System (ADS)

    Adams, B. A.; Ehlers, T. A.

    2015-12-01

    Tests of the interactions between tectonic and climate forcing on Earth's topography often focus on the concept of steady-state whereby processes of rock deformation and erosion are opposing and equal. However, when conditions change such as the climate or tectonic rock uplift, then surface processes act to restore the balance between rock deformation and erosion by adjusting topography. Most examples of canonical steady-state mountain ranges lie within the northern hemisphere, which underwent a radical change in the Quaternary due to the onset of widespread glaciation. The activity of glaciers changed erosion rates and topography in many of these mountain ranges, which likely violates steady-state assumptions. With new topographic analysis, and existing patterns of climate and rock uplift, we explore a mountain range previously considered to be in steady-state, the Olympic Mountains, USA. The broad spatial trend in channel steepness values suggests that the locus of high rock uplift rates is coincident with the rugged range core, in a similar position as high temperature and pressure lithologies, but not in the low lying foothills as has been previously suggested by low-temperature thermochronometry. The details of our analysis suggest the dominant topographic signal in the Olympic Mountains is a spatial, and likely temporal, variation in erosional efficiency dictated by orographic precipitation, and Pleistocene glacier ELA patterns. We demonstrate the same topographic effects are recorded in the basin hypsometries of other Cenozoic mountain ranges around the world. The significant glacial overprint on topography makes the argument of mountain range steadiness untenable in significantly glaciated settings. Furthermore, our results suggest that most glaciated Cenozoic ranges are likely still in a mode of readjustment as fluvial systems change topography and erosion rates to equilibrate with rock uplift rates.

  11. Gating mass cytometry data by deep learning.

    PubMed

    Li, Huamin; Shaham, Uri; Stanton, Kelly P; Yao, Yi; Montgomery, Ruth R; Kluger, Yuval

    2017-11-01

    Mass cytometry or CyTOF is an emerging technology for high-dimensional multiparameter single cell analysis that overcomes many limitations of fluorescence-based flow cytometry. New methods for analyzing CyTOF data attempt to improve automation, scalability, performance and interpretation of data generated in large studies. Assigning individual cells into discrete groups of cell types (gating) involves time-consuming sequential manual steps, untenable for larger studies. We introduce DeepCyTOF, a standardization approach for gating, based on deep learning techniques. DeepCyTOF requires labeled cells from only a single sample. It is based on domain adaptation principles and is a generalization of previous work that allows us to calibrate between a target distribution and a source distribution in an unsupervised manner. We show that DeepCyTOF is highly concordant (98%) with cell classification obtained by individual manual gating of each sample when applied to a collection of 16 biological replicates of primary immune blood cells, even when measured across several instruments. Further, DeepCyTOF achieves very high accuracy on the semi-automated gating challenge of the FlowCAP-I competition as well as two CyTOF datasets generated from primary immune blood cells: (i) 14 subjects with a history of infection with West Nile virus (WNV), (ii) 34 healthy subjects of different ages. We conclude that deep learning in general, and DeepCyTOF specifically, offers a powerful computational approach for semi-automated gating of CyTOF and flow cytometry data. Our codes and data are publicly available at https://github.com/KlugerLab/deepcytof.git. yuval.kluger@yale.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  12. [Is abortion murder?].

    PubMed

    Werning, C

    1995-09-01

    Discussions about Paragraph 218 of the German federal abortion law have spawned antithetical opinions: on the one hand, the full right of the mother or parents to decide about the incipient human life; and on the other hand, under the dogma of abortion is murder, providing abortion is rejected even when the pregnancy is the result of rape and it is unwanted. Two questions are closely related to this issue: 1) what makes human beings human and 2) when does human life begin. From a medical point of view the function of the brain is fundamentally linked to being human. The brain controls almost all functions of the body and determines its psychological makeup, such as intellect and, in a theological sense, the soul. Without the brain such functioning is not possible, since brain death means the death of human life. Children born with anencephaly and microencephaly can never live a human life. At the end of life various diseases (stroke, Alzheimer disease) can severely damage the brain. In these cases normal living is also no longer possible. Yet ethically it is untenable to actively kill these human beings. But when one considers that life-threatening diseases can require life-support intervention, then often the pragmatic intervention is not far removed from active euthanasia. The other question related to the beginning of human life is even more difficult to answer. It is the fertilization of the egg cells; but a conglomeration of cells in the early phase of pregnancy can hardly be characterized as a human person. The human identity, personality, and worth is associated with the functioning of the brain, so only when the brain is fully developed can there be any talk about an unborn human being.

  13. Summoning compassion to address the challenges of conservation.

    PubMed

    Wallach, Arian D; Bekoff, Marc; Batavia, Chelsea; Nelson, Michael P; Ramp, Daniel

    2018-04-27

    Conservation practice is informed by science, but also reflects ethical beliefs about how we ought to value and interact with the Earth's biota. As human activities continue to drive extinctions and diminish critical life-sustaining ecosystem processes, achieving conservation goals becomes increasingly urgent. In our determination to react decisively, conservation challenges can be handled without due deliberation, particularly when wildlife individuals are sacrificed "for the greater good" of wildlife collectives (populations, species, ecosystems). With growing recognition of the widespread sentience and sapience of many nonhuman animals, standard conservation practices that categorically prioritize collectives without due consideration for the wellbeing of individuals are ethically untenable. Here we highlight three overarching ethical orientations characterizing current and historical practices in conservation that suppress compassion: instrumentalism, collectivism, and nativism. We illustrate how establishing a commitment to compassion could re-orient conservation in more ethically expansive directions, which incorporate recognition of the intrinsic value of wildlife, the sentience of nonhuman animals, and the values of novel ecosystems, introduced species and their members. A compassionate conservation approach allays practices that intentionally and unnecessarily harm wildlife individuals, while aligning with critical conservation goals. Although the urgency of achieving effective outcomes for solving major conservation problems may enhance the appeal of quick and harsh measures, the costs are too high. Continuing to justify moral indifference when causing the suffering of wildlife individuals, particularly those who possess sophisticated capacities for emotion, consciousness, and sociality, risks estranging conservation practice from prevailing, and appropriate, social values. As conservationists and compassionate beings, we must demonstrate concern for both

  14. On the convexity of ROC curves estimated from radiological test results

    PubMed Central

    Pesce, Lorenzo L.; Metz, Charles E.; Berbaum, Kevin S.

    2010-01-01

    Rationale and Objectives Although an ideal observer’s receiver operating characteristic (ROC) curve must be convex — i.e., its slope must decrease monotonically — published fits to empirical data often display “hooks.” Such fits sometimes are accepted on the basis of an argument that experiments are done with real, rather than ideal, observers. However, the fact that ideal observers must produce convex curves does not imply that convex curves describe only ideal observers. This paper aims to identify the practical implications of non-convex ROC curves and the conditions that can lead to empirical and/or fitted ROC curves that are not convex. Materials and Methods This paper views non-convex ROC curves from historical, theoretical and statistical perspectives, which we describe briefly. We then consider population ROC curves with various shapes and analyze the types of medical decisions that they imply. Finally, we describe how sampling variability and curve-fitting algorithms can produce ROC curve estimates that include hooks. Results We show that hooks in population ROC curves imply the use of an irrational decision strategy, even when the curve doesn’t cross the chance line, and therefore usually are untenable in medical settings. Moreover, we sketch a simple approach to improve any non-convex ROC curve by adding statistical variation to the decision process. Finally, we sketch how to test whether hooks present in ROC data are likely to have been caused by chance alone and how some hooked ROCs found in the literature can be easily explained as fitting artifacts or modeling issues. Conclusion In general, ROC curve fits that show hooks should be looked upon with suspicion unless other arguments justify their presence. PMID:20599155

  15. SHARP pre-release v1.0 - Current Status and Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Vijay S.; Rahaman, Ronald O.

    The NEAMS Reactor Product Line effort aims to develop an integrated multiphysics simulation capability for the design and analysis of future generations of nuclear power plants. The Reactor Product Line code suite’s multi-resolution hierarchy is being designed to ultimately span the full range of length and time scales present in relevant reactor design and safety analyses, as well as scale from desktop to petaflop computing platforms. In this report, building on a several previous report issued in September 2014, we describe our continued efforts to integrate thermal/hydraulics, neutronics, and structural mechanics modeling codes to perform coupled analysis of a representativemore » fast sodium-cooled reactor core in preparation for a unified release of the toolkit. The work reported in the current document covers the software engineering aspects of managing the entire stack of components in the SHARP toolkit and the continuous integration efforts ongoing to prepare a release candidate for interested reactor analysis users. Here we report on the continued integration effort of PROTEUS/Nek5000 and Diablo into the NEAMS framework and the software processes that enable users to utilize the capabilities without losing scientific productivity. Due to the complexity of the individual modules and their necessary/optional dependency library chain, we focus on the configuration and build aspects for the SHARP toolkit, which includes capability to autodownload dependencies and configure/install with optimal flags in an architecture-aware fashion. Such complexity is untenable without strong software engineering processes such as source management, source control, change reviews, unit tests, integration tests and continuous test suites. Details on these processes are provided in the report as a building step for a SHARP user guide that will accompany the first release, expected by Mar 2016.« less

  16. The assumption of equilibrium in models of migration.

    PubMed

    Schachter, J; Althaus, P G

    1993-02-01

    In recent articles Evans (1990) and Harrigan and McGregor (1993) (hereafter HM) scrutinized the equilibrium model of migration presented in a 1989 paper by Schachter and Althaus. This model used standard microeconomics to analyze gross interregional migration flows based on the assumption that gross flows are in approximate equilibrium. HM criticized the model as theoretically untenable, while Evans summoned empirical as well as theoretical objections. HM claimed that equilibrium of gross migration flows could be ruled out on theoretical grounds. They argued that the absence of net migration requires that either all regions have equal populations or that unsustainable regional migration propensities must obtain. In fact some moves are inter- and other are intraregional. It does not follow, however, that the number of interregional migrants will be larger for the more populous region. Alternatively, a country could be divided into a large number of small regions that have equal populations. With uniform propensities to move, each of these analytical regions would experience in equilibrium zero net migration. Hence, the condition that net migration equal zero is entirely consistent with unequal distributions of population across regions. The criticisms of Evans were based both on flawed reasoning and on misinterpretation of the results of a number of econometric studies. His reasoning assumed that the existence of demand shifts as found by Goldfarb and Yezer (1987) and Topel (1986) invalidated the equilibrium model. The equilibrium never really obtains exactly, but economic modeling of migration properly begins with a simple equilibrium model of the system. A careful reading of the papers Evans cited in support of his position showed that in fact they affirmed rather than denied the appropriateness of equilibrium modeling. Zero net migration together with nonzero gross migration are not theoretically incompatible with regional heterogeneity of population, wages, or

  17. Synchrotron microCT imaging of soft tissue in juvenile zebrafish reveals retinotectal projections

    NASA Astrophysics Data System (ADS)

    Xin, Xuying; Clark, Darin; Ang, Khai Chung; van Rossum, Damian B.; Copper, Jean; Xiao, Xianghui; La Riviere, Patrick J.; Cheng, Keith C.

    2017-02-01

    Biomedical research and clinical diagnosis would benefit greatly from full volume determinations of anatomical phenotype. Comprehensive tools for morphological phenotyping are central for the emerging field of phenomics, which requires high-throughput, systematic, accurate, and reproducible data collection from organisms affected by genetic, disease, or environmental variables. Theoretically, complete anatomical phenotyping requires the assessment of every cell type in the whole organism, but this ideal is presently untenable due to the lack of an unbiased 3D imaging method that allows histopathological assessment of any cell type despite optical opacity. Histopathology, the current clinical standard for diagnostic phenotyping, involves the microscopic study of tissue sections to assess qualitative aspects of tissue architecture, disease mechanisms, and physiological state. However, quantitative features of tissue architecture such as cellular composition and cell counting in tissue volumes can only be approximated due to characteristics of tissue sectioning, including incomplete sampling and the constraints of 2D imaging of 5 micron thick tissue slabs. We have used a small, vertebrate organism, the zebrafish, to test the potential of microCT for systematic macroscopic and microscopic morphological phenotyping. While cell resolution is routinely achieved using methods such as light sheet fluorescence microscopy and optical tomography, these methods do not provide the pancellular perspective characteristic of histology, and are constrained by the limited penetration of visible light through pigmented and opaque specimens, as characterizes zebrafish juveniles. Here, we provide an example of neuroanatomy that can be studied by microCT of stained soft tissue at 1.43 micron isotropic voxel resolution. We conclude that synchrotron microCT is a form of 3D imaging that may potentially be adopted towards more reproducible, large-scale, morphological phenotyping of optically

  18. Continuity of Microblade Technology in the Indian Subcontinent Since 45 ka: Implications for the Dispersal of Modern Humans

    PubMed Central

    Mishra, Sheila; Chauhan, Naveen; Singhvi, Ashok K.

    2013-01-01

    We extend the continuity of microblade technology in the Indian Subcontinent to 45 ka, on the basis of optical dating of microblade assemblages from the site of Mehtakheri, (22° 13' 44″ N Lat 76° 01' 36″ E Long) in Madhya Pradesh, India. Microblade technology in the Indian Subcontinent is continuously present from its first appearance until the Iron Age (~3 ka), making its association with modern humans undisputed. It has been suggested that microblade technology in the Indian Subcontinent was developed locally by modern humans after 35 ka. The dates reported here from Mehtakheri show this inference to be untenable and suggest alternatively that this technology arrived in the Indian Subcontinent with the earliest modern humans. It also shows that modern humans in Indian Subcontinent and SE Asia were associated with differing technologies and this calls into question the “southern dispersal” route of modern humans from Africa through India to SE Asia and then to Australia. We suggest that modern humans dispersed from Africa in two stages coinciding with the warmer interglacial conditions of MIS 5 and MIS 3. Competitive interactions between African modern humans and Indian archaics who shared an adaptation to tropical environments differed from that between modern humans and archaics like Neanderthals and Denisovans, who were adapted to temperate environments. Thus, while modern humans expanded into temperate regions during warmer climates, their expansion into tropical regions, like the Indian Subcontinent, in competition with similarly adapted populations, occurred during arid climates. Thus modern humans probably entered the Indian Subcontinent during the arid climate of MIS 4 coinciding with their disappearance from the Middle East and Northern Africa. The out of phase expansion of modern humans into tropical versus temperate regions has been one of the factors affecting the dispersal of modern humans from Africa during the period 200–40 ka. PMID

  19. Country of origin and racio-ethnicity: are there differences in perceived organizational cultural competency and job satisfaction among nursing assistants in long-term care?

    PubMed

    Allensworth-Davies, Donald; Leigh, Jennifer; Pukstas, Kim; Geron, Scott Miyake; Hardt, Eric; Brandeis, Gary; Engle, Ryann L; Parker, Victoria A

    2007-01-01

    Long-term care facilities nationwide are finding it difficult to train and retain sufficient numbers of nursing assistants, resulting in a dire staffing situation. Researchers, managers, and practitioners alike have been trying to determine the correlates of job satisfaction to address this increasingly untenable situation. One factor that has received little empirical attention in the long-term care literature is cultural competence. Cultural competence is defined as a set of skills, attitudes, behaviors, and policies that enable organizations and staff to work effectively in cross-cultural situations. To examine organizational cultural competence as perceived by nursing assistants and determine if this was related to differences in job satisfaction across countries of origin and racio-ethnic groups. Primary data collected from a cross-section of 135 nursing assistants at four New England nursing homes. Demographics, perceptions of organizational cultural competence, and ratings of job satisfaction were collected. A multivariate, generalized linear model was used to assess predictors of job satisfaction. A secondary analysis was then conducted to identify the most important components of organizational cultural competency. Perception of organizational cultural competence (p = .0005) and autonomy (p = .001) were the strongest predictors of job satisfaction among nursing assistants; as these increase, job satisfaction also increases. Neither country of origin nor racio-ethnicity was associated with job satisfaction, but racio-ethnicity was associated with perceived organizational cultural competence (p = .05). A comfortable work environment for employees of different races/cultures emerged as the strongest organizational cultural competency factor (p = .04). Developing and maintaining organizational cultural competency and employee autonomy are important managerial strategies for increasing job satisfaction and improving staff retention. Toward this end, creating a

  20. Glacial reorganization of topography in a tectonically active mountain range

    NASA Astrophysics Data System (ADS)

    Adams, Byron; Ehlers, Todd

    2016-04-01

    topography makes the argument of mountain range steadiness untenable in significantly glaciated settings. Furthermore, our results suggest that most glaciated Cenozoic ranges are likely still in a mode of readjustment as fluvial systems change topography and erosion rates to equilibrate with rock uplift rates.

  1. Can Horizontal Hydraulic Fracturing Lead to Less Expensive Achievement of More Natural River Flows?

    NASA Astrophysics Data System (ADS)

    Kern, J.; Characklis, G. W.

    2014-12-01

    High ramp rates and low costs make hydropower an extremely valuable resource for meeting "peak" hourly electricity demands, but dams that employ variable, stop-start reservoir releases can have adverse impacts on downstream riverine ecosystems. In recent years, efforts to mitigate the environmental impacts of hydropower peaking have relied predominantly on the use of ramp rate restrictions, or limits on the magnitude of hour-to-hour changes in reservoir discharge. These restrictions shift some hydropower production away from peak hours towards less valuable off-peak hours and impose a financial penalty on dam owners that is a function of: 1) the "spread" (difference) between peak and off-peak electricity prices; and 2) the total amount of generation shifted from peak to off-peak hours. In this study, we show how variability in both the price spread and reservoir inflows can cause large swings in the financial cost of ramp rate restrictions on a seasonal and annual basis. Of particular interest is determining whether current low natural gas prices (largely attributable to improvements in hydraulic fracturing) have reduced the cost of implementing ramp rate restrictions at dams by narrowing the spread between peak and off-peak electricity prices. We also examine the role that large year-to-year fluctuations in the cost of ramp rate restrictions may play in precluding downstream stakeholders (e.g., conservation trusts) from "purchasing" more natural streamflow patterns from dam owners. In recent years, similar arrangements between conservation trusts and consumptive water users have been put into practice in the U.S. for the purposes of supplementing baseflows in rivers. However, significant year-to-year uncertainty in the size of payments necessary to compensate hydropower producers for lost peaking production (i.e., uncertainty in the cost of ramp rate restrictions) makes transactions that aim to mitigate the environmental impacts of hydropower peaking untenable. In

  2. Damage detection in hazardous waste storage tank bottoms using ultrasonic guided waves

    NASA Astrophysics Data System (ADS)

    Cobb, Adam C.; Fisher, Jay L.; Bartlett, Jonathan D.; Earnest, Douglas R.

    2018-04-01

    Detecting damage in storage tanks is performed commercially using a variety of techniques. The most commonly used inspection technologies are magnetic flux leakage (MFL), conventional ultrasonic testing (UT), and leak testing. MFL and UT typically involve manual or robotic scanning of a sensor along the metal surfaces to detect cracks or corrosion wall loss. For inspection of the tank bottom, however, the storage tank is commonly emptied to allow interior access for the inspection system. While there are costs associated with emptying a storage tank for inspection that can be justified in some scenarios, there are situations where emptying the tank is impractical. Robotic, submersible systems have been developed for inspecting these tanks, but there are some storage tanks whose contents are so hazardous that even the use of these systems is untenable. Thus, there is a need to develop an inspection strategy that does not require emptying the tank or insertion of the sensor system into the tank. This paper presents a guided wave system for inspecting the bottom of double-shelled storage tanks (DSTs), with the sensor located on the exterior side-wall of the vessel. The sensor used is an electromagnetic acoustic transducer (EMAT) that generates and receives shear-horizontal guided plate waves using magnetostriction principles. The system operates by scanning the sensor around the circumference of the storage tank and sending guided waves into the tank bottom at regular intervals. The data from multiple locations are combined using the synthetic aperture focusing technique (SAFT) to create a color-mapped image of the vessel thickness changes. The target application of the system described is inspection of DSTs located at the Hanford site, which are million-gallon vessels used to store nuclear waste. Other vessels whose exterior walls are accessible would also be candidates for inspection using the described approach. Experimental results are shown from tests on multiple

  3. A goodness-of-fit test for occupancy models with correlated within-season revisits

    USGS Publications Warehouse

    Wright, Wilson; Irvine, Kathryn M.; Rodhouse, Thomas J.

    2016-01-01

    Occupancy modeling is important for exploring species distribution patterns and for conservation monitoring. Within this framework, explicit attention is given to species detection probabilities estimated from replicate surveys to sample units. A central assumption is that replicate surveys are independent Bernoulli trials, but this assumption becomes untenable when ecologists serially deploy remote cameras and acoustic recording devices over days and weeks to survey rare and elusive animals. Proposed solutions involve modifying the detection-level component of the model (e.g., first-order Markov covariate). Evaluating whether a model sufficiently accounts for correlation is imperative, but clear guidance for practitioners is lacking. Currently, an omnibus goodnessof- fit test using a chi-square discrepancy measure on unique detection histories is available for occupancy models (MacKenzie and Bailey, Journal of Agricultural, Biological, and Environmental Statistics, 9, 2004, 300; hereafter, MacKenzie– Bailey test). We propose a join count summary measure adapted from spatial statistics to directly assess correlation after fitting a model. We motivate our work with a dataset of multinight bat call recordings from a pilot study for the North American Bat Monitoring Program. We found in simulations that our join count test was more reliable than the MacKenzie–Bailey test for detecting inadequacy of a model that assumed independence, particularly when serial correlation was low to moderate. A model that included a Markov-structured detection-level covariate produced unbiased occupancy estimates except in the presence of strong serial correlation and a revisit design consisting only of temporal replicates. When applied to two common bat species, our approach illustrates that sophisticated models do not guarantee adequate fit to real data, underscoring the importance of model assessment. Our join count test provides a widely applicable goodness-of-fit test and

  4. The Importance of Isomorphism for Conclusions about Homology: A Bayesian Multilevel Structural Equation Modeling Approach with Ordinal Indicators.

    PubMed

    Guenole, Nigel

    2016-01-01

    We describe a Monte Carlo study examining the impact of assuming item isomorphism (i.e., equivalent construct meaning across levels of analysis) on conclusions about homology (i.e., equivalent structural relations across levels of analysis) under varying degrees of non-isomorphism in the context of ordinal indicator multilevel structural equation models (MSEMs). We focus on the condition where one or more loadings are higher on the between level than on the within level to show that while much past research on homology has ignored the issue of psychometric isomorphism, psychometric isomorphism is in fact critical to valid conclusions about homology. More specifically, when a measurement model with non-isomorphic items occupies an exogenous position in a multilevel structural model and the non-isomorphism of these items is not modeled, the within level exogenous latent variance is under-estimated leading to over-estimation of the within level structural coefficient, while the between level exogenous latent variance is overestimated leading to underestimation of the between structural coefficient. When a measurement model with non-isomorphic items occupies an endogenous position in a multilevel structural model and the non-isomorphism of these items is not modeled, the endogenous within level latent variance is under-estimated leading to under-estimation of the within level structural coefficient while the endogenous between level latent variance is over-estimated leading to over-estimation of the between level structural coefficient. The innovative aspect of this article is demonstrating that even minor violations of psychometric isomorphism render claims of homology untenable. We also show that posterior predictive p-values for ordinal indicator Bayesian MSEMs are insensitive to violations of isomorphism even when they lead to severely biased within and between level structural parameters. We highlight conditions where poor estimation of even correctly specified

  5. Towards a new moral paradigm in health care delivery: accounting for individuals.

    PubMed

    Katz, Meir

    2010-01-01

    For years, commentators have debated how to most appropriately allocate scarce medical resources over large populations. In this paper, I abstract the major rationing schema into three general approaches: rationing by price, quantity, and prioritization. Each has both normative appeal and considerable weakness. After exploring them, I present what some commentators have termed the "moral paradigm" as an alternative to broader philosophies designed to encapsulate the universe of options available to allocators (often termed the market, professional, and political paradigms). While not itself an abstraction of any specific viable rationing scheme, it provides a strong basis for the development of a new scheme that offers considerable moral and political appeal often absent from traditionally employed rationing schema. As I explain, the moral paradigm, in its strong, absolute, and uncompromising version, is economically untenable. This paper articulates a modified version of the moral paradigm that is pluralist in nature rather than absolute. It appeals to the moral, emotional, and irrational sensibilities of each individual person. The moral paradigm, so articulated, can complement any health care delivery system that policy-makers adopt. It functions by granting individuals the ability to appeal to an administrative adjudicatory board designated for this purpose. The adjudicatory board would have the expertise and power to act in response to the complaints of individual aggrieved patients, including those complaints that stem from the moral, religious, ethical, emotional, irrational, or other subjective positions of the patient, and would have plenary power to affirm the denial of access to medical care or to mandate the provision of such care. The board must be designed to facilitate its intended function while creating structural limitations on abuse of power and other excess. I make some specific suggestions on matters of structure and function in the hope of

  6. Evidence-based ethics? On evidence-based practice and the "empirical turn" from normative bioethics

    PubMed Central

    Goldenberg, Maya J

    2005-01-01

    Background The increase in empirical methods of research in bioethics over the last two decades is typically perceived as a welcomed broadening of the discipline, with increased integration of social and life scientists into the field and ethics consultants into the clinical setting, however it also represents a loss of confidence in the typical normative and analytic methods of bioethics. Discussion The recent incipiency of "Evidence-Based Ethics" attests to this phenomenon and should be rejected as a solution to the current ambivalence toward the normative resolution of moral problems in a pluralistic society. While "evidence-based" is typically read in medicine and other life and social sciences as the empirically-adequate standard of reasonable practice and a means for increasing certainty, I propose that the evidence-based movement in fact gains consensus by displacing normative discourse with aggregate or statistically-derived empirical evidence as the "bottom line". Therefore, along with wavering on the fact/value distinction, evidence-based ethics threatens bioethics' normative mandate. The appeal of the evidence-based approach is that it offers a means of negotiating the demands of moral pluralism. Rather than appealing to explicit values that are likely not shared by all, "the evidence" is proposed to adjudicate between competing claims. Quantified measures are notably more "neutral" and democratic than liberal markers like "species normal functioning". Yet the positivist notion that claims stand or fall in light of the evidence is untenable; furthermore, the legacy of positivism entails the quieting of empirically non-verifiable (or at least non-falsifiable) considerations like moral claims and judgments. As a result, evidence-based ethics proposes to operate with the implicit normativity that accompanies the production and presentation of all biomedical and scientific facts unchecked. Summary The "empirical turn" in bioethics signals a need for

  7. The Binomial Model in Fluctuation Analysis of Quantal Neurotransmitter Release

    PubMed Central

    Quastel, D. M. J.

    1997-01-01

    The mathematics of the binomial model for quantal neurotransmitter release is considered in general terms, to explore what information might be extractable from statistical aspects of data. For an array of N statistically independent release sites, each with a release probability p, the compound binomial always pertains, with = N

    , p′ ≡ 1 - var(m)/ =

    (1 + cvp2) and n′ ≡ /p′ = N/(1 + cvp2), where m is the output/stimulus and cvp2 is var(p)/

    2. Unless n′ is invariant with ambient conditions or stimulation paradigms, the simple binomial (cvp = 0) is untenable and n′ is neither N nor the number of “active” sites or sites with a quantum available. At each site p = popA, where po is the output probability if a site is “eligible” or “filled” despite previous quantal discharge, and pA (eligibility probability) depends at least on the replenishment rate, po, and interstimulus time. Assuming stochastic replenishment, a simple algorithm allows calculation of the full statistical composition of outputs for any hypothetical combinations of po's and refill rates, for any stimulation paradigm and spontaneous release. A rise in n′ (reduced cvp) tends to occur whenever po varies widely between sites, with a raised stimulation frequency or factors tending to increase po's. Unlike and var(m) at equilibrium, output changes early in trains of stimuli, and covariances, potentially provide information about whether changes in reflect change in or in . Formulae are derived for variance and third moments of postsynaptic responses, which depend on the quantal mix in the signals. A new, easily computed function, the area product, gives noise-unbiased variance of a series of synaptic signals and its peristimulus time distribution, which is modified by the unit channel composition of quantal responses and if the signals reflect mixed responses from synapses with different quantal time course. PMID:9017200

  8. Galileon as a local modification of gravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nicolis, Alberto; Rattazzi, Riccardo; Trincherini, Enrico

    2009-03-15

    In the Dvali-Gabadadze-Porrati (DGP) model, the 'self-accelerating' solution is plagued by a ghost instability, which makes the solution untenable. This fact, as well as all interesting departures from general relativity (GR), are fully captured by a four-dimensional effective Lagrangian, valid at distances smaller than the present Hubble scale. The 4D effective theory involves a relativistic scalar {pi}, universally coupled to matter and with peculiar derivative self-interactions. In this paper, we study the connection between self-acceleration and the presence of ghosts for a quite generic class of theories that modify gravity in the infrared. These theories are defined as those thatmore » at distances shorter than cosmological, reduce to a certain generalization of the DGP 4D effective theory. We argue that for infrared modifications of GR locally due to a universally coupled scalar, our generalization is the only one that allows for a robust implementation of the Vainshtein effect--the decoupling of the scalar from matter in gravitationally bound systems--necessary to recover agreement with solar-system tests. Our generalization involves an internal Galilean invariance, under which {pi}'s gradient shifts by a constant. This symmetry constrains the structure of the {pi} Lagrangian so much so that in 4D there exist only five terms that can yield sizable nonlinearities without introducing ghosts. We show that for such theories in fact there are ''self-accelerating'' de Sitter solutions with no ghostlike instabilities. In the presence of compact sources, these solutions can support spherically symmetric, Vainshtein-like nonlinear perturbations that are also stable against small fluctuations. We investigate a possible infrared completion of these theories at scales of order of the Hubble horizon, and larger. There are however some features of our theories that may constitute a problem at the theoretical or phenomenological level: the presence of superluminal

  9. No Evidence of Narrowly Defined Cognitive Penetrability in Unambiguous Vision

    PubMed Central

    Lammers, Nikki A.; de Haan, Edward H.; Pinto, Yair

    2017-01-01

    The classical notion of cognitive impenetrability suggests that perceptual processing is an automatic modular system and not under conscious control. Near consensus is now emerging that this classical notion is untenable. However, as recently pointed out by Firestone and Scholl, this consensus is built on quicksand. In most studies claiming perception is cognitively penetrable, it remains unclear which actual process has been affected (perception, memory, imagery, input selection or judgment). In fact, the only available “proofs” for cognitive penetrability are proxies for perception, such as behavioral responses and neural correlates. We suggest that one can interpret cognitive penetrability in two different ways, a broad sense and a narrow sense. In the broad sense, attention and memory are not considered as “just” pre- and post-perceptual systems but as part of the mechanisms by which top-down processes influence the actual percept. Although many studies have proven top-down influences in this broader sense, it is still debatable whether cognitive penetrability remains tenable in a narrow sense. The narrow sense states that cognitive penetrability only occurs when top-down factors are flexible and cause a clear illusion from a first person perspective. So far, there is no strong evidence from a first person perspective that visual illusions can indeed be driven by high-level flexible factors. One cannot be cognitively trained to see and unsee visual illusions. We argue that this lack of convincing proof for cognitive penetrability in the narrow sense can be explained by the fact that most research focuses on foveal vision only. This type of perception may be too unambiguous for transient high-level factors to control perception. Therefore, illusions in more ambiguous perception, such as peripheral vision, can offer a unique insight into the matter. They produce a clear subjective percept based on unclear, degraded visual input: the optimal basis to study

  10. High performance digital read out integrated circuit (DROIC) for infrared imaging

    NASA Astrophysics Data System (ADS)

    Mizuno, Genki; Olah, Robert; Oduor, Patrick; Dutta, Achyut K.; Dhar, Nibir K.

    2016-05-01

    Banpil Photonics has developed a high-performance Digital Read-Out Integrated Circuit (DROIC) for image sensors and camera systems targeting various military, industrial and commercial Infrared (IR) imaging applications. The on-chip digitization of the pixel output eliminates the necessity for an external analog-to-digital converter (ADC), which not only cuts costs, but also enables miniaturization of packaging to achieve SWaP-C camera systems. In addition, the DROIC offers new opportunities for greater on-chip processing intelligence that are not possible in conventional analog ROICs prevalent today. Conventional ROICs, which typically can enhance only one high performance attribute such as frame rate, power consumption or noise level, fail when simultaneously targeting the most aggressive performance requirements demanded in imaging applications today. Additionally, scaling analog readout circuits to meet such requirements leads to expensive, high-power consumption with large and complex systems that are untenable in the trend towards SWaP-C. We present the implementation of a VGA format (640x512 pixels 15μm pitch) capacitivetransimpedance amplifier (CTIA) DROIC architecture that incorporates a 12-bit ADC at the pixel level. The CTIA pixel input circuitry has two gain modes with programmable full-well capacity values of 100K e- and 500K e-. The DROIC has been developed with a system-on-chip architecture in mind, where all the timing and biasing are generated internally without requiring any critical external inputs. The chip is configurable with many parameters programmable through a serial programmable interface (SPI). It features a global shutter, low power, and high frame rates programmable from 30 up 500 frames per second in full VGA format supported through 24 LVDS outputs. This DROIC, suitable for hybridization with focal plane arrays (FPA) is ideal for high-performance uncooled camera applications ranging from near IR (NIR) and shortwave IR (SWIR) to mid

  11. Modernizing Earth and Space Science Modeling Workflows in the Big Data Era

    NASA Astrophysics Data System (ADS)

    Kinter, J. L.; Feigelson, E.; Walker, R. J.; Tino, C.

    2017-12-01

    Modeling is a major aspect of the Earth and space science research. The development of numerical models of the Earth system, planetary systems or astrophysical systems is essential to linking theory with observations. Optimal use of observations that are quite expensive to obtain and maintain typically requires data assimilation that involves numerical models. In the Earth sciences, models of the physical climate system are typically used for data assimilation, climate projection, and inter-disciplinary research, spanning applications from analysis of multi-sensor data sets to decision-making in climate-sensitive sectors with applications to ecosystems, hazards, and various biogeochemical processes. In space physics, most models are from first principles, require considerable expertise to run and are frequently modified significantly for each case study. The volume and variety of model output data from modeling Earth and space systems are rapidly increasing and have reached a scale where human interaction with data is prohibitively inefficient. A major barrier to progress is that modeling workflows isn't deemed by practitioners to be a design problem. Existing workflows have been created by a slow accretion of software, typically based on undocumented, inflexible scripts haphazardly modified by a succession of scientists and students not trained in modern software engineering methods. As a result, existing modeling workflows suffer from an inability to onboard new datasets into models; an inability to keep pace with accelerating data production rates; and irreproducibility, among other problems. These factors are creating an untenable situation for those conducting and supporting Earth system and space science. Improving modeling workflows requires investments in hardware, software and human resources. This paper describes the critical path issues that must be targeted to accelerate modeling workflows, including script modularization, parallelization, and

  12. Mining big data sets of plankton images: a zero-shot learning approach to retrieve labels without training data

    NASA Astrophysics Data System (ADS)

    Orenstein, E. C.; Morgado, P. M.; Peacock, E.; Sosik, H. M.; Jaffe, J. S.

    2016-02-01

    Technological advances in instrumentation and computing have allowed oceanographers to develop imaging systems capable of collecting extremely large data sets. With the advent of in situ plankton imaging systems, scientists must now commonly deal with "big data" sets containing tens of millions of samples spanning hundreds of classes, making manual classification untenable. Automated annotation methods are now considered to be the bottleneck between collection and interpretation. Typically, such classifiers learn to approximate a function that predicts a predefined set of classes for which a considerable amount of labeled training data is available. The requirement that the training data span all the classes of concern is problematic for plankton imaging systems since they sample such diverse, rapidly changing populations. These data sets may contain relatively rare, sparsely distributed, taxa that will not have associated training data; a classifier trained on a limited set of classes will miss these samples. The computer vision community, leveraging advances in Convolutional Neural Networks (CNNs), has recently attempted to tackle such problems using "zero-shot" object categorization methods. Under a zero-shot framework, a classifier is trained to map samples onto a set of attributes rather than a class label. These attributes can include visual and non-visual information such as what an organism is made out of, where it is distributed globally, or how it reproduces. A second stage classifier is then used to extrapolate a class. In this work, we demonstrate a zero-shot classifier, implemented with a CNN, to retrieve out-of-training-set labels from images. This method is applied to data from two continuously imaging, moored instruments: the Scripps Plankton Camera System (SPCS) and the Imaging FlowCytobot (IFCB). Results from simulated deployment scenarios indicate zero-shot classifiers could be successful at recovering samples of rare taxa in image sets. This

  13. The Importance of Isomorphism for Conclusions about Homology: A Bayesian Multilevel Structural Equation Modeling Approach with Ordinal Indicators

    PubMed Central

    Guenole, Nigel

    2016-01-01

    We describe a Monte Carlo study examining the impact of assuming item isomorphism (i.e., equivalent construct meaning across levels of analysis) on conclusions about homology (i.e., equivalent structural relations across levels of analysis) under varying degrees of non-isomorphism in the context of ordinal indicator multilevel structural equation models (MSEMs). We focus on the condition where one or more loadings are higher on the between level than on the within level to show that while much past research on homology has ignored the issue of psychometric isomorphism, psychometric isomorphism is in fact critical to valid conclusions about homology. More specifically, when a measurement model with non-isomorphic items occupies an exogenous position in a multilevel structural model and the non-isomorphism of these items is not modeled, the within level exogenous latent variance is under-estimated leading to over-estimation of the within level structural coefficient, while the between level exogenous latent variance is overestimated leading to underestimation of the between structural coefficient. When a measurement model with non-isomorphic items occupies an endogenous position in a multilevel structural model and the non-isomorphism of these items is not modeled, the endogenous within level latent variance is under-estimated leading to under-estimation of the within level structural coefficient while the endogenous between level latent variance is over-estimated leading to over-estimation of the between level structural coefficient. The innovative aspect of this article is demonstrating that even minor violations of psychometric isomorphism render claims of homology untenable. We also show that posterior predictive p-values for ordinal indicator Bayesian MSEMs are insensitive to violations of isomorphism even when they lead to severely biased within and between level structural parameters. We highlight conditions where poor estimation of even correctly specified

  14. Testing the role of interspecific competition in the evolutionary origin of elevational zonation: an example with Buarremon brush-finches (Aves, Emberizidae) in the neotropical mountains.

    PubMed

    Cadena, Carlos Daniel

    2007-05-01

    Interspecific competition might drive the evolution of ecological niches and result in pairs of formerly competing species segregating along ecological gradients following a process of character displacement. This mechanism has been proposed to account for replacement of related species along gradients of elevation in many areas of the world, but the fundamental issue of whether competition is responsible for the origin of elevational replacements has not been tested. To test hypotheses about the role of interspecific competition in the origin of complementary elevational ranges, I combined molecular phylogenetics, phylogeography, and population genetic analyses on Buarremon torquatus and B. brunneinucha (Aves, Emberizidae), whose patterns of elevational distribution suggest character displacement or ecological release. The hypothesis that elevational distributions in these species changed in opposite directions as a result of competition is untenable because: (1) a historical expansion of the range of B. brunneinucha into areas occupied by B. torquatus was not accompanied by a shift in the elevational range of the former species; (2) when B. brunneinucha colonized the range of B. torquatus, lineages of the latter distributions had already diverged; and (3) historical trends in effective population size do not suggest populations with elevational ranges abutting those of putative competitors have declined as would be expected if competition caused range contractions. However, owing to uncertainty in coalescent estimates of historical population sizes, the hypothesis that some populations of B. torquatus have declined cannot be confidently rejected, which suggests asymmetric character displacement might have occurred. I suggest that the main role of competition in elevational zonation may be to act as a sorting mechanism that allows the coexistence along mountain slopes only of ecologically similar species that differ in elevational distributions prior to attaining

  15. Mohawk Lake or Mohawk meadow Sedimentary facies and stratigraphy of Quaternary deposits in Mohawk Valley, upper Middle Fork of the Feather River, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yount, J.C.; Harwood, D.S.; Bradbury, J.P.

    1993-04-01

    Mohawk Valley (MV) contain thick, well-exposed sections of Quaternary basin-fill sediments, with abundant interbedded tephra and a diverse assemblage of sedimentary facies. The eastern arm of MV, extending from Clio to Portola, contains as much as 100 m of trough cross-bedded cobble to pebble gravel and planar and trough cross-bedded coarse and medium sand, interpreted as braided stream deposits. Sections exposed in the western arm of MV consist in their lower parts of massive organic-rich silt and clay interbedded with blocky to fissile peat beds up to 1 m thick. Diatom assemblages are dominated by benthic species indicating fresh marshmore » environments with very shallow water depths of one meter or less. Proglacial lacustrine deposits of limited lateral extent are present within the outwash complexes as evidenced by varved fine sand and silt couplets, poorly sorted quartz-rich silt beds containing dropstones, and contorted beds of diamict grading laterally into slump blocks surrounded by wood-bearing silt and silty sand. The Rockland Ash (400 ka) is a prominent marker in the middle or lower part of many sections throughout MV, indicating that at least half of the basin-fill sequence is Late Quaternary in age. A log buried in diamict slumped into a proglacial lake lying approximately 3 km downstream from the Tioga Stage ice termini in Jamison and Gray Eagle Creeks yields an age of 18,715 [+-]235 C[sup 14] years BP. Previous interpretations of MV deposits originating in a large, deep lake with water depths in excess of 150 m are untenable given the sedimentary facies and diatom floras that dominate the valley. Unexhumed valleys such as Sierra Valley to the east and Long Valley to the northwest which contain large meadows traversed by braided streams are probably good analogs for the conditions that existed during the accumulation of the Mohawk Valley deposits.« less

  16. From marine ecology to biological oceanography

    NASA Astrophysics Data System (ADS)

    Mills, Eric L.

    1995-03-01

    Looking back from the 1990s it seems natural to view the work done in the Biologische Anstalt Helgoland by Friedrich Heincke and his colleagues, beginning in 1892, as marine ecology or marine biology, and that done in Kiel, under Victor Hensen and Karl Brandt, as biological oceanography. But historical analysis shows this view to be untenable. Biological oceanography, as a research category and a profession, does not appear until at least the 1950's. In the German tradition of marine research, “Ozeanographie”, originating in 19th century physical geography, did not include the biological sciences. The categories “Meereskunde” and “Meeresforschung” covered all aspects of marine research in Germany from the 1890's to the present day. “Meeresbiologie” like that of Brandt, Heincke, and other German marine scientists, fitted comfortably into these. But in North America no such satisfactory professional or definitional structure existed before the late 1950's. G. A. Riley, one of the first biological oceanographers, fought against descriptive, nonquantitative American ecology. In 1951 he described biological oceanography as the “ecology of marine populations”, linking it with quantitative population ecology in the U.S.A. By the end of the 1960's the U.S. National Science Foundation had recognized biological oceanography as a research area supported separately from marine biology. There was no need for the category “biological oceanography” in German marine science because its subject matter lay under the umbrella of “Meereskunde” or “Meeresforschung”. But in North America, biological oceanography — a fundamental fusion of physics and chemistry with marine biology — was created to give this marine science a status higher than that of the conceptually overloaded ecological sciences. The sociologists Durkheim and Mauss claimed in 1903 that, “the classification of things reproduces the classification of men”; similarly, in science, the

  17. Relativity of Simultaneity and Eternalism: In Defense of the Block Universe

    NASA Astrophysics Data System (ADS)

    Peterson, Daniel; Silberstein, Michael

    Ever since Hermann Minkowski's now infamous comments in 1908 concerning the proper way to view space-time, the debate has raged as to whether or not the universe should be viewed as a four-dimensional, unified whole wherein the past, present, and future are regarded as equally real or whether the views espoused by the possibilists, historicists, and presentists regarding the unreality of the future (and, for presentists, the past) are more accurate. Now, a century after Minkowski's proposed block universe first sparked debate, we present a new, more conclusive argument in favor of the eternalism. Utilizing an argument based on the relativity of simultaneity in the tradition of Putnam and Rietdijk and explicit novel but reasonable assumptions as to the nature of reality, we argue that the past, present, and future should be treated as equally real, thus ruling that presentism and other theories of time that bestow special ontological status to the past, present, or future are untenable. Finally, we respond to our critics who suggest that: (1) there is no metaphysical difference between the positions of eternalism and presentism, (2) the present must be defined as the "here" as well as the "now", or (3) presentism is correct and physicists' current understanding of relativity is incomplete because it does not incorporate a preferred frame. We call response 1 deflationary since it purports to dissolve or deconstruct the age-old debate between the two views and response 2 compatibilist because it does nothing to alter special relativity (SR), arguing instead that SR unadorned has the resources to save presentism. Response 3 we will call incompatibilist because it adorns SR in some way in order to save presentism a la some sort of preferred frame. We show that neither 1 nor 2 can save presentism and 3 is not well motivated at this juncture except as an ad hoc device to refute eternalism.

  18. How does carbon dioxide permeate cell membranes? A discussion of concepts, results and methods

    PubMed Central

    Endeward, Volker; Al-Samir, Samer; Itel, Fabian; Gros, Gerolf

    2013-01-01

    We review briefly how the thinking about the permeation of gases, especially CO2, across cell and artificial lipid membranes has evolved during the last 100 years. We then describe how the recent finding of a drastic effect of cholesterol on CO2 permeability of both biological and artificial membranes fundamentally alters the long-standing idea that CO2—as well as other gases—permeates all membranes with great ease. This requires revision of the widely accepted paradigm that membranes never offer a serious diffusion resistance to CO2 or other gases. Earlier observations of “CO2-impermeable membranes” can now be explained by the high cholesterol content of some membranes. Thus, cholesterol is a membrane component that nature can use to adapt membrane CO2 permeability to the functional needs of the cell. Since cholesterol serves many other cellular functions, it cannot be reduced indefinitely. We show, however, that cells that possess a high metabolic rate and/or a high rate of O2 and CO2 exchange, do require very high CO2 permeabilities that may not be achievable merely by reduction of membrane cholesterol. The article then discusses the alternative possibility of raising the CO2 permeability of a membrane by incorporating protein CO2 channels. The highly controversial issue of gas and CO2 channels is systematically and critically reviewed. It is concluded that a majority of the results considered to be reliable, is in favor of the concept of existence and functional relevance of protein gas channels. The effect of intracellular carbonic anhydrase, which has recently been proposed as an alternative mechanism to a membrane CO2 channel, is analysed quantitatively and the idea considered untenable. After a brief review of the knowledge on permeation of O2 and NO through membranes, we present a summary of the 18O method used to measure the CO2 permeability of membranes and discuss quantitatively critical questions that may be addressed to this method. PMID

  19. Tests of remote aftershock triggering by small mainshocks using Taiwan's earthquake catalog

    NASA Astrophysics Data System (ADS)

    Peng, W.; Toda, S.

    2014-12-01

    . They locate mostly in high geothermal gradient areas, which are probably triggered by a small-scale aseismic process. Thus it rather supports the argument of Richards-Dingers et al. in which dynamic triggering by small mainshock is untenable.

  20. Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example

    USGS Publications Warehouse

    Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.

    2016-02-10

    The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource

  1. Analysis of the Non-LTE Lithium Abundance for a Large Sample of F-, G-, and K-Giants and Supergiants

    NASA Astrophysics Data System (ADS)

    Lyubimkov, L. S.; Petrov, D. V.

    2017-09-01

    A five-dimensional interpolation method and corresponding computer program are developed for using published calculations to determine the non-LTE correction ΔNLTE to the lithium abundance logɛ(Li) derived from the Li I 6707.8 Å line. The ΔNLTE value is determined from the following five parameters: the effective temperature Teff, the acceleration of gravity logg, the metallicity index [Fe/H], the microturbulent velocity Vt, and the LTE Li abundance logɛ(Li) . The program is used to calculate values of ΔNLTE and the non-LTE Li abundance for 91 single bright giants from the list of Lebre, et al. By combining these results with data for 55 stars from the previous paper, we obtain the non-LTE values of logɛ(Li) for 146 FGK-giants and supergiants. We confirm that, because of the absence of the Li line in the spectra of most of these stars, it is only possible to estimate for them an upper bound for the Li abundance. A large spread is confirmed in logɛ(Li) for stars with masses M ≤ 6M ⦿ . A comparison of these results with model calculations of stars confirms the unique sensitivity of the lithium abundance to the initial rotation velocity V0. We discuss the giants and supergiants with lithium abundances logɛ(Li) = 1.4 ± 0.3 , which could have a rotational velocity V0=0 km/s and have already undergone deep convective mixing. Li-rich giants with lithium abundances logɛ(Li) ≥ 2 and nearly up to the initial value of logɛ(Li) = 3.2 ± 0.1 are examined. It is shown that the fraction of Li-rich giants with V0 ≈ 0 - 50 km/s is consistent with current evolutionary models. The other stars of this type, as well as all of the "super Li-rich" giants, for which the standard theory is untenable, can be explained by invoking the hypothesis of recent lithium synthesis in the star or an alternative hypothesis according to which a giant planet is engulfed by the star.

  2. Composition in the Quantum World

    NASA Astrophysics Data System (ADS)

    Hall, Edward Jonathan

    This thesis presents a problem for the foundations of quantum mechanics. It arises from the way that theory describes the composition of larger systems in terms of smaller ones, and renders untenable a wide range of interpretations of quantum mechanics. That quantum mechanics is difficult to interpret is old news, given the well-known Measurement Problem. But the problem I raise is quite different, and in important respects more fundamental. In brief: The physical world exhibits mereological structure: physical objects have parts, which in turn have parts, and so on. A natural way to try to represent this structure is by means of a particle theory, according to which the physical world consists entirely enduring physical objects which themselves have no proper parts, but aggregates of which are, or compose, all physical objects. Elementary, non-relativistic quantum mechanics can be cast in this mold--at least, according to the usual expositions of that theory. But herein lies the problem: the standard attempt to give a systematic particle interpretation to elementary quantum mechanics results in nonsense, thanks to the well-established principle of Permutation Invariance, which constrains the quantum -mechanical description of systems containing identical particles. Specifically, it follows from the most minimal principles of a particle interpretation (much weaker than those needed to generate the Measurement Problem), together with Permutation Invariance, that systems identical in composition must have the same physical state. In other words, systems which merely have the same numbers of the same types of particles are therefore, at all times, perfect physical duplicates. This conclusion is absurd: e.g., it is quite plausible that some of those particles which compose my body make up a system identical in composition to some pepperoni pizza. Yet no part of me is a qualitative physical duplicate of any pepperoni pizza. Perhaps "you are what you eat" --but not in

  3. Crustal strain accumulation on Southern Basin and Range Province faults modulated by distant plate boundary earthquakes? Evidence from geodesy, seismic imaging, and paleoseismology

    NASA Astrophysics Data System (ADS)

    Bennett, R. A.; Shirzaei, M.; Broermann, J.; Spinler, J. C.; Holland, A. A.; Pearthree, P.

    2014-12-01

    GPS in Arizona reveals a change in the pattern of crustal strain accumulation in 2010 and based on viscoelastic modeling appears to be associated with the distant M7.2 El Mayor-Cucapah (EMC) earthquake in Baja California, Mexico. GPS data collected between 1999 and 2009 near the Santa Rita normal fault in SE Arizona reveal a narrow zone of crustal deformation coincident with the fault trace, delineated by W-NW facing Pleistocene fault scarps of heights 1 to 7 m. The apparent deformation zone is also seen in a preliminary InSAR interferogram. Total motion across the zone inferred using an elastic block model constrained by the pre-2010 GPS measurements is ~1 mm/yr in a sense consistent with normal fault motion. However, continuous GPS measurements throughout Arizona reveal pronounced changes in crustal velocity following the EMC earthquake, such that the relative motion across the Santa Rita fault post-2010 is negligible. Paleoseismic evidence indicates that mapped Santa Rita fault scarps were formed by two or more large magnitude (M6.7 to M7.6) surface rupturing normal-faulting earthquakes 60 to 100 kyrs ago. Seismic refraction and reflection data constrained by deep (~800 m) well log data provide evidence of progressive, possibly intermittent, displacement on the fault through time. The rate of strain accumulation observed geodetically prior to 2010, if constant over the past 60 to 100 kyrs, would imply an untenable minimum slip rate deficit of 60 to 100 m since the most recent earthquake. One explanation for the available geodetic, seismic, and paleoseismic evidence is that strain accumulation is modulated by viscoelastic relaxation associated with frequent large magnitude earthquakes in the Salton Trough region, episodically inhibiting the accumulation of elastic strain required to generate large earthquakes on the Santa Rita and possibly other faults in the Southern Basin and Range. An important question is thus for how long the postseismic velocity changes

  4. The contribution of Physician Assistants in primary care: a systematic review

    PubMed Central

    2013-01-01

    meta analysis or meta synthesis untenable. Conclusions The research evidence of the contribution of PAs to primary care was mixed and limited. However, the continued growth in employment of PAs in American primary care suggests that this professional group is judged to be of value by increasing numbers of employers. Further specific studies are needed to fill in the gaps in our knowledge about the effectiveness of PAs’ contribution to the international primary care workforce. PMID:23773235

  5. Recent advances in the management of bovine tuberculosis in free-ranging wildlife.

    PubMed

    O'Brien, Daniel J; Schmitt, Stephen M; Rudolph, Brent A; Nugent, Graham

    2011-07-05

    hunting economy and of whitetails as a game animal have made such aggressive culling politically untenable. This has forced reliance upon publicly supported, and implemented, management tools, and so provided impetus to better understand social support for wildlife management policy, its limitations, and ways to employ it in disease control policy development. Copyright © 2011. Published by Elsevier B.V.

  6. Thresholds and the Evolution of Bedrock Channels on the Hawaiian Islands

    NASA Astrophysics Data System (ADS)

    Raming, L. W.; Whipple, K. X.

    2017-12-01

    Erosional thresholds are a key component of the non-linear dynamics of bedrock channel incision and long-term landscape evolution. Erosion thresholds, however, have remained difficult to quantify and uniquely identify in landscape evolution. Here we present an analysis of the morphology of canyons on the Hawaiian Islands and put forth the hypothesis that they are threshold-dominated landforms. Geologic(USGS), topographic (USGS 10m DEM), runoff (USGS) and meteorological data (Rainfall Atlas of Hawai`i) were used in an analysis of catchments on the islands of Hawai`i, Kaua`i, Lāna`i, Maui, and Moloka'i. Channel incision was estimated by differencing the present topography from reconstructed pre-incision volcanic surfaces. Four key results were obtained from our analysis: (1) Mean total incision ranged from 11 to 684 m and exhibited no correlation with incision duration. (2) In major canyons on the Islands of Hawaii and Kauai rejuvenated-stage basalt flow outcrops at river level show incision effectively ceased after a period no longer than 100 ka and 1.4 Ma, respectively. (3) Mean canyon wall gradient below knickpoints decreases with volcano age, with a median value of 1 measured on Hawaii and of 0.7 on Kauai. (4) Downstream of major knickpoints which demarcate the upper limits of deep canyons, channel profiles have near uniform channel steepness with most values ranging between 60 and 100. The presence of uniform channel steepness (KSN) implies uniform bed shear stress and typically is interpreted as a steady-state balance between uplift and incision in tectonically active landscapes. However, this is untenable for Hawaiian canyons and subsequently we posit that uniform KSN represents a condition where flood shear stress has been reduced to threshold values and incision reduced to near zero. Uniform KSN values decrease with rainfall, consistent with wetter regions generating threshold shear stress at lower KSN. This suggests that rapid incision occurred during

  7. 3D-calibration of three- and four-sensor hot-film probes based on collocated sonic using neural networks

    NASA Astrophysics Data System (ADS)

    Kit, Eliezer; Liberzon, Dan

    2016-09-01

    High resolution measurements of turbulence in the atmospheric boundary layer (ABL) are critical to the understanding of physical processes and parameterization of important quantities, such as the turbulent kinetic energy dissipation. Low spatio-temporal resolution of standard atmospheric instruments, sonic anemometers and LIDARs, limits their suitability for fine-scale measurements of ABL. The use of miniature hot-films is an alternative technique, although such probes require frequent calibration, which is logistically untenable in field setups. Accurate and truthful calibration is crucial for the multi-hot-films applications in atmospheric studies, because the ability to conduct calibration in situ ultimately determines the turbulence measurements quality. Kit et al (2010 J. Atmos. Ocean. Technol. 27 23-41) described a novel methodology for calibration of hot-film probes using a collocated sonic anemometer combined with a neural network (NN) approach. An important step in the algorithm is the generation of a calibration set for NN training by an appropriate low-pass filtering of the high resolution voltages, measured by the hot-film-sensors and low resolution velocities acquired by the sonic. In Kit et al (2010 J. Atmos. Ocean. Technol. 27 23-41), Kit and Grits (2011 J. Atmos. Ocean. Technol. 28 104-10) and Vitkin et al (2014 Meas. Sci. Technol. 25 75801), the authors reported on successful use of this approach for in situ calibration, but also on the method’s limitations and restricted range of applicability. In their earlier work, a jet facility and a probe, comprised of two orthogonal x-hot-films, were used for calibration and for full dataset generation. In the current work, a comprehensive laboratory study of 3D-calibration of two multi-hot-film probes (triple- and four-sensor) using a grid flow was conducted. The probes were embedded in a collocated sonic, and their relative pitch and yaw orientation to the mean flow was changed by means of motorized

  8. Harnessing a methane‐fueled, sediment‐free mixed microbial community for utilization of distributed sources of natural gas

    PubMed Central

    Marlow, Jeffrey J.; Kumar, Amit; Enalls, Brandon C.; Reynard, Linda M.; Tuross, Noreen

    2018-01-01

    Abstract Harnessing the metabolic potential of uncultured microbial communities is a compelling opportunity for the biotechnology industry, an approach that would vastly expand the portfolio of usable feedstocks. Methane is particularly promising because it is abundant and energy‐rich, yet the most efficient methane‐activating metabolic pathways involve mixed communities of anaerobic methanotrophic archaea and sulfate reducing bacteria. These communities oxidize methane at high catabolic efficiency and produce chemically reduced by‐products at a comparable rate and in near‐stoichiometric proportion to methane consumption. These reduced compounds can be used for feedstock and downstream chemical production, and at the production rates observed in situ they are an appealing, cost‐effective prospect. Notably, the microbial constituents responsible for this bioconversion are most prominent in select deep‐sea sediments, and while they can be kept active at surface pressures, they have not yet been cultured in the lab. In an industrial capacity, deep‐sea sediments could be periodically recovered and replenished, but the associated technical challenges and substantial costs make this an untenable approach for full‐scale operations. In this study, we present a novel method for incorporating methanotrophic communities into bioindustrial processes through abstraction onto low mass, easily transportable carbon cloth artificial substrates. Using Gulf of Mexico methane seep sediment as inoculum, optimal physicochemical parameters were established for methane‐oxidizing, sulfide‐generating mesocosm incubations. Metabolic activity required >∼40% seawater salinity, peaking at 100% salinity and 35 °C. Microbial communities were successfully transferred to a carbon cloth substrate, and rates of methane‐dependent sulfide production increased more than threefold per unit volume. Phylogenetic analyses indicated that carbon cloth‐based communities were

  9. Applicability of Channel flow as an extrusion mechanism of the Higher Himalayan Shear Zone from Sutlej, Zanskar, Dhauliganga and Goriganga Sections, Indian Himalaya

    NASA Astrophysics Data System (ADS)

    Mukherjee, Soumyajit

    2010-05-01

    Applicability of Channel flow as an extrusion mechanism of the Higher Himalayan Shear Zone from Sutlej, Zanskar, Dhauliganga and Goriganga Sections, Indian Himalaya Soumyajit Mukherjee Department of Earth Sciences, Indian Institute of Technology Bombay Powai, Mumbai- 400076, INDIA, e-mail: soumyajitm@gmail.com Mukherjee & Koyi (1,2) evaluated the applicability of channel flow extrusion of the Higher Himalayan Shear Zone (HHSZ) in the Zanskar and the Sutlej sections based on field- and micro-structural studies, analytical- and analog models. Further work on the Dhauliganga and the Goriganga sections of the HHSZ reveal complicated structural geology that is untenable to explain simply in terms of channel flow. For example, in the former section, flexure slip folds exist in a zone spatially separated from the upper strand of the South Tibetan Detachment System (STDSU). On the other hand, in the later section, an STDSU- in the sense of Mukherjee and Koyi (1)- is absent. Instead, a steep extensional shear zone with northeasterly dipping shear plane cuts the pre-existing shear fabrics throughout the HHSZ. However, the following common structural features in the HHSZ were observed in these sections. (1) S-C fabrics are the most ubiquitous ductile shear sense indicators in field. (2) Brittle shearing along the preexisting ductile primary shear planes in a top-to-SW sense. (3) Less ubiquitous ductile compressional shearing in the upper part of the shear zone including the STDSU. (4) A phase of local brittle-ductile extension throughout the shear zone as revealed by boudins of various morphologies. (5) The shear zone is divisible into a southern non-migmatitic and a northern migmatitic zone. No special structural dissimilarity is observed across this lithological boundary. Keywords: Channel flow, Extrusion, Higher Himalaya, Structural Geology, Shear zone, Deformation References 1. Mukherjee S, Koyi HA (in press) Higher Himalayan Shear Zone, Sutlej section: structural geology

  10. An Alternative view of Earth's Tectonics : The Moon's explosive origin out of SE Asia.

    NASA Astrophysics Data System (ADS)

    Coleman, P. F.

    2017-12-01

    A lunar birth scar is typically considered untenable, under the standard paradigm (GTS-4.6-0 Ga, Giant Impact/Plate Tectonics), since it would have been erased by a combination of Wilson recycling, and erosion. This paradigm, while supported by robust, absolute dating, is still provisional, and, like all scientifc paradigms, is nonetheless open to refutation. It cannot, a priori, rule out such a scar. If empirical evidence were to be discovered, in favor of a lunar birthmark, it would have profound implications for the standard view. Coleman (2015) proposed an alternative paradigm based on an internal explosion of Proto-Earth (PE) that ejected the Moon into orbit and left coeval global signatures, such as; ocean-continent antipodality, the global geoid, origin of water, continents, trenches, fault lines, LIPs, hotspots, seamount chains, from the high TP shock/seismic waves. The abrupt deceleration also led to inertial effects of PE's crustal layers, possibly explaining subduction/obduction and fold and thrust fold belts. One major, first order, line of evidence is the actual fission signature ( 4000+ km long) where the Moon was explosively thrust tangentially (to the core) through ductile mantle (see Fig B) to escape into orbit. The proposed path, (locus Moon's center) is from (0°, 78.5°E) (Fig A), near present day India, to (+14.4°, 119°E) out of SE Asia (See Fig C). Possible evidence in favor of this path (but not limited to) include: the Indian Geoid Anomaly Low ( Moon's exhumation?), the Himalayas and Tibetan Plateau (generated by the Moon's NE collisional movement and temporary hole and mantle rebound), SE Asia with many minor plates and back arc basins ( the Moon's exit zone), the East African Rifts (EARs) form a NE-directed pull apart region (explained as a set explosive crustal fragments or "plates") moving towards this relic unconsolidated Asian sink hole (See Fig D). The existence of a fossilised lunar birth points to a recent Earth-Moon, since

  11. Methodological insights: fuzzy sets in medicine.

    PubMed

    Vineis, P

    2008-03-01

    In this paper I wish to introduce some ideas about scientific reasoning that have reached the epidemiological community only marginally. They have to do with how we classify things (diseases), and how we formulate hypotheses (causes). According to a simplified and currently untenable model, we come to defining what a disease, or a protone or a chromosome, is by progressive simplification--that is, by extracting an essence from the individual characters of disease. At the end of this inductive process a single element, which guarantees the unequivocal inclusion in the category, is identified. This is what has been called "Merkmal-definition" (Merkmal meaning distinctive sign)--that is, the definition of disease would be allowed by the isolation of a crucial property, a necessary and sufficient condition, which makes that disease unique (and a chair out of a chair, a proton out of a proton, etc). However many objections have been raised by Wittgenstein, Eleanor Rosch and others to this idea: a Merkmal is not always identifiable, and more often a word is used to indicate not a homogeneous and unequivocal set of observations, but a confused constellation with blurred borders. This constellation has been called a fuzzy set and is at the basis of the semantic theory of metaphors proposed by MacCormac and the prototype theory proposed by Rosch. In this way the concept of disease, for example, abandons monothetic definitions, amenable to a necessary and sufficient characteristic, to become "polythetic." I explain how these concepts can help medicine and epidemiology to clarify some open issues in the definition of disease and the identification of causes, through examples taken from oncology, psychiatry, cardiology and infectious diseases. The definition of a malignant tumour, for example, seems to correspond to the concept of "family resemblance," since there is no single criterion that allows us to define unequivocally the concept of cancer: not morphology (there are

  12. Harnessing a methane-fueled, sediment-free mixed microbial community for utilization of distributed sources of natural gas.

    PubMed

    Marlow, Jeffrey J; Kumar, Amit; Enalls, Brandon C; Reynard, Linda M; Tuross, Noreen; Stephanopoulos, Gregory; Girguis, Peter

    2018-06-01

    Harnessing the metabolic potential of uncultured microbial communities is a compelling opportunity for the biotechnology industry, an approach that would vastly expand the portfolio of usable feedstocks. Methane is particularly promising because it is abundant and energy-rich, yet the most efficient methane-activating metabolic pathways involve mixed communities of anaerobic methanotrophic archaea and sulfate reducing bacteria. These communities oxidize methane at high catabolic efficiency and produce chemically reduced by-products at a comparable rate and in near-stoichiometric proportion to methane consumption. These reduced compounds can be used for feedstock and downstream chemical production, and at the production rates observed in situ they are an appealing, cost-effective prospect. Notably, the microbial constituents responsible for this bioconversion are most prominent in select deep-sea sediments, and while they can be kept active at surface pressures, they have not yet been cultured in the lab. In an industrial capacity, deep-sea sediments could be periodically recovered and replenished, but the associated technical challenges and substantial costs make this an untenable approach for full-scale operations. In this study, we present a novel method for incorporating methanotrophic communities into bioindustrial processes through abstraction onto low mass, easily transportable carbon cloth artificial substrates. Using Gulf of Mexico methane seep sediment as inoculum, optimal physicochemical parameters were established for methane-oxidizing, sulfide-generating mesocosm incubations. Metabolic activity required >∼40% seawater salinity, peaking at 100% salinity and 35 °C. Microbial communities were successfully transferred to a carbon cloth substrate, and rates of methane-dependent sulfide production increased more than threefold per unit volume. Phylogenetic analyses indicated that carbon cloth-based communities were substantially streamlined and were

  13. A novel nonparametric item response theory approach to measuring socioeconomic position: a comparison using household expenditure data from a Vietnam health survey, 2003

    PubMed Central

    2014-01-01

    Background Measures of household socio-economic position (SEP) are widely used in health research. There exist a number of approaches to their measurement, with Principal Components Analysis (PCA) applied to a basket of household assets being one of the most common. PCA, however, carries a number of assumptions about the distribution of the data which may be untenable, and alternative, non-parametric, approaches may be preferred. Mokken scale analysis is a non-parametric, item response theory approach to scale development which appears never to have been applied to household asset data. A Mokken scale can be used to rank order items (measures of wealth) as well as households. Using data on household asset ownership from a national sample of 4,154 consenting households in the World Health Survey from Vietnam, 2003, we construct two measures of household SEP. Seventeen items asking about assets, and utility and infrastructure use were used. Mokken Scaling and PCA were applied to the data. A single item measure of total household expenditure is used as a point of contrast. Results An 11 item scale, out of the 17 items, was identified that conformed to the assumptions of a Mokken Scale. All the items in the scale were identified as strong items (Hi > .5). Two PCA measures of SEP were developed as a point of contrast. One PCA measure was developed using all 17 available asset items, the other used the reduced set of 11 items identified in the Mokken scale analaysis. The Mokken Scale measure of SEP and the 17 item PCA measure had a very high correlation (r = .98), and they both correlated moderately with total household expenditure: r = .59 and r = .57 respectively. In contrast the 11 item PCA measure correlated moderately with the Mokken scale (r = .68), and weakly with the total household expenditure (r = .18). Conclusion The Mokken scale measure of household SEP performed at least as well as PCA, and outperformed the PCA measure developed with

  14. The cosmological model of eternal inflation and the transition from chance to biological evolution in the history of life

    PubMed Central

    Koonin, Eugene V

    2007-01-01

    Background Recent developments in cosmology radically change the conception of the universe as well as the very notions of "probable" and "possible". The model of eternal inflation implies that all macroscopic histories permitted by laws of physics are repeated an infinite number of times in the infinite multiverse. In contrast to the traditional cosmological models of a single, finite universe, this worldview provides for the origin of an infinite number of complex systems by chance, even as the probability of complexity emerging in any given region of the multiverse is extremely low. This change in perspective has profound implications for the history of any phenomenon, and life on earth cannot be an exception. Hypothesis Origin of life is a chicken and egg problem: for biological evolution that is governed, primarily, by natural selection, to take off, efficient systems for replication and translation are required, but even barebones cores of these systems appear to be products of extensive selection. The currently favored (partial) solution is an RNA world without proteins in which replication is catalyzed by ribozymes and which serves as the cradle for the translation system. However, the RNA world faces its own hard problems as ribozyme-catalyzed RNA replication remains a hypothesis and the selective pressures behind the origin of translation remain mysterious. Eternal inflation offers a viable alternative that is untenable in a finite universe, i.e., that a coupled system of translation and replication emerged by chance, and became the breakthrough stage from which biological evolution, centered around Darwinian selection, took off. A corollary of this hypothesis is that an RNA world, as a diverse population of replicating RNA molecules, might have never existed. In this model, the stage for Darwinian selection is set by anthropic selection of complex systems that rarely but inevitably emerge by chance in the infinite universe (multiverse). Conclusion The

  15. The cosmological model of eternal inflation and the transition from chance to biological evolution in the history of life.

    PubMed

    Koonin, Eugene V

    2007-05-31

    Recent developments in cosmology radically change the conception of the universe as well as the very notions of "probable" and "possible". The model of eternal inflation implies that all macroscopic histories permitted by laws of physics are repeated an infinite number of times in the infinite multiverse. In contrast to the traditional cosmological models of a single, finite universe, this worldview provides for the origin of an infinite number of complex systems by chance, even as the probability of complexity emerging in any given region of the multiverse is extremely low. This change in perspective has profound implications for the history of any phenomenon, and life on earth cannot be an exception. Origin of life is a chicken and egg problem: for biological evolution that is governed, primarily, by natural selection, to take off, efficient systems for replication and translation are required, but even barebones cores of these systems appear to be products of extensive selection. The currently favored (partial) solution is an RNA world without proteins in which replication is catalyzed by ribozymes and which serves as the cradle for the translation system. However, the RNA world faces its own hard problems as ribozyme-catalyzed RNA replication remains a hypothesis and the selective pressures behind the origin of translation remain mysterious. Eternal inflation offers a viable alternative that is untenable in a finite universe, i.e., that a coupled system of translation and replication emerged by chance, and became the breakthrough stage from which biological evolution, centered around Darwinian selection, took off. A corollary of this hypothesis is that an RNA world, as a diverse population of replicating RNA molecules, might have never existed. In this model, the stage for Darwinian selection is set by anthropic selection of complex systems that rarely but inevitably emerge by chance in the infinite universe (multiverse). The plausibility of different models

  16. Optimal mask characterization by Surrogate Wafer Print (SWaP) method

    NASA Astrophysics Data System (ADS)

    Kimmel, Kurt R.; Hoellein, Ingo; Peters, Jan Hendrick; Ackmann, Paul; Connolly, Brid; West, Craig

    2008-10-01

    Traditionally, definition of mask specifications is done completely by the mask user, while characterization of the mask relative to the specifications is done completely by the mask maker. As the challenges of low-k1 imaging continue to grow in scope of designs and in absolute complexity, the inevitable partnership between wafer lithographers and mask makers has strengthened as well. This is reflected in the jointly owned mask facilities and device manufacturers' continued maintenance of fully captive mask shops which foster the closer mask-litho relationships. However, while some device manufacturers have leveraged this to optimize mask specifications before the mask is built and, therefore, improve mask yield and cost, the opportunity for post-fabrication partnering on mask characterization is more apparent and compelling. The Advanced Mask Technology Center (AMTC) has been investigating the concept of assessing how a mask images, rather than the mask's physical attributes, as a technically superior and lower-cost method to characterize a mask. The idea of printing a mask under its intended imaging conditions, then characterizing the imaged wafer as a surrogate for traditional mask inspections and measurements represents the ultimate method to characterize a mask's performance, which is most meaningful to the user. Surrogate wafer print (SWaP) is already done as part of leading-edge wafer fab mask qualification to validate defect and dimensional performance. In the past, the prospect of executing this concept has generally been summarily discarded as technically untenable and logistically intractable. The AMTC published a paper at BACUS 2007 successfully demonstrating the performance of SWaP for the characterization of defects as an alternative to traditional mask inspection [1]. It showed that this concept is not only feasible, but, in some cases, desirable. This paper expands on last year's work at AMTC to assess the full implementation of SWaP as an

  17. Missing evidence for the LGM-asynchronity in the Central Spanish Pyrenees in geomorphological, sedimentological and pedological archives

    NASA Astrophysics Data System (ADS)

    Hirsch, Florian; Raab, Thomas

    2016-04-01

    began after the sedimentation of the periglacial deposits, either implying a striking timeframe of more than 15 ka with a stable landscape without any pedogenesis, or the untenability of the MIS 3 age of the glacial sediments. Because we can clearly differentiate further phases of geomorphodynamics during the Holocene with truncated soil profiles and the correlate sediments of soil erosion next to undisturbed soils in periglacial sediments with a lateglacial age, we challenge the thesis of an asynchronous LGM in the Central Spanish Pyrenees and advocate a synchronous LGM in the Gallego- and Aragon valley analog to the Eastern Pyrenees.

  18. Reconciliation Ecology, Rewilding and the San Joaquin River Restoration

    NASA Astrophysics Data System (ADS)

    Kraus-Polk, A.

    2014-12-01

    Recent events, perhaps reaching their climactic convergence in the current drought, have exposed the fragility and imbalances of the socioecological system of the San Joaquin river. We see that our triumphant march of progress onfolds on a thin, and unstable crust. What lies below is lava. Our agricultural systems progress only while extracting an ever-untenable social and ecological debt. Our successive regimes of accumulation by appropriation have brought us to the brink of ecological exhaustion. Have we reached our day of reckoning? This is not the first time this question has been asked of this particular system of irrigated agriculture? "Insurmountable" ecological barriers have been eyed down and promptly obliterated through magnificent features of physical and social engineering. But lets us consider for a moment that we have at last reached some sort of edge, a threshold past which we experience a sudden socioecological regime shift. Staring out over this edge can we begin to come to terms with the fallacies of our stories, our ignorance, our foolishness? We need an acknowledgement of the needs of the agriculture systems, it's connections and dependencies. What desperate measures are we willing to take in order to sustain this system? How much further can we go? How far is too far? Is there another way to produce and distribute food? We then turn to the past. We imagine the ecosystem as it once was. The pelagic fish species that formed the biological connection between this river system, the delta, the Ocean, the Mountains. What would it mean to restore this diversity and repair these relationships? What would it take to cede control to the non-human forces that sustain these connections? How do we reconcile restraint and the cessation of control with the human needs of the system? How do we rewild our river in such a way that our needs are met in a way that is more resilient and equitable? We will need systems of agriculture and flood control that serve

  19. Effective Risk Management in Innovative Projects: A Case Study of the Construction of Energy-efficient, Sustainable Building of the Laboratory of Intelligent Building in Cracow

    NASA Astrophysics Data System (ADS)

    Krechowicz, Maria

    2017-10-01

    Many construction projects fail to meet deadlines or they exceed the assumed budget. This scenario is particularly common in the case of innovative projects, in which too late identification of a high risk of delays and exceeding the assumed costs makes a potentially profitable project untenable. A high risk level, far exceeding the level of risk in standard non-innovative projects, is a characteristic feature of the realization phase of innovative projects. This is associated not only with greater complexity of the design and construction phases, but also with the problems with application of new technologies and prototype solutions, lack of qualified personnel with suitable expertise in specialized areas, and with the ability to properly identify the gaps between available and required knowledge and skills. This paper discusses the process of effective risk management in innovative projects on the example of the realization phase of an innovative, energy-efficient and sustainable building of the Laboratory of Intelligent Building in Cracow - DLJM Lab, from the point of view of DORBUD S.A., its general contractor. In this paper, a new approach to risk management process for innovative construction projects is proposed. Risk management process was divided into five stages: gathering information, identification of the important unwanted events, first risk assessment, development and choice of risk reaction strategies, assessment of the residual risk after introducing risk reactions. 18 unwanted events in an innovative construction project were identified. The first risk assessment was carried out using two-parametric risk matrix, in which the probability of unwanted event occurrence and its consequences were analysed. Three levels of risks were defined: tolerable, controlled and uncontrolled. Risk reactions to each defined unwanted event were developed. The following risk reaction types were considered: risk retention, risk reduction, risk transfer and risk

  20. A novel nonparametric item response theory approach to measuring socioeconomic position: a comparison using household expenditure data from a Vietnam health survey, 2003.

    PubMed

    Reidpath, Daniel D; Ahmadi, Keivan

    2014-01-01

    Measures of household socio-economic position (SEP) are widely used in health research. There exist a number of approaches to their measurement, with Principal Components Analysis (PCA) applied to a basket of household assets being one of the most common. PCA, however, carries a number of assumptions about the distribution of the data which may be untenable, and alternative, non-parametric, approaches may be preferred. Mokken scale analysis is a non-parametric, item response theory approach to scale development which appears never to have been applied to household asset data. A Mokken scale can be used to rank order items (measures of wealth) as well as households. Using data on household asset ownership from a national sample of 4,154 consenting households in the World Health Survey from Vietnam, 2003, we construct two measures of household SEP. Seventeen items asking about assets, and utility and infrastructure use were used. Mokken Scaling and PCA were applied to the data. A single item measure of total household expenditure is used as a point of contrast. An 11 item scale, out of the 17 items, was identified that conformed to the assumptions of a Mokken Scale. All the items in the scale were identified as strong items (Hi > .5). Two PCA measures of SEP were developed as a point of contrast. One PCA measure was developed using all 17 available asset items, the other used the reduced set of 11 items identified in the Mokken scale analaysis. The Mokken Scale measure of SEP and the 17 item PCA measure had a very high correlation (r = .98), and they both correlated moderately with total household expenditure: r = .59 and r = .57 respectively. In contrast the 11 item PCA measure correlated moderately with the Mokken scale (r = .68), and weakly with the total household expenditure (r = .18). The Mokken scale measure of household SEP performed at least as well as PCA, and outperformed the PCA measure developed with the 11 items used in the

  1. Promise and problems in using stress triggering models for time-dependent earthquake hazard assessment

    NASA Astrophysics Data System (ADS)

    Cocco, M.

    2001-12-01

    the stressing history perturbing the faults (such as dynamic stress changes, post-seismic stress changes caused by viscolelastic relaxation or fluid flow). If, for instance, we believe that dynamic stress changes can trigger aftershocks or earthquakes years after the passing of the seismic waves through the fault, the perspective of calculating interaction probability is untenable. It is therefore clear we have learned a lot on earthquake interaction incorporating fault constitutive properties, allowing to solve existing controversy, but leaving open questions for future research.

  2. Risk Management of Jettisoned Objects in LEO

    NASA Technical Reports Server (NTRS)

    Bacon, John B.; Gray, Charles

    2011-01-01

    accepted. Although ISS-related debris often presents untenable risks to the EVA crew, IVA crew, or to a departing cargo vehicle for a controlled disposal, such released objects also present a ballistic nuisance to the visiting vehicle traffic, and a potential fragmentation threat to the hundreds of other functional and debris objects whose perigees lie below the ISS orbital altitude. Thus, every such jettison decision is a conscious risk trade.

  3. How do Kakortokites form? Additional evidence from the Ilimaussaq Complex, S. Greenland

    NASA Astrophysics Data System (ADS)

    Hunt, E. J.; Finch, A. A.; Donaldson, C. H.

    2012-04-01

    The Ilímaussaq Complex, South Greenland, contains some of the most evolved igneous rocks in the world and is widely considered to represent one of the largest deposits of rare-earth elements, Ta, Nb and Zr. Our work is focused on the kakortokite layered series at the base of the complex. The layered series is composed of 29 repetitive 3-layer units (named -11 to +17, Bohse et al. 1971), successively enriched in arfvedsonite, eudialyte and nepheline. Despite a large body of work on the development of the kakortokite series, no consensus on the process/processes that produced the layering has been forthcoming. We present the preliminary findings of a combined petrographical, quantitative textural and geochemical analysis on the kakortokite series, initially focused on layer 0. Although many of the hypotheses for the formation of these rocks invoke a pressure change, the enrichment of the series in volatile constituents (CH4 and H; Konnerup-Madsen, 2001) has led many authors to suggest crystallisation occurred in a closed system, with processes of gravitational settling formed the layering. Crystal size distribution (CSD) analysis, performed on hand-digitised photomicrographs, provides insight into processes of crystal nucleation and growth. The results indicate that simple cumulate settling is untenable for layer 0. Instead the plot gradients indicate that the arfvedsonite in the black kakortokite crystallised in situ above a sharp boundary to the white kakortokite. The CSD plots for the alkali feldspars indicate secondary nucleation occurred, with the small crystal size fraction forming in situ. The feldspar phenocrysts also exhibit embayment textures indicating partial resorption. These graphs are consistent with a model whereby an influx of hotter magma results in the partial thermal erosion of the underlying white kakortokite, followed by in situ crystallisation of arfvedsonite above the melt infiltration boundary, followed by in situ crystallisation of

  4. Anomalies in Trace Metal and Rare-Earth Loads below a Waste-Water Treatment Plant

    NASA Astrophysics Data System (ADS)

    Antweiler, R.; Writer, J. H.; Murphy, S.

    2013-12-01

    , it seems untenable as a hypothesis to suppose that the stream bed material can permanently supply the source of the in-stream load increases of a large group of inorganic elements. We propose that the anomalous increase in loads was more a function of the time of sampling (both diurnally and seasonally) and that sampling at different times of day or different seasons during the year would give contradictory results to those seen here. If this is so, inorganic loading studies must include multiple sampling both over the course of a day and during different seasons and flow regimes.

  5. [Religion, morality and politics: the abortion debate].

    PubMed

    Ladriere, P

    1982-01-01

    defined social and family policy. Issues raised in the testimony of representatives of Protestant groups included the idea that each person is responsible for interpreting the will of God in complex situations, limits to the idea that life is a blessing of God, the right of women and couples to control their fertility, and abortion as a last resort. The Protestant position in favor of liberalization of the law held that existing repressive laws were untenable given the perils of illegal abortions and the fundamental modifications in relations between man and nature brought about by science. The Protestant church, a minority in France, took a more active role than the Catholic in suggesting specific legislation.

  6. Interventional cardiology, where real life and science do not necessarily meet.

    PubMed

    Meier, Bernhard

    2016-07-07

    Evidence-based diagnosis, decision-making, and therapy appear a must these days. Generating and publishing evidence is a tedious job according to ever new and tightened research practice regulations. Rules will never prevent the typical human behaviour from showing the new thing to be shinier and the old thing dustier than they really are. The medical community is solicited to concoct a meal that is gullible for patients, authorities, and third-party payers out of the available evidence (after applying some conversion factors correcting the common bias of the researchers), anticipation of what will be the evidence tomorrow, common sense, and digested experience. Examples of misguidance by poorly produced or misinterpreted evidence are plentiful in interventional cardiology as they are in other disciplines. Coronary stents, for instance, were first underestimated due to the fact that they were generally used in bailout situations where the outcome remained rather dismal in spite of the salvaging potential of stents. Then they were overused quite uncritically rather to the detriment of the patient. Now with the high quality of the modern drug-eluting stents (DESs), the overuse persists but is no longer a concern. However, the enhanced potential of DESs compared with bare-metal stents was poorly exploited for >10 years because of reports that slipped through the meshes of good review and publication practice to convey the untenable message that bare-metal stents were preferable in many situations. As other examples, use of the fractional flow reserve (FFR) for decision-making has to be questioned despite prominently published reports recommending it. Fixing a lesion is today easier and hardly more complication prone than assessing it with the FFR. Closure of the patent foramen ovale may never be properly applied, because the collection of the understandably requested evidence takes decades, a follow-up duration that makes research unattractive to physicians and

  7. Validity and Reliability of a Portable Balance Tracking System, BTrackS, in Older Adults.

    PubMed

    Levy, Susan S; Thralls, Katie J; Kviatkovsky, Shiloah A

    Falls are the leading cause of disability, injury, hospital admission, and injury-related death among older adults. Balance limitations have consistently been identified as predictors of falls and increased fall risk. Field measures of balance are limited by issues of subjectivity, ceiling effects, and low sensitivity to change. The gold standard for measuring balance is the force plate; however, its field use is untenable due to high cost and lack of portability. Thus, a critical need is observed for valid objective field measures of balance to accurately assess balance and identify limitations over time. The purpose of this study was to examine the concurrent validity and 3-day test-retest reliability of Balance Tracking System (BTrackS) in community-dwelling older adults. Minimal detectable change values were also calculated to reflect changes in balance beyond measurement error. Postural sway data were collected from community-dwelling older adults (N = 49, mean [SD] age = 71.3 [7.3] years) with a force plate and BTrackS in multitrial eyes open (EO) and eyes closed (EC) static balance conditions. Force sensors transmitted BTrackS data via a USB to a computer running custom software. Three approaches to concurrent validity were taken including calculation of Pearson product moment correlation coefficients, repeated-measures ANOVAs, and Bland-Altman plots. Three-day test-retest reliability of BTrackS was examined in a second sample of 47 community-dwelling older adults (mean [SD] age = 75.8 [7.7] years) using intraclass correlation coefficients and MDC values at 95% CI (MDC95) were calculated. BTrackS demonstrated good validity using Pearson product moment correlations (r > 0.90). Repeated-measures ANOVA and Bland-Altman plots indicated some BTrackS bias with center of pressure (COP) values higher than FP COP values in the EO (mean [SD] bias = 4.0 [6.8]) and EC (mean [SD] bias = 9.6 [12.3]) conditions. Test-retest reliability using intraclass correlation

  8. Strain effects on thermal conductivity of nanostructured silicon by Raman piezothermography

    NASA Astrophysics Data System (ADS)

    Murphy, Kathryn Fay

    , using the Raman laser as a heat source and the Raman spectrum as a measure of temperature, determine thermal transport properties. We show that uniaxial strain up to ˜1% has a weak effect on Si nanowire or thin film thermal conductivity, but irradiation-induced defects in nanowires yield dramatic reductions due to increased phonon scattering. Such defects are accompanied by large strain gradients, but decoupling the effect of these gradients from local changes in mass and interatomic potential is experimentally untenable. To isolate the effect of strain gradients, we extend our method to Si micromeshes, which exhibit nonuniform strains upon loading. The complex strain states achieved cause more drastic reductions of thermal conductivity due to enhanced phonon-phonon scattering in the presence of a strain gradient. The directions suggested by our experiments, as well as the development of the method, will allow for more robust understanding and control of thermal transport in nanostructures.

  9. Water Planning and Climate Change: Actionable Intelligence Yet?

    NASA Astrophysics Data System (ADS)

    Milly, P.

    2008-05-01

    Within a rational planning framework, water planners design major water projects by evaluating tradeoffs of costs, benefits, and risks to life and property. The evaluation is based on anticipated future runoff and streamflow. Generally, planners have invoked the stationarity approximation: they have assumed that hydrologic conditions during the planned lifetime of a project will be similar to those observed in the past. Contemporary anthropogenic climate change arguably makes stationarity untenable. In principle, stationarity-based planning under non- stationarity potentially leads to incorrect assessment of tradeoffs, sub-optimal decisions, and excessive financial and environmental costs (e.g., a reservoir that is too big to ever be filled) and/or insufficient benefits (e.g., levees that are too small to hold back the flood waters). As the reigning default assumption for planning, stationarity is an easy target for criticism; provision of a practical alternative is not so easy. The leading alternative, use of quantitative climate-change projections from global climate models in conjunction with water planners' river-basin models, has serious shortcomings of its own. Climate models (1) neglect some terrestrial processes known to influence runoff and streamflow; (2) do not represent precipitation well at the finer resolved time and space scales; (3) do not resolve any processes at the even finer spatial scale of relevance to much of water planning; and (4) disagree among themselves about some changes. Even setting aside the issue of scale mismatch, for which various "downscaling" methods have been proposed, outputs from climate models generally are not directly transferable to river-basin models, and river-basin models commonly use empiricisms whose historical validity might not extrapolate well under climate change. So climate science is informing water management that stationarity is a flawed assumption, but it has not presented a universally and reliably superior

  10. Real-time 4D ERT monitoring of river water intrusion into a former nuclear disposal site using a transient warping-mesh water table boundary (Invited)

    NASA Astrophysics Data System (ADS)

    Johnson, T.; Hammond, G. E.; Versteeg, R. J.; Zachara, J. M.

    2013-12-01

    The Hanford 300 Area, located adjacent to the Columbia River in south-central Washington, USA, is the site of former research and uranium fuel rod fabrication facilities. Waste disposal practices at site included discharging between 33 and 59 metric tons of uranium over a 40 year period into shallow infiltration galleries, resulting in persistent uranium contamination within the vadose and saturated zones. Uranium transport from the vadose zone to the saturated zone is intimately linked with water table fluctuations and river water intrusion driven by upstream dam operations. As river stage increases, the water table rises into the vadose zone and mobilizes contaminated pore water. At the same time, river water moves inland into the aquifer, and river water chemistry facilitates further mobilization by enabling uranium desorption from contaminated sediments. As river stage decreases, flow moves toward the river, ultimately discharging contaminated water at the river bed. River water specific conductance at the 300 Area varies around 0.018 S/m whereas groundwater specific conductance varies around 0.043 S/m. This contrast provides the opportunity to monitor groundwater/river water interaction by imaging changes in bulk conductivity within the saturated zone using time-lapse electrical resistivity tomography. Previous efforts have demonstrated this capability, but have also shown that disconnecting regularization constraints at the water table is critical for obtaining meaningful time-lapse images. Because the water table moves with time, the regularization constraints must also be transient to accommodate the water table boundary. This was previously accomplished with 2D time-lapse ERT imaging by using a finely discretized computational mesh within the water table interval, enabling a relatively smooth water table to be defined without modifying the mesh. However, in 3D this approach requires a computational mesh with an untenable number of elements. In order to

  11. On the Principles of Building a Layered Intrusion

    NASA Astrophysics Data System (ADS)

    Marsh, B. D.

    2009-12-01

    An accurate and realistic understanding of all magmatic processes involves knowing the combined physical and chemical fundamentals governing the overall process. Magmatic processes involve such a vast array of sub-processes (e.g., heat and mass transfer, crystal growth, slurry transport and sorting, annealing, resorbtion, etc.) that rarely is there any single feature or measurement that can be safely inverted to solve the problem. And each event as in the formation of an intrusion must at some level for heuristic purposes be defined as an isolated event. This is commonly done without much forethought, as is the absolutely critical assumption of the initial conditions defining the beginning of the event. Almost without exception, it is the initial conditions that determine the outcome of the entire process in all physical and biological systems. Automobile factories produce motorized vehicles not water melons or chimpanzees. Nucleosynthesis of H and He always gives the same set of elements. The initial conditions of the magma giving rise to the end product for mafic layered systems are especially difficult to discern and must be bounded by observing simpler, real time magmatic and volcanic processes. Initial conditions come from posing a series of questions: What was the style and duration of filling? What was the rate of influx and final volume of each delivery of magma? What was the compositional variation and phenocryst content of the individual magmatic deliveries? If phenocrysts are present, were they sorted prior to injection during ascension? What was the original and ongoing shape of the magmatic reservoir? A failure to appreciate or answer such basic questions leads to vastly untenable evolutionary scenarios. Unrealistic initial conditions necessarily lead to unrealistic magmatic scenarios. There are certain safe starting points. Eruptive and emplacement fluxes are limited. The larger an intrusion is the longer it took to build and the longer to build the

  12. Gravity is the Key Experiment to Address the Habitability of the Ocean in Jupiter's Moon Europa

    NASA Astrophysics Data System (ADS)

    Sessa, A. M.; Dombard, A. J.

    2013-12-01

    Life requires three constituents: a liquid solvent (i.e., water), a chemical system that can form large molecules to record genetic information (e.g., carbon based) as well as chemical nutrients (e.g., nitrogen, phosphorous), and a chemical disequilibrium system that can provide metabolic energy. While it is believed that there is a saline water layer located between the rock and ice layers in Jupiter's moon Europa, which would satisfy the first requirement, it is unknown if the other conditions are currently met. The likelihood that Europa is a haven for life in our Solar System skyrockets, however, if there is currently active volcanism at the rock-water interface, much the same that volcanic processes enable the chemosynthetic life that forms the basis of deep sea-vent communities at the bottom of Earth's oceans. Exploring the volcanic activity on this interface is challenging, as direct observation via a submersible or high-resolution indirect observations via a dense global seismic network on the surface is at present technically (and fiscally!) untenable. Thus, gravity studies are the best way to explore currently the structure of this all-important interface. Though mostly a silicate body with only a relatively thin (~100 km) layer of water, Europa is different from the terrestrial planets in that this rock-water interface, and not the surface, represents the largest density contrast across the moon's near-surface layers, and thus topography on this interface could conceivably dominate the gravity. Here, we calculate the potential anomalies that arise from topography on the surface, the water-ice interface (at 20 km depth), and the rock-water interface, finding that the latter dominates the free-air gravity at the longest wavelengths (spherical harmonic degrees < 10) and the Bouguer gravity at intermediate wavelengths (degrees ~10-50), and only for the shortest wavelengths (degrees > 50) does the water-ice interface (and presumably mass-density anomalies

  13. PSF Rotation with Changing Defocus and Applications to 3D Imaging for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Prasad, S.; Kumar, R.

    2013-09-01

    For a clear, well corrected imaging aperture in space, the point-spread function (PSF) in its Gaussian image plane has the conventional, diffraction-limited, tightly focused Airy form. Away from that plane, the PSF broadens rapidly, however, resulting in a loss of sensitivity and transverse resolution that makes such a traditional best-optics approach untenable for rapid 3D image acquisition. One must scan in focus to maintain high sensitivity and resolution as one acquires image data, slice by slice, from a 3D volume with reduced efficiency. In this paper we describe a computational-imaging approach to overcome this limitation, one that uses pupil-phase engineering to fashion a PSF that, although not as tight as the Airy spot, maintains its shape and size while rotating uniformly with changing defocus over many waves of defocus phase at the pupil edge. As one of us has shown recently [1], the subdivision of a circular pupil aperture into M Fresnel zones, with the mth zone having an outer radius proportional to m and impressing a spiral phase profile of form m? on the light wave, where ? is the azimuthal angle coordinate measured from a fixed x axis (the dislocation line), yields a PSF that rotates with defocus while keeping its shape and size. Physically speaking, a nonzero defocus of a point source means a quadratic optical phase in the pupil that, because of the square-root dependence of the zone radius on the zone number, increases on average by the same amount from one zone to the next. This uniformly incrementing phase yields, in effect, a rotation of the dislocation line, and thus a rotated PSF. Since the zone-to-zone phase increment depends linearly on defocus to first order, the PSF rotates uniformly with changing defocus. For an M-zone pupil, a complete rotation of the PSF occurs when the defocus-induced phase at the pupil edge changes by M waves. Our recent simulations of reconstructions from image data for 3D image scenes comprised of point sources at

  14. The Source of Proterozoic Anorthosites: Bringing It All Back Home

    NASA Astrophysics Data System (ADS)

    Scoates, J. S.

    2004-05-01

    plot along the plag+2-px cotectic at high pressures. The important thermal divide for the petrogenesis of Proterozoic anorthosites is the plag+olivine+cpx divide as it separates opx-absent from opx-present fractionation trends at mid-crustal pressures. The least fractionated ol-normative compositions project into the region of mantle-derived melts at relatively high pressures (1-1.3 GPa). Radiogenic isotopic studies (Pb, Nd, Sr, Os) are particularly useful for constraining crustal input to anorthosite and have successfully traced out different-aged crustal reservoirs beneath them, especially when the underlying crust is 1 byr or more older than the anorthosites (e.g. Nain). Os isotopic studies do not effectively constrain the source of Proterozoic anorthosites, but rather yield important information about additions of crustal sulfur to ascending and slowly-cooling anorthosite bodies. Although a lower crustal tongue melting origin for Proterozoic anorthosites is clearly untenable, it is likely that no magma associated with Proterozoic anorthosites escaped contamination during ascent through the crust. The lower crust may have acted as a highly effective near-solidus "reactive filter" capable of stabilizing plagioclase as a liquidus phase for the duration of these long-lived (tens of millions of years for the largest suites), low magma flux magmatic systems. Combined low magma productivity and flux are consistent with only small amounts of crustal extension implicating the compositionally heterogeneous continental lithospheric mantle as the dominant source component for Proterozoic anorthosites.

  15. Advancing Understanding of the Role of Belowground Processes in Terrestrial Carbon Sinks trhrough Ground-Penetrating Radar. Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Day, Frank P.

    2015-02-06

    Coarse roots play a significant role in belowground carbon cycling and will likely play an increasingly crucial role in belowground carbon sequestration as atmospheric CO 2 levels continue to rise, yet they are one of the most difficult ecosystem parameters to quantify. Despite promising results with ground-penetrating radar (GPR) as a nondestructive method of quantifying biomass of coarse roots, this application of GPR is in its infancy and neither the complete potential nor limitations of the technology have been fully evaluated. The primary goals and questions of this study fell into four groups: (1) GPR methods: Can GPR detect changemore » in root biomass over time, differentiate live roots from dead roots, differentiate between coarse roots, fine roots bundled together, and a fine root mat, remain effective with varied soil moisture, and detect shadowed roots (roots hidden below larger roots); (2) CO 2 enrichment study at Kennedy Space Center in Brevard County, Florida: Are there post-fire legacy effects of CO 2 fertilization on plant carbon pools following the end of CO 2application ? (3) Disney Wilderness Study: What is the overall coarse root biomass and potential for belowground carbon storage in a restored longleaf pine flatwoods system? Can GPR effectively quantify coarse roots in soils that are wetter than the previous sites and that have a high percentage of saw palmetto rhizomes present? (4) Can GPR accurately represent root architecture in a three-dimensional model? When the user is familiar with the equipment and software in a setting that minimizes unsuitable conditions, GPR is a relatively precise, non-destructive, useful tool for estimating coarse root biomass. However, there are a number of cautions and guidelines that should be followed to minimize inaccuracies or situations that are untenable for GPR use. GPR appears to be precise as it routinely predicts highly similar values for a given area across multiple scanning events; however, it

  16. Light for the quantum. Entangled photons and their applications: a very personal perspective

    NASA Astrophysics Data System (ADS)

    Zeilinger, Anton

    2017-07-01

    The quantum physics of light is a most fascinating field. Here I present a very personal viewpoint, focusing on my own path to quantum entanglement and then on to applications. I have been fascinated by quantum physics ever since I heard about it for the first time in school. The theory struck me immediately for two reasons: (1) its immense mathematical beauty, and (2) the unparalleled precision to which its predictions have been verified again and again. Particularly fascinating for me were the predictions of quantum mechanics for individual particles, individual quantum systems. Surprisingly, the experimental realization of many of these fundamental phenomena has led to novel ideas for applications. Starting from my early experiments with neutrons, I later became interested in quantum entanglement, initially focusing on multi-particle entanglement like GHZ states. This work opened the experimental possibility to do quantum teleportation and quantum hyper-dense coding. The latter became the first entanglement-based quantum experiment breaking a classical limitation. One of the most fascinating phenomena is entanglement swapping, the teleportation of an entangled state. This phenomenon is fundamentally interesting because it can entangle two pairs of particles which do not share any common past. Surprisingly, it also became an important ingredient in a number of applications, including quantum repeaters which will connect future quantum computers with each other. Another application is entanglement-based quantum cryptography where I present some recent long-distance experiments. Entanglement swapping has also been applied in very recent so-called loophole-free tests of Bell’s theorem. Within the physics community such loophole-free experiments are perceived as providing nearly definitive proof that local realism is untenable. While, out of principle, local realism can never be excluded entirely, the 2015 achievements narrow down the remaining possibilities for

  17. Arrangement of Convection in the Earth by Lunar Gravity,II: Geotectonics Under a Minute Wsstward Tilt, With TPW

    NASA Astrophysics Data System (ADS)

    Bostrom, R. C.

    2003-12-01

    G. Darwin's lunar retarding torque is magnitude orders too small to cause lateral motion in a viscous passive Earth [1]. Nevertheless plate-motion data suggesting an apparent net lithosphere rotation seem to accumulate, confirming that given convection under gravity, this can scarcely be immune to an asymmetrical field component. Investigative obstacles have lain in establishing an ITRF tying surface benchmarks to Earth's interior, and a dynamics quantitatively capable of shaping the convection. By delimiting the lunar orbital expansion (irrespective of whether due to marine or body-tide dissipation, or yield under convection itself), LLR [2] delimits the secular, whole-Earth, day-averaged field under which mantle convection takes place. Thus a derived value 600 seconds of the luni-tidal interval indicates that masses not reaching equilibrium add to the secular field a component tilted by arcsin[(600secs)/(25hrs24 min)] = 0.38 degrees (relative to symmetrical standard-g,- the latter pertinent only to an isolated Earth). The derived value delimits also the dissipation, and accords with the increase in l.o.d. and Earth/Moon astronomic history. Conversely, were gtot not minutely west-tilted, a couple would not exist, hence Earth-Moon distance not increase. Assumption that the convection develops under a symmetrical tensor field g in strict accordance with NNR, neglecting the tilt inherent in observed tidal components, is thermodynamically untenable. Convection at all scales must be to some extent asymmetrical. How to assess the effect in a heterogeneous Earth of a system so minute, but operative throughout geological time? Plate motion and ocean development combined with paleomagnetically established TPW [3,4,5,6] display the following:- During Mesozoic times until -110Ma the pole was located at 'quasi-still-stand' in extreme NE Siberia,present coordinates; the regime of convection then operative resulted in N Atlantic birth, under NW-SE extension. Associated with a

  18. In vivo evidence for free radical involvement in the degeneration of rat brain 5-HT following administration of MDMA (‘ecstasy') and p-chloroamphetamine but not the degeneration following fenfluramine

    PubMed Central

    Colado, M I; O'Shea, E; Granados, R; Murray, T K; Green, A R

    1997-01-01

    been damaged by the prior fenfluramine injection. Administration of the free radical scavenging agent α-phenyl-N-tert-butyl nitrone (PBN; 120 mg kg−1, i.p.) 10 min before and 120 min after an MDMA (15 mg kg−1, i.p.) injection prevented the acute rise in the 2,3-DHBA concentration in the dialysate and attenuated by 30% the long term damage to hippocampal 5-HT neurones (as indicated by a smaller MDMA-induced decrease in both the concentration of 5-HT and 5-HIAA and also the binding of [3H]-paroxetine). These data indicate that a major mechanism by which MDMA and PCA induce damage to 5-hydroxytryptaminergic neurones in rat brain is by increasing the formation of free radicals. These probably result from the degradation of catechol and quinone metabolites of these substituted amphetamines. In contrast, fenfluramine induces damage by another mechanism not involving free radicals; a proposal supported by some of our earlier indirect studies. We suggest that these different modes of action render untenable the recent suggestion that MDMA will not be neurotoxic in humans because fenfluramine appears safe at clinical doses. PMID:9222545

  19. Task Force 1. Report of the Task Force on Patient Expectations, Core Values, Reintegration, and the New Model of Family Medicine

    PubMed Central

    Green, Larry A.; Graham, Robert; Bagley, Bruce; Kilo, Charles M.; Spann, Stephen J.; Bogdewic, Stephen P.; Swanson, John

    2004-01-01

    should be reliably provided in family medicine practices, and an itemization of key attributes and core values that define the specialty. It also proposed and described a New Model of family medicine for people of all ages and both genders that emphasizes patient-centered, evidence-based, whole-person care provided through a multidisciplinary team approach in settings that reduce barriers to access and use advanced information systems and other new technologies. The task force recommended a time of active experimentation to redesign the work and workplace of family physicians; the development of revised financial models for family medicine, and a national resource to provide assistance to individual practices moving to New Model practice; and cooperation with others pursuing the transformation of frontline medicine to better serve the public. CONCLUSIONS Unless there are changes in the broader health care system and within the specialty, the position of family medicine in the United States will be untenable in a 10- to 20-year time frame. Even within the constraints of today’s flawed health care system, there are major opportunities for family physicians to realize improved results for patients and economic success. A period of aggressive experimentation and redevelopment of family medicine is needed now. The future success of the discipline and its impact on public well-being depends in large measure on family medicine’s ability to rearticulate its vision and competencies in a fashion that has greater resonance with the public while substantially revising the organization and processes by which care is delivered. When accomplished, family physicians will achieve more fully the aspirations articulated by the specialty’s core values and contribute to the solution of the nation’s serious health care problems.

  20. Carry-over fluency induced by extreme prolongations: A new behavioral paradigm.

    PubMed

    Briley, P M; Barnes, M P; Kalinowski, J S

    2016-04-01

    Extreme prolongations, which can be generated via extreme delayed auditory feedback (DAF) (e.g., 250-500 ms) or mediated cognitively with timing applications (e.g., analog stopwatch) at 2 s per syllable, have long been behavioral techniques used to inhibit stuttering. Some therapies have used this rate solely to establish initial fluency, while others use extremely slowed speech to establish fluency and add other strategic techniques such as easy onsets and diaphragmatic breathing. Extreme prolongations generate effective, efficient, and immediate forward flowing fluent speech, removing the signature behaviors of discrete stuttering (i.e., syllable repetitions and audible and inaudible postural fixations). Prolonged use of extreme prolongations establishes carry-over fluency, which is spontaneous, effortless speech absent of most, if not all, overt and covert manifestations of stuttering. The creation of this immediate fluency and the immense potential of extreme prolongations to generate long periods of carry-over fluency have been overlooked by researchers and clinicians alike. Clinicians depart from these longer prolongation durations as they attempt to achieve the same fluent results at a near normal rate of speech. Clinicians assume they are re-teaching fluency and slow rates will give rise to more normal rates with less control, but without carry-over fluency, controls and cognitive mediation are always needed for the inherently unstable speech systems of persons who stutter to experience fluent speech. The assumption being that the speech system is untenable without some level of cognitive and motoric monitoring that is always necessary. The goal is omnipresent "near normal rate sounding fluency" with continuous mediation via cognitive and motoric processes. This pursuit of "normal sounding fluency" continues despite ever-present relapse. Relapse has become so common that acceptance of stuttering is the new therapy modality because relapse has come to be

  1. Three Smoking Guns Prove Falsity of Green house Warming

    NASA Astrophysics Data System (ADS)

    Fong, P.

    2001-12-01

    Three observed facts: 1, the cloud coverage increased 4.1% in 50 years; 2. the precipitation increased 7.8% in 100 years; 3. the two rates are the same. {Interpretation}. 1, By the increased albedo of the clouds heat dissipation is increased 3.98 W/m2 by 2XCO2 time, canceling out greenhouse warming of 4 W/m{2}. Thus no global warming. 2, The precipitation increase show the increased release of latent heat of vaporization, which turns out to be equal to that absorbed by ocean due to increased evaporation by the greenhouse forcing. This all greenhouse heat is used up in evaporation and the warming of the earth is zero. 3, The identity of the two rates double-checked the two independent proofs. Therefore experimentally no greenhouse warming is triply proved. A new branch of science Pleistocene Climatology is developed to study the theoretical origin of no greenhouse warming. Climatology, like mechanics of a large number of particles, is of course complex and unwieldy. If totally order-less then there is no hope. However, if some regularity appears, then a systematic treatment can be done to simplify the complexity. The rigid bodies are subjected to a special simplifying condition (the distances between all particles are constant) and only 6 degrees of freedom are significant, all others are sidetracked. To study the spinning top there is no need to study the dynamics of every particle of the top by Newton's laws through super-computer. It only needs to solve the Euler equations without computer. In climate study the use of super-computer to study all degrees of freedom of the climate is as untenable as the study of the spinning top by super-computer. Yet in spite of the complexity there is strict regularity as seen in the ice ages, which works as the simplifying conditions to establish a new science Pleistocene climatology. See my book Greenhouse Warming and Nuclear Hazards just published (www.PeterFongBook.com). This time the special condition is the presence of a

  2. Dynamic context discrimination : psychological evidence for the Sandia Cognitive Framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Speed, Ann Elizabeth

    Human behavior is a function of an iterative interaction between the stimulus environment and past experience. It is not simply a matter of the current stimulus environment activating the appropriate experience or rule from memory (e.g., if it is dark and I hear a strange noise outside, then I turn on the outside lights and investigate). Rather, it is a dynamic process that takes into account not only things one would generally do in a given situation, but things that have recently become known (e.g., there have recently been coyotes seen in the area and one is known to bemore » rabid), as well as other immediate environmental characteristics (e.g., it is snowing outside, I know my dog is outside, I know the police are already outside, etc.). All of these factors combine to inform me of the most appropriate behavior for the situation. If it were the case that humans had a rule for every possible contingency, the amount of storage that would be required to enable us to fluidly deal with most situations we encounter would rapidly become biologically untenable. We can all deal with contingencies like the one above with fairly little effort, but if it isn't based on rules, what is it based on? The assertion of the Cognitive Systems program at Sandia for the past 5 years is that at the heart of this ability to effectively navigate the world is an ability to discriminate between different contexts (i.e., Dynamic Context Discrimination, or DCD). While this assertion in and of itself might not seem earthshaking, it is compelling that this ability and its components show up in a wide variety of paradigms across different subdisciplines in psychology. We begin by outlining, at a high functional level, the basic ideas of DCD. We then provide evidence from several different literatures and paradigms that support our assertion that DCD is a core aspect of cognitive functioning. Finally, we discuss DCD and the computational model that we have developed as an instantiation of

  3. Current issues in dental practice management. Part 1. The importance of shared values.

    PubMed

    Newsome, Philip R H

    2003-04-01

    There can be few who would argue with the notion that the nature of dental practice in the United Kingdom has changed dramatically over the last couple of decades. A variety of factors, including new clinical techniques, growing consumerism, a much greater awareness of health-related and well-being issues in the public at large, as well as a marked deregulation within the dental profession, the development of vocational training and recently mandatory lifelong learning, the growing number of females working in the profession, and an increasing reluctance of young dentists to finance dental practices have all combined to create an environment which has enabled and encouraged a move away from traditional forms of dental care delivery. Instead, there has been considerable growth in independently-funded practice and a commensurate growth in the number of practices operating under a corporate body umbrella of one form or another. Currently there are 27 corporate bodies registered with the General Dental Council (GDC) with the likelihood of more in the future given the proposed GDC review. This will no doubt take into consideration European law, under which the restriction within the Dentist's Act on the number of corporate bodies is likely to be untenable. Although they still have only a small share of the dental market--with 4% of all dentists in the UK in 1999--they have expanded rapidly from a small base. The data available at the time the paper was written indicate that the global total of fees earned from dentistry in the UK in the financial year 2001/2002 was almost 3 billion Pounds, of which 1.9 billion Pounds (64%) came from NHS fees and 1.1 billion Pounds (36%) from private fees. Of this 1.9 billion Pounds received in NHS fees in 2001/2002, 0.55 billion Pounds were paid by patients who were not exempt from charges, bringing the total amount actually paid out of patients' pockets for dental treatment to 1.65 billion Pounds. Compare these figures with 1996

  4. Selenium- and tellurium-containing fluorescent molecular probes for the detection of biologically important analytes.

    PubMed

    Manjare, Sudesh T; Kim, Youngsam; Churchill, David G

    2014-10-21

    As scientists in recent decades have discovered, selenium is an important trace element in life. The element is now known to play an important role in biology as an enzymatic antioxidant. In this case, it sits at the active site and converts biological hydrogen peroxides to water. Mimicking this reaction, chemists have synthesized several organoselenium compounds that undergo redox transformations. As such, these types of compounds are important in the future of both medicinal and materials chemistry. One main challenge for organochalcogen chemists has been to synthesize molecular probes that are soluble in water where a selenium or tellurium center can best modify electronics of the molecule based on a chemical oxidation or reduction event. In this Account, we discuss chemists' recent efforts to create chalcogen-based chemosensors through synthetic means and current photophysical understanding. Our work has focused on small chromophoric or fluorophoric molecules, in which we incorporate discrete organochalcogen atoms (e.g., R-Se-R, R-Te-R) in predesigned sites. These synthetic molecules, involving rational synthetic pathways, allow us to chemoselectively oxidize compounds and to study the level of analyte selectivity by way of their optical responses. All the reports we discussed here deal with well-defined and small synthetic molecular systems. With a large number of reports published over the last few years, many have notably originated from the laboratory of K. Han (P. R. China). This growing body of research has given chemists new ideas for the previously untenable reversible reactive oxygen species detection. While reversibility of the probe is technically important from the stand-point of the chalcogen center, facile regenerability of the probe using a secondary analyte to recover the initial probe is a very promising avenue. This is because (bio)chalcogen chemistry is extremely rich and bioinspired and continues to yield important developments across many

  5. Environmental applications of chitosan and its derivatives.

    PubMed

    Yong, Soon Kong; Shrivastava, Manoj; Srivastava, Prashant; Kunhikrishnan, Anitha; Bolan, Nanthi

    2015-01-01

    Chitosan originates from the seafood processing industry and is one of the most abundant of bio-waste materials. Chitosan is a by-product of the alkaline deacetylation process of chitin. Chemically, chitosan is a polysaccharide that is soluble in acidic solution and precipitates at higher pHs. It has great potential for certain environmental applications, such as remediation of organic and inorganic contaminants, including toxic metals and dyes in soil, sediment and water, and development of contaminant sensors. Traditionally, seafood waste has been the primary source of chitin. More recently, alternative sources have emerged such as fungal mycelium, mushroom and krill wastes, and these new sources of chitin and chitosan may overcome seasonal supply limitations that have existed. The production of chitosan from the above-mentioned waste streams not only reduces waste volume, but alleviates pressure on landfills to which the waste would otherwise go. Chitosan production involves four major steps, viz., deproteination, demineralization, bleaching and deacetylation. These four processes require excessive usage of strong alkali at different stages, and drives chitosan's production cost up, potentially making the application of high-grade chitosan for commercial remediation untenable. Alternate chitosan processing techniques, such as microbial or enzymatic processes, may become more cost-effective due to lower energy consumption and waste generation. Chitosan has proved to be versatile for so many environmental applications, because it possesses certain key functional groups, including - OH and -NH2 . However, the efficacy of chitosan is diminished at low pH because of its increased solubility and instability. These deficiencies can be overcome by modifying chitosan's structure via crosslinking. Such modification not only enhances the structural stability of chitosan under low pH conditions, but also improves its physicochemical characteristics, such as porosity

  6. How the mainstream limits the spreading of alternative hypotheses

    NASA Astrophysics Data System (ADS)

    Kalenda, Pavel

    2014-05-01

    that prof. Djuric had tried for more than 10 years to publish this article in various peer-reviewed journals. So, prof. Djuric got into the official book (list) of "scientific dissidents" among hundreds of other professors and doctors of science (De Climont 2012). These "scientific dissidents" do not have access to established journals and may possibly publish privately or at best on the web in marginal journals whose list was published by De Climont (2012). Such a marginal journal in the field of geophysics and geology is New Concepts in Global Tectonics. This journal has been established because the current hypothesis about the movement of the continents due to convection currents in the mantle becomes under the weight of new observation quite untenable. 4) Scientific consensus History has known many hypotheses that were accepted as proven truth but later, in the light of new knowledge, they completely failed. - No one has the right to decide which scientific hypotheses will be accepted and which will not get into print. Perhaps the worst situation is in climatology (due to global effects and impacts), when the plenary session of IPCC consensually stated that the current global warming was mainly due to the human activity. References De Climont, J. (2012): The worldwide list of dissident scientists. http://astrojan.hostei.com/droa.htm. Djurič, J. (2006): Unification Of Gravitation And Electromagnetism. http://jovandjuric.tripod.com/ David H. Douglass, John R. Christy, Benjamin D. Pearson and S. Fred Singer (2007): A comparison of tropical temperature trends with model predictions. International Journal of Climatology, Volume 28, Issue 13, 15 November 2008, Pages: 1693-1701. http://onlinelibrary.wiley.com/doi/10.1002/joc.1651/pdf. Einstein, A. : List of scientific publications by Albert Einstein. /wiki/List_of_scientific_publications_by_Albert_Einstein. Kolínský, P., Valenta, J. and Gaždová, R. (2012): Seismicity, groundwater level variations and earth tides in

  7. Rockfall hazard assessment, risk quantification, and mitigation options for reef cove resort development, False Cape, Queensland, Australia

    NASA Astrophysics Data System (ADS)

    Schlotfeldt, P.

    2009-04-01

    Zones were identified using the digital elevation model set up in ARC GIS (Figure 4). The boundaries were field verified as far as possible. The identified Zones formed the basis of all subsequent work. 4. Once calibrated the rockfall trajectory modeling showed that only between 1% and in the worst case 28% of falling rocks (percentage of 1000 seeding events) per Zones would actually reach the development. While this indicated a reduced likelihood of an incident and hence the risk, the kinetic energy in the case of an impact in most Zones was so high (for the given design block size) that the consequence would be untenable without some form of mitigation. 5. An event tree analysis showed that five out of the eight Zones identified had risk profiles that fell above or very close to what was considered to be an acceptable annual probability of occurrence of a fatality or fatalities. CONCLUSIONS Each Zone has unique characteristics that influence the risk profile associated with the rock fall hazard to the development. Mitigation options and recommendations needed to be adjusted accordingly to fit the physical characteristics and assessed risk profile of each Zone. These included: 1. The possible implantation of exclusion zones (no build areas); 2. Scaling (including controlled blasting) to reduce the potential kinetic energy associated with identified potentially unstable boulders; and 3. The design and construction of Berms and rockfall catch fences.

  8. FORS am Very Large Telescope der Europäischen Südsternwarte

    NASA Astrophysics Data System (ADS)

    1998-09-01

    von Aufnahmen verschiedener astronomischer Objekte, von denen einige hier wiedergegeben sind. Sie wurden alle mit FORS in der Standardauflösung gewonnen (Bildfeldgröße 6.8 x 6.8 Bogenminuten, Pixelgröße 0.20 Bogensekunden) und zeigen einige der eindrucksvollen Möglichkeiten, die das neue Instrument bietet. Spiralgalaxie NGC 1288 ESO PR Photo 37a/98 ESO PR Photo 37a/98 [Preview - JPEG: 800 x 908 pix - 224k] [High-Res - JPEG: 3000 x 3406 pix - 1.5Mb] Farbaufnahme der Spiralgalaxie NGC 1288, aufgenommen in der ersten Beobachtungsnacht von FORS ("Nacht des ersten Lichts"). Das erste Photo zeigt eine Dreifarbenaufnahme der schönen Spiralgalaxie NGC 1288 im südlichen Sternbild Fornax. PR Photo 37a/98 umfaßt das gesamte Feld, das mit der 2048 x 2048 Pixel großen CCD-Kamera abgebildet wurde. Es wurde aus drei CCD-Aufnahmen zusammengesetzt, die bei gutem Seeing in verschiedenen Farben in der "Nacht des ersten Lichts" (15. September 1998) aufgenommen wurden. Diese Galaxie mit einem Durchmesser von rund 200000 Lichtjahren ist etwa 300 Millionen Lichtjahre entfernt, ihre Fluchtgeschwindigkeit beträgt 4500 km/sec. Technische Informationen : Photo 37a/98 ist ein Komposit von drei Aufnahmen in den drei Filtern B (420nm, 6 Minuten belichtet), V (530nm, 3 Minuten) und I (800nm, 3 Minuten) während einer Periode mit 0.7 Bogensekunden Seeing. Das gezeigte Feld ist 6.8 x 6.8 Bogenminuten groß. Norden ist links, Osten unten. Entfernter Galaxienhaufen ESO PR Photo 37b/98 ESO PR Photo 37b/98 [Preview - JPEG: 657 x 800 pix - 248k] [High-Res - JPEG: 2465 x 3000 pix - 1.9Mb] Ein ungewöhnlicher Galaxienhaufen in der Umgebung des Quasars PB5763 . ESO PR Photo 37c/98 ESO PR Photo 37c/98 [Preview - JPEG: 670 x 800 pix - 272k] [High-Res - JPEG: 2512 x 3000 pix - 1.9Mb] Vergrößerung von PR Photo 37b/98; sie zeigt mehr Einzelheiten des ungewöhnlichen Galaxienhaufens. Die nächsten Photos wurden von einer 5-minütigen Aufnahme im Nahen Infrarot reproduziert, die ebenfalls in der "Nacht

  9. Considerations on the mechanism of action of artemisinin antimalarials: part 1--the 'carbon radical' and 'heme' hypotheses.

    PubMed

    Haynes, Richard K; Cheu, Kwan-Wing; N'Da, David; Coghi, Paolo; Monti, Diego

    2013-08-01

    +) complements the action of artemisinins, to be discussed in Part 2; there is no need to posit a reaction of Fe(2+) with the artemisinins to account for their antimalarial activity. The ability of artemisinins and synthetic peroxides to elicit membrane damage is examined in the light of established processes of autoxidation. The oxidant character of the intraparasitic environment is incompatible with the reducing conditions required for generation of C-radicals, and in contrast to the expectation raised by the C-radical hypothesis, and indeed by the heme hypothesis outlined below, antimalarial activities of artemisinins are enhanced under higher partial pressures of dioxygen. Structure-activity data from a wide variety of artemisinins and synthetic peroxides cannot be accommodated within the bounds of the C-radical hypothesis. Finally, the antimalarial Cradical construct sharply contrasts with that of the potently antitumour-active ene-diyne antibiotics such as neocarzinostatin. In an iron-free process, these compounds generate highly reactive aryl C-radicals that abstract H atoms from deoxyribose units in DNA to generate alkyl C-radicals. The last do react with dioxygen in a normal intracellular environment to initiate DNA strand cleavage. Overall, it must be concluded that the C-radical hypothesis as the basis for antimalarial activities of artemisinins and synthetic peroxides is untenable. Heme has been intensively studied as an 'activator' of artemisinins and other antimalarial peroxides, and indeed the hypothesis seemingly has become firmly embedded in the underlying brickwork of the scientific edifice. The locus of activity of the peroxides interacting with the heme is considered to be the parasite digestive vacuole. The basis for the nanomolar activities of artemisinins and synthetic peroxides is variously ascribed to heme-Fe(2+)-mediated generation of C-radicals from the peroxides, formation of heme-artemisinin adducts that are held either to engage in redox cycling

  10. BOOK REVIEW: Electron acceleration in the aurora and beyond

    NASA Astrophysics Data System (ADS)

    McClements, K. G.

    1999-08-01

    Duncan Bryant is a retired space plasma physicist who spent most of his career at the Rutherford-Appleton Laboratory in Oxfordshire, England. For many years he has been challenging a widely accepted theory, that auroral electrons are accelerated by double layers, on the grounds that it contains a fundamental error (allegedly, an implicit assumption that charged particles can gain energy from conservative fields). It is, of course, right that models of particle acceleration in natural plasmas should be scrutinized carefully in terms of their consistency with basic physical principles, and I believe that Dr Bryant has performed a valuable service by highlighting this issue. He maintains that auroral electron acceleration by double layers is fundamentally untenable, and that acceleration takes place instead via resonant interactions with lower hybrid waves. In successive chapters, he asserts that essentially the same process can account for electron acceleration observed at the Earth's bow shock, in the neighbourhood of an `artificial comet' produced as part of the Active Magnetospheric Particle Explorers (AMPTE) space mission in 1984/85, in the solar wind, at the Earth's magnetopause, and in the Earth's magneto- sphere. The evidence for this is not always convincing: waves with frequencies of the order of the lower hybrid resonance are often observed in these plasma environments, but in general it is difficult to identify clearly which wave mode is being observed (whistlers, for example, have frequencies in approximately the same range as lower hybrid waves). Moreover, it is not at all clear that the waves which are observed, even if they were of the appropriate type, would have sufficient intensity to accelerate electrons to the extent observed. The author makes a persuasive case, however, that acceleration in the aurora, and in other plasma environments accessible to in situ measurements, involves some form of wave turbulence. In Chapter 2 it is pointed out that

  11. EDITORIAL: Incoming Editor-in-Chief

    NASA Astrophysics Data System (ADS)

    Lidström, Suzanne

    2012-04-01

    semiconductors and certain complex materials. His recent interests have extended his domain of activity towards the field of quantum chemistry as he is now actively engaged in electronic structure theory and its applications, and light-matter interactions, in particular. The other new member of our editorial team, Professor David Keen, comes highly recommended by Professor Stephen Lovesey, a long-standing friend and former colleague, who was, himself, a former condensed matter editor for the journal many years ago. Professor Keen works in structural disorder, typically studying at the boundary between crystalline, amorphous and liquid phases using neutron and x-ray diffraction and atomistic modelling. Three examples of the areas in which he conducts research are 'liquid-like' disorder in superionic crystalline materials, solid-state amorphization transitions and disorder-induced properties, such as unusual negative thermal expansion. Through working at these boundaries, and at the ISIS neutron scattering facility at Rutherford Appleton Laboratory for over 20 years, he has gained wide experience of all areas of structural condensed matter physics, encompassing crystallography and the structure and simulation of liquid and amorphous materials. Professor Keen has been a Guest Editor for a number of special issues of Journal of Physics: Condensed Matter. My thanks are extended to Stephen for his advice and for recommending such an enthusiastic new editor to join us. Until recently, the extensive review process engaged by Physica Scripta involved almost every manuscript being forwarded to several researchers for examination. The volume of material being received at present, however, makes this procedure untenable and undesirable, as it would be unfair on those researchers willing to participate in the peer review process to continue to review articles that are obviously destined for rejection. Thus, as a direct result of the increase in volume, a screening procedure has been

  12. BOOK REVIEW: Transport and Structural Formation in Plasmas

    NASA Astrophysics Data System (ADS)

    Thyagaraja, A.

    1999-06-01

    heart of the problem is a set of non-linear equations (fluid or kinetic) describing electromagnetic turbulence. These have linearly or non-linearly unstable steady solutions for experimental conditions of interest. The instabilities are invariably non-linearly saturated at sufficiently high fluctuation amplitude, resulting in approximately stationary (but not generally homogeneous or isotropic) turbulence. The turbulence, as a rule, tends to increase the radial transport of density, temperature, momentum and current (for given sources) and thereby lower their gradients. The authors call such gradients `order parameters' in analogy with condensed matter phenomenology (Ginzburg-Landau theory). Whatever one calls them, if the growth rates of turbulent plasma modes are proportional to such gradients (which they are in simple cases of linear instabilities), one has available a `self-organizing' feedback loop. If this is all that the authors wanted to say, it is indeed unexceptionable and they should be applauded for pointing out that a clear conceptual understanding of these ideas is independent of any kinetic complications and reservations about so-called `quasi-linear estimates' of confinement which permeate the subject. Unfortunately, however, what is merely an illustrative model seems to be taken to empirically untenable and theoretically unjustifiable extremes. For example, we are told rather grandly that the book addresses ``a key to understanding the age old question of what occurred in the early stages of our universe and what is likely to occur in the final stages of our universe''. It is hard to discover where this feat is accomplished in this book! It is not clear that the models used by the authors, such as the `current diffusive mode', are really at all relevant to actual tokamaks. If they are, many more predictions (not postdictions) of the model should be made and verified in detail experimentally. At best, all one can say is that such models may not be