Sample records for undo seigyo model

  1. "Undoing" (or Symbolic Reversal) at Homicide Crime Scenes.

    PubMed

    Russell, Maria; Schlesinger, Louis B; Leon, Maria; Holdren, Samantha

    2018-03-01

    A closed case file review of a nonrandom national sample of 975 homicides disclosed 11 cases (1.13%) of undoing, wherein offenders engaged in crime scene behavior that has been considered an attempt to symbolically reverse the murder. The frequency of the various methods of undoing involved the use of blankets to cover the victim's body (55%), positioning the body (55%), use of a bed or couch (42%), washing the body (36%), using pillows (36%), as well as removing clothing and adding other types of adornments (27%). Ten of the 11 offenders were male, and one was female; all 12 victims were female. Ten of the 12 victims were family members or relationship intimates. These findings are consistent with prior reports which concluded that the motivation for undoing behavior is an attempt to compensate for guilt or remorse for having committed the homicide. © 2017 American Academy of Forensic Sciences.

  2. Constructing a New Vision: Undoing Gender through Secondary Education in Honduras

    ERIC Educational Resources Information Center

    Murphy-Graham, Erin

    2009-01-01

    This article presents results from a qualitative study on how the Honduran secondary education programme, "Sistema de Aprendizaje Tutorial" (SAT), attempts to "undo gender" (Deutsch 2007: 122) by encouraging students to rethink gender relations in their everyday lives in a way that reflects their increased consciousness of…

  3. "When You're in a Different Country, Things Are More Apparent": Gender and Study Abroad in Mexico

    ERIC Educational Resources Information Center

    McGivern, Martha B.

    2013-01-01

    This dissertation bridges the divide between comparative education and international education literature by examining student experiences in study abroad programs to make theoretical arguments about the role of culture in "doing" and "undoing" gender. The "undoing gender" framework in comparative education literature…

  4. Undoing Bad Upbringing through Contemplation: An Aristotelian Reconstruction

    ERIC Educational Resources Information Center

    Kristjánsson, Kristján

    2014-01-01

    The aim of this article is to reconstruct two counter-intuitive Aristotelian theses--about contemplation as the culmination of the good life and about the impossibility of undoing bad upbringing--to bring them into line with current empirical research, as well as with the essentials of an overall Aristotelian approach to moral education. I start…

  5. Gender, Narratives and Intersectionality: Can Personal Experience Approaches to Research Contribute to "Undoing Gender"?

    ERIC Educational Resources Information Center

    Cole, Barbara Ann

    2009-01-01

    This paper examines narrative methodologies as one approach to exploring issues of gender, education and social justice and, particularly, insights into "undoing gender". It furthermore examines the possibilities of exploring gender and its multiple intersections in a range of global and policy contexts through the use of personal…

  6. Un/Doing Gender? A Case Study of School Policy and Practice in Zambia

    ERIC Educational Resources Information Center

    Bajaj, Monisha

    2009-01-01

    This article explores an attempt to disrupt gender inequality in a unique, low-cost private school in Ndola, Zambia. It examines deliberate school policies aimed at "undoing gender" or fostering greater gender equity. These include efforts to maintain gender parity at all levels of the school and the requirement that both young men and…

  7. Computer memory management system

    DOEpatents

    Kirk, III, Whitson John

    2002-01-01

    A computer memory management system utilizing a memory structure system of "intelligent" pointers in which information related to the use status of the memory structure is designed into the pointer. Through this pointer system, The present invention provides essentially automatic memory management (often referred to as garbage collection) by allowing relationships between objects to have definite memory management behavior by use of coding protocol which describes when relationships should be maintained and when the relationships should be broken. In one aspect, the present invention system allows automatic breaking of strong links to facilitate object garbage collection, coupled with relationship adjectives which define deletion of associated objects. In another aspect, The present invention includes simple-to-use infinite undo/redo functionality in that it has the capability, through a simple function call, to undo all of the changes made to a data model since the previous `valid state` was noted.

  8. Postdecisional counterfactual thinking by actors and readers.

    PubMed

    Girotto, Vittorio; Ferrante, Donatella; Pighin, Stefania; Gonzalez, Michel

    2007-06-01

    How do individuals think counterfactually about the outcomes of their decisions? Most previous studies have investigated how readers think about fictional stories, rather than how actors think about events they have actually experienced. We assumed that differences in individuals' roles (actor vs. reader) can make different information available, which in turn can affect counterfactual thinking. Hence, we predicted an effect of role on postdecisional counterfactual thinking. Reporting the results of eight studies, we show that readers undo the negative outcome of a story by undoing the protagonist's choice to tackle a given problem, rather than the protagonist's unsuccessful attempt to solve it. But actors who make the same choice and experience the same negative outcome as the protagonist undo this outcome by altering features of the problem. We also show that this effect does not depend on motivational factors. These results contradict current accounts of counterfactual thinking and demonstrate the necessity of investigating the counterfactual thoughts of individuals in varied roles.

  9. Redundant Disk Arrays in Transaction Processing Systems. Ph.D. Thesis, 1993

    NASA Technical Reports Server (NTRS)

    Mourad, Antoine Nagib

    1994-01-01

    We address various issues dealing with the use of disk arrays in transaction processing environments. We look at the problem of transaction undo recovery and propose a scheme for using the redundancy in disk arrays to support undo recovery. The scheme uses twin page storage for the parity information in the array. It speeds up transaction processing by eliminating the need for undo logging for most transactions. The use of redundant arrays of distributed disks to provide recovery from disasters as well as temporary site failures and disk crashes is also studied. We investigate the problem of assigning the sites of a distributed storage system to redundant arrays in such a way that a cost of maintaining the redundant parity information is minimized. Heuristic algorithms for solving the site partitioning problem are proposed and their performance is evaluated using simulation. We also develop a heuristic for which an upper bound on the deviation from the optimal solution can be established.

  10. Gender, Narratives and Intersectionality: can Personal Experience Approaches to Research Contribute to "Undoing Gender"?

    NASA Astrophysics Data System (ADS)

    Cole, Barbara Ann

    2009-11-01

    This paper examines narrative methodologies as one approach to exploring issues of gender, education and social justice and, particularly, insights into "undoing gender". It furthermore examines the possibilities of exploring gender and its multiple intersections in a range of global and policy contexts through the use of personal experience approaches. The "storying" of lived experience is examined as a means of challenging dominant discourses which can construct and other individuals and groups in relation to many aspects of gender and education. Drawing on intersectionality, as a complex and developing feminist theory, the paper considers ways in which narrative can illuminate often hidden complexities while seeking to avoid generalisations and essentialisms. The difficulties of using narrative in relation to these aims are explored in the light of the warnings of feminist writers such as Michele Fine and bell hooks. The paper briefly considers narrative as both methodology and phenomenon, and finally, drawing on critical discourse analysis, discusses the potential of intersectionality and narrative in relation to undoing gender.

  11. Constructing a New Vision: Undoing Gender through Secondary Education in Honduras

    NASA Astrophysics Data System (ADS)

    Murphy-Graham, Erin

    2009-11-01

    This article presents results from a qualitative study on how the Honduran secondary education programme, Sistema de Aprendizaje Tutorial (SAT), attempts to "undo gender" (Deutsch 2007: 122) by encouraging students to rethink gender relations in their everyday lives in a way that reflects their increased consciousness of gender equality. My findings suggest that SAT increased women's gender consciousness and this heightened their desire for change in the domestic sphere. In some instances, women were able to negotiate a new sharing of responsibilities with their spouses. There are several features of SAT that make it a transformative innovation in education: (1) gender is mainstreamed into the curriculum; (2) gender is linked with the larger concept of justice; (3) students engage in reflection, dialogue and debate; (4) teachers are given the opportunity to reflect critically on their understanding of gender in professional development sessions; and (5) it emphasises that undoing gender requires change among individuals and in social structures such as the family.

  12. The Model Analyst’s Toolkit: Scientific Model Development, Analysis, and Validation

    DTIC Science & Technology

    2013-05-20

    Charles River’s Metronome framework. This framework is built on top of the same Equinox libraries that the popular Eclipse Development Environment uses...the names are fully visible (see Figure 8). The Metronome framework also provides functionality for undo and redo, so the user can easily correct...mistakes. Figure 8. Changing Pane sizes and layouts in the new Metronome -enhanced MAT This period, we also improved the MAT project file format so

  13. A Task Group Practitioner's Response to Waldo and Bauman's Article on Regrouping the Categorization of Group Work.

    ERIC Educational Resources Information Center

    Keel, Linda P.

    1998-01-01

    Argues that Waldo and Bauman's Goals and Process (GAP) matrix does not include task/work groups. Claims that it is not in the best interest of group work to undo or rework the Association for Specialists in Group Work's four core groups as a model. States that the field of group work needs a commonly shared framework/categorization from which to…

  14. Redistribution by insurance market regulation: Analyzing a ban on gender-based retirement annuities.

    PubMed

    Finkelstein, Amy; Poterba, James; Rothschild, Casey

    2009-01-01

    We illustrate how equilibrium screening models can be used to evaluate the economic consequences of insurance market regulation. We calibrate and solve a model of the United Kingdom's compulsory annuity market and examine the impact of gender-based pricing restrictions. We find that the endogenous adjustment of annuity contract menus in response to such restrictions can undo up to half of the redistribution from men to women that would occur with exogenous Social Security-like annuity contracts. Our findings indicate the importance of endogenous contract responses and illustrate the feasibility of employing theoretical insurance market equilibrium models for quantitative policy analysis.

  15. Redistribution by insurance market regulation: Analyzing a ban on gender-based retirement annuities

    PubMed Central

    Finkelstein, Amy; Poterba, James; Rothschild, Casey

    2009-01-01

    We illustrate how equilibrium screening models can be used to evaluate the economic consequences of insurance market regulation. We calibrate and solve a model of the United Kingdom’s compulsory annuity market and examine the impact of gender-based pricing restrictions. We find that the endogenous adjustment of annuity contract menus in response to such restrictions can undo up to half of the redistribution from men to women that would occur with exogenous Social Security-like annuity contracts. Our findings indicate the importance of endogenous contract responses and illustrate the feasibility of employing theoretical insurance market equilibrium models for quantitative policy analysis. PMID:20046907

  16. The doing and undoing of male household decision-making and economic authority in Rwanda and its implications for gender transformative programming.

    PubMed

    Stern, Erin; Heise, Lori; McLean, Lyndsay

    2017-12-01

    This paper explores two key norms that underpin intimate partner violence in Rwanda: men's roles as economic providers and decision-making authorities in the household. It describes the political, legal and socio-economic factors affecting these norms and how they create opportunities and barriers to 'undoing' restrictive gender norms. Findings are drawn from an evaluation of Inadshyikirwa, an intimate partner violence prevention programme operating in Rwanda. Across three intervention sectors, 24 focus groups were conducted with unmarried and married men and women residing in intervention communities. Thirty interviews with couples and nine interviews with opinion leaders were conducted before they completed programme training designed to shift gender norms underlying intimate partner violence. The data indicate a strong awareness of and accountability to Rwandan laws and policies supporting women's economic empowerment and decision-making, alongside persisting traditional notions of men as household heads and primary breadwinners. Transgression of these norms could be accommodated in some circumstances, especially those involving economic necessity. The data also identified increasing recognition of the value of a more equitable partnership model. Findings highlight the importance of carefully assessing cracks in the existing gender order that can be exploited to support gender equality and non-violence.

  17. [An extreme case of undoing and posing in a case of murder-suicide. A forensic pathological approach to crime scene investigation].

    PubMed

    Guddat, Saskia S; Schalinski, Sarah; Püschel, Klaus; Tsokos, Michael; Schulz, Friedrich

    2007-01-01

    A 7-year-old boy was killed by his father by manual strangulation during a murder-suicide. After the killing of the son, the father showed typical "undoing" behaviour: He changed the boy's clothes and laid him down on the bed. Then he placed candles around his head, put pictures of the parents' wedding around him and a crucifix and a picture of the family into his hands. He broke off a rose in a vase next to the bed, lit the candles and took photographs of his dead son. Later he called his wife, threatened to kill the son and finally called the police to confess the murder and to announce his forthcoming suicide.

  18. Why Are Drugs So Hard to Quit?

    MedlinePlus

    ... Loading... Close Yeah, keep it Undo Close This video is unavailable. Watch Queue Queue Watch Queue Queue ... in Share More Report Need to report the video? Sign in to report inappropriate content. Sign in ...

  19. Undoing Racism Through Genesee County's REACH Infant Mortality Reduction Initiative.

    PubMed

    Kruger, Daniel J; Carty, Denise C; Turbeville, Ashley R; French-Turner, Tonya M; Brownlee, Shannon

    2015-01-01

    Genesee County Racial and Ethnic Approaches to Community Health Program (REACH) is a Community-Based Public Health partnership for reducing African American infant mortality rates that hosts the Undoing Racism Workshop (URW). Assess the URW's effectiveness in promoting an understanding of racism, institutional racism, and how issues related to race/ethnicity can affect maternal and infant health. Recent URW participants (n=84) completed brief preassessment and postassessment forms; participants (n=101) also completed an on-line, long-term assessment (LTA). URWs promoted understanding of racism and institutional racism, although they were less effective in addressing racism as related to maternal and infant health. The URWs were most effective in the domains related to their standard content. Additional effort is necessary to customize URWs when utilized for activities beyond their original purpose of community mobilization.

  20. Emotional Reactions to the Outcomes of Decisions: The Role of Counterfactual Thought in the Experience of Regret and Disappointment.

    PubMed

    Zeelenberg; van Dijk WW; van der Pligt J; Manstead; van Empelen P; Reinderman

    1998-08-01

    Regret and disappointment are emotions that can be experienced in response to an unfavorable outcome of a decision. Previous research suggests that both emotions are related to the process of counterfactual thinking. The present research extends this idea by combining it with ideas from regret and disappointment theory. The results show that regret is related to behavior-focused counterfactual thought in which the decision-maker's own actions are changed, whereas disappointment is related to situation-focused counterfactual thought in which aspects of the situation are changed. In Study 1 participants (N = 130) were asked to recall an autobiographical episode of either a regretful or a disappointing event. When asked to undo this event, regret participants predominantly changed their own actions, whereas disappointment participants predominantly changed aspects of the situation. In Study 2 all participants (N = 50) read a scenario in which a person experiences a negative event. Participants who were instructed to undo the event by changing the person's actions reported more regret than disappointment, while participants who were instructed to undo the event by changing aspects of the situation reported more disappointment than regret. Study 3 (N = 140) replicated the findings from Study 2 with a different scenario, and a design in which regret and disappointment were measured between rather than within subjects. In the discussion we address the relation among counterfactual thinking, attributions and affective reactions to decision outcomes, and the implications for decision research. Copyright 1998 Academic Press.

  1. Un/doing Gender? a Case Study of School Policy and Practice in Zambia

    NASA Astrophysics Data System (ADS)

    Bajaj, Monisha

    2009-11-01

    This article explores an attempt to disrupt gender inequality in a unique, low-cost private school in Ndola, Zambia. It examines deliberate school policies aimed at "undoing gender" or fostering greater gender equity. These include efforts to maintain gender parity at all levels of the school and the requirement that both young men and women carry out cleaning tasks generally viewed as "women's work". Observations, interviews, student diaries and surveys from this school and from government schools provide the basis for a comparison, indicating how the former strives to interrupt the transmission of gender inequalities as well as how students respond to these practices. The findings suggest that the pedagogical practices deployed by this school have generally succeeded in destabilising norms of gender subordination and gender-based violence, though the replicability of these practices is interrogated given broader questions about the country's public resources and political will.

  2. Modelling Safe Interface Interactions in Web Applications

    NASA Astrophysics Data System (ADS)

    Brambilla, Marco; Cabot, Jordi; Grossniklaus, Michael

    Current Web applications embed sophisticated user interfaces and business logic. The original interaction paradigm of the Web based on static content pages that are browsed by hyperlinks is, therefore, not valid anymore. In this paper, we advocate a paradigm shift for browsers and Web applications, that improves the management of user interaction and browsing history. Pages are replaced by States as basic navigation nodes, and Back/Forward navigation along the browsing history is replaced by a full-fledged interactive application paradigm, supporting transactions at the interface level and featuring Undo/Redo capabilities. This new paradigm offers a safer and more precise interaction model, protecting the user from unexpected behaviours of the applications and the browser.

  3. Pre-Texts for Research: A Response to Robin Usher on Theory and Writing.

    ERIC Educational Resources Information Center

    Standish, Paul

    1994-01-01

    Responding to Usher's analysis of postmodernism and adult education, Standish argues that Usher misinterprets Derrida in a way that reinforces the relativism, skepticism, and rhetoric of oppression that Derrida seeks to undo. (SK)

  4. Back Talk.

    ERIC Educational Resources Information Center

    Ruben, Barbara

    1994-01-01

    Recently, three issues in particular have fueled controversies in environmental debate: ozone, dioxin, and global warming. This article examines how these issues have been characterized by journalists, scientists, and backlash authors. It is suggested that media backlash is threatening to undo public faith in scientific knowledge about global…

  5. Africentrism--Standing on Its Own Cultural Ground

    ERIC Educational Resources Information Center

    Tolliver, Derise E.

    2015-01-01

    Africentrism is a conceptual framework that is rooted in the epistemology, cosmology, and axiology of the indigenous African worldview. Understanding the basic principles and values of this transformative paradigm can inform doctoral programs' efforts to enhance inclusion by undoing practices of marginalization and hegemony.

  6. Breaking the solid ground of common sense: undoing "structure" with Michael Balint.

    PubMed

    Bonomi, Carlo

    2003-09-01

    Balint's great merit was to question what, in the classical perspective, was assumed as a prerequisite for analysis and thus located beyond analysis: the maturity of the ego. A fundamental premise of his work was Ferenczi's distrust for the structural model, which praised the maturity of the ego and its verbal, social, and adaptive abilities. Ferenczi's view of ego maturation as a trauma derivative was strikingly different from the theories of all other psychoanalytic schools and seems to be responsible for Balint's understanding of regression as a sort of inverted process that enables the undoing of the sheltering structures of the mature mind. Balint's understanding of the relation between mature ego and regression diverged not only from the ego psychologists, who emphasized the idea of therapeutic alliance, but also from most of the authors who embraced the object-relational view, like Klein (who considered regression a manifestation of the patient's craving for oral gratification), Fairbairn (who gave up the notion of regression), and Guntrip (who viewed regression as a schizoid phenomenon related to the ego weakness). According to Balint, the clinical appearance of a regression would "depend also on the way the regression is recognized, is accepted, and is responded to by the analyst." In this respect, his position was close to Winnicott's reformulation of the therapeutic action. Yet, the work of Balint reflects the persuasion that the progressive fluidification of the solid structure could be enabled only by the analyst's capacity for becoming himself or herself [unsolid].

  7. A Psychological Perspective of Teen Romances in Young Adult Literature.

    ERIC Educational Resources Information Center

    Dickson, Cheryl L.

    2001-01-01

    Suggests that just as violence on television is hypothesized to increase real-life violence, television romance can likely affect views of real-life romance. Hypothesizes that literature could undo television's mistakes and bridge the gap between real love and fantasy love. (SG)

  8. African American infant mortality and the Genesee County, MI REACH 2010 initiative: an evaluation of the Undoing Racism Workshop.

    PubMed

    Shultz, Cameron; Skorcz, Stephen

    2012-01-01

    The authors examine African American African American and White socioeconomic and infant mortality outcomes in Genesee County, Michigan, assess the stated effects of the Undoing Racism Workshop (URW) on its participants and the greater-Genesee County community, and introduce the ecological approach to the cycle of socialization as a tool to help identify sources of racially linked tension and sites for ameliorative intervention. Findings show that African Americans in Flint are geographically and socioeconomically isolated, have fewer resources to sustain health, and experience higher rates of infant mortality when compared to Whites in Flint's surrounding suburbs. Between two thirds and three fourths of URW follow-up survey respondents endorse the belief that the URW can help reduce infant mortality, and results suggest the workshop helps elicit individual and institutional/policy-related changes intended to lessen the disparity. Authors assert the URW offers a common language and framework for discussing racism as a structural phenomenon rather than merely racial prejudice within individuals.

  9. 3 CFR 9015 - Proclamation 9015 of September 10, 2013. Patriot Day and National Day of Service and Remembrance...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... never undo the pain and injustice borne that terrible morning, nor will we ever forget those we lost. On... such great evil. As we mark the anniversary of September 11, I invite all Americans to observe a...

  10. Undoing Suggestive Influence on Memory: The Reversibility of the Eyewitness Misinformation Effect

    ERIC Educational Resources Information Center

    Oeberst, Aileen; Blank, Hartmut

    2012-01-01

    Presenting inconsistent postevent information about a witnessed incident typically decreases the accuracy of memory reports concerning that event (the "misinformation effect"). Surprisingly, the "reversibility" of the effect (after an initial occurrence) has remained largely unexplored. Based on a "memory conversion" theoretical framework and…

  11. ("Un")Doing Standards in Education with Actor-Network Theory

    ERIC Educational Resources Information Center

    Fenwick, Tara J.

    2010-01-01

    Recent critiques have drawn important attention to the depoliticized consensus and empty promises embedded in network discourses of educational policy. While acceding this critique, this discussion argues that some forms of network analysis--specifically those adopting actor-network theory (ANT) approaches--actually offer useful theoretical…

  12. Mainstreaming Critical Disability Studies: Towards Undoing the Last Prejudice

    ERIC Educational Resources Information Center

    McDonald-Morken, Colleen Ann

    2014-01-01

    According to critical disability studies scholars, disablism may be the fundamental system of unearned advantaging and disadvantaging upon which all other notions of difference-as-deviance are constructed. If so, a deeply critical and intersectional investigation of enabled privilege/disablism prepares a grounding from which seeds of novel and…

  13. Can Money Undo the Past? A Canadian Example.

    ERIC Educational Resources Information Center

    Thomas, R. Murray

    2003-01-01

    In Canada, more than 9,000 lawsuits have been filed by American Indians and Inuits seeking reparations for the mistreatment Indigenous children suffered in residential schools operated by four religious groups and financed by the Canadian government. Although most suits allege "cultural damage" caused by schooling practices, little of…

  14. The Need for Precision

    ERIC Educational Resources Information Center

    Weinberg, David R.

    2012-01-01

    People have become accustomed to the imprecision of language, though imprecise language has a subtle way of misguiding thoughts and actions. In this article, the author argues that the term "teacher" in reference to the Montessori practitioner is a distortion of everything Maria Montessori tried to undo about traditional education. In dealing with…

  15. The Intentional Interim

    ERIC Educational Resources Information Center

    Nugent, Patricia A.

    2011-01-01

    The author spent years in central-office administration, most recently in an interim position. Some interim administrators simply see themselves as placeholders until the real deal is hired, giving the organization the opportunity to coast. There are others who see themselves as change agents and cannot wait to undo or redo what their predecessor…

  16. Undoing the Past: Restoration in the Monday Creek Watershed.

    ERIC Educational Resources Information Center

    Reed, Mary

    2000-01-01

    Monday Creek Restoration Project is a collaborative effort of 20 organizations to clean up an Appalachian Ohio stream fouled for generations by acid mine drainage and industrial waste. The grassroots effort has involved state and federal agencies, VISTA volunteers, community volunteers, and college students who monitor the watershed and share…

  17. --No Title--

    Science.gov Websites

    --------------------------------------------------------------------------------------------------*/ /* undo the min-height 100% trick used to fill the container's height */ .fc-time-grid { min-height: 0 !important; } /* don't display the side axis at all ("all-day" and time cells) */ .fc-agenda-view .fc-axis { display: none; } /* don't display the horizontal lines */ .fc-slats, .fc-time-grid hr

  18. Undoing Appropriateness: Raciolinguistic Ideologies and Language Diversity in Education

    ERIC Educational Resources Information Center

    Flores, Nelson; Rosa, Jonathan

    2015-01-01

    In this article, Nelson Flores and Jonathan Rosa critique appropriateness-based approaches to language diversity in education. Those who subscribe to these approaches conceptualize standardized linguistic practices as an objective set of linguistic forms that are appropriate for an academic setting. In contrast, Flores and Rosa highlight the…

  19. Doing "Business as Usual": Dynamics of Voice in Community Organizing Talk

    ERIC Educational Resources Information Center

    O'Connor, Kevin; Hanny, Courtney; Lewis, Cameron

    2011-01-01

    This article examines discourse in a community change project committed to undoing "business as usual"--attempts to "fix" problems within the community without involvement of residents in the process. We show how, despite commitments to recognizing community "voice," participants' orientation to powerful "centering institutions" (Jan Blommaert…

  20. Combating Climate Change through Quality Education. Policy Brief 2010-03

    ERIC Educational Resources Information Center

    Anderson, Allison

    2010-01-01

    Climate change threatens to undo and even reverse the progress made toward meeting the Millennium Development Goals (MDGs) and poses one of the most serious challenges to reducing global poverty for the international community. However, the education sector offers a currently untapped opportunity to combat climate change. There is a clear…

  1. ("un")Doing the Next Generation Science Standards: Climate Change Education Actor-Networks in Oklahoma

    ERIC Educational Resources Information Center

    Colston, Nicole M.; Ivey, Toni A.

    2015-01-01

    This exploratory research investigated how science education communities of practice in Oklahoma engage in translations of climate change education (CCE). Applications of actor-network theory to educational policymaking facilitate this analysis of the spaces of prescription and spaces of negotiation that characterize CCE in Oklahoma. Informed by…

  2. Teaching Reflexivity: Undoing or Reinscribing Habits of Gender?

    ERIC Educational Resources Information Center

    Bondi, Liz

    2009-01-01

    This paper outlines an approach used in a course designed to teach reflexivity as a research skill and explores what kind of gender intervention such teaching might constitute. Although inspired by feminist debates about the complex power dynamics of research relationships, the course in question does not focus specifically on gender issues.…

  3. Undoing the Knots: Identity Transformations in a Study Abroad Programme

    ERIC Educational Resources Information Center

    Ellwood, Constance

    2011-01-01

    In times of globalised flows of students, this paper offers an alternative way of conceptualising identity change in the experiences of students on study abroad or student exchange programmes. Despite the "identity turn" of recent years, modernist notions of identity continue to impact on the ways in which study abroad experiences are…

  4. Outrage and Engage: A Story of Eminent Domain

    ERIC Educational Resources Information Center

    Pattison, Patricia

    2014-01-01

    Numerous research studies clearly indicate the importance of first impressions. It is very likely that students will form their opinions of the class and the professor during the first class meeting. These first impression can be nearly impossible to reverse or undo, making those first encounters extremely important, for they set the tone for all…

  5. Learning by Undoing, "Democracy and Education," and John Dewey, the Colonial Traveler

    ERIC Educational Resources Information Center

    Papastephanou, Marianna

    2017-01-01

    The centennial anniversary of John Dewey's "Democracy and Education" has been celebrated this year in a reconstructive and utility-based spirit. The article considers this spirit and the need to complement it with a critical-deconstructive and "use-less" prism that will reveal shortcomings in Dewey's and our own political…

  6. "Doing and Undoing Gender": Female Higher Education in the Islamic Republic of Iran

    ERIC Educational Resources Information Center

    Mehran, Golnar

    2009-01-01

    Since the establishment of the Islamic Republic, female higher education has been characterised by a paradoxical combination of discrimination and exclusion, on the one hand, and increasing equality and empowerment, on the other. This study focuses on the triangle of education, equality and empowerment, using Sara Longwe's women's empowerment…

  7. Resegregation in Norfolk, Virginia. Does Restoring Neighborhood Schools Work?

    ERIC Educational Resources Information Center

    Meldrum, Christina; Eaton, Susan E.

    This report reviews school department data and interviews with officials and others involved in the Norfolk (Virginia) school resegregation plan designed to stem White flight and increase parental involvement. The report finds that all the basic assumptions the local community and the court had about the potential benefits of undoing the city's…

  8. The Circle Game: Shadows and Substance in the Indian Residential School Experience in Canada.

    ERIC Educational Resources Information Center

    Chrisjohn, Roland D; Young, Sherri L.; Maraun, Michael

    This book develops an alternative account of Canada's operation of Indian residential schools and provides recommendations for undoing what has been done. Derived from a report on residential schooling submitted to the Royal Commission on Aboriginal Peoples in October 1994, the book discusses the language and rhetoric surrounding residential…

  9. No Outsiders and "The Eternal Sunshine of the Spotless Child"

    ERIC Educational Resources Information Center

    Rasmussen, Mary Lou

    2011-01-01

    "Interrogating Heteronormativity in Primary Schools: The No Outsiders Project" is a book that reflects on a research project based in primary schools and funded by The Economic and Social Research Council of the UK. This text is accompanied by another practice-focused work: "Undoing Homophobia in Primary Schools". The project…

  10. Undoing Diversity: Knowledge and Neoliberal Discourses in Colleges of Education

    ERIC Educational Resources Information Center

    Matus, Claudia; Infante, Marta

    2011-01-01

    In this article we analyze discourses of "diversity" in colleges of education in Chile. We contend that the use of discourses of diversity, as reproducing the separation between mainstream subjectivities and those uncontained by the category of normal, is one of the ways universities align themselves with the rules of a democratic…

  11. Better to matter than merely count.

    PubMed

    Hay, N

    1989-07-01

    when at the age of 42 I gave up my accountancy job in a City bank and became a student nurse at one sixth of the salary I upset the material expectations of a lot of people. But it was money, the symbol of security for so many, that was partly the cause of my undoing.

  12. Social Justice for Human Development

    ERIC Educational Resources Information Center

    Jaramillo, Nathalia

    2010-01-01

    The topic of social justice in U.S. teacher education has a long and protracted history that harkens back to the civil rights movement of the mid-20th century, with its attendant legal rulings and constitutional amendments that sought to undo the legacy of discrimination against communities of color, women, and the poor. What is lost,…

  13. Education Policy and the Pursuit of Equality: Perspectives from South Africa

    ERIC Educational Resources Information Center

    Sayed, Yusuf; Vellanki, Vivek

    2013-01-01

    1994 is an important year in South African history. It brought about significant socio-political changes in an attempt to undo the unjust practices perpetuated during the apartheid regime. The apartheid government had severely impacted all spheres and institutions of society, including education. In this interview, Vivek Vellanki asks Doctor Yusuf…

  14. The Phrase of the Phallic Pheminine: Beyond the "Nurturing Mother" in Feminist Composition Pedagogy.

    ERIC Educational Resources Information Center

    Mowery, Diane

    Theories of phallic authority outlined by Jaques Lacan, Sigmund Freud, and Luce Irigaray suggest that one can effectively undo authority only from a position of authority, a position that traps feminists within the very phallic economy they hope to subvert. Attempting to avoid this trap, feminist pedagogues have made a distinction between…

  15. Operational Reserve: National Guard Readiness when Current Conflicts End

    DTIC Science & Technology

    2010-03-01

    toothpaste back in the tube”17 With probable post war reduction in DOD funding, it is not realistic to assume that the National Guard will obtain...necessitates that we don’t try to put the toothpaste back in the tube. We cannot undo the policies and procedures that have gotten us to the current state

  16. What Dead Schools Can Teach Us: Observations from the Independent School Cemetery

    ERIC Educational Resources Information Center

    McManus, Jim

    2012-01-01

    Why do so many independent schools fail? And what are the chief causes of death? When they can be performed, institutional "autopsies" are illuminating, but it should be noted that many schools disappear with few clues about their final undoing. However, when schools do leave paper trails that help understand the reasons for their…

  17. (Un)Doing Hegemony in Education: Disrupting School-to-Prison Pipelines for Black Males

    ERIC Educational Resources Information Center

    Dancy, T. Elon, II

    2014-01-01

    The school-to-prison pipeline refers to the disturbing national trend in which children are funneled out of public schools and into juvenile and criminal justice systems. The purpose of this article is to theorize how this pipeline fulfills societal commitments to black male over-incarceration. First, the author reviews the troublesome perceptions…

  18. Vulnerability: Self-Study's Contribution to Social Justice Education

    ERIC Educational Resources Information Center

    Knowles, Corinne

    2014-01-01

    Teaching, as a social justice project, seeks to undo and re-imagine oppressive pedagogies in order to transform teachers, their students, and the knowledge with which they work. In this article, I argue that self-study can contribute to social justice in a number of ways by, for instance, making the sometimes limiting norms that frame teaching and…

  19. Third Grade Students' Performance on Calculator and Calculator-Related Tasks. Technical Report No. 498.

    ERIC Educational Resources Information Center

    Weaver, J. Fred

    Refinements of work with calculator algorithms previously conducted by the author are reported. Work with "chaining" and the doing/undoing property in addition and subtraction was tested with 24 third-grade students. Results indicated the need for further instruction with both ideas. Students were able to manipulate the calculator keyboard, but…

  20. Computing environment logbook

    DOEpatents

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  1. Reconceptualising Elite Athlete Programmes: "Undoing" the Politics of Labelling in Health and Physical Education

    ERIC Educational Resources Information Center

    Brown, Seth

    2015-01-01

    High-performance sport is a big business, with nations such as Australia and New Zealand dedicating hundreds of millions of dollars in the development of facilities and in creating sporting centres of excellence. Historically, high-performance sport and elite athlete programmes (EAPs) were regulated to an extra-curricular space in schools or local…

  2. Masculinities and Resistance: High School Boys (Un)doing Boy

    ERIC Educational Resources Information Center

    Kehler, Michael D.

    2004-01-01

    In Australia, Canada, the United States, and the United Kingdom there has been a resurgence in attention directed at boys and schooling. The media and public discourse describes it as a burgeoning moral panic. Mainly grounded in public concerns about achievement levels and violence in schools, the response has been to develop quick fixes and…

  3. The Social Funding of Race: The Role of Schooling

    ERIC Educational Resources Information Center

    Ladson-Billings, Gloria

    2018-01-01

    Our nation is suffused in questions of race and racism. Despite much scholarly and public discussion we struggle to undo long-held assumptions about race and how it functions. This article looks at race from the perspective of a public commodity that the society "funds" in order to make it seem real and intractable. Throughout the…

  4. Learning to Lead for Racial Equity

    ERIC Educational Resources Information Center

    Ngounou, Gislaine; Gutierrez, Nancy

    2017-01-01

    If education leaders aspire to confront and undo the severe racial inequities that exist in so many of our schools and school systems, then they will have to create opportunities for teachers and staff to engage in productive discussions about questions that many of them will be reluctant to consider Given how complex and how deeply felt are…

  5. "Undoing" the Self: Should Heterosexual Teachers "Come Out" in the University Classroom?

    ERIC Educational Resources Information Center

    Allen, Louisa

    2011-01-01

    The issue of whether to "come out" in class has a poignant history in the literature by gay, lesbian and bisexual educators on this topic. By comparison few heterosexuals have publicly written about whether they explicitly reveal their heterosexuality to students. This paper contributes to the enduring debate about whether to "come out" in class…

  6. A Recovery-Oriented Approach to Dependable Services: Repairing Past Errors with System-Wide Undo

    DTIC Science & Technology

    2003-12-01

    54 4.5.3 Handling propagating paradoxes: the squash interface . . . . . . . . . . . . . . . . . . . 54 4.6 Discussion...84 6.3.3 Compensating for paradoxes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 6.3.4 Squashing propagating...the service and comparing the behavior of the replicas to detect and squash misbehaving replicas. While on paper Byzantine fault tolerance may seem to

  7. "It Should Be Better All Together": Exploring Strategies for "Undoing" Gender in Coeducational Physical Education

    ERIC Educational Resources Information Center

    Hills, Laura A.; Croston, Amanda

    2012-01-01

    Physical education (PE) remains the subject in coeducational schools that is most likely to be delivered in gender segregated sessions. Decisions to offer single sex lessons are often underpinned by discourses and practices associated with doing gender that emphasise differences in boys' and girls' attitudes, behaviours, abilities and experiences.…

  8. Affective Association: An Effective Intervention in Countering Fragmentation and Dissociation

    ERIC Educational Resources Information Center

    Hart, Carolyn

    2008-01-01

    This paper is concerned with the processes, both psychoanalytic and neuroscientific, involved in the undoing of dissociation in a 3-year-old, who was seen weekly over a nine month period. A neuroscientific and psychoanalytic developmental framework is used to follow a sequence of phenomena that emerged over the duration of relatively brief once…

  9. Research-Practice Partnerships: Building Two-Way Streets of Engagement. Social Policy Report. Volume 30, Number 4

    ERIC Educational Resources Information Center

    Tseng, Vivian; Easton, John Q.; Supplee, Lauren H.

    2017-01-01

    People have long bemoaned the silos of research and practice. Researchers express frustration that practitioners do not use or misuse research. Practitioners respond that research is not relevant to their work, or is not easily accessible or understood. Research-Practice Partnerships (RPPs) across the country are seeking to undo these patterns.…

  10. Undoing Quantitative Easing: Janet Yellen's Tiger Ride

    ERIC Educational Resources Information Center

    Niederjohn, M. Scott; Schug, Mark C.; Wood, William C.

    2014-01-01

    "One who rides a tiger is afraid to dismount," says a colorful proverb from an earlier time. This may be an apt saying for the situation facing the new head of the Federal Reserve, Janet L. Yellen, who takes over at a time when successive rounds of Fed policy have taken the central bank into uncharted territory. By historical standards,…

  11. Undoing Gender through Legislation and Schooling: The Case of AB 537 and AB 394 in California, USA

    ERIC Educational Resources Information Center

    Knotts, Greg

    2009-01-01

    This article investigates California laws AB 537: The Student Safety and Violence Prevention Act of 2000, and the recently enacted AB 394: Safe Place to Learn Act. Both demand that gender identity and sexual orientation be added to the lexicon of anti-harassment protection in public education. However, despite these progressive measures, schools…

  12. "I'm Glad I Was Designed": Un/Doing Gender and Class in Susan Price's "Odin Trilogy"

    ERIC Educational Resources Information Center

    Lehtonen, Sanna

    2012-01-01

    Susan Price's "Odin Trilogy" (2005-2008) is a juvenile science fiction series that depicts a future where class relations have become polarised due to late capitalist and technological developments and where ways of doing gender continue to be strongly connected with class. The society in the novels is based on slavery: people are either…

  13. Federal Solutions to School Fiscal Crises: Lessons from Nixon's Failed National Sales Tax for Education

    ERIC Educational Resources Information Center

    Venters, Monoka; Hauptli, Meghan V.; Cohen-Vogel, Lora

    2012-01-01

    Applying a Multiple Streams framework, the article documents the development and ultimate undoing of what became known as the national sales tax plan for education. The authors identify four factors that coalesced to lead the Nixon administration to propose replacing local property taxes with a federal value-added tax to finance K-12 education.…

  14. Early Central Regulation, Slow Financial Participation: Relations between Primary Education and the Dutch State from ± 1750-1920

    ERIC Educational Resources Information Center

    van Gijlswijk, Dick

    2016-01-01

    The declining economy of the Dutch Republic obliged city governments in the eighteenth century to take measures to undo the effects of the social deterioration. They therefore founded schools for the poor and sometimes gave full financial support. After 1795, the Batavian Revolution proclaimed that primary education was a state affair, but after a…

  15. The Unintended Consequences of School Inspection: The Prevalence of Inspection Side-Effects in Austria, The Czech Republic, England, Ireland, The Netherlands, Sweden, and Switzerland

    ERIC Educational Resources Information Center

    Jones, Karen L.; Tymms, Peter; Kemethofer, David; O'Hara, Joe; McNamara, Gerry; Huber, Stephan; Myrberg, Eva; Skedsmo, Guri; Greger, David

    2017-01-01

    It has been widely documented that accountability systems, including school inspections, bring with them unintended side effects. These unintended effects are often negative and have the potential to undo the intended positive effects. However the empirical evidence is limited. Through a European comparative study we have had the rare opportunity…

  16. Black Radicals Make for Bad Citizens: Undoing the Myth of the School to Prison Pipeline

    ERIC Educational Resources Information Center

    Sojoyner, Damien M.

    2013-01-01

    Over the past ten years, the analytic formation of the school to prison pipeline has come to dominate the lexicon and general common sense with respect to the relationship between schools and prisons in the United States. The concept and theorization that undergirds its meaning and function do not address the root causes that are central to…

  17. Did States Use Implementation Discretion to Reduce the Stringency of NCLB? Evidence from a Database of State Regulations

    ERIC Educational Resources Information Center

    Wong, Vivian C.; Wing, Coady; Martin, David; Krishnamachari, Anandita

    2018-01-01

    When No Child Left Behind (NCLB) became law in 2002, it was viewed as an effort to create uniform standards for students and schools across the country. More than a decade later, we know surprisingly little about how states actually implemented NCLB and the extent to which state implementation decisions managed to undo the centralizing objectives…

  18. ADA and C++ Business Case Analysis

    DTIC Science & Technology

    1991-07-01

    executable mini-specs, to support import of existing code. Automated database population/change propagation. 9. Documentation generation: via FrameMaker ...Backplane. ii. 4GLS H-20 I I IDE/Software through Pictures (StP) 12 June 1991 iii. Interleaf and FrameMaker publishing. 13. Output formats: PostScript... FrameMaker , WordPerfect. 12. User interface: Menu and mouse, windowing, color, on-line help, undo. Database browser via forms/tables component later

  19. Undoing Feudalism: A New Look at Communal Conflict Mediation

    DTIC Science & Technology

    1994-03-24

    social constructionists assert that each individual human plays a significant role in creating and influencing reality as it is perceived by his...concepts which, over time, have proven their worth by providing meaning to new, ever-emerging social realities. Levi’s second method for belief system...competing social groups is a gradual one, though varying in speed and method according to circumstance. This realization suggests the existence of a

  20. Undoing gender? The case of complementary and alternative medicine.

    PubMed

    Brenton, Joslyn; Elliott, Sinikka

    2014-01-01

    Despite a rich body of sociological research that examines the relationship between gender and health, scholars have paid little attention to the case of complementary and alternative medicine (CAM). One recent study (Sointu 2011) posits that men and women who use CAM challenge traditional ascriptions of femininity and masculinity through the exploration of self-care and emotions, respectively. Drawing on 25 in-depth interviews with middle-class Americans who use CAM, this article instead finds that men and women interpret their CAM use in ways that reproduce traditional gendered identities. Men frame their CAM use in terms of science and rationality, while simultaneously distancing themselves from feminine-coded components of CAM, such as emotions. Women seek CAM for problems such as abusive relationships, low self-esteem, and body image concerns, and frame their CAM use as a quest for self-reinvention that largely reflects and reproduces conventional femininity. Further, the reproduction of gendered identities is shaped by the participants' embrace of neoliberal tenets, such as the cultivation of personal control. This article contributes to ongoing theoretical debates about the doing, redoing and undoing of gender, as well as the literature on health and gender. © 2013 The Authors. Sociology of Health & Illness © 2013 Foundation for the Sociology of Health & Illness/Blackwell Publishing Ltd.

  1. I Don’t Want to Come Back Down: Undoing versus Maintaining of Reward Recovery in Older Adolescents

    PubMed Central

    Gilbert, Kirsten E.; Nolen-Hoeksema, Susan; Gruber, June

    2017-01-01

    Adolescence is characterized by heightened and sometimes impairing reward sensitivity, yet less is known about how adolescents recover from highly arousing positive states. This is particularly important given high onset rates of psychopathology associated with reward sensitivity during late adolescence and early adulthood. The current study thus utilized a novel reward sensitivity task in order to examine potential ways in which older adolescent females (ages 18–21; N = 83) might recover from high arousal positive reward sensitive states. Participants underwent a fixed incentive reward sensitivity task and subsequently watched a neutral, sad, or a low approach-motivated positive emotional film clip during which subjective and physiological recovery was assessed. Results indicated that the positive and negative film conditions were associated with maintained physiological arousal while the neutral condition facilitated faster physiological recovery from the reward sensitivity task. Interestingly, individual differences in self-reported positive emotion during the reward task were associated with faster recovery in the neutral condition. Findings suggest elicited emotion (regardless of valence) may serve to maintain reward sensitivity while self-reported positive emotional experience may be a key ingredient facilitating physiological recovery or undoing. Understanding the nuances of reward recovery provides a critical step in understanding the etiology and persistence of reward dysregulation more generally. PMID:26595439

  2. Networking among women snowboarders: a study of participants at an International Woman Snowboard Camp.

    PubMed

    Sisjord, M K

    2012-02-01

    The article focuses on women snowboarders' networking and relationships with national snowboard associations and commercial organizers. The study was conducted at an International Women Snowboard Camp, which attracted women snowboarders from five different countries. A qualitative interview was undertaken with participants from each country, eight in total, plus an interview with one of the organizers (a woman). The results indicate that participants from the Nordic countries adopt a more proactive stand to promote snowboarding by organizing specific groups in relation to national associations, particularly the Norwegians and the Finnish. Furthermore, some collaboration across national boarders appeared. The only Swedish participant was associated with several snowboarding communities; whereas the Italian (only one) and the Latvian snowboarders had links with commercial organizers, apparently male dominated in structure. The findings are discussed in the light of Castells' network theory and identity construction in social movements, and gender perspectives. The participants' doing/undoing gender reveals different strategies in negotiating hegemonic masculinity and the power structure in the organizations. Narratives from the Nordic participants reflect undoing gender that impacts on identity constructions in terms of project and/or resistance identity. The Italians and Latvians seemingly do gender while undertaking a subordinate position in the male-dominated structure. © 2010 John Wiley & Sons A/S.

  3. Role of Retinocortical Processing in Spatial Vision

    DTIC Science & Technology

    1989-06-01

    its inverse transform . These are even- symmetric functions. Odd-symmetric Gabor functions would also be required for image coding (Daugman, 1987), but...spectrum square; thus its horizontal and vertical scale factors may differ by a power of 2. Since the inverse transform undoes this distor- tion, it has...FIGURE 3 STANDARD FORM OF EVEN GABOR FILTER 7 order to inverse - transform correctly. We used Gabor functions with the standard shape of Daugman’s "polar

  4. Reversible Experiments: Putting Geological Disposal to the Test.

    PubMed

    Bergen, Jan Peter

    2016-06-01

    Conceiving of nuclear energy as a social experiment gives rise to the question of what to do when the experiment is no longer responsible or desirable. To be able to appropriately respond to such a situation, the nuclear energy technology in question should be reversible, i.e. it must be possible to stop its further development and implementation in society, and it must be possible to undo its undesirable consequences. This paper explores these two conditions by applying them to geological disposal of high-level radioactive waste (GD). Despite the fact that considerations of reversibility and retrievability have received increased attention in GD, the analysis in this paper concludes that GD cannot be considered reversible. Firstly, it would be difficult to stop its further development and implementation, since its historical development has led to a point where GD is significantly locked-in. Secondly, the strategy it employs for undoing undesirable consequences is less-than-ideal: it relies on containment of severely radiotoxic waste rather than attempting to eliminate this waste or its radioactivity. And while it may currently be technologically impossible to turn high-level waste into benign substances, GD's containment strategy makes it difficult to eliminate this waste's radioactivity when the possibility would arise. In all, GD should be critically reconsidered if the inclusion of reversibility considerations in radioactive waste management has indeed become as important as is sometimes claimed.

  5. I don't want to come back down: Undoing versus maintaining of reward recovery in older adolescents.

    PubMed

    Gilbert, Kirsten E; Nolen-Hoeksema, Susan; Gruber, June

    2016-03-01

    Adolescence is characterized by heightened and sometimes impairing reward sensitivity, yet less is known about how adolescents recover from highly arousing positive states. This is particularly important given high onset rates of psychopathology associated with reward sensitivity during late adolescence and early adulthood. The current study thus utilized a novel reward sensitivity task in order to examine potential ways in which older adolescent females (ages 18-21; N = 83) might recover from high arousal positive reward sensitive states. Participants underwent a fixed incentive reward sensitivity task and subsequently watched a neutral, sad, or a low approach-motivated positive emotional film clip during which subjective and physiological recovery was assessed. Results indicated that the positive and negative film conditions were associated with maintained physiological arousal while the neutral condition facilitated faster physiological recovery from the reward sensitivity task. It is interesting to note that individual differences in self-reported positive emotion during the reward task were associated with faster recovery in the neutral condition. Findings suggest elicited emotion (regardless of valence) may serve to maintain reward sensitivity whereas self-reported positive emotional experience may be a key ingredient facilitating physiological recovery or undoing. Understanding the nuances of reward recovery provides a critical step in understanding the etiology and persistence of reward dysregulation more generally. (c) 2016 APA, all rights reserved).

  6. Indigenous knowledge and science revisited

    NASA Astrophysics Data System (ADS)

    Aikenhead, Glen S.; Ogawa, Masakata

    2007-07-01

    This article provides a guided tour through three diverse cultural ways of understanding nature: an Indigenous way (with a focus on Indigenous nations in North America), a neo-indigenous way (a concept proposed to recognize many Asian nations' unique ways of knowing nature; in this case, Japan), and a Euro-American scientific way. An exploration of these three ways of knowing unfolds in a developmental way such that some key terms change to become more authentic terms that better represent each culture's collective, yet heterogeneous, worldview, metaphysics, epistemology, and values. For example, the three ways of understanding nature are eventually described as Indigenous ways of living in nature, a Japanese way of knowing seigyo-shizen, and Eurocentric sciences (plural). Characteristics of a postcolonial or anti-hegemonic discourse are suggested for science education, but some inherent difficulties with this discourse are also noted.

  7. Can we undo our first impressions?: The role of reinterpretation in reversing implicit evaluations

    PubMed Central

    Mann, Thomas C.; Ferguson, Melissa J.

    2015-01-01

    Little work has examined whether implicit evaluations can be effectively “undone” after learning new revelations. Across 7 experiments, participants fully reversed their implicit evaluation of a novel target person after reinterpreting earlier information. Revision occurred across multiple implicit evaluation measures (Experiments 1a and 1b), and only when the new information prompted a reinterpretation of prior learning versus did not (Experiment 2). The updating required active consideration of the information, as it emerged only with at least moderate cognitive resources (Experiment 3). Self-reported reinterpretation predicted (Experiment 4) and mediated (Experiment 5) revised implicit evaluations beyond the separate influence of how thoughtfully participants considered the new information in general. Finally, the revised evaluations were durable three days later (Experiment 6). We discuss how these results inform existing theoretical models, and consider implications for future research. PMID:25798625

  8. Can we undo our first impressions? The role of reinterpretation in reversing implicit evaluations.

    PubMed

    Mann, Thomas C; Ferguson, Melissa J

    2015-06-01

    Little work has examined whether implicit evaluations can be effectively "undone" after learning new revelations. Across 7 experiments, participants fully reversed their implicit evaluation of a novel target person after reinterpreting earlier information. Revision occurred across multiple implicit evaluation measures (Experiments 1a and 1b), and only when the new information prompted a reinterpretation of prior learning versus did not (Experiment 2). The updating required active consideration of the information, as it emerged only with at least moderate cognitive resources (Experiment 3). Self-reported reinterpretation predicted (Experiment 4) and mediated (Experiment 5) revised implicit evaluations beyond the separate influence of how thoughtfully participants considered the new information in general. Finally, the revised evaluations were durable 3 days later (Experiment 6). We discuss how these results inform existing theoretical models, and consider implications for future research. (c) 2015 APA, all rights reserved).

  9. Company Command: The Bottom Line

    DTIC Science & Technology

    1990-01-01

    personal basis. Be direct and don’t pull punches. Be sincere and ob- 1e/ctie.. First sergeants. as a group, agree on one rule: Good first sergeants make...8217" You must undo the confusion, LACK OF AGREEMENT ON IINIT GOALIS AND STANDARDS: First sergeants and (C()s should decide jointhr oin the direction for...34* Keep the interview informal. In fact, not every inter- view has to b- in your office. Pull one of your me- chanics aside in the motor pool for a chat

  10. Documentation Library Application (DLA) Version 2.0.0.1, User Guide

    DTIC Science & Technology

    2013-05-08

    document DIScard chotnoes and undo <ht:dc-oot. View AN Library ~IR8librMY ~ooc~~.tiOn Llbt~ry View W AA Library View ~MI Libr -ary...Windows XP and access to the DLA SQL Server database. To install the DLA, navigate to N:\\Dept 161\\3 - PRODUCTS\\ Software Installation...Health Research Center. You should see the DLA menu item listed under the NHRC programs there. Contact the DLA software POC if you encounter any

  11. A Voice Enabled Procedure Browser for the International Space Station

    NASA Technical Reports Server (NTRS)

    Rayner, Manny; Chatzichrisafis, Nikos; Hockey, Beth Ann; Farrell, Kim; Renders, Jean-Michel

    2005-01-01

    Clarissa, an experimental voice enabled procedure browser that has recently been deployed on the International Space Station (ISS), is to the best of our knowledge the first spoken dialog system in space. This paper gives background on the system and the ISS procedures, then discusses the research developed to address three key problems: grammar-based speech recognition using the Regulus toolkit; SVM based methods for open microphone speech recognition; and robust side-effect free dialogue management for handling undos, corrections and confirmations.

  12. Distributed Database Control and Allocation. Volume 1. Frameworks for Understanding Concurrency Control and Recovery Algorithms.

    DTIC Science & Technology

    1983-10-01

    an Aborti , It forwards the operation directly to the recovery system. When the recovery system acknowledges that the operation has been processed, the...list... AbortI . rite Ti Into the abort list. Then undo all of Ti’s writes by reedina their bet ore-images from the audit trail and writin. them back...Into the stable database. [Ack) Then, delete Ti from the active list. Restart. Process Aborti for each Ti on the active list. Ack) In this algorithm

  13. Investigating Advances in the Acquisition of Secure Systems Based on Open Architecture, Open Source Software, and Software Product Lines

    DTIC Science & Technology

    2012-01-27

    example is found in games converted to serve a purpose other than entertainment , such as the development and use of games for science, technology, and...These play-session histories can then be further modded via video editing or remixing with other media (e.g., adding music ) to better enable cinematic...available OSS (e.g., the Linux Kernel on the Sony PS3 game console2) that game system hackers seek to undo. Finally, games are one of the most commonly

  14. Trace-Driven Debugging of Message Passing Programs

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert; Lopez, Louis; Bailey, David (Technical Monitor)

    1998-01-01

    In this paper we report on features added to a parallel debugger to simplify the debugging of parallel message passing programs. These features include replay, setting consistent breakpoints based on interprocess event causality, a parallel undo operation, and communication supervision. These features all use trace information collected during the execution of the program being debugged. We used a number of different instrumentation techniques to collect traces. We also implemented trace displays using two different trace visualization systems. The implementation was tested on an SGI Power Challenge cluster and a network of SGI workstations.

  15. The Intergenerational Transmission of Generosity

    PubMed Central

    Wilhelm, Mark O.; Brown, Eleanor; Rooney, Patrick M.; Steinberg, Richard

    2008-01-01

    This paper estimates the correlation between the generosity of parents and the generosity of their adult children using regression models of adult children’s charitable giving. New charitable giving data are collected in the Panel Study of Income Dynamics and used to estimate the regression models. The regression models are estimated using a wide variety of techniques and specification tests, and the strength of the intergenerational giving correlations are compared with intergenerational correlations in income, wealth, and consumption expenditure from the same sample using the same set of controls. We find the religious giving of parents and children to be strongly correlated, as strongly correlated as are their income and wealth. The correlation in the secular giving (e.g., giving to the United Way, educational institutions, for poverty relief) of parents and children is smaller, similar in magnitude to the intergenerational correlation in consumption. Parents’ religious giving is positively associated with children’s secular giving, but in a more limited sense. Overall, the results are consistent with generosity emerging at least in part from the influence of parental charitable behavior. In contrast to intergenerational models in which parental generosity towards their children can undo government transfer policy (Ricardian equivalence), these results suggest that parental generosity towards charitable organizations might reinforce government policies, such as tax incentives aimed at encouraging voluntary transfers. PMID:19802345

  16. Programs of religious/spiritual support in hospitals - five "Whies" and five "Hows".

    PubMed

    Saad, Marcelo; de Medeiros, Roberta

    2016-08-22

    A contemporary orientation of the hospital experience model must encompass the clients' religious-spiritual dimension. The objective of this paper is to share a previous experience, highlighting at least five reasons hospitals should invest in this direction, and an equal number of steps required to achieve it. In the first part, the text discourses about five reasons to invest in religious-spiritual support programs: 1. Religious-spiritual wellbeing is related to better health; 2. Religious-spiritual appreciation is a standard for hospital accreditation; 3. To undo religious-spiritual misunderstandings that can affect treatment; 4. Patients demand a religious-spiritual outlook from the institution; and 5. Costs may be reduced with religious-spiritual support. In the second part, the text suggests five steps to implement religious-spiritual support programs: 1. Deep institutional involvement; 2. Formal staff training; 3. Infrastructure and resources; 4. Adjustment of institutional politics; and 5. Agreement with religious-spiritual leaders. The authors hope the information compiled here can inspire hospitals to adopt actions toward optimization of the healing experience.

  17. Social neuroscience: undoing the schism between neurology and psychiatry.

    PubMed

    Ibáñez, Agustín; García, Adolfo M; Esteves, Sol; Yoris, Adrián; Muñoz, Edinson; Reynaldo, Lucila; Pietto, Marcos Luis; Adolfi, Federico; Manes, Facundo

    2018-02-01

    Multiple disorders once jointly conceived as "nervous diseases" became segregated by the distinct institutional traditions forged in neurology and psychiatry. As a result, each field specialized in the study and treatment of a subset of such conditions. Here we propose new avenues for interdisciplinary interaction through a triangulation of both fields with social neuroscience. To this end, we review evidence from five relevant domains (facial emotion recognition, empathy, theory of mind, moral cognition, and social context assessment), highlighting their common disturbances across neurological and psychiatric conditions and discussing their multiple pathophysiological mechanisms. Our proposal is anchored in multidimensional evidence, including behavioral, neurocognitive, and genetic findings. From a clinical perspective, this work paves the way for dimensional and transdiagnostic approaches, new pharmacological treatments, and educational innovations rooted in a combined neuropsychiatric training. Research-wise, it fosters new models of the social brain and a novel platform to explore the interplay of cognitive and social functions. Finally, we identify new challenges for this synergistic framework.

  18. The art and science of integrating Undoing Racism with CBPR: challenges of pursuing NIH funding to investigate cancer care and racial equity.

    PubMed

    Yonas, Michael A; Jones, Nora; Eng, Eugenia; Vines, Anissa I; Aronson, Robert; Griffith, Derek M; White, Brandolyn; DuBose, Melvin

    2006-11-01

    In this nation, the unequal burden of disease among People of Color has been well documented. One starting point to eliminating health disparities is recognizing the existence of inequities in health care delivery and identifying the complexities of how institutional racism may operate within the health care system. In this paper, we explore the integration of community-based participatory research (CBPR) principles with an Undoing Racism process to conceptualize, design, apply for, and secure National Institutes of Health (NIH) funding to investigate the complexities of racial equity in the system of breast cancer care. Additionally, we describe the sequence of activities and "necessary conflicts" managed by our Health Disparities Collaborative to design and submit an application for NIH funding. This process of integrating CBPR principles with anti-racist community organizing presented unique challenges that were negotiated only by creating a strong foundation of trusting relationships that viewed conflict as being necessary. The process of developing a successful NIH grant proposal illustrated a variety of important lessons associated with the concepts of cultural humility and cultural safety. For successfully conducting CBPR, major challenges have included: assembling and mobilizing a partnership; the difficulty of establishing a shared vision and purpose for the group; the problem of maintaining trust; and the willingness to address differences in institutional cultures. Expectation, acceptance and negotiation of conflict were essential in the process of developing, preparing and submitting our NIH application. Central to negotiating these and other challenges has been the utilization of a CBPR approach.

  19. Understanding the Dynamics of Socio-Hydrological Environment: a Conceptual Framework

    NASA Astrophysics Data System (ADS)

    Woyessa, Y.; Welderufael, W.; Edossa, D.

    2011-12-01

    Human actions affect ecological systems and the services they provide through various activities, such as land use, water use, pollution and climate change. Climate change is perhaps one of the most important sustainable development challenges that threaten to undo many of the development efforts being made to reach the targets set for the Millennium Development Goals. Understanding the change of ecosystems under different scenarios of climate and biophysical conditions could assist in bringing the issue of ecosystem services into decision making process. Similarly, the impacts of land use change on ecosystems and biodiversity have received considerable attention from ecologists and hydrologists alike. Land use change in a catchment can impact on water supply by altering hydrological processes, such as infiltration, groundwater recharge, base flow and direct runoff. In the past a variety of models were used for predicting land-use changes. Recently the focus has shifted away from using mathematically oriented models to agent-based modelling (ABM) approach to simulate land use scenarios. A conceptual framework is being developed which integrates climate change scenarios and the human dimension of land use decision into a hydrological model in order to assess its impacts on the socio-hydrological dynamics of a river basin. The following figures present the framework for the analysis and modelling of the socio-hydrological dynamics. Keywords: climate change, land use, river basin

  20. [Dogs, man-wolves and full moon].

    PubMed

    Goddemeier, Christof

    2002-06-01

    According to a study of the British Medical Journal in England the incidence of dog bites increases at the time of a full moon. The following article first passes the myths dealing with the werewolf. By changing from delinquent to patient during the Enlightenment the werewolf gets important to the history of medicine and psychiatry. From the anthropological point of view the so-called Lycorexia may be understood as an unconscious effort to undo evolution by transformation into beast. By the figure of the "Huckup" in recent variants concerning the werewolf subject a psychological turn of the legend is expressed.

  1. Ontario’s plunging price-caps on generics: deeper dives may drown some drugs

    PubMed Central

    Anis, Aslam; Harvard, Stephanie; Marra, Carlo

    2011-01-01

    In April 2010, the Ontario government announced another reduction in the maximum price of generic drugs permitted under the Ontario Drug Benefit (ODB) program, demanding that generic drugs now be sold for no more than 25% of the branded product’s price. Other provinces are following Ontario in setting unprecedentedly low price-caps to reduce the cost of generic drugs. Generic product substitution legislation is vital to reducing costs to provincial drug plans, yet lower and lower price-caps may undo some of the benefits of substitution legislation if generics find it difficult to survive. PMID:22046229

  2. Ontario's plunging price-caps on generics: deeper dives may drown some drugs.

    PubMed

    Anis, Aslam; Harvard, Stephanie; Marra, Carlo

    2011-01-01

    In April 2010, the Ontario government announced another reduction in the maximum price of generic drugs permitted under the Ontario Drug Benefit (ODB) program, demanding that generic drugs now be sold for no more than 25% of the branded product's price. Other provinces are following Ontario in setting unprecedentedly low price-caps to reduce the cost of generic drugs. Generic product substitution legislation is vital to reducing costs to provincial drug plans, yet lower and lower price-caps may undo some of the benefits of substitution legislation if generics find it difficult to survive.

  3. Protein function prediction--the power of multiplicity.

    PubMed

    Rentzsch, Robert; Orengo, Christine A

    2009-04-01

    Advances in experimental and computational methods have quietly ushered in a new era in protein function annotation. This 'age of multiplicity' is marked by the notion that only the use of multiple tools, multiple evidence and considering the multiple aspects of function can give us the broad picture that 21st century biology will need to link and alter micro- and macroscopic phenotypes. It might also help us to undo past mistakes by removing errors from our databases and prevent us from producing more. On the downside, multiplicity is often confusing. We therefore systematically review methods and resources for automated protein function prediction, looking at individual (biochemical) and contextual (network) functions, respectively.

  4. streamgap-pepper: Effects of peppering streams with many small impacts

    NASA Astrophysics Data System (ADS)

    Bovy, Jo; Erkal, Denis; Sanders, Jason

    2017-02-01

    streamgap-pepper computes the effect of subhalo fly-bys on cold tidal streams based on the action-angle representation of streams. A line-of-parallel-angle approach is used to calculate the perturbed distribution function of a given stream segment by undoing the effect of all impacts. This approach allows one to compute the perturbed stream density and track in any coordinate system in minutes for realizations of the subhalo distribution down to 10^5 Msun, accounting for the stream's internal dispersion and overlapping impacts. This code uses galpy (ascl:1411.008) and the streampepperdf.py galpy extension, which implements the fast calculation of the perturbed stream structure.

  5. A New Look at Offshore Assembly: The Internationalization of Industry,

    DTIC Science & Technology

    1981-03-01

    7oissy in ’exico City. 17:. irnesto Calderon, " Las maa.uiladaras de los 2aises Centr~leS jue operan en el Zercer v~undo,’ in 1jjjjjL2 Lecturas 171...for eien 1o.cr). A3out 15 percent of all worters ani 2 percent of tne unJ.ille,l receive’i Mor than the minimu:m ( La ~uir, Lecturas 3 S;.;:. 3, i l e...ienJez, p. 142, and BstamnIte, p. 191. LO Sonica-Zvaira G~siitiril1 " La ruerza Ja lraajo. . . 11 in jjgj iia Ljj, Lecturas de C~T~!(nl I. , Ca !9E

  6. Two-Year Colleges and Vocational Schools as Sources of Military Manpower

    DTIC Science & Technology

    1984-10-01

    Schools iSO 46STIBACT ecin"Mm . -% EUam @No to 000000 MW. sdwro tw 1ot * see reverse side IJAM𔃿 47 ~ oSS@ ~ ~ ~ 1*UCLAS9TFIED SSCUIT’vOL~UCA~au0...ijstnwit rat4’s iLv st attvs and other ~eo~rphii aareas. TL .ximinat ion was undo-rtakenx to iso late’ economic: an ienr J: fRc tars rela t ed to .01...23315 14179 33747 712൱ 5.22 Albuquerque, NM 1354 1107 41 62 008 1887 8511 18406 2.23 Amarill6, TX 679 499 42 8 12208. 3215 14751 30174 1.39 :)a:"as

  7. A survey on keeler’s theorem and application of symmetric group for swapping game

    NASA Astrophysics Data System (ADS)

    Pratama, Yohanssen; Prakasa, Yohenry

    2017-01-01

    An episode of Futurama features two-body mind-switching machine which will not work more than once on the same pair of bodies. The problem is “Can the switching be undone so as to restore all minds to their original bodies?” Ken Keeler found an algorithm that undoes any mind-scrambling permutation, and Lihua Huang found the refinement of it. We look on the process how the puzzle can be modeled in terms group theory and using symmetric group to solve it and find the most efficient way of it. After that we will try to build the algorithm to implement it into the computer program and see the effect of the transposition notion into the algorithm complexity. The number of steps that given by the algorithm will be different and one of algorithms will have the advantage in terms of efficiency. We compare Ken Keeler and Lihua Huang algorithms to see is there any difference if we run it in the computer program, although the complexity could be remain the same.

  8. Deconvolution of the vestibular evoked myogenic potential.

    PubMed

    Lütkenhöner, Bernd; Basel, Türker

    2012-02-07

    The vestibular evoked myogenic potential (VEMP) and the associated variance modulation can be understood by a convolution model. Two functions of time are incorporated into the model: the motor unit action potential (MUAP) of an average motor unit, and the temporal modulation of the MUAP rate of all contributing motor units, briefly called rate modulation. The latter is the function of interest, whereas the MUAP acts as a filter that distorts the information contained in the measured data. Here, it is shown how to recover the rate modulation by undoing the filtering using a deconvolution approach. The key aspects of our deconvolution algorithm are as follows: (1) the rate modulation is described in terms of just a few parameters; (2) the MUAP is calculated by Wiener deconvolution of the VEMP with the rate modulation; (3) the model parameters are optimized using a figure-of-merit function where the most important term quantifies the difference between measured and model-predicted variance modulation. The effectiveness of the algorithm is demonstrated with simulated data. An analysis of real data confirms the view that there are basically two components, which roughly correspond to the waves p13-n23 and n34-p44 of the VEMP. The rate modulation corresponding to the first, inhibitory component is much stronger than that corresponding to the second, excitatory component. But the latter is more extended so that the two modulations have almost the same equivalent rectangular duration. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. TOAD Editor

    NASA Technical Reports Server (NTRS)

    Bingle, Bradford D.; Shea, Anne L.; Hofler, Alicia S.

    1993-01-01

    Transferable Output ASCII Data (TOAD) computer program (LAR-13755), implements format designed to facilitate transfer of data across communication networks and dissimilar host computer systems. Any data file conforming to TOAD format standard called TOAD file. TOAD Editor is interactive software tool for manipulating contents of TOAD files. Commonly used to extract filtered subsets of data for visualization of results of computation. Also offers such user-oriented features as on-line help, clear English error messages, startup file, macroinstructions defined by user, command history, user variables, UNDO features, and full complement of mathematical statistical, and conversion functions. Companion program, TOAD Gateway (LAR-14484), converts data files from variety of other file formats to that of TOAD. TOAD Editor written in FORTRAN 77.

  10. The next generation of solar panel substrates?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gledhill, K.M.; Boswell, R.L.; Paul, J.G.

    For over 25 years, satellite power system designers have used rigid honeycomb panels as solar array substrates. Those years have seen very little improvement in the performance of these rigid systems. A new technology under development at the Phillips Laboratory, however, may undo this stagnancy. Composite isogrid panel structures offer a number of potential advantages over honeycomb sandwich structures for solar array applications, including stiffness, weight, and cost improvements. Phillips Laboratory will be performing a series of evaluative tests on the isogrid structure to determine its suitability as a substitute for honeycomb sandwiches in solar panel applications. Testing will includemore » three-point bending, thermal vacuum, and thermal cycling.« less

  11. Undoing an epidemiological paradox: the tobacco industry's targeting of US Immigrants.

    PubMed

    Acevedo-Garcia, Dolores; Barbeau, Elizabeth; Bishop, Jennifer Anne; Pan, Jocelyn; Emmons, Karen M

    2004-12-01

    We sought to ascertain whether the tobacco industry has conceptualized the US immigrant population as a separate market. We conducted a content analysis of major tobacco industry documents. The tobacco industry has engaged in 3 distinct marketing strategies aimed at US immigrants: geographically based marketing directed toward immigrant communities, segmentation based on immigrants' assimilation status, and coordinated marketing focusing on US immigrant groups and their countries of origin. Public health researchers should investigate further the tobacco industry's characterization of the assimilated and non-assimilated immigrant markets, and its specific strategies for targeting these groups, in order to develop informed national and international tobacco control countermarketing strategies designed to protect immigrant populations and their countries of origin.

  12. Transferable Output ASCII Data (TOAD) editor version 1.0 user's guide

    NASA Technical Reports Server (NTRS)

    Bingel, Bradford D.; Shea, Anne L.; Hofler, Alicia S.

    1991-01-01

    The Transferable Output ASCII Data (TOAD) editor is an interactive software tool for manipulating the contents of TOAD files. The TOAD editor is specifically designed to work with tabular data. Selected subsets of data may be displayed to the user's screen, sorted, exchanged, duplicated, removed, replaced, inserted, or transferred to and from external files. It also offers a number of useful features including on-line help, macros, a command history, an 'undo' option, variables, and a full compliment of mathematical functions and conversion factors. Written in ANSI FORTRAN 77 and completely self-contained, the TOAD editor is very portable and has already been installed on SUN, SGI/IRIS, and CONVEX hosts.

  13. The ecology of markets.

    PubMed Central

    Nordhaus, W D

    1992-01-01

    Economies are sometimes viewed as analogous to ecological systems in which "everything is connected to everything else." In complex modern economies, the question arises whether the market mechanism can appropriately coordinate all the interconnections or whether instead some supramarket body is needed to coordinate the vast web of human activities. This study describes how an idealized decentralized competitive market in fact coordinates the different economic organisms in an efficient manner. The problems of pollution and other externalities can undo the efficient outcome unless corrected by appropriate property rights or corrective taxes. But in closing the economic circle, the internalized economy does not actually need to close the natural cycles by linking up all physical flows through recycling. PMID:11607264

  14. Approximate reversibility in the context of entropy gain, information gain, and complete positivity

    NASA Astrophysics Data System (ADS)

    Buscemi, Francesco; Das, Siddhartha; Wilde, Mark M.

    2016-06-01

    There are several inequalities in physics which limit how well we can process physical systems to achieve some intended goal, including the second law of thermodynamics, entropy bounds in quantum information theory, and the uncertainty principle of quantum mechanics. Recent results provide physically meaningful enhancements of these limiting statements, determining how well one can attempt to reverse an irreversible process. In this paper, we apply and extend these results to give strong enhancements to several entropy inequalities, having to do with entropy gain, information gain, entropic disturbance, and complete positivity of open quantum systems dynamics. Our first result is a remainder term for the entropy gain of a quantum channel. This result implies that a small increase in entropy under the action of a subunital channel is a witness to the fact that the channel's adjoint can be used as a recovery map to undo the action of the original channel. We apply this result to pure-loss, quantum-limited amplifier, and phase-insensitive quantum Gaussian channels, showing how a quantum-limited amplifier can serve as a recovery from a pure-loss channel and vice versa. Our second result regards the information gain of a quantum measurement, both without and with quantum side information. We find here that a small information gain implies that it is possible to undo the action of the original measurement if it is efficient. The result also has operational ramifications for the information-theoretic tasks known as measurement compression without and with quantum side information. Our third result shows that the loss of Holevo information caused by the action of a noisy channel on an input ensemble of quantum states is small if and only if the noise can be approximately corrected on average. We finally establish that the reduced dynamics of a system-environment interaction are approximately completely positive and trace preserving if and only if the data processing inequality holds approximately.

  15. Murder will out.

    PubMed

    Lewis, C N; Arsenian, J

    1977-04-01

    A 1974 showing of more than 200 oils and water colors at the Tate Gallery, London, has led to a revival of interest in the 19th century English painter, Richard Dadd (1817 to 1886). In 1843, Dadd killed his father, cutting his throat, because he believed him to be the devil in human form. On a trip to the Near East, Dadd became deluded that the Egyptian god Osiris was directing him to eliminate the devil's influence. Four months after he returned to London he murdered his father, and was institutionalized for the last 43 years of his life. We advance the hypothesis that one particular painting. 'The Fairy Feller's Master-Stroke,' symbolically re-enacts the murder and makes talion restitution. The painting shows minute attention to detail and altogether occupied the artist for 9 years. He also made a water color copy of it entirely from memory, and wrote a 22-page poem explaining the picture with the title, 'An Elimination.' We suggest that this hints at the same theme of undoing. Some art critics have seen in Dadd's other art works a projection of his inner feelings, especially a series of more than 30 water colors entitled 'Sketches to Illustrate the Passions,' amongthem 'Murder,' 'Anger,' 'Hatred,' 'Grief,' and 'Melancholy.' We construe these to support the thesis of redoing and undoing following the trauma of murder. We also mention Dadd's reminiscing, visible in his art, and its usefulness in reaffirming his self-identity. In the art work of Dadd's last 25 years, violent scenes are remarkably absent. Instead, imaginary landscapes and seascenes--the subject matter of his earliest adolescent art--reflect an inward absorption in and continuity of lifelong interests. We suggest that the long process of painting 'The Fairy Feller's Master-Stroke' recapitulated and made restitution for the murder, encapsulating it so that compulsive expression of violent ideation was largely reduced, allowing other memories and activities to be engaged and expressed.

  16. Critical incident technique: an innovative participatory approach to examine and document racial disparities in breast cancer healthcare services

    PubMed Central

    Yonas, Michael A.; Aronson, Robert; Schaal, Jennifer; Eng, Eugenia; Hardy, Christina; Jones, Nora

    2013-01-01

    Disproportionate and persistent inequities in quality of healthcare have been observed among persons of color in the United States. To understand and ultimately eliminate such inequities, several public health institutions have issued calls for innovative methods and approaches that examine determinants from the social, organizational and public policy contexts to inform the design of systems change interventions. The authors, including academic and community research partners in a community-based participatory research (CBPR) study, reflected together on the use and value of the critical incident technique (CIT) for exploring racial disparities in healthcare for women with breast cancer. Academic and community partners used initial large group discussion involving a large partnership of 35 academic and community researchers guided by principles of CBPR, followed by the efforts of a smaller interdisciplinary manuscript team of academic and community researchers to reflect, document summarize and translate this participatory research process, lessons learned and value added from using the CIT with principles of CBPR and Undoing Racism. The finding of this article is a discussion of the process, strengths and challenges of utilizing CIT with CBPR. The participation of community members at all levels of the research process including development, collection of the data and analysis of the data was enhanced by the CIT process. As the field of CBPR continues to mature, innovative processes which combine the expertise of community and academic partners can enhance the success of such partnerships. This report contributes to existing literature by illustrating a unique and participatory research application of CIT with principles of CBPR and Undoing Racism. Findings highlight the collaborative process used to identify and implement this novel method and the adaptability of this technique in the interdisciplinary exploration of system-level changes to understand and address disparities in breast cancer and cancer care. PMID:24000307

  17. Do artists see their retinas?

    PubMed

    Perdreau, Florian; Cavanagh, Patrick

    2011-01-01

    Our perception starts with the image that falls on our retina and on this retinal image, distant objects are small and shadowed surfaces are dark. But this is not what we see. Visual constancies correct for distance so that, for example, a person approaching us does not appear to become a larger person. Interestingly, an artist, when rendering a scene realistically, must undo all these corrections, making distant objects again small. To determine whether years of art training and practice have conferred any specialized visual expertise, we compared the perceptual abilities of artists to those of non-artists in three tasks. We first asked them to adjust either the size or the brightness of a target to match it to a standard that was presented on a perspective grid or within a cast shadow. We instructed them to ignore the context, judging size, for example, by imagining the separation between their fingers if they were to pick up the test object from the display screen. In the third task, we tested the speed with which artists access visual representations. Subjects searched for an L-shape in contact with a circle; the target was an L-shape, but because of visual completion, it appeared to be a square occluded behind a circle, camouflaging the L-shape that is explicit on the retinal image. Surprisingly, artists were as affected by context as non-artists in all three tests. Moreover, artists took, on average, significantly more time to make their judgments, implying that they were doing their best to demonstrate the special skills that we, and they, believed they had acquired. Our data therefore support the proposal from Gombrich that artists do not have special perceptual expertise to undo the effects of constancies. Instead, once the context is present in their drawing, they need only compare the drawing to the scene to match the effect of constancies in both.

  18. Closing the Black-White Gap in Birth Outcomes: A Life-course Approach

    PubMed Central

    Lu, Michael C.; Kotelchuck, Milton; Hogan, Vijaya; Jones, Loretta; Wright, Kynna; Halfon, Neal

    2015-01-01

    In the United States, Black infants have significantly worse birth outcomes than White infants. Over the past decades, public health efforts to address these disparities have focused primarily on increasing access to prenatal care, however, this has not led to closing the gap in birth outcomes. We propose a 12-point plan to reduce Black-White disparities in birth outcomes using a life-course approach. The first four points (increase access to interconception care, preconception care, quality prenatal care, and healthcare throughout the life course) address the needs of African American women for quality healthcare across the lifespan. The next four points (strengthen father involvement, systems integration, reproductive social capital, and community building) go beyond individual-level interventions to address enhancing family and community systems that may influence the health of pregnant women, families, and communities. The last four points (close the education gap, reduce poverty, support working mothers, and undo racism) move beyond the biomedical model to address the social and economic inequities that underlie much of health disparities. Closing the Black-White gap in birth outcomes requires a life course approach which addresses both early life disadvantages and cumulative allostatic load over the life course. PMID:20629248

  19. Closing the Black-White gap in birth outcomes: a life-course approach.

    PubMed

    Lu, Michael C; Kotelchuck, Milton; Hogan, Vijaya; Jones, Loretta; Wright, Kynna; Halfon, Neal

    2010-01-01

    In the United States, Black infants have significantly worse birth outcomes than White infants. Over the past decades, public health efforts to address these disparities have focused primarily on increasing access to prenatal care, however, this has not led to closing the gap in birth outcomes. We propose a 12-point plan to reduce Black-White disparities in birth outcomes using a life-course approach. The first four points (increase access to interconception care, preconception care, quality prenatal care, and healthcare throughout the life course) address the needs of African American women for quality healthcare across the lifespan. The next four points (strengthen father involvement, systems integration, reproductive social capital, and community building) go beyond individual-level interventions to address enhancing family and community systems that may influence the health of pregnant women, families, and communities. The last four points (close the education gap, reduce poverty, support working mothers, and undo racism) move beyond the biomedical model to address the social and economic inequities that underlie much of health disparities. Closing the Black-White gap in birth outcomes requires a life course approach which addresses both early life disadvantages and cumulative allostatic load over the life course.

  20. Undoing an Epidemiological Paradox: The Tobacco Industry’s Targeting of US Immigrants

    PubMed Central

    Acevedo-Garcia, Dolores; Barbeau, Elizabeth; Bishop, Jennifer Anne; Pan, Jocelyn; Emmons, Karen M.

    2004-01-01

    Objectives. We sought to ascertain whether the tobacco industry has conceptualized the US immigrant population as a separate market. Methods. We conducted a content analysis of major tobacco industry documents. Results. The tobacco industry has engaged in 3 distinct marketing strategies aimed at US immigrants: geographically based marketing directed toward immigrant communities, segmentation based on immigrants’ assimilation status, and coordinated marketing focusing on US immigrant groups and their countries of origin. Conclusions. Public health researchers should investigate further the tobacco industry’s characterization of the assimilated and non-assimilated immigrant markets, and its specific strategies for targeting these groups, in order to develop informed national and international tobacco control countermarketing strategies designed to protect immigrant populations and their countries of origin. PMID:15569972

  1. Shared resource control between human and computer

    NASA Technical Reports Server (NTRS)

    Hendler, James; Wilson, Reid

    1989-01-01

    The advantages of an AI system of actively monitoring human control of a shared resource (such as a telerobotic manipulator) are presented. A system is described in which a simple AI planning program gains efficiency by monitoring human actions and recognizing when the actions cause a change in the system's assumed state of the world. This enables the planner to recognize when an interaction occurs between human actions and system goals, and allows maintenance of an up-to-date knowledge of the state of the world and thus informs the operator when human action would undo a goal achieved by the system, when an action would render a system goal unachievable, and efficiently replans the establishment of goals after human intervention.

  2. Considering the reversibility of passive and reactive transport problems: Are forward-in-time and backward-in-time models ever equivalent?

    NASA Astrophysics Data System (ADS)

    Engdahl, N.

    2017-12-01

    Backward in time (BIT) simulations of passive tracers are often used for capture zone analysis, source area identification, and generation of travel time and age distributions. The BIT approach has the potential to become an immensely powerful tool for direct inverse modeling but the necessary relationships between the processes modeled in the forward and backward models have yet to be formally established. This study explores the time reversibility of passive and reactive transport models in a variety of 2D heterogeneous domains using particle-based random walk methods for the transport and nonlinear reaction steps. Distributed forward models are used to generate synthetic observations that form the initial conditions for the backward in time models and we consider both linear-flood and point injections. The results for passive travel time distributions show that forward and backward models are not exactly equivalent but that the linear-flood BIT models are reasonable approximations. Point based BIT models fall within the travel time range of the forward models, though their distributions can be distinctive in some cases. The BIT approximation is not as robust when nonlinear reactive transport is considered and we find that this reaction system is only exactly reversible under uniform flow conditions. We use a series of simplified, longitudinally symmetric, but heterogeneous, domains to illustrate the causes of these discrepancies between the two model types. Many of the discrepancies arise because diffusion is a "self-adjoint" operator, which causes mass to spread in the forward and backward models. This allows particles to enter low velocity regions in the both models, which has opposite effects in the forward and reverse models. It may be possible to circumvent some of these limitations using an anti-diffusion model to undo mixing when time is reversed, but this is beyond the capabilities of the existing Lagrangian methods.

  3. Climate Change and Socio-Hydrological Dynamics: Adaptations and Feedbacks

    NASA Astrophysics Data System (ADS)

    Woyessa, Yali E.; Welderufael, Worku A.

    2012-10-01

    A functioning ecological system results in ecosystem goods and services which are of direct value to human beings. Ecosystem services are the conditions and processes which sustain and fulfil human life, and maintain biodiversity and the production of ecosystem goods. However, human actions affect ecological systems and the services they provide through various activities, such as land use, water use, pollution and climate change. Climate change is perhaps one of the most important sustainable development challenges that threatens to undo many of the development efforts being made to reach the targets set for the Millennium Development Goals. Understanding the provision of ecosystem services and how they change under different scenarios of climate and biophysical conditions could assist in bringing the issue of ecosystem services into decision making process. Similarly, the impacts of land use change on ecosystems and biodiversity have received considerable attention from ecologists and hydrologists alike. Land use change in a catchment can impact on water supply by altering hydrological processes, such as infiltration, groundwater recharge, base flow and direct runoff. In the past a variety of models were used for predicting landuse changes. Recently, the focus has shifted away from using mathematically oriented models to agent-based modeling (ABM) approach to simulate land use scenarios. The agent-based perspective, with regard to land-use cover change, is centered on the general nature and rules of land-use decision making by individuals. A conceptual framework is developed to investigate the possibility of incorporating the human dimension of land use decision and climate change model into a hydrological model in order to assess the impact of future land use scenario and climate change on the ecological system in general and water resources in particular.

  4. Junking Good Science: Undoing Daubert v Merrill Dow Through Cross-Examination and Argument

    PubMed Central

    Givelber, Daniel; Strickler, Lori

    2006-01-01

    For more than 40 years, the tobacco industry prevailed in lawsuits brought by injured smokers, despite overwhelming epidemiological evidence that smoking caused lung cancer. Tobacco lawyers were able to create doubt about causation. They sought to persuade jurors that “everybody knew” smoking was harmful but “nobody knows” what causes cancer by recreating in court the scientific debate resolved by the 1964 Surgeon General’s Report. The particularistic structure of jury trials combined with the law’s mechanistic view of causation enables a defendant to contest virtually any claim concerning disease causation. Despite judicial efforts to eliminate “junk science” from lawsuits, a well-financed defendant may succeed in persuading jurors of the epidemiological equivalent of the proposition that the earth is flat. PMID:16317200

  5. Cigarette graphic warning labels increase both risk perceptions and smoking myth endorsement.

    PubMed

    Evans, Abigail T; Peters, Ellen; Shoben, Abigail B; Meilleur, Louise R; Klein, Elizabeth G; Tompkins, Mary Kate; Tusler, Martin

    2018-02-01

    Cigarette graphic warning labels elicit negative emotion, which increases risk perceptions through multiple processes. We examined whether this emotion simultaneously affects motivated cognitions like smoking myth endorsement (e.g. 'exercise can undo the negative effects of smoking') and perceptions of cigarette danger versus other products. 736 adult and 469 teen smokers/vulnerable smokers viewed one of three warning label types (text-only, low emotion graphic or high emotion graphic) four times over two weeks. Emotional reactions to the warnings were reported during the first and fourth exposures. Participants reported how often they considered the warnings, smoking myth endorsement, risk perceptions and perceptions of cigarette danger relative to smokeless tobacco and electronic cigarettes. In structural equation models, emotional reactions influenced risk perceptions and smoking myth endorsement through two processes. Emotion acted as information about risk, directly increasing smoking risk perceptions and decreasing smoking myth endorsement. Emotion also acted as a spotlight, motivating consideration of the warning information. Warning consideration increased risk perceptions, but also increased smoking myth endorsement. Emotional reactions to warnings decreased perceptions of cigarette danger relative to other products. Emotional reactions to cigarette warnings increase smoking risk perceptions, but also smoking myth endorsement and misperceptions that cigarettes are less dangerous than potentially harm-reducing tobacco products.

  6. Metabolite damage and repair in metabolic engineering design.

    PubMed

    Sun, Jiayi; Jeffryes, James G; Henry, Christopher S; Bruner, Steven D; Hanson, Andrew D

    2017-11-01

    The necessarily sharp focus of metabolic engineering and metabolic synthetic biology on pathways and their fluxes has tended to divert attention from the damaging enzymatic and chemical side-reactions that pathway metabolites can undergo. Although historically overlooked and underappreciated, such metabolite damage reactions are now known to occur throughout metabolism and to generate (formerly enigmatic) peaks detected in metabolomics datasets. It is also now known that metabolite damage is often countered by dedicated repair enzymes that undo or prevent it. Metabolite damage and repair are highly relevant to engineered pathway design: metabolite damage reactions can reduce flux rates and product yields, and repair enzymes can provide robust, host-independent solutions. Herein, after introducing the core principles of metabolite damage and repair, we use case histories to document how damage and repair processes affect efficient operation of engineered pathways - particularly those that are heterologous, non-natural, or cell-free. We then review how metabolite damage reactions can be predicted, how repair reactions can be prospected, and how metabolite damage and repair can be built into genome-scale metabolic models. Lastly, we propose a versatile 'plug and play' set of well-characterized metabolite repair enzymes to solve metabolite damage problems known or likely to occur in metabolic engineering and synthetic biology projects. Copyright © 2017 International Metabolic Engineering Society. All rights reserved.

  7. Metabolite damage and repair in metabolic engineering design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Jiayi; Jeffryes, James G.; Henry, Christopher S.

    The necessarily sharp focus of metabolic engineering and metabolic synthetic biology on pathways and their fluxes has tended to divert attention from the damaging enzymatic and chemical side-reactions that pathway metabolites can undergo. Although historically overlooked and underappreciated, such metabolite damage reactions are now known to occur throughout metabolism and to generate (formerly enigmatic) peaks detected in metabolomics datasets. It is also now known that metabolite damage is often countered by dedicated repair enzymes that undo or prevent it. Metabolite damage and repair are highly relevant to engineered pathway design: metabolite damage reactions can reduce flux rates and product yields,more » and repair enzymes can provide robust, host-independent solutions. Herein, after introducing the core principles of metabolite damage and repair, we use case histories to document how damage and repair processes affect efficient operation of engineered pathways - particularly those that are heterologous, non-natural, or cell-free. We then review how metabolite damage reactions can be predicted, how repair reactions can be prospected, and how metabolite damage and repair can be built into genome-scale metabolic models. Lastly, we propose a versatile 'plug and play' set of well-characterized metabolite repair enzymes to solve metabolite damage problems known or likely to occur in metabolic engineering and synthetic biology projects.« less

  8. [Chapter 7. Big Data or the illusion of a synthesis by aggregation. Epistemological, ethical and political critics].

    PubMed

    Coutellec, Léo; Weil-Dubuc, Paul-Loup

    2017-10-27

    In this article, we propose a critical approach to the big data phenomenon by deconstructing the methodological principle that structures its logic : the principle of aggregation. Our hypothesis is upstream of the critics who make the use of big data a new mode of government. Aggregation, as a mode of processing the heterogeneity of data, structures the thinking big data, it is its very logic. Fragmentation in order to better aggregate, to aggregate to better fragment, a dialectic based on a presumption of generalized aggregability and on the claim to make aggregation the preferred route for the production of new syntheses. We proceed in three steps to deconstruct this idea and undo the claim of aggregation to assert itself as a new way to produce knowledge, as a new synthesis of identity and finally as a new model of solidarity. Each time we show that these attempts at aggregation fail to produce their objects : no knowledge, no identity, no solidarity can result from a process of amalgamation. In all three cases, aggregation is always accompanied by a moment of fragmentation whose dissociation, dislocation and separation are different figures. The bet we are making then is to make hesitate what presents itself as a new way of thinking man and the world.

  9. Effect of science laboratory centrifuge of space station environment

    NASA Technical Reports Server (NTRS)

    Searby, Nancy

    1990-01-01

    It is argued that it is essential to have a centrifuge operating during manned space station operations. Background information and a rationale for the research centrifuge are given. It is argued that we must provide a controlled acceleration environment for comparison with microgravity studies. The lack of control groups in previous studies throws into question whether the obseved effects were the result of microgravity or not. The centrifuge could be used to provide a 1-g environment to supply specimens free of launch effects for long-term studies. With the centrifuge, the specimens could be immediately transferred to microgravity without undergoing gradual acclimation. Also, the effects of artificial gravity on humans could be investigated. It is also argued that the presence of the centrifuge on the space station will not cause undo vibrations or other disturbing effects.

  10. Creating Simple Windchill Admin Tools Using Info*Engine

    NASA Technical Reports Server (NTRS)

    Jones, Corey; Kapatos, Dennis; Skradski, Cory

    2012-01-01

    Being a Windchill administrator often requires performing simple yet repetitive tasks on large sets of objects. These can include renaming, deleting, checking in, undoing checkout, and much more. This is especially true during a migration. Fortunately, PTC has provided a simple way to dynamically interact with Windchill using Info*Engine. This presentation will describe how to create simple Info*Engine tasks capable of saving Windchill 10.0 administrators hours of tedious work. It will also show how these tasks can be combined and displayed on a simple JSP page that acts as a "Windchill Administrator Dashboard/Toolbox". The attendee will learn some valuable tasks Info*Engine capable of performing. The attendee will gain a basic understanding of how to perform and implement Info*Engine tasks. The attendee will learn what's involved in creating a JSP page that displays Info*Engine tasks

  11. The effect of Taiwan's national health insurance on mortality of the elderly: revisited.

    PubMed

    Chang, Simon

    2012-11-01

    A recent paper estimates the effects of Taiwan's National Health Insurance (NHI) on the elderly and concludes that NHI greatly increased the medical care utilization of the elderly but did not reduce their mortality. Using more recent and more accurate mortality data of the same group of elderly, this note re-estimates the NHI effect on mortality and finds that the mortality hazard of the previously uninsured elderly in the post-NHI period was on average 24% lower than it would have been in the absence of NHI. However, the NHI effect on the mortality hazard is only evident in the first 6 years following the enactment of NHI, suggesting that it may be difficult to undo the damage caused by the lack of insurance in early life. Copyright © 2011 John Wiley & Sons, Ltd.

  12. Undoing measurement-induced dephasing in circuit QED

    NASA Astrophysics Data System (ADS)

    Frisk Kockum, A.; Tornberg, L.; Johansson, G.

    2012-05-01

    We analyze the backaction of homodyne detection and photodetection on superconducting qubits in circuit quantum electrodynamics. Although both measurement schemes give rise to backaction in the form of stochastic phase rotations, which leads to dephasing, we show that this can be perfectly undone provided that the measurement signal is fully accounted for. This result improves on an earlier one [Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.82.012329 82, 012329 (2010)], showing that the method suggested can be made to realize a perfect two-qubit parity measurement. We propose a benchmarking experiment on a single qubit to demonstrate the method using homodyne detection. By analyzing the limited measurement efficiency of the detector and bandwidth of the amplifier, we show that the parameter values necessary to see the effect are within the limits of existing technology.

  13. Deciding Not to Un-Do the "I Do:" Therapy Experiences of Women Who Consider Divorce But Decide to Remain Married.

    PubMed

    Kanewischer, Erica J W; Harris, Steven M

    2015-07-01

    This study explores women's experience of marital therapy while they navigated decision making around divorce. A qualitative method was used to gain a deeper understanding of the participants' therapy and relationship decision-making experiences. How are women's decisions whether or not to exit their marriage affected by therapy? The researchers interviewed 15 women who had considered initiating divorce before they turned 40 and had attended at least five marital therapy sessions but ultimately decided not to divorce. In general, participants reported that the therapy was helpful to them, their decision-making process and their marriages. Five main themes emerged from the interviews: Women Initiated Therapy, Therapist Was Experienced as Unbiased, Therapy was Helpful, Importance of Extra-therapeutic Factors, and Gradual Process. © 2014 American Association for Marriage and Family Therapy.

  14. Trump and the GOP agenda: implications for retirement policy.

    PubMed

    Madland, David; Rowell, Alex

    2018-04-11

    Policymakers need to act to protect Americans' retirement security. A significant portion of Americans are at risk of not being able to maintain their standard of living in retirement and research suggests that this percentage is likely to grow. This commentary provides background on the current state of American retirement, highlights recent efforts to reform retirement policy, and predicts what to expect under President Donald Trump. Retirement has not been a major focus of national policymakers in recent years. Early actions during the Trump administration to undo Obama administration policies may make it more difficult for individuals to save for retirement. While it is impossible to predict the future with any certainty, long standing trends and recent political developments suggest that major action will not be taken during the Trump presidency to boost retirement security.

  15. Ion Outflow Observations

    NASA Technical Reports Server (NTRS)

    Mellot, Mary (Technical Monitor)

    2002-01-01

    The characteristics of out-flowing ions have been investigated under various circumstances. In particular the upwelling of ions from the cleft region has been studied to attempt to look at source characteristics (e.g., temperature, altitude). High altitude (6-8 Re) data tend to show ions species that have the same velocity and are adiabatically cooled. Such ions, while representative of their source, can not provide an accurate picture. Ion observations from the TIDE detector on the Polar spacecraft show an energy (or equivalently a velocity) spectrum of ions as they undo the geomagnetic mass spectrometer effect due to convection-gravity separation of the different species. Consolidation of this type of data into a complete representation of the source spectrum can be attempted by building a set of maximum-phase-space- density-velocity pairs and attributing the total to the source.

  16. V and V of Lexical, Syntactic and Semantic Properties for Interactive Systems Through Model Checking of Formal Description of Dialog

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.; Martinie, Celia; Palanque, Philippe

    2013-01-01

    During early phases of the development of an interactive system, future system properties are identified (through interaction with end users in the brainstorming and prototyping phase of the application, or by other stakehold-ers) imposing requirements on the final system. They can be specific to the application under development or generic to all applications such as usability principles. Instances of specific properties include visibility of the aircraft altitude, speed… in the cockpit and the continuous possibility of disengaging the autopilot in whatever state the aircraft is. Instances of generic properties include availability of undo (for undoable functions) and availability of a progression bar for functions lasting more than four seconds. While behavioral models of interactive systems using formal description techniques provide complete and unambiguous descriptions of states and state changes, it does not provide explicit representation of the absence or presence of properties. Assessing that the system that has been built is the right system remains a challenge usually met through extensive use and acceptance tests. By the explicit representation of properties and the availability of tools to support checking these properties, it becomes possible to provide developers with means for systematic exploration of the behavioral models and assessment of the presence or absence of these properties. This paper proposes the synergistic use two tools for checking both generic and specific properties of interactive applications: Petshop and Java PathFinder. Petshop is dedicated to the description of interactive system behavior. Java PathFinder is dedicated to the runtime verification of Java applications and as an extension dedicated to User Interfaces. This approach is exemplified on a safety critical application in the area of interactive cockpits for large civil aircrafts.

  17. The dangerous role of silence in the relationship between trauma and violence: a group response.

    PubMed

    Phillips, Suzanne B

    2015-01-01

    This article considers that somewhere in the space between violence and trauma is dangerous silence. Silence intensifies the impact of trauma, and trauma that goes unspoken, un-witnessed, and unclaimed too often "outs itself" as more violence to self or others. Relevant empirical evidence on the impact of civilian interpersonal violence, combat trauma, school shootings, bullying, and domestic violence confirms this tragic cycle. Crucial to addressing the danger of silence in this cycle, the article examines the centrality of silence existentially, neuropsychologically, psychologically, developmentally, interpersonally, and culturally in relation to violence. The bridge to voicing and assimilating the unspeakable is empathic connection with others. Drawing upon two different types of group programs, the article demonstrates that group can serve as that bridge. Group process has the potential to undo the dangerous role of silence in the relationship of trauma and violence.

  18. Auto identification technology and its impact on patient safety in the Operating Room of the Future.

    PubMed

    Egan, Marie T; Sandberg, Warren S

    2007-03-01

    Automatic identification technologies, such as bar coding and radio frequency identification, are ubiquitous in everyday life but virtually nonexistent in the operating room. User expectations, based on everyday experience with automatic identification technologies, have generated much anticipation that these systems will improve readiness, workflow, and safety in the operating room, with minimal training requirements. We report, in narrative form, a multi-year experience with various automatic identification technologies in the Operating Room of the Future Project at Massachusetts General Hospital. In each case, the additional human labor required to make these ;labor-saving' technologies function in the medical environment has proved to be their undoing. We conclude that while automatic identification technologies show promise, significant barriers to realizing their potential still exist. Nevertheless, overcoming these obstacles is necessary if the vision of an operating room of the future in which all processes are monitored, controlled, and optimized is to be achieved.

  19. A Scientist's Guide to Science Denial

    NASA Astrophysics Data System (ADS)

    Rosenau, J.

    2012-12-01

    Why are so many scientifically uncontroversial topics, from evolution and the age of the earth to climate change and vaccines, so contentious in society? The American public respects science and scientists, yet seems remarkably unaware of - or resistant to accepting - what scientists have learned about the world around us. This resistance holds back science education and undermines public policy discussions. Scientists and science communicators often react to science denial as if it were a question of scientific knowledge, and respond by trying to correct false scientific claims. Many independent lines of evidence show that science denial is not primarily about science. People reject scientific claims which seem to conflict with their personal identity - often because they believe that accepting those claims would threaten some deeply-valued cultural, political, or religious affiliation. Only by identifying, addressing, and defusing the underlying political and cultural concerns can educators, scientists, and science communicators undo the harm done by science denial.

  20. Epistemological controversies in the analytic field elucidated by the theological realm.

    PubMed

    Squverer, Amos

    2015-08-01

    This article proposes to address certain epistemological controversies in psychoanalysis by elucidating them through the religious field. The theological field serves the author as the repressed, which indicates the latent stakes that continue to do work at the heart of these debates. The goal is to show how debates that take place on the epistemological level bring into confrontation different anthropological concepts and discursive traditions that have their roots in religious discourses. The principal hypothesis of the author is that the dissident theories of psychoanalysis can be understood as a return to a pre-monotheistic theological conception or to an idolatrous practice that aims, primarily, to undo castration. This hypothesis will be used to elucidate the debates with two authors: Adler and Rank. The author shows how these theorists, by leaving analytical ground, connect their theories to pre-monotheistic conceptions and highlight conceptual tools that are characteristic to them. Copyright © 2015 Institute of Psychoanalysis.

  1. Field of view advantage of conjugate adaptive optics in microscopy applications

    PubMed Central

    Mertz, Jerome; Paudel, Hari; Bifano, Thomas G.

    2015-01-01

    The imaging performance of an optical microscope can be degraded by sample-induced aberrations. A general strategy to undo the effect of these aberrations is to apply wavefront correction with a deformable mirror (DM). In most cases the DM is placed conjugate to the microscope pupil, called pupil adaptive optics (AO). When the aberrations are spatially variant an alternative configuration involves placing the DM conjugate to the main source of aberrations, called conjugate AO. We provide a theoretical and experimental comparison of both configurations for the simplified case where spatially variant aberrations are produced by a well defined phase screen. We pay particular attention to the resulting correction field of view (FOV). Conjugate AO is found to provide a significant FOV advantage. While this result is well known in the astronomy community, our goal here is to recast it specifically for the optical microscopy community. PMID:25967343

  2. The use of defence mechanisms as precursors to coming out in post-apartheid South Africa: a gay and lesbian youth perspective.

    PubMed

    Butler, Allister H; Astbury, Gaynor

    2008-01-01

    This article comprises one facet of a larger, three-year phenomenological study (1997-2000) of gay and lesbian youth coming out in post-apartheid South Africa. A nonprobability sample of 18 young people, aged between 16 and 21 years, was interviewed. The resultant data was content analyzed, and the trustworthiness of the information was ensured via member checking and utilizing an independent coder. Results consistently revealed that gay and lesbian youth use defense mechanisms, such as denial, avoidance, compartmentalization, suppression, compensation, sublimation, undoing, displacement, rationalization, and intellectualization, in a conscious manner during their coming out process. The young people in this study demonstrated resilience despite the prejudice and inner turmoil that they had experienced. Practice guidelines are suggested in terms of how health and social care practitioners can support gay and lesbian youth in coping with their coming out process.

  3. Epidemiology and challenges to the elimination of global tuberculosis.

    PubMed

    Jassal, Mandeep S; Bishai, William R

    2010-05-15

    Recent epidemiological indicators of tuberculosis (TB) indicate that the Millennium Development Goal of TB elimination by 2050 will not be achieved. The majority of incident cases are occurring in population-dense regions of Africa and Asia where TB is endemic. The persistence of TB in the setting of poor existing health infrastructure has led to an increase in drug-resistant cases, exacerbated by the strong association with human immunodeficiency virus coinfection. Spreading drug resistance threatens to undo decades of progress in controlling the disease. Several significant gaps can be identified in various aspects of national- and international-directed TB-control efforts. Various governing bodies and international organizations need to address the immediate challenges. This article highlights some of the major policies that lawmakers and funding institutions should consider. Existing economic and social obstacles must be overcome if TB elimination is to be a reachable goal.

  4. Shame and the motivation to change the self.

    PubMed

    Lickel, Brian; Kushlev, Kostadin; Savalei, Victoria; Matta, Shashi; Schmader, Toni

    2014-12-01

    A central question of human psychology is whether and when people change for the better. Although it has long been assumed that emotion plays a central role in self-regulation, the role of specific emotions in motivating a desire for self-change has been largely ignored. We report 2 studies examining people's lived experiences of self-conscious emotions, particularly shame, in motivating a desire for self-change. Study 1 revealed that when participants recalled experiences of shame, guilt, or embarrassment, shame-and, to some degree, guilt-predicted a motivation for self-change. Study 2 compared shame, guilt, and regret for events and found that although shame experiences often involved high levels of both regret and guilt, it was feelings of shame that uniquely predicted a desire for self-change, whereas regret predicted an interest in mentally undoing the past and repairing harm done. Implications for motivating behavior change are discussed.

  5. Looking Under the Hood of the Cadillac Tax.

    PubMed

    Glied, Sherry; Striar, Adam

    2016-06-01

    One effect of the Affordable Care Act's "Cadillac tax" (now delayed until 2020) is to undo part of the existing federal tax preference for employer-sponsored insurance. The specific features of this tax on high-cost health plans--notably, the inclusion of tax-favored savings vehicles such as health savings accounts (HSAs) in the formula for determining who is subject to the tax--are designed primarily to maximize revenue and minimize coverage disruptions, not to reduce health spending. Thus, at least initially, these savings accounts, rather than enrollee cost-sharing or other plan features, are likely to be affected most by the tax as employers act to limit their HSA contributions. Because high earners are the ones benefiting most from tax-preferred accounts, the high-cost plan tax will probably be more progressive than prior analyses have suggested, while having only a modest impact on total health spending.

  6. The upcycling of post-industrial PP/PET waste streams through in-situ microfibrillar preparation

    NASA Astrophysics Data System (ADS)

    Delva, Laurens; Ragaert, Kim; Cardon, Ludwig

    2015-12-01

    Post-industrial plastic waste streams can be re-used as secondary material streams for polymer processing by extrusion or injection moulding. One of the major commercially available waste stream contains polypropylene (PP) contaminated with polyesters (mostly polyethylene tereftalate - PET). An important practical hurdle for the direct implementation of this waste stream is the immiscibility of PP and PET in the melt, which leads to segregation within the polymer structure and adversely affects the reproducibility and mechanical properties of the manufactured parts. It has been indicated in literature that the creation of PET microfibrils in the PP matrix could undo these drawbacks and upcycle the PP/PET combination. Within the current research, a commercially available virgin PP/PET was evaluated for the microfibrillar preparation. The mechanical (tensile and impact) properties, thermal properties and morphology of the composites were characterized at different stages of the microfibrillar preparation.

  7. From genius inverts to gendered intelligence: Lewis Terman and the power of the norm.

    PubMed

    Hegarty, Peter

    2007-05-01

    The histories of "intelligence" and "sexuality" have largely been narrated separately. In Lewis Terman's work on individual differences, they intersect. Influenced by G. Stanley Hall, Terman initially described atypically accelerated development as problematic. Borrowing from Galton, Terman later positioned gifted children as nonaverage but ideal. Attention to the gifted effeminate subjects used to exemplify giftedness and gender nonconformity in Terman's work shows the selective instantiation of nonaverageness as pathological a propos of effeminacy, and as ideal a propos of high intelligence. Throughout, high intelligence is conflated with health, masculinity, and heterosexuality. Terman's research located marital sexual problems in women's bodies, further undoing possibilities for evaluating heterosexual men's practices as different from a normative position. Terman's research modernized Galton's imperialist vision of a society lead by a male cognitive elite. Psychologists continue to traffic in his logic that values and inculcates intelligence only in the service of sexual and gender conformity.

  8. Efficient Tracing for On-the-Fly Space-Time Displays in a Debugger for Message Passing Programs

    NASA Technical Reports Server (NTRS)

    Hood, Robert; Matthews, Gregory

    2001-01-01

    In this work we describe the implementation of a practical mechanism for collecting and displaying trace information in a debugger for message passing programs. We introduce a trace format that is highly compressible while still providing information adequate for debugging purposes. We make the mechanism convenient for users to access by incorporating the trace collection in a set of wrappers for the MPI (message passing interface) communication library. We implement several debugger operations that use the trace display: consistent stoplines, undo, and rollback. They all are implemented using controlled replay, which executes at full speed in target processes until the appropriate position in the computation is reached. They provide convenient mechanisms for getting to places in the execution where the full power of a state-based debugger can be brought to bear on isolating communication errors.

  9. Apollo: a community resource for genome annotation editing

    PubMed Central

    Ed, Lee; Nomi, Harris; Mark, Gibson; Raymond, Chetty; Suzanna, Lewis

    2009-01-01

    Summary: Apollo is a genome annotation-editing tool with an easy to use graphical interface. It is a component of the GMOD project, with ongoing development driven by the community. Recent additions to the software include support for the generic feature format version 3 (GFF3), continuous transcriptome data, a full Chado database interface, integration with remote services for on-the-fly BLAST and Primer BLAST analyses, graphical interfaces for configuring user preferences and full undo of all edit operations. Apollo's user community continues to grow, including its use as an educational tool for college and high-school students. Availability: Apollo is a Java application distributed under a free and open source license. Installers for Windows, Linux, Unix, Solaris and Mac OS X are available at http://apollo.berkeleybop.org, and the source code is available from the SourceForge CVS repository at http://gmod.cvs.sourceforge.net/gmod/apollo. Contact: elee@berkeleybop.org PMID:19439563

  10. [Human resources for health in Chile: the reform's pending challenge].

    PubMed

    Méndez, Claudio A

    2009-09-01

    Omission of human resources from health policy development has been identified as a barrier in the health sector reform's adoption phase. Since 2002, Chile's health care system has been undergoing a transformation based on the principles of health as a human right, equity, solidarity, efficiency, and social participation. While the reform has set forth the redefinition of the medical professions, continuing education, scheduled accreditation, and the introduction of career development incentives, it has not considered management options tailored to the new setting, a human resources strategy that has the consensus of key players and sector policy, or a process for understanding the needs of health care staff and professionals. However, there is still time to undo the shortcomings, in large part because the reform's implementation phase only recently has begun. Overcoming this challenge is in the hands of the experts charged with designing public health strategies and policies.

  11. Apollo: a community resource for genome annotation editing.

    PubMed

    Lee, Ed; Harris, Nomi; Gibson, Mark; Chetty, Raymond; Lewis, Suzanna

    2009-07-15

    Apollo is a genome annotation-editing tool with an easy to use graphical interface. It is a component of the GMOD project, with ongoing development driven by the community. Recent additions to the software include support for the generic feature format version 3 (GFF3), continuous transcriptome data, a full Chado database interface, integration with remote services for on-the-fly BLAST and Primer BLAST analyses, graphical interfaces for configuring user preferences and full undo of all edit operations. Apollo's user community continues to grow, including its use as an educational tool for college and high-school students. Apollo is a Java application distributed under a free and open source license. Installers for Windows, Linux, Unix, Solaris and Mac OS X are available at http://apollo.berkeleybop.org, and the source code is available from the SourceForge CVS repository at http://gmod.cvs.sourceforge.net/gmod/apollo.

  12. Class advantage and the gender divide: flexibility on the job and at home.

    PubMed

    Gerstel, Naomi; Clawson, Dan

    2014-09-01

    Using a survey, interviews, and observations, the authors examine inequality in temporal flexibility at home and at work. They focus on four occupations to show that class advantage is deployed in the service of gendered notions of temporal flexibility while class disadvantage makes it difficult to obtain such flexibility. The class advantage of female nurses and male doctors enables them to obtain flexibility in their work hours; they use that flexibility in gendered ways: nurses to prioritize family and physicians to prioritize careers. Female nursing assistants and male emergency medical technicians can obtain little employee-based flexibility and, as a result, have more difficulty meeting conventional gendered expectations. Advantaged occupations "do gender" in conventional ways while disadvantaged occupations "undo gender." These processes operate through organizational rules and cultural schemas that sustain one another but may undermine the gender and class neutrality of family-friendly policies.

  13. Ways of Doing: Restorative Practices, Governmentality, and Provider Conduct in Post-Apartheid Health Care.

    PubMed

    Harris, Bronwyn; Eyles, John; Goudge, Jane

    2016-01-01

    In this article, we consider the conduct of post-apartheid health care in a policy context directed toward entrenching democracy, ensuring treatment-adherent patients, and creating a healthy populace actively responsible for their own health. We ask how tuberculosis treatment, antiretroviral therapy, and maternal services are delivered within South Africa's health system, an institutional site of colonial and apartheid injustice, and democratic reform. Using Foucauldian and post-Foucauldian notions of governmentality, we explore provider ways of doing to, for, and with patients in three health subdistricts. Although restorative provider engagements are expected in policy, older authoritarian and paternalistic norms persist in practice. These challenge and reshape, even 'undo' democratic assertions of citizenship, while producing compliant, self-responsible patients. Alongside the need to address pervasive structural barriers to health care, a restorative approach requires community participation, provider accountability, and a health system that does with providers as much as providers who do with patients.

  14. Fulfill Promises and Avoid Breaches to Retain Satisfied, Committed Nurses.

    PubMed

    Rodwell, John; Ellershaw, Julia

    2016-07-01

    This study examines two commonly proposed mechanisms, violation and trust, to see if they mediate the relationships between the components of the psychological contract (i.e., promises, fulfillment, and breach) and their impact on the work-related outcomes of job satisfaction, intent to quit, and organizational commitment. Online surveys were completed by 459 Australian nurses. Structural equation modeling revealed that breach and fulfillment have direct and mediated effects on the outcomes, whereas promises had no impact. Violation partially mediated the relationship between breach and job satisfaction and intent to quit, while trust partially mediated the relationships between fulfillment and organizational commitment, and breach and organizational commitment. Negative experiences (i.e., breaches) were related to both increased feelings of violation and decreased feelings of trust. In contrast, positive experiences (i.e., fulfillment) increased trust but did not significantly reduce feelings of violation. Nurse and organizational managers can use these findings to improve communication with nurses so as to minimize the negative effects of breach and maximize the positive effects of fulfillment and thus improve attitudes. Nurse managers need to be careful to make promises regarding their nurses' employment that they can fulfill and to particularly avoid breaking the psychological contract. The potentially disproportionate negative effect of breach means that a breach can undo a lot of efforts to fulfill employment-related promises. © 2016 Sigma Theta Tau International.

  15. Lensfree Computational Microscopy Tools and their Biomedical Applications

    NASA Astrophysics Data System (ADS)

    Sencan, Ikbal

    Conventional microscopy has been a revolutionary tool for biomedical applications since its invention several centuries ago. Ability to non-destructively observe very fine details of biological objects in real time enabled to answer many important questions about their structures and functions. Unfortunately, most of these advance microscopes are complex, bulky, expensive, and/or hard to operate, so they could not reach beyond the walls of well-equipped laboratories. Recent improvements in optoelectronic components and computational methods allow creating imaging systems that better fulfill the specific needs of clinics or research related biomedical applications. In this respect, lensfree computational microscopy aims to replace bulky and expensive optical components with compact and cost-effective alternatives through the use of computation, which can be particularly useful for lab-on-a-chip platforms as well as imaging applications in low-resource settings. Several high-throughput on-chip platforms are built with this approach for applications including, but not limited to, cytometry, micro-array imaging, rare cell analysis, telemedicine, and water quality screening. The lack of optical complexity in these lensfree on-chip imaging platforms is compensated by using computational techniques. These computational methods are utilized for various purposes in coherent, incoherent and fluorescent on-chip imaging platforms e.g. improving the spatial resolution, to undo the light diffraction without using lenses, localization of objects in a large volume and retrieval of the phase or the color/spectral content of the objects. For instance, pixel super resolution approaches based on source shifting are used in lensfree imaging platforms to prevent under sampling, Bayer pattern, and aliasing artifacts. Another method, iterative phase retrieval, is utilized to compensate the lack of lenses by undoing the diffraction and removing the twin image noise of in-line holograms. This technique enables recovering the complex optical field from its intensity measurement(s) by using additional constraints in iterations, such as spatial boundaries and other known properties of objects. Another computational tool employed in lensfree imaging is compressive sensing (or decoding), which is a novel method taking advantage of the fact that natural signals/objects are mostly sparse or compressible in known bases. This inherent property of objects enables better signal recovery when the number of measurement is low, even below the Nyquist rate, and increases the additive noise immunity of the system.

  16. Managing gender diversity in healthcare: getting it right.

    PubMed

    Vanderbroeck, Paul; Wasserfallen, Jean-Blaise

    2017-02-06

    Purpose Diversity, notably gender diversity, is growing in health care, both at the level of teams and the level of organizations. This paper aims to describe the challenges for team leaders and leaders of organizations to manage this diversity. The authors believe that more could be done to help leaders master these challenges in a way that makes diverse teams and organizations more productive. Design/methodology/approach Drawing on previously published research, using gender diversity as an example, the paper first describes how diversity can both have a positive and a negative influence on team productivity. Next, it describes the challenge of gender diversity at an organizational level, using Switzerland as an example. Findings The first part of the paper espouses the causes of gender diversity, undoes some of the myths surrounding diversity and presents a model for effective management of diversity in teams. The second part looks at gender diversity at an organizational level. Drawing from sources inside and outside healthcare, the effects of the "leaking pipeline", "glass wall" and "glass ceiling" that prevent health-care organizations from leveraging the potential of female talent are discussed. Practical implications The authors propose a model developed for intercultural teamwork as a framework for leveraging gender diversity for better team productivity. Proposals are offered to health-care organizations on how they can tip the gender balance at senior levels into their favor, so as to get the maximum benefit from the available talent. Originality/value Applying the "how to" ideas and recommendations from this general review will help leaders of health-care organizations gain a better return on investment from their talent development as well as to increase the productivity of their workforce by a better use of diverse talent.

  17. Even good bots fight: The case of Wikipedia

    PubMed Central

    Tsvetkova, Milena; García-Gavilanes, Ruth; Floridi, Luciano; Yasseri, Taha

    2017-01-01

    In recent years, there has been a huge increase in the number of bots online, varying from Web crawlers for search engines, to chatbots for online customer service, spambots on social media, and content-editing bots in online collaboration communities. The online world has turned into an ecosystem of bots. However, our knowledge of how these automated agents are interacting with each other is rather poor. Bots are predictable automatons that do not have the capacity for emotions, meaning-making, creativity, and sociality and it is hence natural to expect interactions between bots to be relatively predictable and uneventful. In this article, we analyze the interactions between bots that edit articles on Wikipedia. We track the extent to which bots undid each other’s edits over the period 2001–2010, model how pairs of bots interact over time, and identify different types of interaction trajectories. We find that, although Wikipedia bots are intended to support the encyclopedia, they often undo each other’s edits and these sterile “fights” may sometimes continue for years. Unlike humans on Wikipedia, bots’ interactions tend to occur over longer periods of time and to be more reciprocated. Yet, just like humans, bots in different cultural environments may behave differently. Our research suggests that even relatively “dumb” bots may give rise to complex interactions, and this carries important implications for Artificial Intelligence research. Understanding what affects bot-bot interactions is crucial for managing social media well, providing adequate cyber-security, and designing well functioning autonomous vehicles. PMID:28231323

  18. Even good bots fight: The case of Wikipedia.

    PubMed

    Tsvetkova, Milena; García-Gavilanes, Ruth; Floridi, Luciano; Yasseri, Taha

    2017-01-01

    In recent years, there has been a huge increase in the number of bots online, varying from Web crawlers for search engines, to chatbots for online customer service, spambots on social media, and content-editing bots in online collaboration communities. The online world has turned into an ecosystem of bots. However, our knowledge of how these automated agents are interacting with each other is rather poor. Bots are predictable automatons that do not have the capacity for emotions, meaning-making, creativity, and sociality and it is hence natural to expect interactions between bots to be relatively predictable and uneventful. In this article, we analyze the interactions between bots that edit articles on Wikipedia. We track the extent to which bots undid each other's edits over the period 2001-2010, model how pairs of bots interact over time, and identify different types of interaction trajectories. We find that, although Wikipedia bots are intended to support the encyclopedia, they often undo each other's edits and these sterile "fights" may sometimes continue for years. Unlike humans on Wikipedia, bots' interactions tend to occur over longer periods of time and to be more reciprocated. Yet, just like humans, bots in different cultural environments may behave differently. Our research suggests that even relatively "dumb" bots may give rise to complex interactions, and this carries important implications for Artificial Intelligence research. Understanding what affects bot-bot interactions is crucial for managing social media well, providing adequate cyber-security, and designing well functioning autonomous vehicles.

  19. Rapid Expectation Adaptation during Syntactic Comprehension

    PubMed Central

    Fine, Alex B.; Jaeger, T. Florian; Farmer, Thomas A.; Qian, Ting

    2013-01-01

    When we read or listen to language, we are faced with the challenge of inferring intended messages from noisy input. This challenge is exacerbated by considerable variability between and within speakers. Focusing on syntactic processing (parsing), we test the hypothesis that language comprehenders rapidly adapt to the syntactic statistics of novel linguistic environments (e.g., speakers or genres). Two self-paced reading experiments investigate changes in readers’ syntactic expectations based on repeated exposure to sentences with temporary syntactic ambiguities (so-called “garden path sentences”). These sentences typically lead to a clear expectation violation signature when the temporary ambiguity is resolved to an a priori less expected structure (e.g., based on the statistics of the lexical context). We find that comprehenders rapidly adapt their syntactic expectations to converge towards the local statistics of novel environments. Specifically, repeated exposure to a priori unexpected structures can reduce, and even completely undo, their processing disadvantage (Experiment 1). The opposite is also observed: a priori expected structures become less expected (even eliciting garden paths) in environments where they are hardly ever observed (Experiment 2). Our findings suggest that, when changes in syntactic statistics are to be expected (e.g., when entering a novel environment), comprehenders can rapidly adapt their expectations, thereby overcoming the processing disadvantage that mistaken expectations would otherwise cause. Our findings take a step towards unifying insights from research in expectation-based models of language processing, syntactic priming, and statistical learning. PMID:24204909

  20. Undoing Gender Through Legislation and Schooling: the Case of AB 537 and AB 394 IN California, USA

    NASA Astrophysics Data System (ADS)

    Knotts, Greg

    2009-11-01

    This article investigates California laws AB 537: The Student Safety and Violence Prevention Act of 2000, and the recently enacted AB 394: Safe Place to Learn Act. Both demand that gender identity and sexual orientation be added to the lexicon of anti-harassment protection in public education. However, despite these progressive measures, schools have an unconscious acceptance of heteronormativity and gendered norms, which undermines both the spirit and language of these laws. This paper examines how California schools can both change standard practices and realise the transformative social change that laws like AB 537 and AB 394 can instigate. I assert that the systemic implementation of these laws, through the adoption, enforcement and evaluation of existing AB 537 Task Force Recommendations, is necessary for their success. My second assertion is that AB 537 and AB 394 have the potential to change and reconstitute gender-based and heteronormative standards at school sites.

  1. "doing and Undoing Gender": Female Higher Education in the Islamic Republic of Iran

    NASA Astrophysics Data System (ADS)

    Mehran, Golnar

    2009-11-01

    Since the establishment of the Islamic Republic, female higher education has been characterised by a paradoxical combination of discrimination and exclusion, on the one hand, and increasing equality and empowerment, on the other. This study focuses on the triangle of education, equality and empowerment, using Sara Longwe's women's empowerment framework to analyse the interplay between the three. State policies to Islamise the universities during the 1980-1983 Cultural Revolution determined the "gender appropriateness" of each specialisation and led to the exclusion of women from "masculine" fields of study during the early years of the revolution. Despite such discriminatory measures, women today represent the majority of students in all fields, except engineering. Women, however, remain underrepresented at graduate levels of education and as faculty members. An important challenge is to understand why men are not entering different specialisations and whether there is a possibility of "re-doing gender" - this time in addressing male inequality and disempowerment at undergraduate levels.

  2. Undoing the past in order to lie in the present: Counterfactual thinking and deceptive communication.

    PubMed

    Briazu, Raluca A; Walsh, Clare R; Deeprose, Catherine; Ganis, Giorgio

    2017-04-01

    This paper explores the proposal that there is a close link between counterfactual thinking and lying. Both require the imagination of alternatives to reality and we describe four studies which explore this link. In Study 1 we measured individual differences in both abilities and found that individuals with a tendency to generate counterfactual thoughts were also more likely to generate potential lies. Studies 2 and 3 showed that counterfactual availability influences people's ability to come up with lies and the extent to which they expect others to lie. Study 4 used a behavioural measure of deception to show that people tend to lie more in situations also known to elicit counterfactual thoughts. Overall, the results show that the imagination of alternatives to the past plays an important role in the generation of lies. We discuss the implications for the fields of counterfactual thinking and deception. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Handling Late Changes to Titan Science

    NASA Technical Reports Server (NTRS)

    Pitesky, Jo Eliza; Steadman, Kim; Ray, Trina; Burton, Marcia

    2014-01-01

    The Cassini mission has been in orbit for eight years, returning a wealth of scientific data from Titan and the Saturnian system. The mission, a cooperative undertaking between NASA, ESA and ASI, is currently in its second extension of the prime mission. The Cassini Solstice Mission (CSM) extends the mission's lifetime until Saturn's northern summer solstice in 2017. The Titan Orbital Science Team (TOST) has the task of integrating the science observations for all 56 targeted Titan flybys in the CSM. In order to balance Titan science across the entire set of flybys during the CSM, to optimize and influence the Titan flyby altitudes, and to decrease the future workload, TOST went through a "jumpstart" process before the start of the CSM. The "jumpstart" produced Master Timelines for each flyby, identifying prime science observations and allocating control of the spacecraft attitude to specific instrument teams. Three years after completing this long-range plan, TOST now faces a new challenge: incorporating changes into the Titan Science Plan without undoing the balance achieved during the jumpstart.

  4. Considering organizational factors in addressing health care disparities: two case examples.

    PubMed

    Griffith, Derek M; Yonas, Michael; Mason, Mondi; Havens, Betsy E

    2010-05-01

    Policy makers and practitioners have yet to successfully understand and eliminate persistent racial differences in health care quality. Interventions to address these racial health care disparities have largely focused on increasing cultural awareness and sensitivity, promoting culturally competent care, and increasing providers' adherence to evidence-based guidelines. Although these strategies have improved some proximal factors associated with service provision, they have not had a strong impact on racial health care disparities. Interventions to date have had limited impact on racial differences in health care quality, in part, because they have not adequately considered or addressed organizational and institutional factors. In this article, we describe an emerging intervention strategy to reduce health care disparities called dismantling (undoing) racism and how it has been adapted to a rural public health department and an urban medical system. These examples illustrate the importance of adapting interventions to the organizational and institutional context and have important implications for practitioners and policy makers.

  5. Democratic Republic of the Congo: undoing government by predation.

    PubMed

    Rackley, Edward B

    2006-12-01

    This paper draws on two periods of field research, conducted in 2004, to consider the state of governance in the Democratic Republic of the Congo (DRC). The first measures the paralysing impact of illegal taxation on riverine trade in the western provinces; the second documents civilian attempts to seek safety from violence in the troubled east, and evaluates third-party efforts to provide protection and security. Analysis of study findings suggests that the DRC's current governance crisis is neither historically novel nor driven exclusively by mineral resources, extraction rights or trafficking. Rather, government by predation is an endemic and systematic feature of the civil and military administration, ensuring the daily economic survival of soldiers and officials, who are able to wield their authority in a 'riskfree' environment, without oversight or accountability. The paper's conclusion tries to make sense of the persistence of corruption in social and political life, and assess the capacity of ordinary citizens to reverse their predicament.

  6. Idiopathic condylar resorption: The current understanding in diagnosis and treatment

    PubMed Central

    Young, Andrew

    2017-01-01

    Idiopathic condylar resorption (ICR) is a condition with no known cause, which manifests as progressive malocclusion, esthetic changes, and often pain. Cone-beam computed tomography and magnetic resonance imaging are the most valuable imaging methods for diagnosis and tracking, compared to the less complete and more distorted images provided by panoramic radiographs, and the higher radiation of 99mtechnetium-methylene diphosphonate. ICR has findings that overlap with osteoarthritis, inflammatory arthritis, physiologic resorption/remodeling, congenital disorders affecting the mandible, requiring thorough image analysis, physical examination, and history-taking. Correct diagnosis and determination of whether the ICR is active or inactive are essential when orthodontic or prosthodontic treatment is anticipated as active ICR can undo those treatments. Several treatments for ICR have been reported with the goals of either halting the progression of ICR or correcting the deformities that it caused. These treatments have varying degrees of success and adverse effects, but the rarity of the condition prevents any evidence-based recommendations. PMID:28584413

  7. Darwin at Orchis Bank: Selection after the Origin.

    PubMed

    Tabb, Kathryn

    2016-02-01

    Darwin's first publication after the Origin of Species was a volume on orchids that expanded on the theory of adaptation through natural selection introduced in his opus. Here I argue that On the Various Contrivances by which British and Foreign Orchids are Fertilised by Insects (1862) is not merely an empirical confirmation of his theory. In response to immediate criticisms of his metaphor of natural selection, Darwin uses Orchids to present adaptation as the result of innumerable natural laws, rather than discrete acts analogous to conscious choices. The means of selection among polliniferous plants cannot be neatly classed under the Origin's categories of artificial, natural, or sexual selection. Along with Darwin's exploration of sexual selection in his later works, Orchids serves to undo the restrictive metaphor so firmly established by the Origin and to win over those of Darwin's contemporaries who were committed advocates of natural law but suspicious of evolution by natural selection. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. LGBTQ+ Young Adults on the Street and on Campus: Identity as a Product of Social Context.

    PubMed

    Schmitz, Rachel M; Tyler, Kimberly A

    2018-01-01

    Lesbian, gay, bisexual, queer, and other sexual and gender minority (LGBTQ+) young adults face unique identity-related experiences based on their immersion in distinctive social contexts. The predominant framework of performing separate analyses on samples of LGBTQ+ young people by their primary social status obfuscates more holistic understandings of the role of social context. Using 46 in-depth interviews with LGBTQ+ college students and LGBTQ+ homeless young adults, we ask: How are LGBTQ+ young adults' capacities for "doing" their gender and sexual identities shaped by their distinctive social contexts? In developing their identities, both groups of LGBTQ+ young adults navigated their social environments to seek out resources and support. Most college students described their educational contexts as conducive to helping them develop their identities, or "undo" rigid norms of gender and sexuality. Homeless young adults' social environments, meanwhile, imposed complex barriers to self-expression that reinforced more normative expectations of "doing" gender and sexual identities.

  9. [Relationships between defense mechanisms and coping strategies, facing exam anxiety performance].

    PubMed

    Grebot, E; Paty, B; Girarddephanix, N

    2006-01-01

    Defence mechanisms and coping strategies rely on different theoretical backgrounds and describe distinct psychological processes. Cramer has based a distinction on the following dimensions: conscious processes vs. not; intentionality vs. not; hierarchical conception vs. not. On the contrary to these distinctions, the two notions of defense mechanisms and coping strategies are defined as similar in the Diagnostical and Statistical Manual (DSM IV). This assimilation between coping and defenses in the DSM IV is not confirmed by some researches, namely the one by Callahan and Chabrol. It indeed proves a relationship between adaptive coping and mature defenses, as well as between maladaptive coping and immature defenses. Similarly, Plutchik offered theoretical correspondences between eight defense mechanisms and eight coping strategies: (a) Defenses: repression, isolation, introjection and Coping escape; (b) Defense denial and Coping minimalization; (c) Defense undoing and coping substitution; (d) Defenses: regression, acting out and coping social support; (e) Defenses: compensation, identification, fantasy and coping replacement; (f) Defenses: intellectualization, sublimation, annulation, rationalisation and coping: planification; (g) Defense projection and coping blame; (h) Defense: reactional formation and coping inversion. this research aims at testing the relations observed by Callahan and Chabrol and some theoretical correspondences proposed by Plutchik between defences and coping strategies in a population of students similar to the one used by Callahan and Chabrol. It also aims at studying the relationships between coping strategies and conscious derives of defense mechanisms, such as defined by Bond (1995). Defenses were evaluated the first day of the examination week. the population includes 184 women students in human sciences (sociology and psychology). defenses were evaluated with the Defense Style Questionnaire by Bond (DSQ 40). Its French version is made of 40 items and validated by Guelfi et al. It explores 20 defense mechanisms, as well as 3 defense styles: (1) a "mature style", composed by 4 defenses: sublimation, humor, anticipation, repression; (2) a "neurotic style", composed by 4 defenses: annulation, reactional formation, altruism and idealization; (3) an "immature style", composed by 12 defenses. Coping strategies were measured by the French version of the Way of Coping Check-List-Revised, (WCC-R) by Lazarus and Folkman, validated by Graziani et al. It evaluates 10 factors: 1) Problem solving; 2) Evasion; 3) Social support; 4) Self-control; 5) Escape; 6) Responsabilization-Replanification; 7) Resignation; 8)Diplomacy; 9) Confrontation; 10) Personal evolution. Our results confirm partially Callahan and Chabrol's conclusions in favour of existing relationships between adaptive coping strategies and mature defenses, as well as between maladaptive coping strategies and immature defenses. They demonstrate three positive relationships: 1) a relation between Problem solving resolution coping and two mature defenses (Sublimation, Anticipation); 2) a relation between Evasion coping and nevrotic and immature defenses; 3) a relation between Escaping coping and immature defenses. The correspondences between defense mechanisms and coping strategies, such as proposed by Plutchik psycho-evolutionist emotional model are partly validated. Some links were indeed validated in this research, between: a) Defense Undoing and Escaping or Evasion coping; b) Defense Fantasy and Responzabilization coping, c) Defense Sublimation and Problem solving resolution coping; d) Defense Sublimation and Responsabilization coping or Problem solving resolution coping; e) Défense Annulation and Responzabilisation coping.

  10. Artemisinin resistance--modelling the potential human and economic costs.

    PubMed

    Lubell, Yoel; Dondorp, Arjen; Guérin, Philippe J; Drake, Tom; Meek, Sylvia; Ashley, Elizabeth; Day, Nicholas P J; White, Nicholas J; White, Lisa J

    2014-11-23

    Artemisinin combination therapy is recommended as first-line treatment for falciparum malaria across the endemic world and is increasingly relied upon for treating vivax malaria where chloroquine is failing. Artemisinin resistance was first detected in western Cambodia in 2007, and is now confirmed in the Greater Mekong region, raising the spectre of a malaria resurgence that could undo a decade of progress in control, and threaten the feasibility of elimination. The magnitude of this threat has not been quantified. This analysis compares the health and economic consequences of two future scenarios occurring once artemisinin-based treatments are available with high coverage. In the first scenario, artemisinin combination therapy (ACT) is largely effective in the management of uncomplicated malaria and severe malaria is treated with artesunate, while in the second scenario ACT are failing at a rate of 30%, and treatment of severe malaria reverts to quinine. The model is applied to all malaria-endemic countries using their specific estimates for malaria incidence, transmission intensity and GDP. The model describes the direct medical costs for repeated diagnosis and retreatment of clinical failures as well as admission costs for severe malaria. For productivity losses, the conservative friction costing method is used, which assumes a limited economic impact for individuals that are no longer economically active until they are replaced from the unemployment pool. Using conservative assumptions and parameter estimates, the model projects an excess of 116,000 deaths annually in the scenario of widespread artemisinin resistance. The predicted medical costs for retreatment of clinical failures and for management of severe malaria exceed US$32 million per year. Productivity losses resulting from excess morbidity and mortality were estimated at US$385 million for each year during which failing ACT remained in use as first-line treatment. These 'ballpark' figures for the magnitude of the health and economic threat posed by artemisinin resistance add weight to the call for urgent action to detect the emergence of resistance as early as possible and contain its spread from known locations in the Mekong region to elsewhere in the endemic world.

  11. Free Energies of Quantum Particles: The Coupled-Perturbed Quantum Umbrella Sampling Method.

    PubMed

    Glover, William J; Casey, Jennifer R; Schwartz, Benjamin J

    2014-10-14

    We introduce a new simulation method called Coupled-Perturbed Quantum Umbrella Sampling that extends the classical umbrella sampling approach to reaction coordinates involving quantum mechanical degrees of freedom. The central idea in our method is to solve coupled-perturbed equations to find the response of the quantum system's wave function along a reaction coordinate of interest. This allows for propagation of the system's dynamics under the influence of a quantum biasing umbrella potential and provides a method to rigorously undo the effects of the bias to compute equilibrium ensemble averages. In this way, one can drag electrons into regions of high free energy where they would otherwise not go, thus enabling chemistry by fiat. We demonstrate the applicability of our method for two condensed-phase systems of interest. First, we consider the interaction of a hydrated electron with an aqueous sodium cation, and we calculate a potential of mean force that shows that an e(-):Na(+) contact pair is the thermodynamically favored product starting from either a neutral sodium atom or the separate cation and electron species. Second, we present the first determination of a hydrated electron's free-energy profile relative to an air/water interface. For the particular model parameters used, we find that the hydrated electron is more thermodynamically stable in the bulk rather than at the interface. Our analysis suggests that the primary driving force keeping the electron away from the interface is the long-range electron-solvent polarization interaction rather than the short-range details of the chosen pseudopotential.

  12. Remains of care: opioid substitution treatment in the post-welfare state.

    PubMed

    Leppo, Anna; Perälä, Riikka

    2017-07-01

    This article examines how the amplified role of pharmaceutical substances in addiction treatment affects the everyday realisation of care, particularly the relationship between workers and patients, in so called austere environments. Theoretically the article draws firstly on the literature that links pharmaceuticalisation to the neoliberal undoing of central public structures and institutions of care, and secondly on Anne-Marie Mol's concept of the logic of care. Based on an ethnographic analysis of the everyday life at a Finnish opioid substitution treatment clinic we show the mechanisms through which the realisation of pharmacotherapy can, in the current political climate, result in a very narrow understanding of drug problems and minimal human contact between patients and professionals. Our analysis manifests an important shift in the logic of addiction treatment and health-care policy more broadly; namely, a growing tendency to emphasise the need for patients to care for themselves and make good choices with limited help from formal care institutions and professionals. We call this new ethos the logic of austerity. © 2017 Foundation for the Sociology of Health & Illness.

  13. Resignifying the sickle cell gene: Narratives of genetic risk, impairment and repair.

    PubMed

    Berghs, Maria; Dyson, Simon M; Atkin, Karl

    2017-03-01

    Connecting theoretical discussion with empirical qualitative work, this article examines how sickle cell became a site of public health intervention in terms of 'racialised' risks. Historically, sickle cell became socio-politically allied to ideas of repair, in terms of the state improving the health of a neglected ethnic minority population. Yet, we elucidate how partial improvements in care and education arose alongside preventative public health screening efforts. Using qualitative research based in the United Kingdom, we show how a focus on collective efforts of repair can lie in tension with how services and individuals understand and negotiate antenatal screening. We illustrate how screening for sickle cell disorder calls into question narrative identity, undoing paradigms in which ethnicity, disablement and genetic impairment become framed. Research participants noted that rather than 'choices', it is 'risks' and their negotiation that are a part of discourses of modernity and the new genetics. Furthermore, while biomedical paradigms are rationally and ethically (de)constructed by participants, this was never fully engaged with by professionals, contributing to overall perception of antenatal screening as disempowering and leading to disengagement.

  14. Everyday places, heterosexist spaces and risk in contemporary Sweden.

    PubMed

    Nygren, Katarina Giritli; Öhman, Susanna; Olofsson, Anna

    2016-01-01

    Subjective feelings of risk are a central feature of everyday life, and evidence shows that people who do not conform to contemporary normative notions are often more exposed to everyday risks than others. Despite this, normative notions are rarely acknowledged as risk objects. By drawing on the theory of 'doing' and 'undoing' risk, which combines intersectional and risk theory, this study contributes new perspectives on the everyday risks in contemporary society that face people who many would label as being 'at risk' - lesbian, gay, bisexual and transgender people. The study consists of five focus group interviews with lesbian, gay, bisexual and transgender people of different ages in Sweden. Findings pinpoint risks and how these are done and un-done in different spheres of interviewees' lives: the emotional risks prevailing in their private lives; the risk of discrimination at work and in relations with other institutions; and the risk of violence and harassment in public places. These risks are all related to the heteronormative order in which the mere fact of being lesbian, gay, bisexual and transgender is perceived as a risk.

  15. Baryon Acoustic Oscillations reconstruction with pixels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Obuljen, Andrej; Villaescusa-Navarro, Francisco; Castorina, Emanuele

    2017-09-01

    Gravitational non-linear evolution induces a shift in the position of the baryon acoustic oscillations (BAO) peak together with a damping and broadening of its shape that bias and degrades the accuracy with which the position of the peak can be determined. BAO reconstruction is a technique developed to undo part of the effect of non-linearities. We present and analyse a reconstruction method that consists of displacing pixels instead of galaxies and whose implementation is easier than the standard reconstruction method. We show that this method is equivalent to the standard reconstruction technique in the limit where the number of pixelsmore » becomes very large. This method is particularly useful in surveys where individual galaxies are not resolved, as in 21cm intensity mapping observations. We validate this method by reconstructing mock pixelated maps, that we build from the distribution of matter and halos in real- and redshift-space, from a large set of numerical simulations. We find that this method is able to decrease the uncertainty in the BAO peak position by 30-50% over the typical angular resolution scales of 21 cm intensity mapping experiments.« less

  16. Comparison of ego defenses among physically abused children, neglected, and non-maltreated children.

    PubMed

    Finzi, Ricky; Har-Even, Dov; Weizman, Abraham

    2003-01-01

    The nature and level of ego functioning were assessed in 41 recently detected physically abused children, and in two control groups of 38 neglected and 35 non-abused/non-neglected children (aged 6 to 12 years), using the Child Suicidal Potential Scales (CSPS). The results obtained in this study support the hypothesis that the influences of parental violence on the child's ego functions are detrimental, as reflected by significantly higher impairments in affect regulation (like irritability, anger, passivity, depression), low levels of impulse control, distortions in reality testing, and extensive operation of immature defense mechanisms in the physically abused children in comparison to the controls. Significant differences between the physically abused and the non-abused/non-neglected children were found for all mechanisms except displacement. The differences between the physically abused and neglected children for regression, denial and splitting, projection, and introjection (high scores for the physically abused children), and for compensation and undoing (higher scores for the neglected children) were also significant. It is suggested that physically abused children should be distinguished as a high-risk population for future personality disorders.

  17. Linear phase conjugation for atmospheric aberration compensation

    NASA Astrophysics Data System (ADS)

    Grasso, Robert J.; Stappaerts, Eddy A.

    1998-01-01

    Atmospheric induced aberrations can seriously degrade laser performance, greatly affecting the beam that finally reaches the target. Lasers propagated over any distance in the atmosphere suffer from a significant decrease in fluence at the target due to these aberrations. This is especially so for propagation over long distances. It is due primarily to fluctuations in the atmosphere over the propagation path, and from platform motion relative to the intended aimpoint. Also, delivery of high fluence to the target typically requires low beam divergence, thus, atmospheric turbulence, platform motion, or both results in a lack of fine aimpoint control to keep the beam directed at the target. To improve both the beam quality and amount of laser energy delivered to the target, Northrop Grumman has developed the Active Tracking System (ATS); a novel linear phase conjugation aberration compensation technique. Utilizing a silicon spatial light modulator (SLM) as a dynamic wavefront reversing element, ATS undoes aberrations induced by the atmosphere, platform motion or both. ATS continually tracks the target as well as compensates for atmospheric and platform motion induced aberrations. This results in a high fidelity, near-diffraction limited beam delivered to the target.

  18. (Un)doing gender in a rehabilitation context: a narrative analysis of gender and self in stories of chronic muscle pain.

    PubMed

    Ahlsen, Birgitte; Bondevik, Hilde; Mengshoel, Anne Marit; Solbrække, Kari Nyheim

    2014-01-01

    To explore how gender appears in the stories of self-told by men and women undergoing rehabilitation for chronic muscle pain. The material, which consists of qualitative interviews with 10 men and 6 women with chronic neck pain, was analyzed from a gender sensitive perspective using narrative method. The analysis was inspired by Arthur Frank's typologies of illness narratives (restitution, chaos and quest). The women's stories displayed selves that were actively trying to transcend their former identity and life conditions, in which their pain was embedded. Their stories tended to develop from "chaos", towards a quest narrative with a more autonomous self. The selves in the men's stories appeared to be actively seeking a solution to the pain within a medical context. Framed as a restitution narrative, rooted in a biomedical model of disease, the voice often heard in the men's stories was of a self-dependent on future health care. Our findings contribute greater nuance to a dominant cultural conception that men are more independent than women in relation to health care. Understanding the significance of gender in the construction of selves in stories of chronic pain may help to improve the health care offered to patients suffering from chronic pain. Implications for Rehabilitation Patients tell stories that powerfully communicate their particular illness experiences. Cultural expectations of femininity and masculinity play a significant role with regard to how the patients construct their stories, which may be important to health professionals' perceptions of the patients' problem. Health care professionals should listen carefully to the patient's own story and be sensitive to the significance of gender when trying to understand these people's health problem.

  19. Genetic modifications for personal enhancement: a defence.

    PubMed

    Murphy, Timothy F

    2014-04-01

    Bioconservative commentators argue that parents should not take steps to modify the genetics of their children even in the name of enhancement because of the damage they predict for values, identities and relationships. Some commentators have even said that adults should not modify themselves through genetic interventions. One commentator worries that genetic modifications chosen by adults for themselves will undermine moral agency, lead to less valuable experiences and fracture people's sense of self. These worries are not justified, however, since the effects of modification will not undo moral agency as such. Adults can still have valuable experiences, even if some prior choices no longer seem meaningful. Changes at the genetic level will not always, either, alienate people from their own sense of self. On the contrary, genetic modifications can help amplify choice, enrich lives and consolidate identities. Ultimately, there is no moral requirement that people value their contingent genetic endowment to the exclusion of changes important to them in their future genetic identities. Through weighing risks and benefits, adults also have the power to consent to, and assume the risks of, genetic modifications for themselves in a way not possible in prenatal genetic interventions.

  20. On the Duality of Forward and Inverse Light Transport.

    PubMed

    Chandraker, Manmohan; Bai, Jiamin; Ng, Tian-Tsong; Ramamoorthi, Ravi

    2011-10-01

    Inverse light transport seeks to undo global illumination effects, such as interreflections, that pervade images of most scenes. This paper presents the theoretical and computational foundations for inverse light transport as a dual of forward rendering. Mathematically, this duality is established through the existence of underlying Neumann series expansions. Physically, it can be shown that each term of our inverse series cancels an interreflection bounce, just as the forward series adds them. While the convergence properties of the forward series are well known, we show that the oscillatory convergence of the inverse series leads to more interesting conditions on material reflectance. Conceptually, the inverse problem requires the inversion of a large light transport matrix, which is impractical for realistic resolutions using standard techniques. A natural consequence of our theoretical framework is a suite of fast computational algorithms for light transport inversion--analogous to finite element radiosity, Monte Carlo and wavelet-based methods in forward rendering--that rely at most on matrix-vector multiplications. We demonstrate two practical applications, namely, separation of individual bounces of the light transport and fast projector radiometric compensation, to display images free of global illumination artifacts in real-world environments.

  1. Lived Experience of Thai Women with Alcohol Addiction.

    PubMed

    Hanpatchaiyakul, Kulnaree; Eriksson, Henrik; Kijsomporn, Jureerat; Östlund, Gunnel

    2017-12-01

    This study explored the lived experiences of Thai women in relation to alcohol addiction in treatment. Twelve women aged 20 to 65 years, were participated. The participants were recruited from two special hospitals and one outpatient clinic in a general hospital. Descriptive phenomenology was applied to analyze the transcripts of the individual interviews. The explored phenomenon of Thai women experiencing alcohol addiction included four essential aspects, (1) feeling inferior and worthless (2) feeling physically and emotionally hurt, (3) fearing physical deterioration and premature death, and (4) feeling superior and powerful. Through these different aspects of Thai women's lived experiences, the following essence was synthesized. The essence of the lived experience of alcohol addiction among the studied Thai women was ambivalence between feeling inferior and worthless and feeling superior and powerful when acting as a man. Drinking alcohol lessened life's difficulties and fears; for example, of violence, bodily demolition, premature death and marginalization from family and society. Thai women who experience alcohol addiction are treated with gender-related double standards when trying to undo gender traditional roles. Their marginalization from family and society deepens making them even more vulnerable to the positive side effects of alcohol drinking. Copyright © 2017. Published by Elsevier B.V.

  2. Continuous deep sedation and homicide: an unsolved problem in law and professional morality.

    PubMed

    den Hartogh, Govert

    2016-06-01

    When a severely suffering dying patient is deeply sedated, and this sedated condition is meant to continue until his death, the doctor involved often decides to abstain from artificially administering fluids. For this dual procedure almost all guidelines require that the patient should not have a life expectancy beyond a stipulated maximum of days (4-14). The reason obviously is that in case of a longer life-expectancy the patient may die from dehydration rather than from his lethal illness. But no guideline tells us how we should describe the dual procedure in case of a longer life-expectancy. Many arguments have been advanced why we should not consider it to be a form of homicide, that is, ending the life of the patient (with or without his request). I argue that none of these arguments, taken separately or jointly, is persuasive. When a commission, even one that is not itself life-shortening, foreseeably renders a person unable to undo the life-shortening effects of another, simultaneous omission, the commission and the omission together should be acknowledged to kill her. I discuss the legal and ethical implications of this conclusion.

  3. The neurobiology of reward and cognitive control systems and their role in incentivizing health behavior.

    PubMed

    Garavan, Hugh; Weierstall, Karen

    2012-11-01

    This article reviews the neurobiology of cognitive control and reward processes and addresses their role in the treatment of addiction. We propose that the neurobiological mechanisms involved in treatment may differ from those involved in the etiology of addiction and consequently are worthy of increased investigation. We review the literature on reward and control processes and evidence of differences in these systems in drug addicted individuals. We also review the relatively small literature on neurobiological predictors of abstinence. We conclude that prefrontal control systems may be central to a successful recovery from addiction. The frontal lobes have been shown to regulate striatal reward-related processes, to be among the regions that predict treatment outcome, and to show elevated functioning in those who have succeeded in maintaining abstinence. The evidence of the involvement of the frontal lobes in recovery is consistent with the hypothesis that recovery is a distinct process that is more than the undoing of those processes involved in becoming addicted and a return to the pre-addiction state of the individual. The extent to which these frontal systems are engaged by treatment interventions may contribute to their efficacy. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Undoing climate warming by atmospheric carbon-dioxide removal: can a holocene-like climate be restored?

    NASA Astrophysics Data System (ADS)

    MacDougall, Andrew

    2013-04-01

    Understandably, most climate modelling studies of future climate have focused on the affects of carbon emissions in the present century or the long-term fate of anthropogenically emitted carbon. These studies make an assumption: that once net anthropogenic carbon emissions cease, that humanity will make no further effort to intervene in atmospheric composition. There is a case to be made, however, that there will be a desire to return to a "safe" atmospheric concentration of CO2. Realistically this implies synthetically removing CO2 from the atmosphere and storing it is some geologically stable form. For this study experiments were conducted using the University of Victoria Earth System Climate Model (UVic ESCM) forced with novel future atmospheric trace-gas concentration pathways to explore a gradual return to pre-industrial radiative forcing. The concentration pathways follow each RCP (2.6, 4.5, 6.0, and 8.5) exactly until the peak CO2 concentration of that RCP is reached, at which point atmospheric CO2 is reduced at the same rate it increased until the 1850 concentration of CO2 is reached. Non-CO2 greenhouse gas forcing follows the prescribed RCP path until the year of peak CO2, then is subsequently linearly reduced to pre-industrial forcing. Pasture and crop areas are also gradually reduced to their pre-industrial extent. Under the middle two concentration pathways (4.5 and 6.0) a climate resembling the 20th century climate can be restored by the 25th century, although surface temperature remains above the pre-industrial temperature until at least the 30th century. Due to carbon-cycle feedbacks the quantity of carbon that must be removed from the atmosphere is larger than the quantity that was originally emitted. For concentration pathways 2.6, 4.5, and 6.0 the sequestered CO2 is 115-190% of the original cumulative carbon emissions. These results suggest that even with monumental effort to remove CO2 from the atmosphere, humanity will be living with the consequences of fossil fuel emissions for a very long time.

  5. Quality of life at the retirement transition: Life course pathways in an early 'baby boom' birth cohort.

    PubMed

    Wildman, Josephine M; Moffatt, Suzanne; Pearce, Mark

    2018-06-01

    Promoting quality of life (QoL) in later life is an important policy goal. However, studies using prospective data to explore the mechanisms by which earlier events influence QoL in older age are lacking. This study is the first to use prospective data to investigate pathways by which a range of measures of life-course socioeconomic status contribute to later-life QoL. The study uses data from the Newcastle Thousand Families Study cohort (N = 1142), an early 'baby-boom' birth cohort born in 1947 in Newcastle upon Tyne, an industrial city in north-east England. Using prospective survey data collected between birth and later adulthood (N = 393), a path analysis investigated the effects and relative contributions of a range of life-course socioeconomic factors to QoL at age 62-64 measured using the CASP-19 scale. Strong positive effects on later-life QoL were found for advantaged occupational status in mid-life and better self-reported health, employment and mortgage-freedom in later adulthood. Significant positive indirect effects on QoL were found from social class at birth and achieved education level, mediated through later-life socioeconomic advantage. Experiencing no adverse events by age five had a large total positive effect on QoL at age 62-64, comprising a direct effect and indirect effects, mediated through education, mid-life social class and later-life self-reported health. Results support a pathway model with the effects of factors in earlier life acting via later-life factors, and an accumulation model with earlier-life factors having large total, cumulative effects on later-life QoL. The presence of a direct effect of adverse childhood events by age five on QoL suggests a 'critical period' and indicates that policies across the life-course are needed to promote later-life QoL, with policies directed towards older adults perhaps too late to 'undo the damage' of earlier adverse events. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Nearly noiseless amplification of microwave signals with a Josephson parametric amplifier

    NASA Astrophysics Data System (ADS)

    Castellanos-Beltran, Manuel

    2009-03-01

    A degenerate parametric amplifier transforms an incident coherent state by amplifying one of its quadrature components while deamplifying the other. This transformation, when performed by an ideal parametric amplifier, is completely deterministic and reversible; therefore the amplifier in principle can be noiseless. We attempt to realize a noiseless amplifier of this type at microwave frequencies with a Josephson parametric amplifier (JPA). To this end, we have built a superconducting microwave cavity containing many dc-SQUIDs. This arrangement creates a non-linear medium in a cavity and it is closely analogous to an optical parametric amplifier. In my talk, I will describe the current performance of this circuit, where I show I can amplify signals with less added noise than a quantum-limited amplifier that amplifies both quadratures. In addition, the JPA also squeezes the electromagnetic vacuum fluctuations by 10 dB. Finally, I will discuss our effort to put two such amplifiers in series in order to undo the first stage of squeezing with a second stage of amplification, demonstrating that the amplification process is truly reversible.[4pt] M. A. Castellanos-Beltran, K. D. Irwin, G. C. Hilton, L. R. Vale and K. W. Lehnert, Nature Physics, published on line, http://dx.doi.org/10.1038/nphys1090 (2008).

  7. Reversing one's fortune by pushing away bad luck.

    PubMed

    Zhang, Yan; Risen, Jane L; Hosey, Christine

    2014-06-01

    Across cultures, people try to "undo" bad luck with superstitious rituals such as knocking on wood, spitting, or throwing salt. We suggest that these rituals reduce the perceived likelihood of anticipated negative outcomes because they involve avoidant actions that exert force away from one's representation of self, which simulates the experience of pushing away bad luck. Five experiments test this hypothesis by having participants tempt fate and then engage in avoidant actions that are either superstitious (Experiment 1, knocking on wood) or nonsuperstitious (Experiments 2-5, throwing a ball). We find that participants who knock down (away from themselves) or throw a ball think that a jinxed negative outcome is less likely than participants who knock up (toward themselves) or hold a ball. Experiments 3 and 4 provide evidence that after tempting fate, engaging in an avoidant action leads to less clear mental representations for the jinxed event, which, in turn, leads to lower perceived likelihoods. Finally, we demonstrate that engaging in an avoidant action-rather than creating physical distance-is critical for reversing the perceived effect of the jinx. Although superstitions are often culturally defined, the underlying psychological processes that give rise to them may be shared across cultures. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carron, Julien; Lewis, Antony; Challinor, Anthony, E-mail: j.carron@sussex.ac.uk, E-mail: Antony.Lewis@sussex.ac.uk, E-mail: a.d.challinor@ast.cam.ac.uk

    We present a first internal delensing of CMB maps, both in temperature and polarization, using the public foreground-cleaned (SMICA) Planck 2015 maps. After forming quadratic estimates of the lensing potential, we use the corresponding displacement field to undo the lensing on the same data. We build differences of the delensed spectra to the original data spectra specifically to look for delensing signatures. After taking into account reconstruction noise biases in the delensed spectra, we find an expected sharpening of the power spectrum acoustic peaks with a delensing efficiency of 29 % ( TT ) 25 % ( TE ) andmore » 22 % ( EE ). The detection significance of the delensing effects is very high in all spectra: 12 σ in EE polarization; 18 σ in TE ; and 20 σ in TT . The null hypothesis of no lensing in the maps is rejected at 26 σ. While direct detection of the power in lensing B -modes themselves is not possible at high significance at Planck noise levels, we do detect (at 4.5 σ (under the null hypothesis)) delensing effects in the B -mode map, with 7 % reduction in lensing power. Our results provide a first demonstration of polarization delensing, and generally of internal CMB delensing, and stand in agreement with the baseline ΛCDM Planck 2015 cosmology expectations.« less

  9. New Day for Longest-Working Mars Rover

    NASA Image and Video Library

    2018-02-16

    NASA's Mars Exploration Rover Opportunity recorded the dawn of the rover's 4,999th Martian day, or sol, with its Panoramic Camera (Pancam) on Feb. 15, 2018, yielding this processed, approximately true-color scene. The view looks across Endeavour Crater, which is about 14 miles (22 kilometers) in diameter, from the inner slope of the crater's western rim. Opportunity has driven a little over 28.02 miles (45.1 kilometers) since it landed in the Meridiani Planum region of Mars in January, 2004, for what was planned as a 90-sol mission. A sol lasts about 40 minutes longer than an Earth day. This view combines three separate Pancam exposures taken through filters centered on wavelengths of 601 microns (red), 535 microns (green) and 482 microns (blue). It was processed at Texas A&M University to correct for some of the oversaturation and glare, though it still includes some artifacts from pointing a camera with a dusty lens at the Sun. The processing includes radiometric correction, interpolation to fill in gaps in the data caused by saturation due to Sun's brightness, and warping the red and blue images to undo the effects of time passing between each of the exposures through different filters. https://photojournal.jpl.nasa.gov/catalog/PIA22221

  10. Reversing the conventional leather processing sequence for cleaner leather production.

    PubMed

    Saravanabhavan, Subramani; Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari

    2006-02-01

    Conventional leather processing generally involves a combination of single and multistep processes that employs as well as expels various biological, inorganic, and organic materials. It involves nearly 14-15 steps and discharges a huge amount of pollutants. This is primarily due to the fact that conventional leather processing employs a "do-undo" process logic. In this study, the conventional leather processing steps have been reversed to overcome the problems associated with the conventional method. The charges of the skin matrix and of the chemicals and pH profiles of the process have been judiciously used for reversing the process steps. This reversed process eventually avoids several acidification and basification/neutralization steps used in conventional leather processing. The developed process has been validated through various analyses such as chromium content, shrinkage temperature, softness measurements, scanning electron microscopy, and physical testing of the leathers. Further, the performance of the leathers is shown to be on par with conventionally processed leathers through bulk property evaluation. The process enjoys a significant reduction in COD and TS by 53 and 79%, respectively. Water consumption and discharge is reduced by 65 and 64%, respectively. Also, the process benefits from significant reduction in chemicals, time, power, and cost compared to the conventional process.

  11. The relationship between trait emotional intelligence, resiliency, and mental health in older adults: the mediating role of savouring.

    PubMed

    Wilson, Claire A; Saklofske, Donald H

    2018-05-01

    The present study explores savouring, defined as the process of attending to positive experiences, as a mediator in the relationships between resiliency, trait emotional intelligence (EI), and subjective mental health in older adults. Following Fredrickson's Broaden and Build Theory of positive emotions, the present study aims to extend our understanding of the underlying processes that link resiliency and trait EI with self-reported mental health in older adulthood. A sample of 149 adults aged 65 and over (M = 73.72) were recruited from retirement homes and community groups. Participants completed measures of resiliency, savouring, trait EI, and subjective mental health either online or in a paper format. Path analysis revealed that savouring fully mediated the relationship between resiliency and mental health. However, trait EI did not significantly predict mental health in this sample. These findings provided partial support for the Broaden and Build Theory of positive emotions. As anticipated, savouring imitated the broadening effect of positive emotions by mediating the relationship between resiliency and mental health. However, savouring failed to reflect the undoing effect of positive emotions and did not mediate the relationship between EI and mental health. These findings have implications for positive psychology exercises and may be a simple, yet effective means of improving the life quality of older adults.

  12. Substantial Equivalence Standards in Tobacco Governance: Statutory Clarity and Regulatory Precedent for the FSPTCA.

    PubMed

    Carpenter, Daniel; Connolly, Gregory N; Lempert, Lauren Kass

    2017-08-01

    The Family Smoking Prevention and Tobacco Control Act (FSPTCA) of 2009 creates the first national system of premarket regulation of tobacco products in American history. The FDA must now review and give marketing authorization to all new tobacco products, based on a public health standard, before they can be legally marketed. Yet the law also contains an alternative pathway for market entry-the substantial equivalence (SE) clause-by which novel and altered tobacco products can be marketed by demonstrating their substantial equivalence to existing products. Over 99 percent of tobacco product applications sent to the FDA under the new law have used this mechanism, and loose application of the SE mechanism carries the risk of undoing the FDA's gatekeeping power under the law. We review the statutory and regulatory precedent for SE, examining the FSPTCA itself as well as regulatory precedent from drug and device regulation (from which the term substantial equivalence and much of the associated statutory language was derived). Our review of standards and scientific precedent demonstrates that exacting scrutiny under the public health standard should govern all SE reviews and that clinical data incorporating social scientific evidence should be routinely required for SE claims by tobacco product sponsors. Copyright © 2017 by Duke University Press.

  13. Global, 4D Differential Emission Measure Analysis of EIT 17.1, 19.5 and 28.4 nm Images

    NASA Astrophysics Data System (ADS)

    Frazin, R. A.; Vasquez, A. M.; Kamalabadi, F.

    2007-12-01

    We present for the first time the results of a method that combines 3D tomography and differential emission measure (DEM) analysis to determine the 3D local differential measure (LDEM), which is a measure of the amount of plasma as a function of electron temperature within each volume element of the computation grid. The volume elements are (3 deg X 3 deg X 0.02 Rs). The input data are a time series of EUV images taken in the 17.1, 19.5 and 28.4 nm bands. The method, developed theoretically in a previous paper [Frazin et al. 2005, ApJ v. 628, p. 1070], involves a combination of solar rotational tomography (SRT) and classical differential emission measure (DEM) analysis. SRT uses solar rotation to "undo" the line-of-sight integrals, while DEM analysis determines the temperature distribution (LDEM) in each voxel. Temporal variations of the solar corona limit the applicability of SRT to structures that remain relatively stable on the two-week time scale. We show results for certain structures that were judged to be stable by watching the EIT movies. We anticipate dramatic increases in the temperature resolution of this technique with the XRT instrument.

  14. An experimental 'Life' for an experimental life: Richard Waller's biography of Robert Hooke (1705).

    PubMed

    Moxham, Noah

    2016-03-01

    Richard Waller's 'Life of Dr Robert Hooke', prefixed to his edition of Hooke's Posthumous Works (1705), is an important source for the life of one of the most eminent members of the early Royal Society. It also has the distinction of being one of the earliest biographies of a man of science to be published in English. I argue that it is in fact the first biography to embrace the subject's natural-philosophical work as the centre of his life, and I investigate Waller's reasons for adopting this strategy and his struggle with the problem of how to represent an early experimental philosopher in print. I suggest that Waller eschews the 'Christian philosopher' tradition of contemporary biography - partly because of the unusually diverse and fragmentary nature of Hooke's intellectual output - and draws instead upon the structure of the Royal Society's archive as a means of organizing and understanding Hooke's life. The most quoted phrase from Waller's biography is that Hooke became 'to a crime close and reserved' in later life; this essay argues that Waller's biographical sketch was fashioned in order to undo the effects of that reserve. In modelling his approach very closely on the structure of the society's records he was principally concerned with making Hooke's work and biography accessible, intelligible and useful to the fellowship in a context familiar to them, a context which had provided the institutional framework for most of Hooke's adult life. I argue that Waller's 'Life' was also intended to make the largest claims for Hooke's intellectual standing that the author dared in the context of the enmity between Hooke and Isaac Newton once the latter became president of the Royal Society. However, I also adduce fresh manuscript evidence that Waller actually compiled, but did not publish, a defence of Hooke's claim to have discovered the inverse square law of gravity, allowing us to glimpse a much more assertive biography of Hooke than the published version.

  15. Image editing with Adobe Photoshop 6.0.

    PubMed

    Caruso, Ronald D; Postel, Gregory C

    2002-01-01

    The authors introduce Photoshop 6.0 for radiologists and demonstrate basic techniques of editing gray-scale cross-sectional images intended for publication and for incorporation into computerized presentations. For basic editing of gray-scale cross-sectional images, the Tools palette and the History/Actions palette pair should be displayed. The History palette may be used to undo a step or series of steps. The Actions palette is a menu of user-defined macros that save time by automating an action or series of actions. Converting an image to 8-bit gray scale is the first editing function. Cropping is the next action. Both decrease file size. Use of the smallest file size necessary for the purpose at hand is recommended. Final file size for gray-scale cross-sectional neuroradiologic images (8-bit, single-layer TIFF [tagged image file format] at 300 pixels per inch) intended for publication varies from about 700 Kbytes to 3 Mbytes. Final file size for incorporation into computerized presentations is about 10-100 Kbytes (8-bit, single-layer, gray-scale, high-quality JPEG [Joint Photographic Experts Group]), depending on source and intended use. Editing and annotating images before they are inserted into presentation software is highly recommended, both for convenience and flexibility. Radiologists should find that image editing can be carried out very rapidly once the basic steps are learned and automated. Copyright RSNA, 2002

  16. What do we mean? On the importance of not abandoning scientific rigor when talking about science education.

    PubMed

    Klahr, David

    2013-08-20

    Although the "science of science communication" usually refers to the flow of scientific knowledge from scientists to the public, scientists direct most of their communications not to the public, but instead to other scientists in their field. This paper presents a case study on this understudied type of communication: within a discipline, among its practitioners. I argue that many of the contentious disagreements that exist today in the field in which I conduct my research--early science education--derive from a lack of operational definitions, such that when competing claims are made for the efficacy of one type of science instruction vs. another, the arguments are hopelessly disjointed. The aim of the paper is not to resolve the current claims and counterclaims about the most effective pedagogies in science education, but rather to note that the assessment of one approach vs. the other is all too often defended on the basis of strongly held beliefs, rather than on the results of replicable experiments, designed around operational definitions of the teaching methods being investigated. A detailed example of operational definitions from my own research on elementary school science instruction is provided. In addition, the paper addresses the issue of how casual use of labels-both within the discipline and when communicating with the public-may inadvertently "undo" the benefits of operational definitions.

  17. Environmental indivisibilities and information costs: fanaticism, agnosticism, and intellectual progress

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olson, M.

    1982-05-01

    This analysis suggests several distinctive policy recommendations about environmental problems. One is that some of the alarms about ecological catastrophes cannot simply be dismissed, even when some of those who sound the alarms seem almost fanatic. The information needed to be sure one way or another is simply lacking, and may not be attainable at reasonable cost for a long time. We are therefore left with inevitable risk. Ecological systems could also be incomparably more robust than the alarmists claim, so we might also be worrying needlessly. The implication for environmental and ecological research is that we should not exprectmore » that it will produce conclusive information, but should fund a lot of it anyhow. If previous research has produced few compelling results, valid information about these problems is scarce and therefore more valuable. The harvest of research in the areas characterized by indivisibilities is then poor but precious knowledge. If it is important to be able to change behavior quickly, when and if we finally get the information that the ecosystem can't take any more, then it is important that we have the open-mindedness needed to change our views and policies the moment decisive information arrives. Those who shout wolf too often, and those who are sure there are no wolves around, could be our undoing.« less

  18. Opportunistic biases: Their origins, effects, and an integrated solution.

    PubMed

    DeCoster, Jamie; Sparks, Erin A; Sparks, Jordan C; Sparks, Glenn G; Sparks, Cheri W

    2015-09-01

    Researchers commonly explore their data in multiple ways before deciding which analyses they will include in the final versions of their papers. While this improves the chances of researchers finding publishable results, it introduces an "opportunistic bias," such that the reported relations are stronger or otherwise more supportive of the researcher's theories than they would be without the exploratory process. The magnitudes of opportunistic biases can often be stronger than those of the effects being investigated, leading to invalid conclusions and a lack of clarity in research results. Authors typically do not report their exploratory procedures, so opportunistic biases are very difficult to detect just by reading the final version of a research report. In this article, we explain how a number of accepted research practices can lead to opportunistic biases, discuss the prevalence of these practices in psychology, consider the different effects that opportunistic biases have on psychological science, evaluate the strategies that methodologists have proposed to prevent or correct for the effects of these biases, and introduce an integrated solution to reduce the prevalence and influence of opportunistic biases. The recent prominence of articles discussing questionable research practices both in scientific journals and in the public media underscores the importance of understanding how opportunistic biases are created and how we might undo their effects. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  19. Bacteria in combination with fertilizers promote root and shoot growth of maize in saline-sodic soil.

    PubMed

    Zafar-Ul-Hye, Muhammad; Farooq, Hafiz Muhammad; Hussain, Mubshar

    2015-03-01

    Salinity is the leading abiotic stress hampering maize ( Zea mays L.) growth throughout the world, especially in Pakistan. During salinity stress, the endogenous ethylene level in plants increases, which retards proper root growth and consequent shoot growth of the plants. However, certain bacteria contain the enzyme 1-aminocyclopropane-1-carboxylate (ACC) deaminase, which converts 1-aminocyclopropane-1-carboxylic acid (an immediate precursor of ethylene biosynthesis in higher plants) into ammonia and α-ketobutyrate instead of ethylene. In the present study, two Pseudomonas bacterial strains containing ACC-deaminase were tested separately and in combinations with mineral fertilizers to determine their potential to minimize/undo the effects of salinity on maize plants grown under saline-sodic field conditions. The data recorded at 30, 50 and 70 days after sowing revealed that both the Pseudomonas bacterial strains improved root and shoot length, root and shoot fresh weight, and root and shoot dry weight up to 34, 43, 35, 71, 55 and 68%, respectively, when applied without chemical fertilizers: these parameter were enhanced up to 108, 95, 100, 131, 100 and 198%, respectively, when the strains were applied along with chemical fertilizers. It can be concluded that ACC-deaminase Pseudomonas bacterial strains applied alone and in conjunction with mineral fertilizers improved the root and shoot growth of maize seedlings grown in saline-sodic soil.

  20. Bacteria in combination with fertilizers promote root and shoot growth of maize in saline-sodic soil

    PubMed Central

    Zafar-ul-Hye, Muhammad; Farooq, Hafiz Muhammad; Hussain, Mubshar

    2015-01-01

    Salinity is the leading abiotic stress hampering maize ( Zea mays L.) growth throughout the world, especially in Pakistan. During salinity stress, the endogenous ethylene level in plants increases, which retards proper root growth and consequent shoot growth of the plants. However, certain bacteria contain the enzyme 1-aminocyclopropane-1-carboxylate (ACC) deaminase, which converts 1-aminocyclopropane-1-carboxylic acid (an immediate precursor of ethylene biosynthesis in higher plants) into ammonia and α-ketobutyrate instead of ethylene. In the present study, two Pseudomonas bacterial strains containing ACC-deaminase were tested separately and in combinations with mineral fertilizers to determine their potential to minimize/undo the effects of salinity on maize plants grown under saline-sodic field conditions. The data recorded at 30, 50 and 70 days after sowing revealed that both the Pseudomonas bacterial strains improved root and shoot length, root and shoot fresh weight, and root and shoot dry weight up to 34, 43, 35, 71, 55 and 68%, respectively, when applied without chemical fertilizers: these parameter were enhanced up to 108, 95, 100, 131, 100 and 198%, respectively, when the strains were applied along with chemical fertilizers. It can be concluded that ACC-deaminase Pseudomonas bacterial strains applied alone and in conjunction with mineral fertilizers improved the root and shoot growth of maize seedlings grown in saline-sodic soil. PMID:26221093

  1. Academic Institutionalization of Community Health Services: Way Ahead in Medical Education Reforms

    PubMed Central

    Kumar, Raman

    2012-01-01

    Policy on medical education has a major bearing on the outcome of health care delivery system. Countries plan and execute development of human resource in health, based on the realistic assessments of health system needs. A closer observation of medical education and its impact on the delivery system in India reveals disturbing trends. Primary care forms backbone of any system for health care delivery. One of the major challenges in India has been chronic deficiency of trained human resource eager to work in primary care setting. Attracting talent and employing skilled workforce seems a distant dream. Talking specifically of the medical education, there are large regional variations, urban - rural divide and issues with financing of the infrastructure. The existing design of medical education is not compatible with the health care delivery system of India. Impact is visible at both qualitative as well as quantitative levels. Medical education and the delivery system are working independent of each other, leading outcomes which are inequitable and unjust. Decades of negligence of medical education regulatory mechanism has allowed cropping of multiple monopolies governed by complex set of conflict of interest. Primary care physicians, supposed to be the community based team leaders stand disfranchised academically and professionally. To undo the distorted trajectory, a paradigm shift is required. In this paper, we propose expansion of ownership in medical education with academic institutionalization of community health services. PMID:24478994

  2. Universal coverage reforms in the USA: From Obamacare through Trump.

    PubMed

    Rice, Thomas; Unruh, Lynn Y; van Ginneken, Ewout; Rosenau, Pauline; Barnes, Andrew J

    2018-05-22

    Since the election of Donald Trump as President, momentum towards universal health care coverage in the United States has stalled, although efforts to repeal the Affordable Care Act (ACA) in its entirety failed. The ACA resulted in almost a halving of the percentage of the population under age 65 who are uninsured. In lieu of total repeal, the Republican-led Congress repealed the individual mandate to purchase health insurance, beginning in 2019. Moreover, the Trump administration is using its administrative authority to undo many of the requirements in the health insurance exchanges. Partly as a result, premium increases for the most popular plans will rise an average of 34% in 2018 and are likely to rise further after the mandate repeal goes into effect. Moreover, the administration is proposing other changes that, in providing states with more flexibility, may lead to the sale of cheaper and less comprehensive policies. In this volatile environment it is difficult to anticipate what will occur next. In the short-term there is proposed compromise legislation, where Republicans agree to provide funding for the cost-sharing subsidies if the Democrats agree to increase state flexibility in some areas and provide relief to small employers. Much will depend on the 2018 and 2020 elections. In the meantime, the prospects are that the number of uninsured will grow. Copyright © 2018 The Author(s). Published by Elsevier B.V. All rights reserved.

  3. Naval Observatory Vector Astrometry Software (NOVAS) Version 3.1, Introducing a Python Edition

    NASA Astrophysics Data System (ADS)

    Barron, Eric G.; Kaplan, G. H.; Bangert, J.; Bartlett, J. L.; Puatua, W.; Harris, W.; Barrett, P.

    2011-01-01

    The Naval Observatory Vector Astrometry Software (NOVAS) is a source-code library that provides common astrometric quantities and transformations. NOVAS calculations are accurate at the sub-milliarcsecond level. The library can supply, in one or two subroutine or function calls, the instantaneous celestial position of any star or planet in a variety of coordinate systems. NOVAS also provides access to all of the building blocks that go into such computations. NOVAS Version 3.1 introduces a Python edition alongside the Fortran and C editions. The Python edition uses the computational code from the C edition and, currently, mimics the function calls of the C edition. Future versions will expand the functionality of the Python edition to harness the object-oriented nature of the Python language, and will implement the ability to handle large quantities of objects or observers using the array functionality in NumPy (a third-party scientific package for Python). NOVAS 3.1 also adds a module to transform GCRS vectors to the ITRS; the ITRS to GCRS transformation was already provided in NOVAS 3.0. The module that corrects an ITRS vector for polar motion has been modified to undo that correction upon demand. In the C edition, the ephemeris-access functions have been revised for use on 64-bit systems and for improved performance in general. NOVAS, including documentation, is available from the USNO website (http://www.usno.navy.mil/USNO/astronomical-applications/software-products/novas).

  4. Mixing implants of differing metallic composition in the treatment of upper-extremity fractures.

    PubMed

    Acevedo, Daniel; Loy, Bo Nasmyth; Loy, Bo Nasymuth; Lee, Brian; Omid, Reza; Itamura, John

    2013-09-01

    Mixing implants with differing metallic compositions has been avoided for fear of galvanic corrosion and subsequent failure of the implants and of bone healing. The purpose of this study was to evaluate upper-extremity fractures treated with open reduction and internal fixation with metallic implants that differed in metallic composition placed on the same bone. The authors studied the effects of using both stainless steel and titanium implants on fracture healing, implant failure, and other complications associated with this method of fixation. Their hypothesis was that combining these metals on the same bone would not cause clinically significant nonunions or undo clinical effects from galvanic corrosion. A retrospective review was performed of 17 patients with upper-extremity fractures fixed with metal implants of differing metallic compositions. The primary endpoint was fracture union. Eight clavicles, 2 proximal humeri, 3 distal humeri, 3 olecranons, and 1 glenoid fracture with an average follow-up 10 months were reviewed. All fractures healed. One patient experienced screw backout, which did not affect healing. This study implies that mixing implants with differing metallic compositions on the same bone for the treatment of fractures does not adversely affect bone healing. No evidence existed of corrosion or an increase in complications with this method of treatment. Contrary to prior belief, small modular hand stainless steel plates can be used to assist in reduction of smaller fracture fragments in combination with anatomic titanium plates to obtain anatomic reduction of the fracture without adversely affecting healing. Copyright 2013, SLACK Incorporated.

  5. Free cooling phase-diagram of hard-spheres with short- and long-range interactions

    NASA Astrophysics Data System (ADS)

    Gonzalez, S.; Thornton, A. R.; Luding, S.

    2014-10-01

    We study the stability, the clustering and the phase-diagram of free cooling granular gases. The systems consist of mono-disperse particles with additional non-contact (long-range) interactions, and are simulated here by the event-driven molecular dynamics algorithm with discrete (short-range shoulders or wells) potentials (in both 2D and 3D). Astonishingly good agreement is found with a mean field theory, where only the energy dissipation term is modified to account for both repulsive or attractive non-contact interactions. Attractive potentials enhance cooling and structure formation (clustering), whereas repulsive potentials reduce it, as intuition suggests. The system evolution is controlled by a single parameter: the non-contact potential strength scaled by the fluctuation kinetic energy (granular temperature). When this is small, as expected, the classical homogeneous cooling state is found. However, if the effective dissipation is strong enough, structure formation proceeds, before (in the repulsive case) non-contact forces get strong enough to undo the clustering (due to the ongoing dissipation of granular temperature). For both repulsive and attractive potentials, in the homogeneous regime, the cooling shows a universal behaviour when the (inverse) control parameter is used as evolution variable instead of time. The transition to a non-homogeneous regime, as predicted by stability analysis, is affected by both dissipation and potential strength. This can be cast into a phase diagram where the system changes with time, which leaves open many challenges for future research.

  6. Sex before the State: Civic Sex, Reproductive Innovations, and Gendered Parental Identity.

    PubMed

    Murphy, Timothy F

    2017-04-01

    Certain changes in the way that states classify people by sex as well as certain reproductive innovations undercut the rationale for state identification of people as male or female in signifying gendered parental relationships to children. At present, people known to the state as men may be genetic mothers to their children; people known to the state as women may be genetic fathers to their children. Synthetic gametes would make it possible for transgender men to be genetically related to children as fathers and transgender women to be genetically related to children as mothers, even if they have otherwise relied on naturally-occurring gametes to be genetic mothers and genetic fathers of children respectively. Synthetic gametes would presumably make it possible for any person to be the genetic father or genetic mother of children, even in a mix-and-match way. Other reproductive innovations will also undercut existing expectations of gendered parental identity. Uterus transplants would uncouple the maternal function of gestation from women, allowing men to share in maternity that way. Extracorporeal gestation ((ExCG)-gestation outside anyone's body-would also undercut the until-now absolute connection between female sex and maternity. In kind, effects such as these-undoing conventionally gendered parenthood-undercut the state's interest in knowing whether parents are male or female in relation to a given child, as against knowing simply whether someone stands in a parental relationship to that child, as a matter of rights and duties.

  7. Quantum reversibility is relative, or does a quantum measurement reset initial conditions?

    PubMed

    Zurek, Wojciech H

    2018-07-13

    I compare the role of the information in classical and quantum dynamics by examining the relation between information flows in measurements and the ability of observers to reverse evolutions. I show that in the Newtonian dynamics reversibility is unaffected by the observer's retention of the information about the measurement outcome. By contrast-even though quantum dynamics is unitary, hence, reversible-reversing quantum evolution that led to a measurement becomes, in principle, impossible for an observer who keeps the record of its outcome. Thus, quantum irreversibility can result from the information gain rather than just its loss-rather than just an increase of the (von Neumann) entropy. Recording of the outcome of the measurement resets, in effect, initial conditions within the observer's (branch of) the Universe. Nevertheless, I also show that the observer's friend-an agent who knows what measurement was successfully carried out and can confirm that the observer knows the outcome but resists his curiosity and does not find out the result-can, in principle, undo the measurement. This relativity of quantum reversibility sheds new light on the origin of the arrow of time and elucidates the role of information in classical and quantum physics. Quantum discord appears as a natural measure of the extent to which dissemination of information about the outcome affects the ability to reverse the measurement.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).

  8. Rewriting the epigenetic code for tumor resensitization: a review.

    PubMed

    Oronsky, Bryan; Oronsky, Neil; Scicinski, Jan; Fanger, Gary; Lybeck, Michelle; Reid, Tony

    2014-10-01

    In cancer chemotherapy, one axiom, which has practically solidified into dogma, is that acquired resistance to antitumor agents or regimens, nearly inevitable in all patients with metastatic disease, remains unalterable and irreversible, rendering therapeutic rechallenge futile. However, the introduction of epigenetic therapies, including histone deacetylase inhibitors (HDACis) and DNA methyltransferase inhibitors (DNMTIs), provides oncologists, like computer programmers, with new techniques to "overwrite" the modifiable software pattern of gene expression in tumors and challenge the "one and done" treatment prescription. Taking the epigenetic code-as-software analogy a step further, if chemoresistance is the product of multiple nongenetic alterations, which develop and accumulate over time in response to treatment, then the possibility to hack or tweak the operating system and fall back on a "system restore" or "undo" feature, like the arrow icon in the Windows XP toolbar, reconfiguring the tumor to its baseline nonresistant state, holds tremendous promise for turning advanced, metastatic cancer from a fatal disease into a chronic, livable condition. This review aims 1) to explore the potential mechanisms by which a group of small molecule agents including HDACis (entinostat and vorinostat), DNMTIs (decitabine and 5-azacytidine), and redox modulators (RRx-001) may reprogram the tumor microenvironment from a refractory to a nonrefractory state, 2) highlight some recent findings, and 3) discuss whether the current "once burned forever spurned" paradigm in the treatment of metastatic disease should be revised to promote active resensitization attempts with formerly failed chemotherapies.

  9. Natural leathers from natural materials: progressing toward a new arena in leather processing.

    PubMed

    Saravanabhavan, Subramani; Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari

    2004-02-01

    Globally, the leather industry is currently undergoing radical transformation due to pollution and discharge legislations. Thus, the leather industry is pressurized to look for cleaner options for processing the raw hides and skins. Conventional methods of pre-tanning, tanning and post-tanning processes are known to contribute more than 98% of the total pollution load from the leather processing. The conventional method of the tanning process involves the "do-undo" principle. Furthermore, the conventional methods employed in leather processing subject the skin/ hide to a wide variation in pH (2.8-13.0). This results in the emission of huge amounts of pollution loads such as BOD, COD, TDS, TS, sulfates, chlorides and chromium. In the approach illustrated here, the hair and flesh removal as well as fiber opening have been achieved using biocatalysts at pH 8.0, pickle-free natural tanning employing vegetable tannins, and post-tanning using environmentally friendly chemicals. Hence, this process involves dehairing, fiber opening, and pickle-free natural tanning followed by ecofriendly post-tanning. It has been found that the extent of hair removal and opening up of fiber bundles is comparable to that of conventionally processed leathers. This has been substantiated through scanning electron microscopic analysis and softness measurements. Performance of the leathers is shown to be on par with conventionally chrome-tanned leathers through physical and hand evaluation. The process also exhibits zero metal (chromium) discharge and significant reduction in BOD, COD, TDS, and TS loads by 83, 69, 96, and 96%, respectively. Furthermore, the developed process seems to be economically viable.

  10. Re-programming tumour cell metabolism to treat cancer: no lone target for lonidamine.

    PubMed

    Bhutia, Yangzom D; Babu, Ellappan; Ganapathy, Vadivel

    2016-06-01

    Tumour cell metabolism is very different from normal cell metabolism; cancer cells re-programme the metabolic pathways that occur in normal cells in such a manner that it optimizes their proliferation, growth and survival. Although this metabolic re-programming obviously operates to the advantage of the tumour, it also offers unique opportunities for effective cancer therapy. Molecules that target the tumour cell-specific metabolic pathways have potential as novel anti-cancer drugs. Lonidamine belongs to this group of molecules and is already in use in some countries for cancer treatment. It has been known for a long time that lonidamine interferes with energy production in tumour cells by inhibiting hexokinase II (HKII), a glycolytic enzyme. However, subsequent studies have uncovered additional pharmacological targets for the drug, which include the electron transport chain and the mitochondrial permeability transition pore, thus expanding the pharmacological effects of the drug on tumour cell metabolism. A study by Nancolas et al. in a recent issue of the Biochemical Journal identifies two additional new targets for lonidamine: the pyruvate transporter in the mitochondria and the H(+)-coupled monocarboxylate transporters in the plasma membrane (PM). It is thus becoming increasingly apparent that the anti-cancer effects of lonidamine do not occur through a single target; the drug works at multiple sites. Irrespective of the molecular targets, what lonidamine does in the end is to undo what the tumour cells have done in terms of re-programming cellular metabolism and mitochondrial function. © 2016 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.

  11. Investigating the Role of Coherence Effects on Jet Quenching in Pb-Pb Collisions at √{sNN} = 2.76 TeV using Jet Substructure

    NASA Astrophysics Data System (ADS)

    Zardoshti, Nima; Alice Collaboration

    2017-11-01

    We report measurements of two jet shapes, the ratio of 2-Subjettiness to 1-Subjettiness (τ2 /τ1) and the opening angle between the two axes of the 2-Subjettiness jet shape, which is obtained by reclustering the jet with the exclusive-kT algorithm [S.D.Ellis and D.E.Soper, Phys.Rev.B 48, 3160] and undoing the final clustering step. The aim of this measurement is to explore a possible change in the rate of 2-pronged objects in Pb-Pb compared to pp due to colour coherence. Coherence effects [Y.Mehtar-Tani, C.A.Salgado and K.Tywoniuk Phys. Rev. Lett. 106:122002, 2011] relate to the ability of the medium to resolve a jet's substructure, which has an impact on the energy loss magnitude and mechanism of the traversing jet. In both collision systems charged jets are found with the anti-kT algorithm [M.Cacciari, G.P.Salam and G.Soyez JHEP 0804:063, 2008], a resolution parameter of R = 0.4 and a constituent cut off of 0.15 GeV. This analysis uses hadron-jet coincidence techniques in Pb-Pb collisions to reject the combinatorial background and corrects further for background effects by employing various jet shape subtraction techniques and two dimensional unfolding. Measurements of the Nsubjettiness for jet momenta of 40-60 GeV/c in Pb-Pb collisions at √{sNN} = 2.76 TeV and pp collisions at √{ s} = 7 TeV will be presented and compared to PYTHIA simulations.

  12. Post-Earthquake Reconstruction — in Context of Housing

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Comprehensive rescue and relief operations are always launched with no loss of time with active participation of the Army, Governmental agencies, Donor agencies, NGOs, and other Voluntary organizations after each Natural Disaster. There are several natural disasters occurring throughout the world round the year and one of them is Earthquake. More than any other natural catastrophe, an earthquake represents the undoing of our most basic pre-conceptions of the earth as the source of stability or the first distressing factor due to earthquake is the collapse of our dwelling units. Earthquake has affected buildings since people began constructing them. So after each earthquake a reconstruction of housing program is very much essential since housing is referred to as shelter satisfying one of the so-called basic needs next to food and clothing. It is a well-known fact that resettlement (after an earthquake) is often accompanied by the creation of ghettos and ensuing problems in the provision of infrastructure and employment. In fact a housing project after Bhuj earthquake in Gujarat, India, illustrates all the negative aspects of resettlement in the context of reconstruction. The main theme of this paper is to consider few issues associated with post-earthquake reconstruction in context of housing, all of which are significant to communities that have had to rebuild after catastrophe or that will face such a need in the future. Few of them are as follows: (1) Why rebuilding opportunities are time consuming? (2) What are the causes of failure in post-earthquake resettlement? (3) How can holistic planning after an earthquake be planned? (4) What are the criteria to be checked for sustainable building materials? (5) What are the criteria for success in post-earthquake resettlement? (6) How mitigation in post-earthquake housing can be made using appropriate repair, restoration, and strengthening concepts?

  13. Bioclipse: an open source workbench for chemo- and bioinformatics.

    PubMed

    Spjuth, Ola; Helmus, Tobias; Willighagen, Egon L; Kuhn, Stefan; Eklund, Martin; Wagener, Johannes; Murray-Rust, Peter; Steinbeck, Christoph; Wikberg, Jarl E S

    2007-02-22

    There is a need for software applications that provide users with a complete and extensible toolkit for chemo- and bioinformatics accessible from a single workbench. Commercial packages are expensive and closed source, hence they do not allow end users to modify algorithms and add custom functionality. Existing open source projects are more focused on providing a framework for integrating existing, separately installed bioinformatics packages, rather than providing user-friendly interfaces. No open source chemoinformatics workbench has previously been published, and no successful attempts have been made to integrate chemo- and bioinformatics into a single framework. Bioclipse is an advanced workbench for resources in chemo- and bioinformatics, such as molecules, proteins, sequences, spectra, and scripts. It provides 2D-editing, 3D-visualization, file format conversion, calculation of chemical properties, and much more; all fully integrated into a user-friendly desktop application. Editing supports standard functions such as cut and paste, drag and drop, and undo/redo. Bioclipse is written in Java and based on the Eclipse Rich Client Platform with a state-of-the-art plugin architecture. This gives Bioclipse an advantage over other systems as it can easily be extended with functionality in any desired direction. Bioclipse is a powerful workbench for bio- and chemoinformatics as well as an advanced integration platform. The rich functionality, intuitive user interface, and powerful plugin architecture make Bioclipse the most advanced and user-friendly open source workbench for chemo- and bioinformatics. Bioclipse is released under Eclipse Public License (EPL), an open source license which sets no constraints on external plugin licensing; it is totally open for both open source plugins as well as commercial ones. Bioclipse is freely available at http://www.bioclipse.net.

  14. The moving-least-squares-particle hydrodynamics method (MLSPH)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dilts, G.

    1997-12-31

    An enhancement of the smooth-particle hydrodynamics (SPH) method has been developed using the moving-least-squares (MLS) interpolants of Lancaster and Salkauskas which simultaneously relieves the method of several well-known undesirable behaviors, including spurious boundary effects, inaccurate strain and rotation rates, pressure spikes at impact boundaries, and the infamous tension instability. The classical SPH method is derived in a novel manner by means of a Galerkin approximation applied to the Lagrangian equations of motion for continua using as basis functions the SPH kernel function multiplied by the particle volume. This derivation is then modified by simply substituting the MLS interpolants for themore » SPH Galerkin basis, taking care to redefine the particle volume and mass appropriately. The familiar SPH kernel approximation is now equivalent to a colocation-Galerkin method. Both classical conservative and recent non-conservative formulations of SPH can be derived and emulated. The non-conservative forms can be made conservative by adding terms that are zero within the approximation at the expense of boundary-value considerations. The familiar Monaghan viscosity is used. Test calculations of uniformly expanding fluids, the Swegle example, spinning solid disks, impacting bars, and spherically symmetric flow illustrate the superiority of the technique over SPH. In all cases it is seen that the marvelous ability of the MLS interpolants to add up correctly everywhere civilizes the noisy, unpredictable nature of SPH. Being a relatively minor perturbation of the SPH method, it is easily retrofitted into existing SPH codes. On the down side, computational expense at this point is significant, the Monaghan viscosity undoes the contribution of the MLS interpolants, and one-point quadrature (colocation) is not accurate enough. Solutions to these difficulties are being pursued vigorously.« less

  15. U.S. elementary and secondary schools: equalizing opportunity or replicating the status quo?

    PubMed

    Rouse, Cecilia Elena; Barrow, Lisa

    2006-01-01

    Although education pays off handsomely in the United States, children from low-income families attain less education than children from more advantaged families. In this article, Cecilia Elena Rouse and Lisa Barrow investigate why family background is so strongly linked to education. The authors show that family socioeconomic status affects such educational outcomes as test scores, grade retention, and high school graduation, and that educational attainment strongly affects adult earnings. They then go on to ask why children from more advantaged families get more or better schooling than those from less advantaged families. For low-income students, greater psychological costs, the cost of forgone income (continuing in school instead of getting a job), and borrowing costs all help to explain why these students attain less education than more privileged children. And these income-related differences in costs may themselves be driven by differences in access to quality schools. As a result, U.S. public schools tend to reinforce the transmission of low socioeconomic status from parents to children. Policy interventions aimed at improving school quality for children from disadvantaged families thus have the potential to increase social mobility. Despite the considerable political attention paid to increasing school accountability, as in the No Child Left Behind Act, along with charter schools and vouchers to help the children of poor families attend private school, to date the best evidence suggests that such programs will improve student achievement only modestly. Based on the best research evidence, smaller class sizes seem to be one promising avenue for improving school quality for disadvantaged students. High teacher quality is also likely to be important. However, advantaged families, by spending more money on education outside school, can and will partly undo policy attempts to equalize school quality for poor and nonpoor children.

  16. Feminist ecology: Doing, undoing, and redoing gender in science

    PubMed Central

    Teller, Amy S.; Porcelli, Apollonya M.

    2017-01-01

    Women continue to be underrepresented in STEM fields and also are more likely to leave academic careers than men. While much existing sociological research on gender in science focuses on structures, institutions, and policies, we take a cultural and phenomenological approach to the question. We focus on the interaction between structural and micro-sociological forces that uphold existing gender inequalities and drive new forms of inequality within the discipline of ecology by tracing the experience of female graduate students. Ecology in the United States and elsewhere is currently undergoing three shifts, well documented by previous studies—more female scientists, interdisciplinary work, and research in human-altered landscapes—that comprise a transition to what we call “feminist ecology.” We ask whether these disciplinary-level shifts in ecology are accompanied by renegotiations in the way ecologists “do gender” as they work. In this paper we argue that despite structural changes toward a feminist ecology, gender inequalities are not eliminated. Our data collected using ethnographic and autoethnographic methods during ecological fieldwork in the Northeastern United States, show that gender inequality persists through daily interactions, shaping the way that fieldwork is conducted and bodies are policed. We provide additional evidence of the way that ecologists and non-ecologists interact during fieldwork, highlighting the embeddedness of scientific disciplines within larger societal forces. Thus, the question of women in science cannot be understood strictly from within the bounds of science but extends to gender relations in society at large. We hope that this study can serve as a teaching tool for university efforts to increase the success, not just the prevalence, of women in science, and facilitate productive interdisciplinary research across disciplines. PMID:28989594

  17. Impact of variation in the BDNF gene on social stress sensitivity and the buffering impact of positive emotions: replication and extension of a gene-environment interaction.

    PubMed

    van Winkel, Mark; Peeters, Frenk; van Winkel, Ruud; Kenis, Gunter; Collip, Dina; Geschwind, Nicole; Jacobs, Nele; Derom, Catherine; Thiery, Evert; van Os, Jim; Myin-Germeys, Inez; Wichers, Marieke

    2014-06-01

    A previous study reported that social stress sensitivity is moderated by the brain-derived-neurotrophic-factor(Val66Met) (BDNF rs6265) genotype. Additionally, positive emotions partially neutralize this moderating effect. The current study aimed to: (i) replicate in a new independent sample of subjects with residual depressive symptoms the moderating effect of BDNF(Val66Met) genotype on social stress sensitivity, (ii) replicate the neutralizing impact of positive emotions, (iii) extend these analyses to other variations in the BDNF gene in the new independent sample and the original sample of non-depressed individuals. Previous findings were replicated in an experience sampling method (ESM) study. Negative Affect (NA) responses to social stress were stronger in "Val/Met" carriers of BDNF(Val66Met) compared to "Val/Val" carriers. Positive emotions neutralized the moderating effect of BDNF(Val66Met) genotype on social stress sensitivity in a dose-response fashion. Finally, two of four additional BDNF SNPs (rs11030101, rs2049046) showed similar moderating effects on social stress-sensitivity across both samples. The neutralizing effect of positive emotions on the moderating effects of these two additional SNPs was found in one sample. In conclusion, ESM has important advantages in gene-environment (GxE) research and may attribute to more consistent findings in future GxE research. This study shows how the impact of BDNF genetic variation on depressive symptoms may be explained by its impact on subtle daily life responses to social stress. Further, it shows that the generation of positive affect (PA) can buffer social stress sensitivity and partially undo the genetic susceptibility. Copyright © 2014 Elsevier B.V. and ECNP. All rights reserved.

  18. JSME: a free molecule editor in JavaScript.

    PubMed

    Bienfait, Bruno; Ertl, Peter

    2013-01-01

    A molecule editor, i.e. a program facilitating graphical input and interactive editing of molecules, is an indispensable part of every cheminformatics or molecular processing system. Today, when a web browser has become the universal scientific user interface, a tool to edit molecules directly within the web browser is essential. One of the most popular tools for molecular structure input on the web is the JME applet. Since its release nearly 15 years ago, however the web environment has changed and Java applets are facing increasing implementation hurdles due to their maintenance and support requirements, as well as security issues. This prompted us to update the JME editor and port it to a modern Internet programming language - JavaScript. The actual molecule editing Java code of the JME editor was translated into JavaScript with help of the Google Web Toolkit compiler and a custom library that emulates a subset of the GUI features of the Java runtime environment. In this process, the editor was enhanced by additional functionalities including a substituent menu, copy/paste, drag and drop and undo/redo capabilities and an integrated help. In addition to desktop computers, the editor supports molecule editing on touch devices, including iPhone, iPad and Android phones and tablets. In analogy to JME the new editor is named JSME. This new molecule editor is compact, easy to use and easy to incorporate into web pages. A free molecule editor written in JavaScript was developed and is released under the terms of permissive BSD license. The editor is compatible with JME, has practically the same user interface as well as the web application programming interface. The JSME editor is available for download from the project web page http://peter-ertl.com/jsme/

  19. [The disgrace of Antoine Daquin, first physician of Louis XIV (1693)].

    PubMed

    Peumery, J J

    1996-12-01

    Antoine Daquin, Principal Physician of Louis XIV and Earl of Jouy-en-Josas, was born in Paris. He was the son of Louis-Henri Daquin, Physician to Queen Marie de Médicis; his paternal grandfather, born in the Jewish religion, became converted to catholicism at Aquino, in Italy, whence his name d'Aquin, then Daquin. A. Daquin studied to be a doctor at Montpellier and graduated on 18 May 1648. He married Marguerite Gayant, Antoine Vallot's niece, Antoine Vallot being the Principal Physician of Louis XIV. This relationship permitted him to get the position of Principal Physician of the Queen, then, after Vallot's death, to succeed him, on 18 April 1672, as Principal Physician of the King. The kindliness of the King's mistress, Mme de Montespan, helped him in that appointment. Daquin was a good doctor, he turned out awkward: "great courtier, but rich, miser, grasping, wanting to establish his family anyway" said the Duc de Saint-Simon. He dared ask the King for the Archbishopric of Tours for his son: "it was the rock on which he broke up" said again Saint-Simon. On 2 November 1693, the comte de Pontchartrain came to his home by order of the King, to tell him, he was ordered to retire from Court without delay. It was forbidden him to come back or to write to the King. Guy-Crescent Fagon was designated "Premier Médecin" instead of him; but Fagon had worked at the undoing of Daquin, with a view to robbing him of his position, with the complicity of the King's new mistress, Mme de Maintenon. After his disgrace, Daquin retired probably to Moulins; he died obscurely in Vichy, on 17 May 1696. Today, Daquin is regarded as a victim of intrigues of Court, which explains his celebrity.

  20. Biointervention makes leather processing greener: an integrated cleansing and tanning system.

    PubMed

    Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari

    2003-06-01

    The do-undo methods adopted in conventional leather processing generate huge amounts of pollutants. In other words, conventional methods employed in leather processing subject the skin/hide to wide variations in pH. Pretanning and tanning processes alone contribute more than 90% of the total pollution from leather processing. Included in this is a great deal of solid wastes such as lime and chrome sludge. In the approach described here, the hair and flesh removal as well as fiber opening have been achieved using biocatalysts at pH 8.0 for cow hides. This was followed by a pickle-free chrome tanning, which does not require a basification step. Hence, this tanning technique involves primarily three steps, namely, dehairing, fiber opening, and tanning. It has been found that the extent of hair removal, opening up of fiber bundles, and penetration and distribution of chromium are comparable to that produced by traditional methods. This has been substantiated through scanning electron microscopic, stratigraphic chrome distribution analysis, and softness measurements. Performance of the leathers is shown to be on par with conventionally processed leathers through physical and hand evaluation. Importantly, softness of the leathers is numerically proven to be comparable with that of control. The process also demonstrates reduction in chemical oxygen demand load by 80%, total solids load by 85%, and chromium load by 80% as compared to the conventional process, thereby leading toward zero discharge. The input-output audit shows that the biocatalytic three-step tanning process employs a very low amount of chemicals, thereby reducing the discharge by 90% as compared to the conventional multistep processing. Furthermore, it is also demonstrated that the process is technoeconomically viable.

  1. Monetizing the social benefits of landfill mining: Evidence from a Contingent Valuation survey in a rural area in Greece.

    PubMed

    Damigos, Dimitris; Menegaki, Maria; Kaliampakos, Dimitris

    2016-05-01

    Despite the emerging global attention towards promoting waste management policies that reduce environmental impacts and conserve natural resources, landfilling still remains the dominant waste management practice in many parts of the world. Owing to this situation, environmental burdens are bequeathed to and large amounts of potentially valuable materials are lost for future generations. As a means to undo these adverse effects a process known as landfill mining (LFM) could be implemented provided that economic feasibility is ensured. So far, only a few studies have focused on the economic feasibility of LFM from a private point of view and even less studies have attempted to economically justify the need for LMF projects from a social point of view. This paper, aiming to add to the limited literature in the field, presents the results of a survey conducted in a rural district in Greece, by means of the Contingent Valuation method (CVM) in order to estimate society's willingness to pay for LFM programs. According to the empirical survey, more than 95% of the respondents recognize the need for LFM programs. Nevertheless, only one-fourth of the respondents are willing to pay through increased taxes for LFM, owing mainly to economic depression and unemployment. Those who accept the increased tax are willing to pay about €50 per household per year, on average, which results in a mean willingness to pay (WTP) for the entire population under investigation of around €12 per household per year. The findings of this research work provide useful insights about the 'dollar-based' benefits of LFM in the context of social cost-benefit analysis of LFM projects. Yet, it is evident that further research is necessary. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Iterative initial condition reconstruction

    NASA Astrophysics Data System (ADS)

    Schmittfull, Marcel; Baldauf, Tobias; Zaldarriaga, Matias

    2017-07-01

    Motivated by recent developments in perturbative calculations of the nonlinear evolution of large-scale structure, we present an iterative algorithm to reconstruct the initial conditions in a given volume starting from the dark matter distribution in real space. In our algorithm, objects are first moved back iteratively along estimated potential gradients, with a progressively reduced smoothing scale, until a nearly uniform catalog is obtained. The linear initial density is then estimated as the divergence of the cumulative displacement, with an optional second-order correction. This algorithm should undo nonlinear effects up to one-loop order, including the higher-order infrared resummation piece. We test the method using dark matter simulations in real space. At redshift z =0 , we find that after eight iterations the reconstructed density is more than 95% correlated with the initial density at k ≤0.35 h Mpc-1 . The reconstruction also reduces the power in the difference between reconstructed and initial fields by more than 2 orders of magnitude at k ≤0.2 h Mpc-1 , and it extends the range of scales where the full broadband shape of the power spectrum matches linear theory by a factor of 2-3. As a specific application, we consider measurements of the baryonic acoustic oscillation (BAO) scale that can be improved by reducing the degradation effects of large-scale flows. In our idealized dark matter simulations, the method improves the BAO signal-to-noise ratio by a factor of 2.7 at z =0 and by a factor of 2.5 at z =0.6 , improving standard BAO reconstruction by 70% at z =0 and 30% at z =0.6 , and matching the optimal BAO signal and signal-to-noise ratio of the linear density in the same volume. For BAO, the iterative nature of the reconstruction is the most important aspect.

  3. OpenChrom: a cross-platform open source software for the mass spectrometric analysis of chromatographic data.

    PubMed

    Wenig, Philip; Odermatt, Juergen

    2010-07-30

    Today, data evaluation has become a bottleneck in chromatographic science. Analytical instruments equipped with automated samplers yield large amounts of measurement data, which needs to be verified and analyzed. Since nearly every GC/MS instrument vendor offers its own data format and software tools, the consequences are problems with data exchange and a lack of comparability between the analytical results. To challenge this situation a number of either commercial or non-profit software applications have been developed. These applications provide functionalities to import and analyze several data formats but have shortcomings in terms of the transparency of the implemented analytical algorithms and/or are restricted to a specific computer platform. This work describes a native approach to handle chromatographic data files. The approach can be extended in its functionality such as facilities to detect baselines, to detect, integrate and identify peaks and to compare mass spectra, as well as the ability to internationalize the application. Additionally, filters can be applied on the chromatographic data to enhance its quality, for example to remove background and noise. Extended operations like do, undo and redo are supported. OpenChrom is a software application to edit and analyze mass spectrometric chromatographic data. It is extensible in many different ways, depending on the demands of the users or the analytical procedures and algorithms. It offers a customizable graphical user interface. The software is independent of the operating system, due to the fact that the Rich Client Platform is written in Java. OpenChrom is released under the Eclipse Public License 1.0 (EPL). There are no license constraints regarding extensions. They can be published using open source as well as proprietary licenses. OpenChrom is available free of charge at http://www.openchrom.net.

  4. Spoken Language Processing in the Clarissa Procedure Browser

    NASA Technical Reports Server (NTRS)

    Rayner, M.; Hockey, B. A.; Renders, J.-M.; Chatzichrisafis, N.; Farrell, K.

    2005-01-01

    Clarissa, an experimental voice enabled procedure browser that has recently been deployed on the International Space Station, is as far as we know the first spoken dialog system in space. We describe the objectives of the Clarissa project and the system's architecture. In particular, we focus on three key problems: grammar-based speech recognition using the Regulus toolkit; methods for open mic speech recognition; and robust side-effect free dialogue management for handling undos, corrections and confirmations. We first describe the grammar-based recogniser we have build using Regulus, and report experiments where we compare it against a class N-gram recogniser trained off the same 3297 utterance dataset. We obtained a 15% relative improvement in WER and a 37% improvement in semantic error rate. The grammar-based recogniser moreover outperforms the class N-gram version for utterances of all lengths from 1 to 9 words inclusive. The central problem in building an open-mic speech recognition system is being able to distinguish between commands directed at the system, and other material (cross-talk), which should be rejected. Most spoken dialogue systems make the accept/reject decision by applying a threshold to the recognition confidence score. NASA shows how a simple and general method, based on standard approaches to document classification using Support Vector Machines, can give substantially better performance, and report experiments showing a relative reduction in the task-level error rate by about 25% compared to the baseline confidence threshold method. Finally, we describe a general side-effect free dialogue management architecture that we have implemented in Clarissa, which extends the "update semantics'' framework by including task as well as dialogue information in the information state. We show that this enables elegant treatments of several dialogue management problems, including corrections, confirmations, querying of the environment, and regression testing.

  5. Post-traumatic stress and coping in an inner-city child. Traumatogenic witnessing of interparental violence and murder.

    PubMed

    Parson, E R

    1995-01-01

    Violence today appears to be ubiquitous: it even enters the clinical session, deeply internalized within child victims who were exposed to often unspeakable horror. Violence and its pernicious, horrific effects are observed in the streets, schools, parks, playgrounds, and homes of some inner-city communities. This article introduces the use of Anna Freud's Diagnostic Profile system with an inner-city child who, at the age of four, witnessed his mother fatally stab his father with a kitchen knife and at age eleven was assessed and treated by the author. Clinicians may wonder whether any kind of therapy could ever undo the serious fixations, regressions, developmental arrests, and integrate trauma-shattered ego functions observed in children exposed to visual horror and affective terror. Application of the Profile may offer some direction with these children: a panoramic view of their painful mood, their hypervigilance and distrust, fears, separation and annihilation anxieties, nightmares (with murder imagery), developmental anomalies and arrests is presented with clarity and force. The therapist uses countertransference responses to monitor the affect tolerance in the child and to determine the appropriate dosages of awareness the child can integrate from one moment to the next. The therapist also serves as the child's external stimulus barrier and explores feelings about media-driven portrayals of violence, stereotypes, and inner-city children and youths. The unsurpassed utility of the Profile as a diagnostic system that documents vital economic, dynamic, structural, genetic and adaptive-coping information about the child is discussed in detail as is the Profile's added benefit of possibly guarding against misdiagnosis and charting a course for psychotherapy in difficult city-violence trauma cases.

  6. From instinct to intellect: the challenge of maintaining healthy weight in the modern world.

    PubMed

    Peters, J C; Wyatt, H R; Donahoo, W T; Hill, J O

    2002-05-01

    The global obesity epidemic is being driven in large part by a mismatch between our environment and our metabolism. Human physiology developed to function within an environment where high levels of physical activity were needed in daily life and food was inconsistently available. For most of mankind's history, physical activity has 'pulled' appetite so that the primary challenge to the physiological system for body weight control was to obtain sufficient energy intake to prevent negative energy balance and body energy loss. The current environment is characterized by a situation whereby minimal physical activity is required for daily life and food is abundant, inexpensive, high in energy density and widely available. Within this environment, food intake 'pushes' the system, and the challenge to the control system becomes to increase physical activity sufficiently to prevent positive energy balance. There does not appear to be a strong drive to increase physical activity in response to excess energy intake and there appears to be only a weak adaptive increase in resting energy expenditure in response to excess energy intake. In the modern world, the prevailing environment constitutes a constant background pressure that promotes weight gain. We propose that the modern environment has taken body weight control from an instinctual (unconscious) process to one that requires substantial cognitive effort. In the current environment, people who are not devoting substantial conscious effort to managing body weight are probably gaining weight. It is unlikely that we would be able to build the political will to undo our modern lifestyle, to change the environment back to one in which body weight control again becomes instinctual. In order to combat the growing epidemic we should focus our efforts on providing the knowledge, cognitive skills and incentives for controlling body weight and at the same time begin creating a supportive environment to allow better management of body weight.

  7. Knowledge base rule partitioning design for CLIPS

    NASA Technical Reports Server (NTRS)

    Mainardi, Joseph D.; Szatkowski, G. P.

    1990-01-01

    This describes a knowledge base (KB) partitioning approach to solve the problem of real-time performance using the CLIPS AI shell when containing large numbers of rules and facts. This work is funded under the joint USAF/NASA Advanced Launch System (ALS) Program as applied research in expert systems to perform vehicle checkout for real-time controller and diagnostic monitoring tasks. The Expert System advanced development project (ADP-2302) main objective is to provide robust systems responding to new data frames of 0.1 to 1.0 second intervals. The intelligent system control must be performed within the specified real-time window, in order to meet the demands of the given application. Partitioning the KB reduces the complexity of the inferencing Rete net at any given time. This reduced complexity improves performance but without undo impacts during load and unload cycles. The second objective is to produce highly reliable intelligent systems. This requires simple and automated approaches to the KB verification & validation task. Partitioning the KB reduces rule interaction complexity overall. Reduced interaction simplifies the V&V testing necessary by focusing attention only on individual areas of interest. Many systems require a robustness that involves a large number of rules, most of which are mutually exclusive under different phases or conditions. The ideal solution is to control the knowledge base by loading rules that directly apply for that condition, while stripping out all rules and facts that are not used during that cycle. The practical approach is to cluster rules and facts into associated 'blocks'. A simple approach has been designed to control the addition and deletion of 'blocks' of rules and facts, while allowing real-time operations to run freely. Timing tests for real-time performance for specific machines under R/T operating systems have not been completed but are planned as part of the analysis process to validate the design.

  8. Roe v. Wade. On abortion.

    PubMed

    French, M

    1998-01-01

    In ancient Assyria, fathers held the right of life or death over their newborn infants, but women found to have performed an abortion on themselves or others were impaled and denied burial. This punishment was otherwise reserved for crimes against the state such as high treason or assault on the king. Likewise, in Babylon if a wife arranged her husband's death so that she could marry another man, she was convicted of treason and impaled or crucified. Thus, ancient thought paralleled the husband-wife relationship with that of the state-subject. The small group of men who generally dominate institutions such as the state, the church, or a corporation have a primary demand for obedience and deference to their supreme authority from their underlings. These groups did not condemn abortion because it involved questions of life or death. After all, many states have permitted infanticide, many still sanction execution, and all are willing to sacrifice the lives of their soldiers in war. Patriarchs condemn abortion because they consider it treasonous for a woman to assert the right to use her own judgement and to treat her body as if it were her own and not the property of her husband. This denies the supremacy of the male, which is the first principle of patriarchs. Because patriarchal institutions depend upon the subjection of women, women's bodies become important markers in the struggle for human freedom. This explains why patriarchal institutions in the US have continuously attacked women's right to abortion by fragmenting the statute allowing abortion and attempting to render the fragments illegal. While US women have won other rights that can be protected legally, women require the right to abortion in order to possess the right to physical integrity and to be able to undo what men have done to them. Otherwise, men would be able to create a set-back in women's human rights by forcing women into motherhood.

  9. Linear perturbation theory for tidal streams and the small-scale CDM power spectrum

    NASA Astrophysics Data System (ADS)

    Bovy, Jo; Erkal, Denis; Sanders, Jason L.

    2017-04-01

    Tidal streams in the Milky Way are sensitive probes of the population of low-mass dark matter subhaloes predicted in cold dark matter (CDM) simulations. We present a new calculus for computing the effect of subhalo fly-bys on cold streams based on the action-angle representation of streams. The heart of this calculus is a line-of-parallel-angle approach that calculates the perturbed distribution function of a stream segment by undoing the effect of all relevant impacts. This approach allows one to compute the perturbed stream density and track in any coordinate system in minutes for realizations of the subhalo distribution down to 105 M⊙, accounting for the stream's internal dispersion and overlapping impacts. We study the statistical properties of density and track fluctuations with large suites of simulations of the effect of subhalo fly-bys. The one-dimensional density and track power spectra along the stream trace the subhalo mass function, with higher mass subhaloes producing power only on large scales, while lower mass subhaloes cause structure on smaller scales. We also find significant density and track bispectra that are observationally accessible. We further demonstrate that different projections of the track all reflect the same pattern of perturbations, facilitating their observational measurement. We apply this formalism to data for the Pal 5 stream and make a first rigorous determination of 10^{+11}_{-6} dark matter subhaloes with masses between 106.5 and 109 M⊙ within 20 kpc from the Galactic centre [corresponding to 1.4^{+1.6}_{-0.9} times the number predicted by CDM-only simulations or to fsub(r < 20 kpc) ≈ 0.2 per cent] assuming that the Pal 5 stream is 5 Gyr old. Improved data will allow measurements of the subhalo mass function down to 105 M⊙, thus definitively testing whether dark matter is clumpy on the smallest scales relevant for galaxy formation.

  10. MSLICE Sequencing

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas M.; Joswig, Joseph C.; Shams, Khawaja S.; Norris, Jeffrey S.; Morris, John R.

    2011-01-01

    MSLICE Sequencing is a graphical tool for writing sequences and integrating them into RML files, as well as for producing SCMF files for uplink. When operated in a testbed environment, it also supports uplinking these SCMF files to the testbed via Chill. This software features a free-form textural sequence editor featuring syntax coloring, automatic content assistance (including command and argument completion proposals), complete with types, value ranges, unites, and descriptions from the command dictionary that appear as they are typed. The sequence editor also has a "field mode" that allows tabbing between arguments and displays type/range/units/description for each argument as it is edited. Color-coded error and warning annotations on problematic tokens are included, as well as indications of problems that are not visible in the current scroll range. "Quick Fix" suggestions are made for resolving problems, and all the features afforded by modern source editors are also included such as copy/cut/paste, undo/redo, and a sophisticated find-and-replace system optionally using regular expressions. The software offers a full XML editor for RML files, which features syntax coloring, content assistance and problem annotations as above. There is a form-based, "detail view" that allows structured editing of command arguments and sequence parameters when preferred. The "project view" shows the user s "workspace" as a tree of "resources" (projects, folders, and files) that can subsequently be opened in editors by double-clicking. Files can be added, deleted, dragged-dropped/copied-pasted between folders or projects, and these operations are undoable and redoable. A "problems view" contains a tabular list of all problems in the current workspace. Double-clicking on any row in the table opens an editor for the appropriate sequence, scrolling to the specific line with the problem, and highlighting the problematic characters. From there, one can invoke "quick fix" as described above to resolve the issue. Once resolved, saving the file causes the problem to be removed from the problem view.

  11. Population pressure, poverty and the environment.

    PubMed

    Camp, S L

    1992-06-01

    Using the agricultural revolution as a starting point, human population has grown 50 times since then. The amount of environmental and ecological damage inflicted by humans before the agricultural revolution pales in comparison to the damage done afterwards. It took until 1800, or approximately 9800 years from the beginning of the agricultural revolution, for world population to reach 1 billion. It took only 187 years to reach 5 billion and current projections estimate that it will take only 11 years to add the 6th billion. If the governments of the world do not work together during this decade and bring a family planning message to every couple of reproductive age, the results will be catastrophic. Every year 40-50 million acres of forest are cut down. On average, the people living in developing countries are cutting down forests twice as fast as they can grow back. Deforestation, combined with intensive agriculture, is turning the world's farm land into desert. Soil erosion and desertification threaten 1/3 of the total land surface which is home to 1/5 of the population. While high consumption levels in developed countries and industrial pollution worldwide do have a huge impact, the fact remains that increases in population place increased burdens on the ecology's carrying capacity. While the former problems urgently need to be addressed, reducing population growth rates eases pressure on all the aspects of the environment. China suffers from every kind of ecological problem and its reliance on high sulfur coal as a primary energy source threatens to undo all the efficiency improvements made in developed countries. Water shortages are common in China as they are in many other countries, again a problem that would be less severe if population growth were reduced. Urban areas are the fastest growing and least prepared to handle the increased demand for drinking water and sanitation control. The cost of universal family planning is only US$9 billion.

  12. Deep Sea Memory of High Atmospheric CO2 Concentration

    NASA Astrophysics Data System (ADS)

    Mathesius, Sabine; Hofmann, Matthias; Caldeira, Ken; Schellnhuber, Hans Joachim

    2015-04-01

    Carbon dioxide removal (CDR) from the atmosphere has been proposed as a powerful measure to mitigate global warming and ocean acidification. Planetary-scale interventions of that kind are often portrayed as "last-resort strategies", which need to weigh in if humankind keeps on enhancing the climate-system stock of CO2. Yet even if CDR could restore atmospheric CO2 to substantially lower concentrations, would it really qualify to undo the critical impacts of past emissions? In the study presented here, we employed an Earth System Model of Intermediate Complexity (EMIC) to investigate how CDR might erase the emissions legacy in the marine environment, focusing on pH, temperature and dissolved oxygen. Against a background of a world following the RCP8.5 emissions path ("business-as-usual") for centuries, we simulated the effects of two massive CDR interventions with CO2 extraction rates of 5 GtC yr-1 and 25 GtC yr-1, respectively, starting in 2250. We found that the 5 GtC yr-1 scheme would have only minor ameliorative influence on the oceans, even after several centuries of application. By way of contrast, the extreme 25 GtC yr-1 scheme eventually leads to tangible improvements. However, even with such an aggressive measure, past CO2 emissions leave a substantial legacy in the marine environment within the simulated period (i.e., until 2700). In summary, our study demonstrates that anthropogenic alterations of the oceans, caused by continued business-as-usual emissions, may not be reversed on a multi-centennial time scale by the most aspirational geoengineering measures. We also found that a transition from the RCP8.5 state to the state of a strong mitigation scenario (RCP2.6) is not possible, even under the assumption of extreme extraction rates (25 GtC yr-1). This is explicitly demonstrated by simulating additional scenarios, starting CDR already in 2150 and operating until the atmospheric CO2 concentration reaches 280 ppm and 180 ppm, respectively. The simulated massive CDR interventions eventually bring down the global mean pH value to the RCP2.6 level, yet cannot restore a similarly homogenous distribution - while the pH of the upper ocean returns to the preindustrial value or even exceed it (in the 180 ppm scenario), the deep ocean remains acidified. The deep ocean is out of contact with the atmosphere and therefore unreachable by atmospheric CDR. Our results suggest that the proposition that the marine consequences of early emissions reductions are comparable to those of delayed reductions plus CDR is delusive and that a policy that allows for emitting CO2 today in the hopes of removing it tomorrow is bound to generate substantial regrets.

  13. Incarceration in fragile families.

    PubMed

    Wildeman, Christopher; Western, Bruce

    2010-01-01

    Since the mid-1970s the U.S. imprisonment rate has increased roughly fivefold. As Christopher Wildeman and Bruce Western explain, the effects of this sea change in the imprisonment rate--commonly called mass imprisonment or the prison boom--have been concentrated among those most likely to form fragile families: poor and minority men with little schooling. Imprisonment diminishes the earnings of adult men, compromises their health, reduces familial resources, and contributes to family breakup. It also adds to the deficits of poor children, thus ensuring that the effects of imprisonment on inequality are transferred intergenerationally. Perversely, incarceration has its most corrosive effects on families whose fathers were involved in neither domestic violence nor violent crime before being imprisoned. Because having a parent go to prison is now so common for poor, minority children and so negatively affects them, the authors argue that mass imprisonment may increase future racial and class inequality--and may even lead to more crime in the long-term, thereby undoing any benefits of the prison boom. U.S. crime policy has thus, in the name of public safety, produced more vulnerable families and reduced the life chances of their children. Wildeman and Western advocate several policy reforms, such as limiting prison time for drug offenders and for parolees who violate the technical conditions of their parole, reconsidering sentence enhancements for repeat offenders, and expanding supports for prisoners and ex-prisoners. But Wildeman and Western argue that criminal justice reform alone will not solve the problems of school failure, joblessness, untreated addiction, and mental illness that pave the way to prison. In fact, focusing solely on criminal justice reforms would repeat the mistakes the nation made during the prison boom: trying to solve deep social problems with criminal justice policies. Addressing those broad problems, they say, requires a greater social commitment to education, public health, and the employment opportunities of low-skilled men and women. The primary sources of order and stability--public safety in its wide sense--are the informal social controls of family and work. Thus, broad social policies hold the promise not only of improving the wellbeing of fragile families, but also, by strengthening families and providing jobs, of contributing to public safety.

  14. Mucopolysaccharidosis VI in cats - clarification regarding genetic testing.

    PubMed

    Lyons, Leslie A; Grahn, Robert A; Genova, Francesca; Beccaglia, Michela; Hopwood, John J; Longeri, Maria

    2016-07-02

    The release of new DNA-based diagnostic tools has increased tremendously in companion animals. Over 70 different DNA variants are now known for the cat, including DNA variants in disease-associated genes and genes causing aesthetically interesting traits. The impact genetic tests have on animal breeding and health management is significant because of the ability to control the breeding of domestic cats, especially breed cats. If used properly, genetic testing can prevent the production of diseased animals, causing the reduction of the frequency of the causal variant in the population, and, potentially, the eventual eradication of the disease. However, testing of some identified DNA variants may be unwarranted and cause undo strife within the cat breeding community and unnecessary reduction of gene pools and availability of breeding animals. Testing for mucopolysaccharidosis Type VI (MPS VI) in cats, specifically the genetic testing of the L476P (c.1427T>C) and the D520N (c.1558G>A) variants in arylsulfatase B (ARSB), has come under scrutiny. No health problems are associated with the D520N (c.1558G>A) variant, however, breeders that obtain positive results for this variant are speculating as to possible correlation with health concerns. Birman cats already have a markedly reduced gene pool and have a high frequency of the MPS VI D520N variant. Further reduction of the gene pool by eliminating cats that are heterozygous or homozygous for only the MPS VI D520N variant could lead to more inbreeding depression effects on the breed population. Herein is debated the genetic testing of the MPS VI D520N variant in cats. Surveys from different laboratories suggest the L476P (c.1427T>C) disease-associated variant should be monitored in the cat breed populations, particularly breeds with Siamese derivations and outcrosses. However, the D520N has no evidence of association with disease in cats and testing is not recommended in the absence of L476P genotyping. Selection against the D520N is not warranted in cat populations. More rigorous guidelines may be required to support the genetic testing of DNA variants in all animal species.

  15. Java Application Shell: A Framework for Piecing Together Java Applications

    NASA Technical Reports Server (NTRS)

    Miller, Philip; Powers, Edward I. (Technical Monitor)

    2001-01-01

    This session describes the architecture of Java Application Shell (JAS), a Swing-based framework for developing interactive Java applications. Java Application Shell is being developed by Commerce One, Inc. for NASA Goddard Space Flight Center Code 588. The purpose of JAS is to provide a framework for the development of Java applications, providing features that enable the development process to be more efficient, consistent and flexible. Fundamentally, JAS is based upon an architecture where an application is considered a collection of 'plugins'. In turn, a plug-in is a collection of Swing actions defined using XML and packaged in a jar file. Plug-ins may be local to the host platform or remotely-accessible through HTTP. Local and remote plugins are automatically discovered by JAS upon application startup; plugins may also be loaded dynamically without having to re-start the application. Using Extensible Markup Language (XML) to define actions, as opposed to hardcoding them in application logic, allows easier customization of application-specific operations by separating application logic from presentation. Through XML, a developer defines an action that may appear on any number of menus, toolbars, and buttons. Actions maintain and propagate enable/disable states and specify icons, tool-tips, titles, etc. Furthermore, JAS allows actions to be implemented using various scripting languages through the use of IBM's Bean Scripting Framework. Scripted action implementation is seamless to the end-user. In addition to action implementation, scripts may be used for application and unit-level testing. In the case of application-level testing, JAS has hooks to assist a script in simulating end-user input. JAS also provides property and user preference management, JavaHelp, Undo/Redo, Multi-Document Interface, Single-Document Interface, printing, and logging. Finally, Jini technology has also been included into the framework by means of a Jini services browser and the ability to associate services with actions. Several Java technologies have been incorporated into JAS, including Swing, Internal Frames, Java Beans, XML, JavaScript, JavaHelp, and Jini. Additional information is contained in the original extended abstract.

  16. Socio-economic inequalities in the incidence of four common cancers: a population-based registry study.

    PubMed

    Tweed, E J; Allardice, G M; McLoone, P; Morrison, D S

    2018-01-01

    To investigate the relationship between socio-economic circumstances and cancer incidence in Scotland in recent years. Population-based study using cancer registry data. Data on incident cases of colorectal, lung, female breast, and prostate cancer diagnosed between 2001 and 2012 were obtained from a population-based cancer registry covering a population of approximately 2.5 million people in the West of Scotland. Socio-economic circumstances were assessed based on postcode of residence at diagnosis, using the Scottish Index of Multiple Deprivation (SIMD). For each cancer, crude and age-standardised incidence rates were calculated by quintile of SIMD score, and the number of excess cases associated with socio-economic deprivation was estimated. 93,866 cases met inclusion criteria, comprising 21,114 colorectal, 31,761 lung, 23,757 female breast, and 15,314 prostate cancers. Between 2001 and 2006, there was no consistent association between socio-economic circumstances and colorectal cancer incidence, but 2006-2012 saw an emerging deprivation gradient in both sexes. The incidence rate ratio (IRR) for colorectal cancer between most deprived and least deprived increased from 1.03 (95% confidence interval [CI] 0.91-1.16) to 1.24 (95% CI 1.11-1.39) during the study period. The incidence of lung cancer showed the strongest relationship with socio-economic circumstances, with inequalities widening across the study period among women from IRR 2.66 (95% CI 2.33-3.05) to 2.91 (95% CI 2.54-3.33) in 2001-03 and 2010-12, respectively. Breast and prostate cancer showed an inverse relationship with socio-economic circumstances, with lower incidence among people living in more deprived areas. Significant socio-economic inequalities remain in cancer incidence in the West of Scotland, and in some cases are increasing. In particular, this study has identified an emerging, previously unreported, socio-economic gradient in colorectal cancer incidence among women as well as men. Actions to prevent, mitigate, and undo health inequalities should be a public health priority. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. The influence of artificial radiation damage and thermal annealing on helium diffusion kinetics in apatite

    NASA Astrophysics Data System (ADS)

    Shuster, David L.; Farley, Kenneth A.

    2009-01-01

    Recent work [Shuster D. L., Flowers R. M. and Farley K. A. (2006) The influence of natural radiation damage on helium diffusion kinetics in apatite. Earth Planet. Sci. Lett.249(3-4), 148-161] revealing a correlation between radiogenic 4He concentration and He diffusivity in natural apatites suggests that helium migration is retarded by radiation-induced damage to the crystal structure. If so, the He diffusion kinetics of an apatite is an evolving function of time and the effective uranium concentration in a cooling sample, a fact which must be considered when interpreting apatite (U-Th)/He ages. Here we report the results of experiments designed to investigate and quantify this phenomenon by determining He diffusivities in apatites after systematically adding or removing radiation damage. Radiation damage was added to a suite of synthetic and natural apatites by exposure to between 1 and 100 h of neutron irradiation in a nuclear reactor. The samples were then irradiated with a 220 MeV proton beam and the resulting spallogenic 3He used as a diffusant in step-heating diffusion experiments. In every sample, irradiation increased the activation energy ( E a) and the frequency factor ( D o/ a2) of diffusion and yielded a higher He closure temperature ( T c) than the starting material. For example, 100 h in the reactor caused the He closure temperature to increase by as much as 36 °C. For a given neutron fluence the magnitude of increase in closure temperature scales negatively with the initial closure temperature. This is consistent with a logarithmic response in which the neutron damage is additive to the initial damage present. In detail, the irradiations introduce correlated increases in E a and ln( D o/a 2) that lie on the same array as found in natural apatites. This strongly suggests that neutron-induced damage mimics the damage produced by U and Th decay in natural apatites. To investigate the potential consequences of annealing of radiation damage, samples of Durango apatite were heated in vacuum to temperatures up to 550 °C for between 1 and 350 h. After this treatment the samples were step-heated using the remaining natural 4He as the diffusant. At temperatures above 290 °C a systematic change in T c was observed, with values becoming lower with increasing temperature and time. For example, reduction of T c from the starting value of 71 to ˜52 °C occurred in 1 h at 375 °C or 10 h at 330 °C. The observed variations in T c are strongly correlated with the fission track length reduction predicted from the initial holding time and temperature. Furthermore, like the neutron irradiated apatites, these samples plot on the same E a - ln( D o/ a2) array as natural samples, suggesting that damage annealing is simply undoing the consequences of damage accumulation in terms of He diffusivity. Taken together these data provide unequivocal evidence that at these levels, radiation damage acts to retard He diffusion in apatite, and that thermal annealing reverses the process. The data provide support for the previously described radiation damage trapping kinetic model of Shuster et al. (2006) and can be used to define a model which fully accommodates damage production and annealing.

  18. A 2 per cent distance to z = 0.35 by reconstructing baryon acoustic oscillations - I. Methods and application to the Sloan Digital Sky Survey

    NASA Astrophysics Data System (ADS)

    Padmanabhan, Nikhil; Xu, Xiaoying; Eisenstein, Daniel J.; Scalzo, Richard; Cuesta, Antonio J.; Mehta, Kushal T.; Kazin, Eyal

    2012-12-01

    We present the first application to density field reconstruction to a galaxy survey to undo the smoothing of the baryon acoustic oscillation (BAO) feature due to non-linear gravitational evolution and thereby improve the precision of the distance measurements possible. We apply the reconstruction technique to the clustering of galaxies from the Sloan Digital Sky Survey (SDSS) Data Release 7 (DR7) luminous red galaxy (LRG) sample, sharpening the BAO feature and achieving a 1.9 per cent measurement of the distance to z = 0.35. We update the reconstruction algorithm of Eisenstein et al. to account for the effects of survey geometry as well as redshift-space distortions and validate it on 160 LasDamas simulations. We demonstrate that reconstruction sharpens the BAO feature in the angle averaged galaxy correlation function, reducing the non-linear smoothing scale Σnl from 8.1 to 4.4 Mpc h-1. Reconstruction also significantly reduces the effects of redshift-space distortions at the BAO scale, isotropizing the correlation function. This sharpened BAO feature yields an unbiased distance estimate (<0.2 per cent) and reduces the scatter from 3.3 to 2.1 per cent. We demonstrate the robustness of these results to the various reconstruction parameters, including the smoothing scale, the galaxy bias and the linear growth rate. Applying this reconstruction algorithm to the SDSS LRG DR7 sample improves the significance of the BAO feature in these data from 3.3σ for the unreconstructed correlation function to 4.2σ after reconstruction. We estimate a relative distance scale DV/rs to z = 0.35 of 8.88 ± 0.17, where rs is the sound horizon and DV≡(DA2H-1)1/3 is a combination of the angular diameter distance DA and Hubble parameter H. Assuming a sound horizon of 154.25 Mpc, this translates into a distance measurement DV(z = 0.35) = 1.356 ± 0.025 Gpc. We find that reconstruction reduces the distance error in the DR7 sample from 3.5 to 1.9 per cent, equivalent to a survey with three times the volume of SDSS.

  19. A neural network-based method for spectral distortion correction in photon counting x-ray CT

    NASA Astrophysics Data System (ADS)

    Touch, Mengheng; Clark, Darin P.; Barber, William; Badea, Cristian T.

    2016-08-01

    Spectral CT using a photon counting x-ray detector (PCXD) shows great potential for measuring material composition based on energy dependent x-ray attenuation. Spectral CT is especially suited for imaging with K-edge contrast agents to address the otherwise limited contrast in soft tissues. We have developed a micro-CT system based on a PCXD. This system enables both 4 energy bins acquisition, as well as full-spectrum mode in which the energy thresholds of the PCXD are swept to sample the full energy spectrum for each detector element and projection angle. Measurements provided by the PCXD, however, are distorted due to undesirable physical effects in the detector and can be very noisy due to photon starvation in narrow energy bins. To address spectral distortions, we propose and demonstrate a novel artificial neural network (ANN)-based spectral distortion correction mechanism, which learns to undo the distortion in spectral CT, resulting in improved material decomposition accuracy. To address noise, post-reconstruction denoising based on bilateral filtration, which jointly enforces intensity gradient sparsity between spectral samples, is used to further improve the robustness of ANN training and material decomposition accuracy. Our ANN-based distortion correction method is calibrated using 3D-printed phantoms and a model of our spectral CT system. To enable realistic simulations and validation of our method, we first modeled the spectral distortions using experimental data acquired from 109Cd and 133Ba radioactive sources measured with our PCXD. Next, we trained an ANN to learn the relationship between the distorted spectral CT projections and the ideal, distortion-free projections in a calibration step. This required knowledge of the ground truth, distortion-free spectral CT projections, which were obtained by simulating a spectral CT scan of the digital version of a 3D-printed phantom. Once the training was completed, the trained ANN was used to perform distortion correction on any subsequent scans of the same system with the same parameters. We used joint bilateral filtration to perform noise reduction by jointly enforcing intensity gradient sparsity between the reconstructed images for each energy bin. Following reconstruction and denoising, the CT data was spectrally decomposed using the photoelectric effect, Compton scattering, and a K-edge material (i.e. iodine). The ANN-based distortion correction approach was tested using both simulations and experimental data acquired in phantoms and a mouse with our PCXD-based micro-CT system for 4 bins and full-spectrum acquisition modes. The iodine detectability and decomposition accuracy were assessed using the contrast-to-noise ratio and relative error in iodine concentration estimation metrics in images with and without distortion correction. In simulation, the material decomposition accuracy in the reconstructed data was vastly improved following distortion correction and denoising, with 50% and 20% reductions in material concentration measurement error in full-spectrum and 4 energy bins cases, respectively. Overall, experimental data confirms that full-spectrum mode provides superior results to 4-energy mode when the distortion corrections are applied. The material decomposition accuracy in the reconstructed data was vastly improved following distortion correction and denoising, with as much as a 41% reduction in material concentration measurement error for full-spectrum mode, while also bringing the iodine detectability to 4-6 mg ml-1. Distortion correction also improved the 4 bins mode data, but to a lesser extent. The results demonstrate the experimental feasibility and potential advantages of ANN-based distortion correction and joint bilateral filtration-based denoising for accurate K-edge imaging with a PCXD. Given the computational efficiency with which the ANN can be applied to projection data, the proposed scheme can be readily integrated into existing CT reconstruction pipelines.

  20. Impact of magnetic fields on ram pressure stripping in disk galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruszkowski, M.; Brüggen, M.; Lee, D.

    Ram pressure stripping can remove significant amounts of gas from galaxies in clusters and massive groups and thus has a large impact on the evolution of cluster galaxies. Recent observations have shown that key properties of ram-pressure-stripped tails of galaxies, such as their width and structure, are in conflict with predictions by simulations. To increase the realism of existing simulations, we simulated for the first time a disk galaxy exposed to a uniformly magnetized wind including radiative cooling and self-gravity of the gas. We find that magnetic fields have a strong effect on the morphology of the gas in themore » tail of the galaxy. While in the purely hydrodynamical case the tail is very clumpy, the magnetohydrodynamical case shows very filamentary structures in the tail. The filaments can be strongly supported by magnetic pressure and, wherever this is the case, the magnetic fields vectors tend to be aligned with the filaments. The ram pressure stripping process may lead to the formation of magnetized density tails that appear as bifurcated in the plane of the sky and resemble the double tails observed in ESO 137-001 and ESO 137-002. Such tails can be formed under a variety of situations, both for the disks oriented face-on with respect to the intracluster medium (ICM) wind and for the tilted ones. While this bifurcation is the consequence of the generic tendency for the magnetic fields to produce very filamentary tail morphology, the tail properties are further shaped by the combination of the magnetic field orientation and the sliding of the field past the disk surface exposed to the wind. Despite the fact that the effect of the magnetic field on the morphology of the tail is strong, magnetic draping does not strongly change the rate of gas stripping. For a face-on galaxy, the field tends to reduce the amount of gas stripping compared to the pure hydrodynamical case, and is associated with the formation of a stable magnetic draping layer on the side of the galaxy exposed to the incoming ICM wind. For significantly tilted disks, the situation may be reversed and the stripping rate may be enhanced by the 'scraping' of the disk surface by the magnetic fields sliding past the ISM/ICM interface. Instabilities, such as gravitational instabilities, undo the protective effect of this layer and allow the gas to leak out of the galaxy.« less

  1. Synthesis of Joint Volumes, Visualization of Paths, and Revision of Viewing Sequences in a Multi-dimensional Seismic Data Viewer

    NASA Astrophysics Data System (ADS)

    Chen, D. M.; Clapp, R. G.; Biondi, B.

    2006-12-01

    Ricksep is a freely-available interactive viewer for multi-dimensional data sets. The viewer is very useful for simultaneous display of multiple data sets from different viewing angles, animation of movement along a path through the data space, and selection of local regions for data processing and information extraction. Several new viewing features are added to enhance the program's functionality in the following three aspects. First, two new data synthesis algorithms are created to adaptively combine information from a data set with mostly high-frequency content, such as seismic data, and another data set with mainly low-frequency content, such as velocity data. Using the algorithms, these two data sets can be synthesized into a single data set which resembles the high-frequency data set on a local scale and at the same time resembles the low- frequency data set on a larger scale. As a result, the originally separated high and low-frequency details can now be more accurately and conveniently studied together. Second, a projection algorithm is developed to display paths through the data space. Paths are geophysically important because they represent wells into the ground. Two difficulties often associated with tracking paths are that they normally cannot be seen clearly inside multi-dimensional spaces and depth information is lost along the direction of projection when ordinary projection techniques are used. The new algorithm projects samples along the path in three orthogonal directions and effectively restores important depth information by using variable projection parameters which are functions of the distance away from the path. Multiple paths in the data space can be generated using different character symbols as positional markers, and users can easily create, modify, and view paths in real time. Third, a viewing history list is implemented which enables Ricksep's users to create, edit and save a recipe for the sequence of viewing states. Then, the recipe can be loaded into an active Ricksep session, after which the user can navigate to any state in the sequence and modify the sequence from that state. Typical uses of this feature are undoing and redoing viewing commands and animating a sequence of viewing states. The theoretical discussion are carried out and several examples using real seismic data are provided to show how these new Ricksep features provide more convenient, accurate ways to manipulate multi-dimensional data sets.

  2. Entanglement and Metrology with Singlet-Triplet Qubits

    NASA Astrophysics Data System (ADS)

    Shulman, Michael Dean

    Electron spins confined in semiconductor quantum dots are emerging as a promising system to study quantum information science and to perform sensitive metrology. Their weak interaction with the environment leads to long coherence times and robust storage for quantum information, and the intrinsic tunability of semiconductors allows for controllable operations, initialization, and readout of their quantum state. These spin qubits are also promising candidates for the building block for a scalable quantum information processor due to their prospects for scalability and miniaturization. However, several obstacles limit the performance of quantum information experiments in these systems. For example, the weak coupling to the environment makes inter-qubit operations challenging, and a fluctuating nuclear magnetic field limits the performance of single-qubit operations. The focus of this thesis will be several experiments which address some of the outstanding problems in semiconductor spin qubits, in particular, singlet-triplet (S-T0) qubits. We use these qubits to probe both the electric field and magnetic field noise that limit the performance of these qubits. The magnetic noise bath is probed with high bandwidth and precision using novel techniques borrowed from the field of Hamiltonian learning, which are effective due to the rapid control and readout available in S-T 0 qubits. These findings allow us to effectively undo the undesired effects of the fluctuating nuclear magnetic field by tracking them in real-time, and we demonstrate a 30-fold improvement in the coherence time T2*. We probe the voltage noise environment of the qubit using coherent qubit oscillations, which is partially enabled by control of the nuclear magnetic field. We find that the voltage noise bath is frequency-dependent, even at frequencies as high as 1MHz, and it shows surprising and, as of yet, unexplained temperature dependence. We leverage this knowledge of the voltage noise environment, the nuclear magnetic field control, as well as new techniques for calibrated measurement of the density matrix in a singlet-triplet qubit to entangle two adjacent single-triplet qubits. We fully characterize the generated entangled states and prove that they are, indeed, entangled. This work opens new opportunities to use qubits as sensors for improved metrological capabilities, as well as for improved quantum information processing. The singlet-triplet qubit is unique in that it can be used to probe two fundamentally different noise baths, which are important for a large variety of solid state qubits. More specifically, this work establishes the singlet-triplet qubit as a viable candidate for the building block of a scalable quantum information processor.

  3. SE-FIT

    NASA Technical Reports Server (NTRS)

    Chen, Yongkang; Weislogel, Mark; Schaeffer, Ben; Semerjian, Ben; Yang, Lihong; Zimmerli, Gregory

    2012-01-01

    The mathematical theory of capillary surfaces has developed steadily over the centuries, but it was not until the last few decades that new technologies have put a more urgent demand on a substantially more qualitative and quantitative understanding of phenomena relating to capillarity in general. So far, the new theory development successfully predicts the behavior of capillary surfaces for special cases. However, an efficient quantitative mathematical prediction of capillary phenomena related to the shape and stability of geometrically complex equilibrium capillary surfaces remains a significant challenge. As one of many numerical tools, the open-source Surface Evolver (SE) algorithm has played an important role over the last two decades. The current effort was undertaken to provide a front-end to enhance the accessibility of SE for the purposes of design and analysis. Like SE, the new code is open-source and will remain under development for the foreseeable future. The ultimate goal of the current Surface Evolver Fluid Interface Tool (SEFIT) development is to build a fully integrated front-end with a set of graphical user interface (GUI) elements. Such a front-end enables the access to functionalities that are developed along with the GUIs to deal with pre-processing, convergence computation operation, and post-processing. In other words, SE-FIT is not just a GUI front-end, but an integrated environment that can perform sophisticated computational tasks, e.g. importing industry standard file formats and employing parameter sweep functions, which are both lacking in SE, and require minimal interaction by the user. These functions are created using a mixture of Visual Basic and the SE script language. These form the foundation for a high-performance front-end that substantially simplifies use without sacrificing the proven capabilities of SE. The real power of SE-FIT lies in its automated pre-processing, pre-defined geometries, convergence computation operation, computational diagnostic tools, and crash-handling capabilities to sustain extensive computations. SE-FIT performance is enabled by its so-called file-layer mechanism. During the early stages of SE-FIT development, it became necessary to modify the original SE code to enable capabilities required for an enhanced and synchronized communication. To this end, a file-layer was created that serves as a command buffer to ensure a continuous and sequential execution of commands sent from the front-end to SE. It also establishes a proper means for handling crashes. The file layer logs input commands and SE output; it also supports user interruption requests, back and forward operation (i.e. undo and redo), and others. It especially enables the batch mode computation of a series of equilibrium surfaces and the searching of critical parameter values in studying the stability of capillary surfaces. In this way, the modified SE significantly extends the capabilities of the original SE.

  4. Full Scale Advanced Systems Testbed (FAST): Capabilities and Recent Flight Research

    NASA Technical Reports Server (NTRS)

    Miller, Christopher

    2014-01-01

    At the NASA Armstrong Flight Research Center research is being conducted into flight control technologies that will enable the next generation of air and space vehicles. The Full Scale Advanced Systems Testbed (FAST) aircraft provides a laboratory for flight exploration of these technologies. In recent years novel but simple adaptive architectures for aircraft and rockets have been researched along with control technologies for improving aircraft fuel efficiency and control structural interaction. This presentation outlines the FAST capabilities and provides a snapshot of the research accomplishments to date. Flight experimentation allows a researcher to substantiate or invalidate their assumptions and intuition about a new technology or innovative approach Data early in a development cycle is invaluable for determining which technology barriers are real and which ones are imagined Data for a technology at a low TRL can be used to steer and focus the exploration and fuel rapid advances based on real world lessons learned It is important to identify technologies that are mature enough to benefit from flight research data and not be tempted to wait until we have solved all the potential issues prior to getting some data Sometimes a stagnated technology just needs a little real world data to get it going One trick to getting data for low TRL technologies is finding an environment where it is okay to take risks, where occasional failure is an expected outcome Learning how things fail is often as valuable as showing that they work FAST has been architected to facilitate this type of testing for control system technologies, specifically novel algorithms and sensors Rapid prototyping with a quick turnaround in a fly-fix-fly paradigm Sometimes it's easier and cheaper to just go fly it than to analyze the problem to death The goal is to find and test control technologies that would benefit from flight data and find solutions to the real barriers to innovation. The FAST vehicle is a flexible laboratory for nascent technologies that would benefit from early life cycle flight research data It provides a robust and safe environment where innovative techniques can be explored in a fly-fix-fly rapid prototyping paradigm IRAC Simple adaptive control technologies can provide real benefits without undo complexity Adverse pilot/adaptive system interactions can be mitigated and tools have been developed to evaluate those interactions ICP Substantial fuel savings can be achieved over a broad range of vehicles and configurations with intelligent control solutions LVAC The AAC design is robust and effective for the SLS mission, and promises to provide benefits to other platforms as well OCLA Hopefully will show that structural feedback can be seamlessly integrated with performance and stability objectives All of these control technologies have been implemented into the same baseline control law and could be combined into one control solution that answers many pressing questions for modern vehicle configurations

  5. School Astronomy Club: from Project to Knowledge

    NASA Astrophysics Data System (ADS)

    Folhas, Alvaro

    2016-04-01

    Prepare a generation of young people for the challenges of the future is a task which forces us to rethink the school, not just for being difficult, but also because students feel that the school has very little to offer, especially something that interests them. Thus, the school is dysfunctional, is ill, and needs prompt treatment. School have to adjust to the new times, and this does not mean changing the old blackboards by advanced interactive whiteboards. The school has to find the way to the students with something that seduce them: the Challenge. The Astronomy Club that I lead in my school is essentially a Project space. Students who voluntarily joined the club, organize themselves according to their interests around projects whose outcome is not defined from the beginning, which requires them to do, undo and redo. Which obliges them to feel the need to ask for help to mathematics or physics to achieve answers, to feel the passion to study with a genuine purpose of learning. Some examples of the work: The younger students are challenged to reproduce the historical astronomical experiments that have opened the doors of knowledge such as the Eratosthenes experiment to determine the perimeter of the Earth (on equinox), or by using congruent triangles, determine the diameter the sun. These students are driven to establish distance scales in the solar system, which, to their astonishment, allows them to clear misconceptions that arise from some pictures of books and allows them to have a scientifically correct idea of the planetary orbit and distance separating the planets of the Solar System. For students from 15 to 18 years, I have to raise the level of the challenges and use the natural tendency of this age bracket to assert making new and exciting things. To this purpose, I am fortunate to have the support of large organizations like NUCLIO, ESA, CERN, and Go-Lab Project, Inspiring Science Education, Open Discovery Space and Global Hands on Universe. Through them the students have participated in various activities such as scientific research of NEOs (Near Earth Objects) of the IASC Project (International Astronomical Search Collaboration (pronounced "Isaac")), an educational outreach program which provides high quality astronomical data from several Professional Astronomy Observatories, to allow students to scrutinize the space with professional tools and be able to make original astronomical discoveries. They use also professional, robotically controlled telescope for astronomical research and education projects, such as the two telescopes of Faulkes Telescope Project (2.0m diameter telescope at Hawaii and Australia) and the Liverpool Telescope of Astrophysics Research Institute of Liverpool John Moores University (2.0m diameter telescope, Canary Islands), to obtain pictures and data from galaxies and nebulae, and try to solve problems using real science data that they have either obtained themselves through their own observations or data acquired from other sources. These students learn what is, and how to make Science, develop their own skills and knowledge, transferring this enthusiasm to others and promoting a culture of school we all desire.

  6. Real Time Control of CO2 Enrichment Experiments on the Sea Floor Enabled by the MARS Cabled Observatory

    NASA Astrophysics Data System (ADS)

    Brewer, P. G.; Mbari Foce Team

    2010-12-01

    We report on progress on FOCE (Free Ocean CO2 Enrichment) techniques designed to accomplish realistic (that is not contained within land-based aquaria) experiments on the response of deep-sea animals and biogeochemical cycles to ocean acidification. Such experiments have long been carried out on ecosystems on land, and the outcome has differed significantly from CO2 enrichment in enclosed greenhouse systems, thereby undoing much of the hope for an increase in the large-scale biosphere draw down of atmospheric CO2. It is a far bigger step if deep-sea animals and systems are removed from their cold, dark, high pressure and low oxygen native habitat. The equivalent problem in the ocean is far more difficult because of (1) the very different physical forcing; (2) the complex reaction rates between CO2 and water require delay times between addition and entry to the experimental space; (3) the lack of supporting infrastructure and of adequate sensors; and (4) the need for sophisticated and robust control techniques in both hardware and software. We have overcome almost all of these challenges, and related working systems have already been successfully deployed on the Great Barrier Reef coralline flats with Australian colleagues. We have used the MBARI MARS (Monterey Accelerated Research System) cabled observatory to carry out deep-ocean (880m depth) experiments. The basic experimental unit is a 1m x 1m x 50cm chamber with side arms of ~ 3m length to provide the required chemical delay times for the reaction between admixed CO2 enriched sea water and emergence of the flow into the main chamber. Controllable thrusters, operated by user commands, help maintain a steady flow of seawater through the experiment. The site is slightly below the depth of the O2 minimum where small changes in either O2 from ocean warming, or CO2 from ocean acidification can lead to the formation of dead zones. Shallow (near shore) experiments are now also in the late planning stages. We have developed extremely low noise pH sensors that show for the first time the scale and frequency of the tidally driven background pH fluctuations in the ocean. This helps establish the limits in background pH that deep-sea animals are adapted to. We have developed software to control this complex system in real time and to make control possible over the web. A graphical user interface allows operator observation of flow and background conditions, and full choice of experimental settings. CO2 enrichment is provided by ROV delivery of ~50-100 L of liquid CO2 which is contained by its buoyancy within a box set immediately above the side arm opening. The dissolution rate of liquid CO2 through the hydrate skin is ~0.5 μmol/cm2/sec thereby providing a working fluid in the reservoir which is drawn upon as needed. Experiments of 2-3 weeks duration are possible from a single filling. Figure 1. pH changes created in FOCE by a series of CO2 enriched sea water additions under varying flow conditions.

  7. Evaluation of Crew-Centric Onboard Mission Operations Planning and Execution Tool: Year 2

    NASA Technical Reports Server (NTRS)

    Hillenius, S.; Marquez, J.; Korth, D.; Rosenbaum, M.; Deliz, Ivy; Kanefsky, Bob; Zheng, Jimin

    2018-01-01

    Currently, mission planning for the International Space Station (ISS) is largely affected by ground operators in mission control. The task of creating a week-long mission plan for ISS crew takes dozens of people multiple days to complete, and is often created far in advance of its execution. As such, re-planning or adapting to changing real-time constraints or emergent issues is similarly taxing. As we design for future mission operations concepts to other planets or areas with limited connectivity to Earth, more of these ground-based tasks will need to be handled autonomously by the crew onboard.There is a need for a highly usable (including low training time) tool that enables efficient self-scheduling and execution within a single package. The ISS Program has identified Playbook as a potential option. It already has high crew acceptance as a plan viewer from previous analogs and can now support a crew self-scheduling assessment on ISS or on another mission. The goals of this work, a collaboration between the Human Research Program and the ISS Program, are to inform the design of systems for more autonomous crew operations and provide a platform for research on crew autonomy for future deep space missions. Our second year of the research effort have included new insights on the crew self-scheduling sessions performed by the crew through use on the HERA (Human Exploration Research Analog) and NEEMO (NASA Extreme Environment Mission Operations) analogs. Use on the NEEMO analog involved two self-scheduling strategies where the crew planned and executed two days of EVAs (Extra-Vehicular Activities). On HERA year two represented the first HERA campaign where we were able to perform research tasks. This involved selected flexible activities that the crew could schedule, mock timelines where the crew completed more complex planning exercises, usability evaluation of the crew self-scheduling features, and more insights into the limit of plan complexity that the crew could effectively self-schedule. In parallel we have added in new features and functionality in the Playbook tool based off of our insights from crew self-scheduling in the NASA analogs. In particular this year we have added in the ability for the crew to add, edit, and remove their own activities in the Playbook tool, expanding the type of planning and re-planning possible in the tool and opening up the ability for more free form plan creation. The ability to group and manipulate groups of activities from the plan task list was also added, allowing crew members to add predefined sets of activities onto their mission timeline. In addition we also added a way for crew members to roll back changes in their plan, in order to allow an undo like capability. These features expand and complement the initial self-scheduling features added in year one with the goal of making crew autonomous planning more efficient. As part of this work we have also finished developing the first version of our Playbook Data Analysis Tool, a research tool built to interpret and analyze the unobtrusively collected data obtained during the NASA analog missions through Playbook. This data which includes user click interaction as well as plan change information, through the Playbook Data Analysis Tool, allows us to playback this information as if a video camera was mounted over the crewmember's tablet. While the primary purpose of this tool is to allow usability analysis of crew self-scheduling sessions used on the NASA analog, since the data collected is structured, the tool can automatically derive metrics that would be traditionally tedious to achieve without manual analysis of video playback. We will demonstrate and discuss the ability for future derived metrics to be added to the tool. In addition to the current data and results gathered in year two we will also discuss the preparation and goals of our International Space Station (ISS) onboard technology demonstration with Playbook. This technology demonstration will be preformed as part of the CAST payload starting in late 2016.

  8. Radiation Environment Modeling for Spacecraft Design: New Model Developments

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Xapsos, Mike; Lauenstein, Jean-Marie; Ladbury, Ray

    2006-01-01

    A viewgraph presentation on various new space radiation environment models for spacecraft design is described. The topics include: 1) The Space Radiatio Environment; 2) Effects of Space Environments on Systems; 3) Space Radiatio Environment Model Use During Space Mission Development and Operations; 4) Space Radiation Hazards for Humans; 5) "Standard" Space Radiation Environment Models; 6) Concerns about Standard Models; 7) Inadequacies of Current Models; 8) Development of New Models; 9) New Model Developments: Proton Belt Models; 10) Coverage of New Proton Models; 11) Comparison of TPM-1, PSB97, AP-8; 12) New Model Developments: Electron Belt Models; 13) Coverage of New Electron Models; 14) Comparison of "Worst Case" POLE, CRESELE, and FLUMIC Models with the AE-8 Model; 15) New Model Developments: Galactic Cosmic Ray Model; 16) Comparison of NASA, MSU, CIT Models with ACE Instrument Data; 17) New Model Developmemts: Solar Proton Model; 18) Comparison of ESP, JPL91, KIng/Stassinopoulos, and PSYCHIC Models; 19) New Model Developments: Solar Heavy Ion Model; 20) Comparison of CREME96 to CREDO Measurements During 2000 and 2002; 21) PSYCHIC Heavy ion Model; 22) Model Standardization; 23) Working Group Meeting on New Standard Radiation Belt and Space Plasma Models; and 24) Summary.

  9. Comparisons of Multilevel Modeling and Structural Equation Modeling Approaches to Actor-Partner Interdependence Model.

    PubMed

    Hong, Sehee; Kim, Soyoung

    2018-01-01

    There are basically two modeling approaches applicable to analyzing an actor-partner interdependence model: the multilevel modeling (hierarchical linear model) and the structural equation modeling. This article explains how to use these two models in analyzing an actor-partner interdependence model and how these two approaches work differently. As an empirical example, marital conflict data were used to analyze an actor-partner interdependence model. The multilevel modeling and the structural equation modeling produced virtually identical estimates for a basic model. However, the structural equation modeling approach allowed more realistic assumptions on measurement errors and factor loadings, rendering better model fit indices.

  10. [Analysis of the stability and adaptability of near infrared spectra qualitative analysis model].

    PubMed

    Cao, Wu; Li, Wei-jun; Wang, Ping; Zhang, Li-ping

    2014-06-01

    The stability and adaptability of model of near infrared spectra qualitative analysis were studied. Method of separate modeling can significantly improve the stability and adaptability of model; but its ability of improving adaptability of model is limited. Method of joint modeling can not only improve the adaptability of the model, but also the stability of model, at the same time, compared to separate modeling, the method can shorten the modeling time, reduce the modeling workload; extend the term of validity of model, and improve the modeling efficiency. The experiment of model adaptability shows that, the correct recognition rate of separate modeling method is relatively low, which can not meet the requirements of application, and joint modeling method can reach the correct recognition rate of 90%, and significantly enhances the recognition effect. The experiment of model stability shows that, the identification results of model by joint modeling are better than the model by separate modeling, and has good application value.

  11. Evaluating the Bias of Alternative Cost Progress Models: Tests Using Aerospace Industry Acquisition Programs

    DTIC Science & Technology

    1992-12-01

    suspect :mat, -n2 extent predict:.on cas jas ccsiziveiv crrei:=e amonc e v:arious models, :he fandom *.;aik, learn ha r ur e, i;<ea- variable and Bemis...Functions, Production Rate Adjustment Model, Learning Curve Model. Random Walk Model. Bemis Model. Evaluating Model Bias, Cost Prediction Bias. Cost...of four cost progress models--a random walk model, the tradiuonai learning curve model, a production rate model Ifixed-variable model). and a model

  12. Experience with turbulence interaction and turbulence-chemistry models at Fluent Inc.

    NASA Technical Reports Server (NTRS)

    Choudhury, D.; Kim, S. E.; Tselepidakis, D. P.; Missaghi, M.

    1995-01-01

    This viewgraph presentation discusses (1) turbulence modeling: challenges in turbulence modeling, desirable attributes of turbulence models, turbulence models in FLUENT, and examples using FLUENT; and (2) combustion modeling: turbulence-chemistry interaction and FLUENT equilibrium model. As of now, three turbulence models are provided: the conventional k-epsilon model, the renormalization group model, and the Reynolds-stress model. The renormalization group k-epsilon model has broadened the range of applicability of two-equation turbulence models. The Reynolds-stress model has proved useful for strongly anisotropic flows such as those encountered in cyclones, swirlers, and combustors. Issues remain, such as near-wall closure, with all classes of models.

  13. Leadership Models.

    ERIC Educational Resources Information Center

    Freeman, Thomas J.

    This paper discusses six different models of organizational structure and leadership, including the scalar chain or pyramid model, the continuum model, the grid model, the linking pin model, the contingency model, and the circle or democratic model. Each model is examined in a separate section that describes the model and its development, lists…

  14. SUMMA and Model Mimicry: Understanding Differences Among Land Models

    NASA Astrophysics Data System (ADS)

    Nijssen, B.; Nearing, G. S.; Ou, G.; Clark, M. P.

    2016-12-01

    Model inter-comparison and model ensemble experiments suffer from an inability to explain the mechanisms behind differences in model outcomes. We can clearly demonstrate that the models are different, but we cannot necessarily identify the reasons why, because most models exhibit myriad differences in process representations, model parameterizations, model parameters and numerical solution methods. This inability to identify the reasons for differences in model performance hampers our understanding and limits model improvement, because we cannot easily identify the most promising paths forward. We have developed the Structure for Unifying Multiple Modeling Alternatives (SUMMA) to allow for controlled experimentation with model construction, numerical techniques, and parameter values and therefore isolate differences in model outcomes to specific choices during the model development process. In developing SUMMA, we recognized that hydrologic models can be thought of as individual instantiations of a master modeling template that is based on a common set of conservation equations for energy and water. Given this perspective, SUMMA provides a unified approach to hydrologic modeling that integrates different modeling methods into a consistent structure with the ability to instantiate alternative hydrologic models at runtime. Here we employ SUMMA to revisit a previous multi-model experiment and demonstrate its use for understanding differences in model performance. Specifically, we implement SUMMA to mimic the spread of behaviors exhibited by the land models that participated in the Protocol for the Analysis of Land Surface Models (PALS) Land Surface Model Benchmarking Evaluation Project (PLUMBER) and draw conclusions about the relative performance of specific model parameterizations for water and energy fluxes through the soil-vegetation continuum. SUMMA's ability to mimic the spread of model ensembles and the behavior of individual models can be an important tool in focusing model development and improvement efforts.

  15. Seven Modeling Perspectives on Teaching and Learning: Some Interrelations and Cognitive Effects

    ERIC Educational Resources Information Center

    Easley, J. A., Jr.

    1977-01-01

    The categories of models associated with the seven perspectives are designated as combinatorial models, sampling models, cybernetic models, game models, critical thinking models, ordinary language analysis models, and dynamic structural models. (DAG)

  16. Pursuing the method of multiple working hypotheses to understand differences in process-based snow models

    NASA Astrophysics Data System (ADS)

    Clark, Martyn; Essery, Richard

    2017-04-01

    When faced with the complex and interdisciplinary challenge of building process-based land models, different modelers make different decisions at different points in the model development process. These modeling decisions are generally based on several considerations, including fidelity (e.g., what approaches faithfully simulate observed processes), complexity (e.g., which processes should be represented explicitly), practicality (e.g., what is the computational cost of the model simulations; are there sufficient resources to implement the desired modeling concepts), and data availability (e.g., is there sufficient data to force and evaluate models). Consequently the research community, comprising modelers of diverse background, experience, and modeling philosophy, has amassed a wide range of models, which differ in almost every aspect of their conceptualization and implementation. Model comparison studies have been undertaken to explore model differences, but have not been able to meaningfully attribute inter-model differences in predictive ability to individual model components because there are often too many structural and implementation differences among the different models considered. As a consequence, model comparison studies to date have provided limited insight into the causes of differences in model behavior, and model development has often relied on the inspiration and experience of individual modelers rather than on a systematic analysis of model shortcomings. This presentation will summarize the use of "multiple-hypothesis" modeling frameworks to understand differences in process-based snow models. Multiple-hypothesis frameworks define a master modeling template, and include a a wide variety of process parameterizations and spatial configurations that are used in existing models. Such frameworks provide the capability to decompose complex models into the individual decisions that are made as part of model development, and evaluate each decision in isolation. It is hence possible to attribute differences in system-scale model predictions to individual modeling decisions, providing scope to mimic the behavior of existing models, understand why models differ, characterize model uncertainty, and identify productive pathways to model improvement. Results will be presented applying multiple hypothesis frameworks to snow model comparison projects, including PILPS, SnowMIP, and the upcoming ESM-SnowMIP project.

  17. Research on Multi - Person Parallel Modeling Method Based on Integrated Model Persistent Storage

    NASA Astrophysics Data System (ADS)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper mainly studies the multi-person parallel modeling method based on the integrated model persistence storage. The integrated model refers to a set of MDDT modeling graphics system, which can carry out multi-angle, multi-level and multi-stage description of aerospace general embedded software. Persistent storage refers to converting the data model in memory into a storage model and converting the storage model into a data model in memory, where the data model refers to the object model and the storage model is a binary stream. And multi-person parallel modeling refers to the need for multi-person collaboration, the role of separation, and even real-time remote synchronization modeling.

  18. Constructive Epistemic Modeling: A Hierarchical Bayesian Model Averaging Method

    NASA Astrophysics Data System (ADS)

    Tsai, F. T. C.; Elshall, A. S.

    2014-12-01

    Constructive epistemic modeling is the idea that our understanding of a natural system through a scientific model is a mental construct that continually develops through learning about and from the model. Using the hierarchical Bayesian model averaging (HBMA) method [1], this study shows that segregating different uncertain model components through a BMA tree of posterior model probabilities, model prediction, within-model variance, between-model variance and total model variance serves as a learning tool [2]. First, the BMA tree of posterior model probabilities permits the comparative evaluation of the candidate propositions of each uncertain model component. Second, systemic model dissection is imperative for understanding the individual contribution of each uncertain model component to the model prediction and variance. Third, the hierarchical representation of the between-model variance facilitates the prioritization of the contribution of each uncertain model component to the overall model uncertainty. We illustrate these concepts using the groundwater modeling of a siliciclastic aquifer-fault system. The sources of uncertainty considered are from geological architecture, formation dip, boundary conditions and model parameters. The study shows that the HBMA analysis helps in advancing knowledge about the model rather than forcing the model to fit a particularly understanding or merely averaging several candidate models. [1] Tsai, F. T.-C., and A. S. Elshall (2013), Hierarchical Bayesian model averaging for hydrostratigraphic modeling: Uncertainty segregation and comparative evaluation. Water Resources Research, 49, 5520-5536, doi:10.1002/wrcr.20428. [2] Elshall, A.S., and F. T.-C. Tsai (2014). Constructive epistemic modeling of groundwater flow with geological architecture and boundary condition uncertainty under Bayesian paradigm, Journal of Hydrology, 517, 105-119, doi: 10.1016/j.jhydrol.2014.05.027.

  19. The Influence of a Model's Reinforcement Contingency and Affective Response on Children's Perceptions of the Model

    ERIC Educational Resources Information Center

    Thelen, Mark H.; And Others

    1977-01-01

    Assesses the influence of model consequences on perceived model affect and, conversely, assesses the influence of model affect on perceived model consequences. Also appraises the influence of model consequences and model affect on perceived model attractiveness, perceived model competence, and perceived task attractiveness. (Author/RK)

  20. Bayesian Model Averaging of Artificial Intelligence Models for Hydraulic Conductivity Estimation

    NASA Astrophysics Data System (ADS)

    Nadiri, A.; Chitsazan, N.; Tsai, F. T.; Asghari Moghaddam, A.

    2012-12-01

    This research presents a Bayesian artificial intelligence model averaging (BAIMA) method that incorporates multiple artificial intelligence (AI) models to estimate hydraulic conductivity and evaluate estimation uncertainties. Uncertainty in the AI model outputs stems from error in model input as well as non-uniqueness in selecting different AI methods. Using one single AI model tends to bias the estimation and underestimate uncertainty. BAIMA employs Bayesian model averaging (BMA) technique to address the issue of using one single AI model for estimation. BAIMA estimates hydraulic conductivity by averaging the outputs of AI models according to their model weights. In this study, the model weights were determined using the Bayesian information criterion (BIC) that follows the parsimony principle. BAIMA calculates the within-model variances to account for uncertainty propagation from input data to AI model output. Between-model variances are evaluated to account for uncertainty due to model non-uniqueness. We employed Takagi-Sugeno fuzzy logic (TS-FL), artificial neural network (ANN) and neurofuzzy (NF) to estimate hydraulic conductivity for the Tasuj plain aquifer, Iran. BAIMA combined three AI models and produced better fitting than individual models. While NF was expected to be the best AI model owing to its utilization of both TS-FL and ANN models, the NF model is nearly discarded by the parsimony principle. The TS-FL model and the ANN model showed equal importance although their hydraulic conductivity estimates were quite different. This resulted in significant between-model variances that are normally ignored by using one AI model.

  1. A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services

    NASA Astrophysics Data System (ADS)

    Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.

    2015-12-01

    Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014, 11th International Conf. on Hydroinformatics, New York, NY.

  2. Maximum likelihood Bayesian model averaging and its predictive analysis for groundwater reactive transport models

    USGS Publications Warehouse

    Curtis, Gary P.; Lu, Dan; Ye, Ming

    2015-01-01

    While Bayesian model averaging (BMA) has been widely used in groundwater modeling, it is infrequently applied to groundwater reactive transport modeling because of multiple sources of uncertainty in the coupled hydrogeochemical processes and because of the long execution time of each model run. To resolve these problems, this study analyzed different levels of uncertainty in a hierarchical way, and used the maximum likelihood version of BMA, i.e., MLBMA, to improve the computational efficiency. This study demonstrates the applicability of MLBMA to groundwater reactive transport modeling in a synthetic case in which twenty-seven reactive transport models were designed to predict the reactive transport of hexavalent uranium (U(VI)) based on observations at a former uranium mill site near Naturita, CO. These reactive transport models contain three uncertain model components, i.e., parameterization of hydraulic conductivity, configuration of model boundary, and surface complexation reactions that simulate U(VI) adsorption. These uncertain model components were aggregated into the alternative models by integrating a hierarchical structure into MLBMA. The modeling results of the individual models and MLBMA were analyzed to investigate their predictive performance. The predictive logscore results show that MLBMA generally outperforms the best model, suggesting that using MLBMA is a sound strategy to achieve more robust model predictions relative to a single model. MLBMA works best when the alternative models are structurally distinct and have diverse model predictions. When correlation in model structure exists, two strategies were used to improve predictive performance by retaining structurally distinct models or assigning smaller prior model probabilities to correlated models. Since the synthetic models were designed using data from the Naturita site, the results of this study are expected to provide guidance for real-world modeling. Limitations of applying MLBMA to the synthetic study and future real-world modeling are discussed.

  3. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    NASA Astrophysics Data System (ADS)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming interfaces, the general model interface and five case studies, including a regression model, Noah-MP, FASST, SAC-HTET/SNOW-17, and FLake. These different models vary in complexity with software structure. Also, we will describe how these complexities were overcome through using this approach and results of model benchmarks within LIS.

  4. Literature review of models on tire-pavement interaction noise

    NASA Astrophysics Data System (ADS)

    Li, Tan; Burdisso, Ricardo; Sandu, Corina

    2018-04-01

    Tire-pavement interaction noise (TPIN) becomes dominant at speeds above 40 km/h for passenger vehicles and 70 km/h for trucks. Several models have been developed to describe and predict the TPIN. However, these models do not fully reveal the physical mechanisms or predict TPIN accurately. It is well known that all the models have both strengths and weaknesses, and different models fit different investigation purposes or conditions. The numerous papers that present these models are widely scattered among thousands of journals, and it is difficult to get the complete picture of the status of research in this area. This review article aims at presenting the history and current state of TPIN models systematically, making it easier to identify and distribute the key knowledge and opinions, and providing insight into the future research trend in this field. In this work, over 2000 references related to TPIN were collected, and 74 models were reviewed from nearly 200 selected references; these were categorized into deterministic models (37), statistical models (18), and hybrid models (19). The sections explaining the models are self-contained with key principles, equations, and illustrations included. The deterministic models were divided into three sub-categories: conventional physics models, finite element and boundary element models, and computational fluid dynamics models; the statistical models were divided into three sub-categories: traditional regression models, principal component analysis models, and fuzzy curve-fitting models; the hybrid models were divided into three sub-categories: tire-pavement interface models, mechanism separation models, and noise propagation models. At the end of each category of models, a summary table is presented to compare these models with the key information extracted. Readers may refer to these tables to find models of their interest. The strengths and weaknesses of the models in different categories were then analyzed. Finally, the modeling trend and future direction in this area are given.

  5. Multi-Model Combination techniques for Hydrological Forecasting: Application to Distributed Model Intercomparison Project Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ajami, N K; Duan, Q; Gao, X

    2005-04-11

    This paper examines several multi-model combination techniques: the Simple Multi-model Average (SMA), the Multi-Model Super Ensemble (MMSE), Modified Multi-Model Super Ensemble (M3SE) and the Weighted Average Method (WAM). These model combination techniques were evaluated using the results from the Distributed Model Intercomparison Project (DMIP), an international project sponsored by the National Weather Service (NWS) Office of Hydrologic Development (OHD). All of the multi-model combination results were obtained using uncalibrated DMIP model outputs and were compared against the best uncalibrated as well as the best calibrated individual model results. The purpose of this study is to understand how different combination techniquesmore » affect the skill levels of the multi-model predictions. This study revealed that the multi-model predictions obtained from uncalibrated single model predictions are generally better than any single member model predictions, even the best calibrated single model predictions. Furthermore, more sophisticated multi-model combination techniques that incorporated bias correction steps work better than simple multi-model average predictions or multi-model predictions without bias correction.« less

  6. Expert models and modeling processes associated with a computer-modeling tool

    NASA Astrophysics Data System (ADS)

    Zhang, Baohui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-07-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using think aloud technique and video recording, we captured their computer screen modeling activities and thinking processes. We also interviewed them the day following their modeling sessions to further probe the rationale of their modeling practices. We analyzed both the audio-video transcripts and the experts' models. We found the experts' modeling processes followed the linear sequence built in the modeling program with few instances of moving back and forth. They specified their goals up front and spent a long time thinking through an entire model before acting. They specified relationships with accurate and convincing evidence. Factors (i.e., variables) in expert models were clustered, and represented by specialized technical terms. Based on the above findings, we made suggestions for improving model-based science teaching and learning using Model-It.

  7. Illustrating a Model-Game-Model Paradigm for Using Human Wargames in Analysis

    DTIC Science & Technology

    2017-02-01

    Working Paper Illustrating a Model- Game -Model Paradigm for Using Human Wargames in Analysis Paul K. Davis RAND National Security Research...paper proposes and illustrates an analysis-centric paradigm (model- game -model or what might be better called model-exercise-model in some cases) for...to involve stakehold- ers in model development from the outset. The model- game -model paradigm was illustrated in an application to crisis planning

  8. Multi-model analysis of terrestrial carbon cycles in Japan: limitations and implications of model calibration using eddy flux observations

    NASA Astrophysics Data System (ADS)

    Ichii, K.; Suzuki, T.; Kato, T.; Ito, A.; Hajima, T.; Ueyama, M.; Sasai, T.; Hirata, R.; Saigusa, N.; Ohtani, Y.; Takagi, K.

    2010-07-01

    Terrestrial biosphere models show large differences when simulating carbon and water cycles, and reducing these differences is a priority for developing more accurate estimates of the condition of terrestrial ecosystems and future climate change. To reduce uncertainties and improve the understanding of their carbon budgets, we investigated the utility of the eddy flux datasets to improve model simulations and reduce variabilities among multi-model outputs of terrestrial biosphere models in Japan. Using 9 terrestrial biosphere models (Support Vector Machine - based regressions, TOPS, CASA, VISIT, Biome-BGC, DAYCENT, SEIB, LPJ, and TRIFFID), we conducted two simulations: (1) point simulations at four eddy flux sites in Japan and (2) spatial simulations for Japan with a default model (based on original settings) and a modified model (based on model parameter tuning using eddy flux data). Generally, models using default model settings showed large deviations in model outputs from observation with large model-by-model variability. However, after we calibrated the model parameters using eddy flux data (GPP, RE and NEP), most models successfully simulated seasonal variations in the carbon cycle, with less variability among models. We also found that interannual variations in the carbon cycle are mostly consistent among models and observations. Spatial analysis also showed a large reduction in the variability among model outputs. This study demonstrated that careful validation and calibration of models with available eddy flux data reduced model-by-model differences. Yet, site history, analysis of model structure changes, and more objective procedure of model calibration should be included in the further analysis.

  9. Conceptual and logical level of database modeling

    NASA Astrophysics Data System (ADS)

    Hunka, Frantisek; Matula, Jiri

    2016-06-01

    Conceptual and logical levels form the top most levels of database modeling. Usually, ORM (Object Role Modeling) and ER diagrams are utilized to capture the corresponding schema. The final aim of business process modeling is to store its results in the form of database solution. For this reason, value oriented business process modeling which utilizes ER diagram to express the modeling entities and relationships between them are used. However, ER diagrams form the logical level of database schema. To extend possibilities of different business process modeling methodologies, the conceptual level of database modeling is needed. The paper deals with the REA value modeling approach to business process modeling using ER-diagrams, and derives conceptual model utilizing ORM modeling approach. Conceptual model extends possibilities for value modeling to other business modeling approaches.

  10. BiGG Models: A platform for integrating, standardizing and sharing genome-scale models

    DOE PAGES

    King, Zachary A.; Lu, Justin; Drager, Andreas; ...

    2015-10-17

    In this study, genome-scale metabolic models are mathematically structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scalemore » metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data.« less

  11. BiGG Models: A platform for integrating, standardizing and sharing genome-scale models

    PubMed Central

    King, Zachary A.; Lu, Justin; Dräger, Andreas; Miller, Philip; Federowicz, Stephen; Lerman, Joshua A.; Ebrahim, Ali; Palsson, Bernhard O.; Lewis, Nathan E.

    2016-01-01

    Genome-scale metabolic models are mathematically-structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scale metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data. PMID:26476456

  12. Service-oriented model-encapsulation strategy for sharing and integrating heterogeneous geo-analysis models in an open web environment

    NASA Astrophysics Data System (ADS)

    Yue, Songshan; Chen, Min; Wen, Yongning; Lu, Guonian

    2016-04-01

    Earth environment is extremely complicated and constantly changing; thus, it is widely accepted that the use of a single geo-analysis model cannot accurately represent all details when solving complex geo-problems. Over several years of research, numerous geo-analysis models have been developed. However, a collaborative barrier between model providers and model users still exists. The development of cloud computing has provided a new and promising approach for sharing and integrating geo-analysis models across an open web environment. To share and integrate these heterogeneous models, encapsulation studies should be conducted that are aimed at shielding original execution differences to create services which can be reused in the web environment. Although some model service standards (such as Web Processing Service (WPS) and Geo Processing Workflow (GPW)) have been designed and developed to help researchers construct model services, various problems regarding model encapsulation remain. (1) The descriptions of geo-analysis models are complicated and typically require rich-text descriptions and case-study illustrations, which are difficult to fully represent within a single web request (such as the GetCapabilities and DescribeProcess operations in the WPS standard). (2) Although Web Service technologies can be used to publish model services, model users who want to use a geo-analysis model and copy the model service into another computer still encounter problems (e.g., they cannot access the model deployment dependencies information). This study presents a strategy for encapsulating geo-analysis models to reduce problems encountered when sharing models between model providers and model users and supports the tasks with different web service standards (e.g., the WPS standard). A description method for heterogeneous geo-analysis models is studied. Based on the model description information, the methods for encapsulating the model-execution program to model services and for describing model-service deployment information are also included in the proposed strategy. Hence, the model-description interface, model-execution interface and model-deployment interface are studied to help model providers and model users more easily share, reuse and integrate geo-analysis models in an open web environment. Finally, a prototype system is established, and the WPS standard is employed as an example to verify the capability and practicability of the model-encapsulation strategy. The results show that it is more convenient for modellers to share and integrate heterogeneous geo-analysis models in cloud computing platforms.

  13. Object-oriented biomedical system modelling--the language.

    PubMed

    Hakman, M; Groth, T

    1999-11-01

    The paper describes a new object-oriented biomedical continuous system modelling language (OOBSML). It is fully object-oriented and supports model inheritance, encapsulation, and model component instantiation and behaviour polymorphism. Besides the traditional differential and algebraic equation expressions the language includes also formal expressions for documenting models and defining model quantity types and quantity units. It supports explicit definition of model input-, output- and state quantities, model components and component connections. The OOBSML model compiler produces self-contained, independent, executable model components that can be instantiated and used within other OOBSML models and/or stored within model and model component libraries. In this way complex models can be structured as multilevel, multi-component model hierarchies. Technically the model components produced by the OOBSML compiler are executable computer code objects based on distributed object and object request broker technology. This paper includes both the language tutorial and the formal language syntax and semantic description.

  14. Fitting IRT Models to Dichotomous and Polytomous Data: Assessing the Relative Model-Data Fit of Ideal Point and Dominance Models

    ERIC Educational Resources Information Center

    Tay, Louis; Ali, Usama S.; Drasgow, Fritz; Williams, Bruce

    2011-01-01

    This study investigated the relative model-data fit of an ideal point item response theory (IRT) model (the generalized graded unfolding model [GGUM]) and dominance IRT models (e.g., the two-parameter logistic model [2PLM] and Samejima's graded response model [GRM]) to simulated dichotomous and polytomous data generated from each of these models.…

  15. Comparing and combining process-based crop models and statistical models with some implications for climate change

    NASA Astrophysics Data System (ADS)

    Roberts, Michael J.; Braun, Noah O.; Sinclair, Thomas R.; Lobell, David B.; Schlenker, Wolfram

    2017-09-01

    We compare predictions of a simple process-based crop model (Soltani and Sinclair 2012), a simple statistical model (Schlenker and Roberts 2009), and a combination of both models to actual maize yields on a large, representative sample of farmer-managed fields in the Corn Belt region of the United States. After statistical post-model calibration, the process model (Simple Simulation Model, or SSM) predicts actual outcomes slightly better than the statistical model, but the combined model performs significantly better than either model. The SSM, statistical model and combined model all show similar relationships with precipitation, while the SSM better accounts for temporal patterns of precipitation, vapor pressure deficit and solar radiation. The statistical and combined models show a more negative impact associated with extreme heat for which the process model does not account. Due to the extreme heat effect, predicted impacts under uniform climate change scenarios are considerably more severe for the statistical and combined models than for the process-based model.

  16. An empirical model to forecast solar wind velocity through statistical modeling

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Ridley, A. J.

    2013-12-01

    The accurate prediction of the solar wind velocity has been a major challenge in the space weather community. Previous studies proposed many empirical and semi-empirical models to forecast the solar wind velocity based on either the historical observations, e.g. the persistence model, or the instantaneous observations of the sun, e.g. the Wang-Sheeley-Arge model. In this study, we use the one-minute WIND data from January 1995 to August 2012 to investigate and compare the performances of 4 models often used in literature, here referred to as the null model, the persistence model, the one-solar-rotation-ago model, and the Wang-Sheeley-Arge model. It is found that, measured by root mean square error, the persistence model gives the most accurate predictions within two days. Beyond two days, the Wang-Sheeley-Arge model serves as the best model, though it only slightly outperforms the null model and the one-solar-rotation-ago model. Finally, we apply the least-square regression to linearly combine the null model, the persistence model, and the one-solar-rotation-ago model to propose a 'general persistence model'. By comparing its performance against the 4 aforementioned models, it is found that the accuracy of the general persistence model outperforms the other 4 models within five days. Due to its great simplicity and superb performance, we believe that the general persistence model can serve as a benchmark in the forecast of solar wind velocity and has the potential to be modified to arrive at better models.

  17. A Primer for Model Selection: The Decisive Role of Model Complexity

    NASA Astrophysics Data System (ADS)

    Höge, Marvin; Wöhling, Thomas; Nowak, Wolfgang

    2018-03-01

    Selecting a "best" model among several competing candidate models poses an often encountered problem in water resources modeling (and other disciplines which employ models). For a modeler, the best model fulfills a certain purpose best (e.g., flood prediction), which is typically assessed by comparing model simulations to data (e.g., stream flow). Model selection methods find the "best" trade-off between good fit with data and model complexity. In this context, the interpretations of model complexity implied by different model selection methods are crucial, because they represent different underlying goals of modeling. Over the last decades, numerous model selection criteria have been proposed, but modelers who primarily want to apply a model selection criterion often face a lack of guidance for choosing the right criterion that matches their goal. We propose a classification scheme for model selection criteria that helps to find the right criterion for a specific goal, i.e., which employs the correct complexity interpretation. We identify four model selection classes which seek to achieve high predictive density, low predictive error, high model probability, or shortest compression of data. These goals can be achieved by following either nonconsistent or consistent model selection and by either incorporating a Bayesian parameter prior or not. We allocate commonly used criteria to these four classes, analyze how they represent model complexity and what this means for the model selection task. Finally, we provide guidance on choosing the right type of criteria for specific model selection tasks. (A quick guide through all key points is given at the end of the introduction.)

  18. Women's Endorsement of Models of Sexual Response: Correlates and Predictors.

    PubMed

    Nowosielski, Krzysztof; Wróbel, Beata; Kowalczyk, Robert

    2016-02-01

    Few studies have investigated endorsement of female sexual response models, and no single model has been accepted as a normative description of women's sexual response. The aim of the study was to establish how women from a population-based sample endorse current theoretical models of the female sexual response--the linear models and circular model (partial and composite Basson models)--as well as predictors of endorsement. Accordingly, 174 heterosexual women aged 18-55 years were included in a cross-sectional study: 74 women diagnosed with female sexual dysfunction (FSD) based on DSM-5 criteria and 100 non-dysfunctional women. The description of sexual response models was used to divide subjects into four subgroups: linear (Masters-Johnson and Kaplan models), circular (partial Basson model), mixed (linear and circular models in similar proportions, reflective of the composite Basson model), and a different model. Women were asked to choose which of the models best described their pattern of sexual response and how frequently they engaged in each model. Results showed that 28.7% of women endorsed the linear models, 19.5% the partial Basson model, 40.8% the composite Basson model, and 10.9% a different model. Women with FSD endorsed the partial Basson model and a different model more frequently than did non-dysfunctional controls. Individuals who were dissatisfied with a partner as a lover were more likely to endorse a different model. Based on the results, we concluded that the majority of women endorsed a mixed model combining the circular response with the possibility of an innate desire triggering a linear response. Further, relationship difficulties, not FSD, predicted model endorsement.

  19. The Use of Modeling-Based Text to Improve Students' Modeling Competencies

    ERIC Educational Resources Information Center

    Jong, Jing-Ping; Chiu, Mei-Hung; Chung, Shiao-Lan

    2015-01-01

    This study investigated the effects of a modeling-based text on 10th graders' modeling competencies. Fifteen 10th graders read a researcher-developed modeling-based science text on the ideal gas law that included explicit descriptions and representations of modeling processes (i.e., model selection, model construction, model validation, model…

  20. Performance and Architecture Lab Modeling Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this linkmore » makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior. The model -- an executable program -- is a hierarchical composition of annotation functions, synthesized functions, statistics for runtime values, and performance measurements.« less

  1. Maximum likelihood Bayesian model averaging and its predictive analysis for groundwater reactive transport models

    DOE PAGES

    Lu, Dan; Ye, Ming; Curtis, Gary P.

    2015-08-01

    While Bayesian model averaging (BMA) has been widely used in groundwater modeling, it is infrequently applied to groundwater reactive transport modeling because of multiple sources of uncertainty in the coupled hydrogeochemical processes and because of the long execution time of each model run. To resolve these problems, this study analyzed different levels of uncertainty in a hierarchical way, and used the maximum likelihood version of BMA, i.e., MLBMA, to improve the computational efficiency. Our study demonstrates the applicability of MLBMA to groundwater reactive transport modeling in a synthetic case in which twenty-seven reactive transport models were designed to predict themore » reactive transport of hexavalent uranium (U(VI)) based on observations at a former uranium mill site near Naturita, CO. Moreover, these reactive transport models contain three uncertain model components, i.e., parameterization of hydraulic conductivity, configuration of model boundary, and surface complexation reactions that simulate U(VI) adsorption. These uncertain model components were aggregated into the alternative models by integrating a hierarchical structure into MLBMA. The modeling results of the individual models and MLBMA were analyzed to investigate their predictive performance. The predictive logscore results show that MLBMA generally outperforms the best model, suggesting that using MLBMA is a sound strategy to achieve more robust model predictions relative to a single model. MLBMA works best when the alternative models are structurally distinct and have diverse model predictions. When correlation in model structure exists, two strategies were used to improve predictive performance by retaining structurally distinct models or assigning smaller prior model probabilities to correlated models. Since the synthetic models were designed using data from the Naturita site, the results of this study are expected to provide guidance for real-world modeling. Finally, limitations of applying MLBMA to the synthetic study and future real-world modeling are discussed.« less

  2. Takagi-Sugeno-Kang fuzzy models of the rainfall-runoff transformation

    NASA Astrophysics Data System (ADS)

    Jacquin, A. P.; Shamseldin, A. Y.

    2009-04-01

    Fuzzy inference systems, or fuzzy models, are non-linear models that describe the relation between the inputs and the output of a real system using a set of fuzzy IF-THEN rules. This study deals with the application of Takagi-Sugeno-Kang type fuzzy models to the development of rainfall-runoff models operating on a daily basis, using a system based approach. The models proposed are classified in two types, each intended to account for different kinds of dominant non-linear effects in the rainfall-runoff relationship. Fuzzy models type 1 are intended to incorporate the effect of changes in the prevailing soil moisture content, while fuzzy models type 2 address the phenomenon of seasonality. Each model type consists of five fuzzy models of increasing complexity; the most complex fuzzy model of each model type includes all the model components found in the remaining fuzzy models of the respective type. The models developed are applied to data of six catchments from different geographical locations and sizes. Model performance is evaluated in terms of two measures of goodness of fit, namely the Nash-Sutcliffe criterion and the index of volumetric fit. The results of the fuzzy models are compared with those of the Simple Linear Model, the Linear Perturbation Model and the Nearest Neighbour Linear Perturbation Model, which use similar input information. Overall, the results of this study indicate that Takagi-Sugeno-Kang fuzzy models are a suitable alternative for modelling the rainfall-runoff relationship. However, it is also observed that increasing the complexity of the model structure does not necessarily produce an improvement in the performance of the fuzzy models. The relative importance of the different model components in determining the model performance is evaluated through sensitivity analysis of the model parameters in the accompanying study presented in this meeting. Acknowledgements: We would like to express our gratitude to Prof. Kieran M. O'Connor from the National University of Ireland, Galway, for providing the data used in this study.

  3. A simple computational algorithm of model-based choice preference.

    PubMed

    Toyama, Asako; Katahira, Kentaro; Ohira, Hideki

    2017-08-01

    A broadly used computational framework posits that two learning systems operate in parallel during the learning of choice preferences-namely, the model-free and model-based reinforcement-learning systems. In this study, we examined another possibility, through which model-free learning is the basic system and model-based information is its modulator. Accordingly, we proposed several modified versions of a temporal-difference learning model to explain the choice-learning process. Using the two-stage decision task developed by Daw, Gershman, Seymour, Dayan, and Dolan (2011), we compared their original computational model, which assumes a parallel learning process, and our proposed models, which assume a sequential learning process. Choice data from 23 participants showed a better fit with the proposed models. More specifically, the proposed eligibility adjustment model, which assumes that the environmental model can weight the degree of the eligibility trace, can explain choices better under both model-free and model-based controls and has a simpler computational algorithm than the original model. In addition, the forgetting learning model and its variation, which assume changes in the values of unchosen actions, substantially improved the fits to the data. Overall, we show that a hybrid computational model best fits the data. The parameters used in this model succeed in capturing individual tendencies with respect to both model use in learning and exploration behavior. This computational model provides novel insights into learning with interacting model-free and model-based components.

  4. Airborne Wireless Communication Modeling and Analysis with MATLAB

    DTIC Science & Technology

    2014-03-27

    research develops a physical layer model that combines antenna modeling using computational electromagnetics and the two-ray propagation model to...predict the received signal strength. The antenna is modeled with triangular patches and analyzed by extending the antenna modeling algorithm by Sergey...7  2.7. Propagation Modeling : Statistical Models ............................................................8  2.8. Antenna Modeling

  5. Marginal and Random Intercepts Models for Longitudinal Binary Data with Examples from Criminology

    ERIC Educational Resources Information Center

    Long, Jeffrey D.; Loeber, Rolf; Farrington, David P.

    2009-01-01

    Two models for the analysis of longitudinal binary data are discussed: the marginal model and the random intercepts model. In contrast to the linear mixed model (LMM), the two models for binary data are not subsumed under a single hierarchical model. The marginal model provides group-level information whereas the random intercepts model provides…

  6. EpiModel: An R Package for Mathematical Modeling of Infectious Disease over Networks.

    PubMed

    Jenness, Samuel M; Goodreau, Steven M; Morris, Martina

    2018-04-01

    Package EpiModel provides tools for building, simulating, and analyzing mathematical models for the population dynamics of infectious disease transmission in R. Several classes of models are included, but the unique contribution of this software package is a general stochastic framework for modeling the spread of epidemics on networks. EpiModel integrates recent advances in statistical methods for network analysis (temporal exponential random graph models) that allow the epidemic modeling to be grounded in empirical data on contacts that can spread infection. This article provides an overview of both the modeling tools built into EpiModel , designed to facilitate learning for students new to modeling, and the application programming interface for extending package EpiModel , designed to facilitate the exploration of novel research questions for advanced modelers.

  7. EpiModel: An R Package for Mathematical Modeling of Infectious Disease over Networks

    PubMed Central

    Jenness, Samuel M.; Goodreau, Steven M.; Morris, Martina

    2018-01-01

    Package EpiModel provides tools for building, simulating, and analyzing mathematical models for the population dynamics of infectious disease transmission in R. Several classes of models are included, but the unique contribution of this software package is a general stochastic framework for modeling the spread of epidemics on networks. EpiModel integrates recent advances in statistical methods for network analysis (temporal exponential random graph models) that allow the epidemic modeling to be grounded in empirical data on contacts that can spread infection. This article provides an overview of both the modeling tools built into EpiModel, designed to facilitate learning for students new to modeling, and the application programming interface for extending package EpiModel, designed to facilitate the exploration of novel research questions for advanced modelers. PMID:29731699

  8. Model compilation: An approach to automated model derivation

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Baudin, Catherine; Iwasaki, Yumi; Nayak, Pandurang; Tanaka, Kazuo

    1990-01-01

    An approach is introduced to automated model derivation for knowledge based systems. The approach, model compilation, involves procedurally generating the set of domain models used by a knowledge based system. With an implemented example, how this approach can be used to derive models of different precision and abstraction is illustrated, and models are tailored to different tasks, from a given set of base domain models. In particular, two implemented model compilers are described, each of which takes as input a base model that describes the structure and behavior of a simple electromechanical device, the Reaction Wheel Assembly of NASA's Hubble Space Telescope. The compilers transform this relatively general base model into simple task specific models for troubleshooting and redesign, respectively, by applying a sequence of model transformations. Each transformation in this sequence produces an increasingly more specialized model. The compilation approach lessens the burden of updating and maintaining consistency among models by enabling their automatic regeneration.

  9. A composite computational model of liver glucose homeostasis. I. Building the composite model.

    PubMed

    Hetherington, J; Sumner, T; Seymour, R M; Li, L; Rey, M Varela; Yamaji, S; Saffrey, P; Margoninski, O; Bogle, I D L; Finkelstein, A; Warner, A

    2012-04-07

    A computational model of the glucagon/insulin-driven liver glucohomeostasis function, focusing on the buffering of glucose into glycogen, has been developed. The model exemplifies an 'engineering' approach to modelling in systems biology, and was produced by linking together seven component models of separate aspects of the physiology. The component models use a variety of modelling paradigms and degrees of simplification. Model parameters were determined by an iterative hybrid of fitting to high-scale physiological data, and determination from small-scale in vitro experiments or molecular biological techniques. The component models were not originally designed for inclusion within such a composite model, but were integrated, with modification, using our published modelling software and computational frameworks. This approach facilitates the development of large and complex composite models, although, inevitably, some compromises must be made when composing the individual models. Composite models of this form have not previously been demonstrated.

  10. The applicability of turbulence models to aerodynamic and propulsion flowfields at McDonnell-Douglas Aerospace

    NASA Technical Reports Server (NTRS)

    Kral, Linda D.; Ladd, John A.; Mani, Mori

    1995-01-01

    The objective of this viewgraph presentation is to evaluate turbulence models for integrated aircraft components such as the forebody, wing, inlet, diffuser, nozzle, and afterbody. The one-equation models have replaced the algebraic models as the baseline turbulence models. The Spalart-Allmaras one-equation model consistently performs better than the Baldwin-Barth model, particularly in the log-layer and free shear layers. Also, the Sparlart-Allmaras model is not grid dependent like the Baldwin-Barth model. No general turbulence model exists for all engineering applications. The Spalart-Allmaras one-equation model and the Chien k-epsilon models are the preferred turbulence models. Although the two-equation models often better predict the flow field, they may take from two to five times the CPU time. Future directions are in further benchmarking the Menter blended k-w/k-epsilon and algorithmic improvements to reduce CPU time of the two-equation model.

  11. The determination of third order linear models from a seventh order nonlinear jet engine model

    NASA Technical Reports Server (NTRS)

    Lalonde, Rick J.; Hartley, Tom T.; De Abreu-Garcia, J. Alex

    1989-01-01

    Results are presented that demonstrate how good reduced-order models can be obtained directly by recursive parameter identification using input/output (I/O) data of high-order nonlinear systems. Three different methods of obtaining a third-order linear model from a seventh-order nonlinear turbojet engine model are compared. The first method is to obtain a linear model from the original model and then reduce the linear model by standard reduction techniques such as residualization and balancing. The second method is to identify directly a third-order linear model by recursive least-squares parameter estimation using I/O data of the original model. The third method is to obtain a reduced-order model from the original model and then linearize the reduced model. Frequency responses are used as the performance measure to evaluate the reduced models. The reduced-order models along with their Bode plots are presented for comparison purposes.

  12. BioModels: expanding horizons to include more modelling approaches and formats

    PubMed Central

    Nguyen, Tung V N; Graesslin, Martin; Hälke, Robert; Ali, Raza; Schramm, Jochen; Wimalaratne, Sarala M; Kothamachu, Varun B; Rodriguez, Nicolas; Swat, Maciej J; Eils, Jurgen; Eils, Roland; Laibe, Camille; Chelliah, Vijayalakshmi

    2018-01-01

    Abstract BioModels serves as a central repository of mathematical models representing biological processes. It offers a platform to make mathematical models easily shareable across the systems modelling community, thereby supporting model reuse. To facilitate hosting a broader range of model formats derived from diverse modelling approaches and tools, a new infrastructure for BioModels has been developed that is available at http://www.ebi.ac.uk/biomodels. This new system allows submitting and sharing of a wide range of models with improved support for formats other than SBML. It also offers a version-control backed environment in which authors and curators can work collaboratively to curate models. This article summarises the features available in the current system and discusses the potential benefit they offer to the users over the previous system. In summary, the new portal broadens the scope of models accepted in BioModels and supports collaborative model curation which is crucial for model reproducibility and sharing. PMID:29106614

  13. Modelling, teachers' views on the nature of modelling, and implications for the education of modellers

    NASA Astrophysics Data System (ADS)

    Justi, Rosária S.; Gilbert, John K.

    2002-04-01

    In this paper, the role of modelling in the teaching and learning of science is reviewed. In order to represent what is entailed in modelling, a 'model of modelling' framework is proposed. Five phases in moving towards a full capability in modelling are established by a review of the literature: learning models; learning to use models; learning how to revise models; learning to reconstruct models; learning to construct models de novo. In order to identify the knowledge and skills that science teachers think are needed to produce a model successfully, a semi-structured interview study was conducted with 39 Brazilian serving science teachers: 10 teaching at the 'fundamental' level (6-14 years); 10 teaching at the 'medium'-level (15-17 years); 10 undergraduate pre-service 'medium'-level teachers; 9 university teachers of chemistry. Their responses are used to establish what is entailed in implementing the 'model of modelling' framework. The implications for students, teachers, and for teacher education, of moving through the five phases of capability, are discussed.

  14. Modelling land use change with generalized linear models--a multi-model analysis of change between 1860 and 2000 in Gallatin Valley, Montana.

    PubMed

    Aspinall, Richard

    2004-08-01

    This paper develops an approach to modelling land use change that links model selection and multi-model inference with empirical models and GIS. Land use change is frequently studied, and understanding gained, through a process of modelling that is an empirical analysis of documented changes in land cover or land use patterns. The approach here is based on analysis and comparison of multiple models of land use patterns using model selection and multi-model inference. The approach is illustrated with a case study of rural housing as it has developed for part of Gallatin County, Montana, USA. A GIS contains the location of rural housing on a yearly basis from 1860 to 2000. The database also documents a variety of environmental and socio-economic conditions. A general model of settlement development describes the evolution of drivers of land use change and their impacts in the region. This model is used to develop a series of different models reflecting drivers of change at different periods in the history of the study area. These period specific models represent a series of multiple working hypotheses describing (a) the effects of spatial variables as a representation of social, economic and environmental drivers of land use change, and (b) temporal changes in the effects of the spatial variables as the drivers of change evolve over time. Logistic regression is used to calibrate and interpret these models and the models are then compared and evaluated with model selection techniques. Results show that different models are 'best' for the different periods. The different models for different periods demonstrate that models are not invariant over time which presents challenges for validation and testing of empirical models. The research demonstrates (i) model selection as a mechanism for rating among many plausible models that describe land cover or land use patterns, (ii) inference from a set of models rather than from a single model, (iii) that models can be developed based on hypothesised relationships based on consideration of underlying and proximate causes of change, and (iv) that models are not invariant over time.

  15. Investigation of prospective teachers' knowledge and understanding of models and modeling and their attitudes towards the use of models in science education

    NASA Astrophysics Data System (ADS)

    Aktan, Mustafa B.

    The purpose of this study was to investigate prospective science teachers' knowledge and understanding of models and modeling, and their attitudes towards the use of models in science teaching through the following research questions: What knowledge do prospective science teachers have about models and modeling in science? What understandings about the nature of models do these teachers hold as a result of their educational training? What perceptions and attitudes do these teachers hold about the use of models in their teaching? Two main instruments, semi-structured in-depth interviewing and an open-item questionnaire, were used to obtain data from the participants. The data were analyzed from an interpretative phenomenological perspective and grounded theory methods. Earlier studies on in-service science teachers' understanding about the nature of models and modeling revealed that variations exist among teachers' limited yet diverse understanding of scientific models. The results of this study indicated that variations also existed among prospective science teachers' understanding of the concept of model and the nature of models. Apparently the participants' knowledge of models and modeling was limited and they viewed models as materialistic examples and representations. I found that the teachers believed the purpose of a model is to make phenomena more accessible and more understandable. They defined models by referring to an example, a representation, or a simplified version of the real thing. I found no evidence of negative attitudes towards use of models among the participants. Although the teachers valued the idea that scientific models are important aspects of science teaching and learning, and showed positive attitudes towards the use of models in their teaching, certain factors like level of learner, time, lack of modeling experience, and limited knowledge of models appeared to be affecting their perceptions negatively. Implications for the development of science teaching and teacher education programs are discussed. Directions for future research are suggested. Overall, based on the results, I suggest that prospective science teachers should engage in more modeling activities through their preparation programs, gain more modeling experience, and collaborate with their colleagues to better understand and implement scientific models in science teaching.

  16. Validation of Groundwater Models: Meaningful or Meaningless?

    NASA Astrophysics Data System (ADS)

    Konikow, L. F.

    2003-12-01

    Although numerical simulation models are valuable tools for analyzing groundwater systems, their predictive accuracy is limited. People who apply groundwater flow or solute-transport models, as well as those who make decisions based on model results, naturally want assurance that a model is "valid." To many people, model validation implies some authentication of the truth or accuracy of the model. History matching is often presented as the basis for model validation. Although such model calibration is a necessary modeling step, it is simply insufficient for model validation. Because of parameter uncertainty and solution non-uniqueness, declarations of validation (or verification) of a model are not meaningful. Post-audits represent a useful means to assess the predictive accuracy of a site-specific model, but they require the existence of long-term monitoring data. Model testing may yield invalidation, but that is an opportunity to learn and to improve the conceptual and numerical models. Examples of post-audits and of the application of a solute-transport model to a radioactive waste disposal site illustrate deficiencies in model calibration, prediction, and validation.

  17. Hierarchical modeling and inference in ecology: The analysis of data from populations, metapopulations and communities

    USGS Publications Warehouse

    Royle, J. Andrew; Dorazio, Robert M.

    2008-01-01

    A guide to data collection, modeling and inference strategies for biological survey data using Bayesian and classical statistical methods. This book describes a general and flexible framework for modeling and inference in ecological systems based on hierarchical models, with a strict focus on the use of probability models and parametric inference. Hierarchical models represent a paradigm shift in the application of statistics to ecological inference problems because they combine explicit models of ecological system structure or dynamics with models of how ecological systems are observed. The principles of hierarchical modeling are developed and applied to problems in population, metapopulation, community, and metacommunity systems. The book provides the first synthetic treatment of many recent methodological advances in ecological modeling and unifies disparate methods and procedures. The authors apply principles of hierarchical modeling to ecological problems, including * occurrence or occupancy models for estimating species distribution * abundance models based on many sampling protocols, including distance sampling * capture-recapture models with individual effects * spatial capture-recapture models based on camera trapping and related methods * population and metapopulation dynamic models * models of biodiversity, community structure and dynamics.

  18. Using the Model Coupling Toolkit to couple earth system models

    USGS Publications Warehouse

    Warner, J.C.; Perlin, N.; Skyllingstad, E.D.

    2008-01-01

    Continued advances in computational resources are providing the opportunity to operate more sophisticated numerical models. Additionally, there is an increasing demand for multidisciplinary studies that include interactions between different physical processes. Therefore there is a strong desire to develop coupled modeling systems that utilize existing models and allow efficient data exchange and model control. The basic system would entail model "1" running on "M" processors and model "2" running on "N" processors, with efficient exchange of model fields at predetermined synchronization intervals. Here we demonstrate two coupled systems: the coupling of the ocean circulation model Regional Ocean Modeling System (ROMS) to the surface wave model Simulating WAves Nearshore (SWAN), and the coupling of ROMS to the atmospheric model Coupled Ocean Atmosphere Prediction System (COAMPS). Both coupled systems use the Model Coupling Toolkit (MCT) as a mechanism for operation control and inter-model distributed memory transfer of model variables. In this paper we describe requirements and other options for model coupling, explain the MCT library, ROMS, SWAN and COAMPS models, methods for grid decomposition and sparse matrix interpolation, and provide an example from each coupled system. Methods presented in this paper are clearly applicable for coupling of other types of models. ?? 2008 Elsevier Ltd. All rights reserved.

  19. Generalized Multilevel Structural Equation Modeling

    ERIC Educational Resources Information Center

    Rabe-Hesketh, Sophia; Skrondal, Anders; Pickles, Andrew

    2004-01-01

    A unifying framework for generalized multilevel structural equation modeling is introduced. The models in the framework, called generalized linear latent and mixed models (GLLAMM), combine features of generalized linear mixed models (GLMM) and structural equation models (SEM) and consist of a response model and a structural model for the latent…

  20. Frequentist Model Averaging in Structural Equation Modelling.

    PubMed

    Jin, Shaobo; Ankargren, Sebastian

    2018-06-04

    Model selection from a set of candidate models plays an important role in many structural equation modelling applications. However, traditional model selection methods introduce extra randomness that is not accounted for by post-model selection inference. In the current study, we propose a model averaging technique within the frequentist statistical framework. Instead of selecting an optimal model, the contributions of all candidate models are acknowledged. Valid confidence intervals and a [Formula: see text] test statistic are proposed. A simulation study shows that the proposed method is able to produce a robust mean-squared error, a better coverage probability, and a better goodness-of-fit test compared to model selection. It is an interesting compromise between model selection and the full model.

  1. Premium analysis for copula model: A case study for Malaysian motor insurance claims

    NASA Astrophysics Data System (ADS)

    Resti, Yulia; Ismail, Noriszura; Jaaman, Saiful Hafizah

    2014-06-01

    This study performs premium analysis for copula models with regression marginals. For illustration purpose, the copula models are fitted to the Malaysian motor insurance claims data. In this study, we consider copula models from Archimedean and Elliptical families, and marginal distributions of Gamma and Inverse Gaussian regression models. The simulated results from independent model, which is obtained from fitting regression models separately to each claim category, and dependent model, which is obtained from fitting copula models to all claim categories, are compared. The results show that the dependent model using Frank copula is the best model since the risk premiums estimated under this model are closely approximate to the actual claims experience relative to the other copula models.

  2. Utilizing Biological Models to Determine the Recruitment of the IRA by Modeling the Voting Behavior of Sinn Fein

    DTIC Science & Technology

    2006-03-01

    models, the thesis applies a biological model, the Lotka - Volterra predator- prey model, to a highly suggestive case study, that of the Irish Republican...Model, Irish Republican Army, Sinn Féin, Lotka - Volterra Predator Prey Model, Recruitment, British Army 16. PRICE CODE 17. SECURITY CLASSIFICATION OF...weaknesses of sociological and biological models, the thesis applies a biological model, the Lotka - Volterra predator-prey model, to a highly suggestive

  3. Right-Sizing Statistical Models for Longitudinal Data

    PubMed Central

    Wood, Phillip K.; Steinley, Douglas; Jackson, Kristina M.

    2015-01-01

    Arguments are proposed that researchers using longitudinal data should consider more and less complex statistical model alternatives to their initially chosen techniques in an effort to “right-size” the model to the data at hand. Such model comparisons may alert researchers who use poorly fitting overly parsimonious models to more complex better fitting alternatives, and, alternatively, may identify more parsimonious alternatives to overly complex (and perhaps empirically under-identified and/or less powerful) statistical models. A general framework is proposed for considering (often nested) relationships between a variety of psychometric and growth curve models. A three-step approach is proposed in which models are evaluated based on the number and patterning of variance components prior to selection of better-fitting growth models that explain both mean and variation/covariation patterns. The orthogonal, free-curve slope-intercept (FCSI) growth model is considered as a general model which includes, as special cases, many models including the Factor Mean model (FM, McArdle & Epstein, 1987), McDonald's (1967) linearly constrained factor model, Hierarchical Linear Models (HLM), Repeated Measures MANOVA, and the Linear Slope Intercept (LinearSI) Growth Model. The FCSI model, in turn, is nested within the Tuckerized factor model. The approach is illustrated by comparing alternative models in a longitudinal study of children's vocabulary and by comparison of several candidate parametric growth and chronometric models in a Monte Carlo study. PMID:26237507

  4. Right-sizing statistical models for longitudinal data.

    PubMed

    Wood, Phillip K; Steinley, Douglas; Jackson, Kristina M

    2015-12-01

    Arguments are proposed that researchers using longitudinal data should consider more and less complex statistical model alternatives to their initially chosen techniques in an effort to "right-size" the model to the data at hand. Such model comparisons may alert researchers who use poorly fitting, overly parsimonious models to more complex, better-fitting alternatives and, alternatively, may identify more parsimonious alternatives to overly complex (and perhaps empirically underidentified and/or less powerful) statistical models. A general framework is proposed for considering (often nested) relationships between a variety of psychometric and growth curve models. A 3-step approach is proposed in which models are evaluated based on the number and patterning of variance components prior to selection of better-fitting growth models that explain both mean and variation-covariation patterns. The orthogonal free curve slope intercept (FCSI) growth model is considered a general model that includes, as special cases, many models, including the factor mean (FM) model (McArdle & Epstein, 1987), McDonald's (1967) linearly constrained factor model, hierarchical linear models (HLMs), repeated-measures multivariate analysis of variance (MANOVA), and the linear slope intercept (linearSI) growth model. The FCSI model, in turn, is nested within the Tuckerized factor model. The approach is illustrated by comparing alternative models in a longitudinal study of children's vocabulary and by comparing several candidate parametric growth and chronometric models in a Monte Carlo study. (c) 2015 APA, all rights reserved).

  5. Model averaging techniques for quantifying conceptual model uncertainty.

    PubMed

    Singh, Abhishek; Mishra, Srikanta; Ruskauff, Greg

    2010-01-01

    In recent years a growing understanding has emerged regarding the need to expand the modeling paradigm to include conceptual model uncertainty for groundwater models. Conceptual model uncertainty is typically addressed by formulating alternative model conceptualizations and assessing their relative likelihoods using statistical model averaging approaches. Several model averaging techniques and likelihood measures have been proposed in the recent literature for this purpose with two broad categories--Monte Carlo-based techniques such as Generalized Likelihood Uncertainty Estimation or GLUE (Beven and Binley 1992) and criterion-based techniques that use metrics such as the Bayesian and Kashyap Information Criteria (e.g., the Maximum Likelihood Bayesian Model Averaging or MLBMA approach proposed by Neuman 2003) and Akaike Information Criterion-based model averaging (AICMA) (Poeter and Anderson 2005). These different techniques can often lead to significantly different relative model weights and ranks because of differences in the underlying statistical assumptions about the nature of model uncertainty. This paper provides a comparative assessment of the four model averaging techniques (GLUE, MLBMA with KIC, MLBMA with BIC, and AIC-based model averaging) mentioned above for the purpose of quantifying the impacts of model uncertainty on groundwater model predictions. Pros and cons of each model averaging technique are examined from a practitioner's perspective using two groundwater modeling case studies. Recommendations are provided regarding the use of these techniques in groundwater modeling practice.

  6. Examination of various turbulence models for application in liquid rocket thrust chambers

    NASA Technical Reports Server (NTRS)

    Hung, R. J.

    1991-01-01

    There is a large variety of turbulence models available. These models include direct numerical simulation, large eddy simulation, Reynolds stress/flux model, zero equation model, one equation model, two equation k-epsilon model, multiple-scale model, etc. Each turbulence model contains different physical assumptions and requirements. The natures of turbulence are randomness, irregularity, diffusivity and dissipation. The capabilities of the turbulence models, including physical strength, weakness, limitations, as well as numerical and computational considerations, are reviewed. Recommendations are made for the potential application of a turbulence model in thrust chamber and performance prediction programs. The full Reynolds stress model is recommended. In a workshop, specifically called for the assessment of turbulence models for applications in liquid rocket thrust chambers, most of the experts present were also in favor of the recommendation of the Reynolds stress model.

  7. Comparative study of turbulence models in predicting hypersonic inlet flows

    NASA Technical Reports Server (NTRS)

    Kapoor, Kamlesh; Anderson, Bernhard H.; Shaw, Robert J.

    1992-01-01

    A numerical study was conducted to analyze the performance of different turbulence models when applied to the hypersonic NASA P8 inlet. Computational results from the PARC2D code, which solves the full two-dimensional Reynolds-averaged Navier-Stokes equation, were compared with experimental data. The zero-equation models considered for the study were the Baldwin-Lomax model, the Thomas model, and a combination of the Baldwin-Lomax and Thomas models; the two-equation models considered were the Chien model, the Speziale model (both low Reynolds number), and the Launder and Spalding model (high Reynolds number). The Thomas model performed best among the zero-equation models, and predicted good pressure distributions. The Chien and Speziale models compared wery well with the experimental data, and performed better than the Thomas model near the walls.

  8. Comparative study of turbulence models in predicting hypersonic inlet flows

    NASA Technical Reports Server (NTRS)

    Kapoor, Kamlesh; Anderson, Bernhard H.; Shaw, Robert J.

    1992-01-01

    A numerical study was conducted to analyze the performance of different turbulence models when applied to the hypersonic NASA P8 inlet. Computational results from the PARC2D code, which solves the full two-dimensional Reynolds-averaged Navier-Stokes equation, were compared with experimental data. The zero-equation models considered for the study were the Baldwin-Lomax model, the Thomas model, and a combination of the Baldwin-Lomax and Thomas models; the two-equation models considered were the Chien model, the Speziale model (both low Reynolds number), and the Launder and Spalding model (high Reynolds number). The Thomas model performed best among the zero-equation models, and predicted good pressure distributions. The Chien and Speziale models compared very well with the experimental data, and performed better than the Thomas model near the walls.

  9. [The reliability of dento-maxillary models created by cone-beam CT and rapid prototyping:a comparative study].

    PubMed

    Lv, Yan; Yan, Bin; Wang, Lin; Lou, Dong-hua

    2012-04-01

    To analyze the reliability of the dento-maxillary models created by cone-beam CT and rapid prototyping (RP). Plaster models were obtained from 20 orthodontic patients who had been scanned by cone-beam CT and 3-D models were formed after the calculation and reconstruction of software. Then, computerized composite models (RP models) were produced by rapid prototyping technique. The crown widths, dental arch widths and dental arch lengths on each plaster model, 3-D model and RP model were measured, followed by statistical analysis with SPSS17.0 software package. For crown widths, dental arch lengths and crowding, there were significant differences(P<0.05) among the 3 models, but the dental arch widths were on the contrary. Measurements on 3-D models were significantly smaller than those on other two models(P<0.05). Compared with 3-D models, RP models had more numbers which were not significantly different from those on plaster models(P>0.05). The regression coefficient among three models were significantly different(P<0.01), ranging from 0.8 to 0.9. But between RP and plaster models was bigger than that between 3-D and plaster models. There is high consistency within 3 models, while some differences were accepted in clinic. Therefore, it is possible to substitute 3-D and RP models for plaster models in order to save storage space and improve efficiency.

  10. Smart Frameworks and Self-Describing Models: Model Metadata for Automated Coupling of Hydrologic Process Components (Invited)

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.

    2013-12-01

    Model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that allow heterogeneous sets of process models to be assembled in a plug-and-play manner to create composite "system models". These mechanisms facilitate code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers, e.g. by requiring them to provide their output in specific forms that meet the input requirements of other models. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can compare the answers to these queries with similar answers from other process models in a collection and then automatically call framework service components as necessary to mediate the differences between the coupled models. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. To illustrate the power of standardized model interfaces and metadata, a smart, light-weight modeling framework written in Python will be introduced that can automatically (without user intervention) couple a set of BMI-enabled hydrologic process components together to create a spatial hydrologic model. The same mechanisms could also be used to provide seamless integration (import/export) of data and models.

  11. A model-averaging method for assessing groundwater conceptual model uncertainty.

    PubMed

    Ye, Ming; Pohlmann, Karl F; Chapman, Jenny B; Pohll, Greg M; Reeves, Donald M

    2010-01-01

    This study evaluates alternative groundwater models with different recharge and geologic components at the northern Yucca Flat area of the Death Valley Regional Flow System (DVRFS), USA. Recharge over the DVRFS has been estimated using five methods, and five geological interpretations are available at the northern Yucca Flat area. Combining the recharge and geological components together with additional modeling components that represent other hydrogeological conditions yields a total of 25 groundwater flow models. As all the models are plausible given available data and information, evaluating model uncertainty becomes inevitable. On the other hand, hydraulic parameters (e.g., hydraulic conductivity) are uncertain in each model, giving rise to parametric uncertainty. Propagation of the uncertainty in the models and model parameters through groundwater modeling causes predictive uncertainty in model predictions (e.g., hydraulic head and flow). Parametric uncertainty within each model is assessed using Monte Carlo simulation, and model uncertainty is evaluated using the model averaging method. Two model-averaging techniques (on the basis of information criteria and GLUE) are discussed. This study shows that contribution of model uncertainty to predictive uncertainty is significantly larger than that of parametric uncertainty. For the recharge and geological components, uncertainty in the geological interpretations has more significant effect on model predictions than uncertainty in the recharge estimates. In addition, weighted residuals vary more for the different geological models than for different recharge models. Most of the calibrated observations are not important for discriminating between the alternative models, because their weighted residuals vary only slightly from one model to another.

  12. Meta-Modeling: A Knowledge-Based Approach to Facilitating Model Construction and Reuse

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Dungan, Jennifer L.

    1997-01-01

    In this paper, we introduce a new modeling approach called meta-modeling and illustrate its practical applicability to the construction of physically-based ecosystem process models. As a critical adjunct to modeling codes meta-modeling requires explicit specification of certain background information related to the construction and conceptual underpinnings of a model. This information formalizes the heretofore tacit relationship between the mathematical modeling code and the underlying real-world phenomena being investigated, and gives insight into the process by which the model was constructed. We show how the explicit availability of such information can make models more understandable and reusable and less subject to misinterpretation. In particular, background information enables potential users to better interpret an implemented ecosystem model without direct assistance from the model author. Additionally, we show how the discipline involved in specifying background information leads to improved management of model complexity and fewer implementation errors. We illustrate the meta-modeling approach in the context of the Scientists' Intelligent Graphical Modeling Assistant (SIGMA) a new model construction environment. As the user constructs a model using SIGMA the system adds appropriate background information that ties the executable model to the underlying physical phenomena under investigation. Not only does this information improve the understandability of the final model it also serves to reduce the overall time and programming expertise necessary to initially build and subsequently modify models. Furthermore, SIGMA's use of background knowledge helps eliminate coding errors resulting from scientific and dimensional inconsistencies that are otherwise difficult to avoid when building complex models. As a. demonstration of SIGMA's utility, the system was used to reimplement and extend a well-known forest ecosystem dynamics model: Forest-BGC.

  13. 10. MOVABLE BED SEDIMENTATION MODELS. DOGTOOTH BEND MODEL (MODEL SCALE: ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. MOVABLE BED SEDIMENTATION MODELS. DOGTOOTH BEND MODEL (MODEL SCALE: 1' = 400' HORIZONTAL, 1' = 100' VERTICAL), AND GREENVILLE BRIDGE MODEL (MODEL SCALE: 1' = 360' HORIZONTAL, 1' = 100' VERTICAL). - Waterways Experiment Station, Hydraulics Laboratory, Halls Ferry Road, 2 miles south of I-20, Vicksburg, Warren County, MS

  14. Bayesian Data-Model Fit Assessment for Structural Equation Modeling

    ERIC Educational Resources Information Center

    Levy, Roy

    2011-01-01

    Bayesian approaches to modeling are receiving an increasing amount of attention in the areas of model construction and estimation in factor analysis, structural equation modeling (SEM), and related latent variable models. However, model diagnostics and model criticism remain relatively understudied aspects of Bayesian SEM. This article describes…

  15. Evolution of computational models in BioModels Database and the Physiome Model Repository.

    PubMed

    Scharm, Martin; Gebhardt, Tom; Touré, Vasundra; Bagnacani, Andrea; Salehzadeh-Yazdi, Ali; Wolkenhauer, Olaf; Waltemath, Dagmar

    2018-04-12

    A useful model is one that is being (re)used. The development of a successful model does not finish with its publication. During reuse, models are being modified, i.e. expanded, corrected, and refined. Even small changes in the encoding of a model can, however, significantly affect its interpretation. Our motivation for the present study is to identify changes in models and make them transparent and traceable. We analysed 13734 models from BioModels Database and the Physiome Model Repository. For each model, we studied the frequencies and types of updates between its first and latest release. To demonstrate the impact of changes, we explored the history of a Repressilator model in BioModels Database. We observed continuous updates in the majority of models. Surprisingly, even the early models are still being modified. We furthermore detected that many updates target annotations, which improves the information one can gain from models. To support the analysis of changes in model repositories we developed MoSt, an online tool for visualisations of changes in models. The scripts used to generate the data and figures for this study are available from GitHub https://github.com/binfalse/BiVeS-StatsGenerator and as a Docker image at https://hub.docker.com/r/binfalse/bives-statsgenerator/ . The website https://most.bio.informatik.uni-rostock.de/ provides interactive access to model versions and their evolutionary statistics. The reuse of models is still impeded by a lack of trust and documentation. A detailed and transparent documentation of all aspects of the model, including its provenance, will improve this situation. Knowledge about a model's provenance can avoid the repetition of mistakes that others already faced. More insights are gained into how the system evolves from initial findings to a profound understanding. We argue that it is the responsibility of the maintainers of model repositories to offer transparent model provenance to their users.

  16. Large-watershed flood simulation and forecasting based on different-resolution distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Li, J.

    2017-12-01

    Large-watershed flood simulation and forecasting is very important for a distributed hydrological model in the application. There are some challenges including the model's spatial resolution effect, model performance and accuracy and so on. To cope with the challenge of the model's spatial resolution effect, different model resolution including 1000m*1000m, 600m*600m, 500m*500m, 400m*400m, 200m*200m were used to build the distributed hydrological model—Liuxihe model respectively. The purpose is to find which one is the best resolution for Liuxihe model in Large-watershed flood simulation and forecasting. This study sets up a physically based distributed hydrological model for flood forecasting of the Liujiang River basin in south China. Terrain data digital elevation model (DEM), soil type and land use type are downloaded from the website freely. The model parameters are optimized by using an improved Particle Swarm Optimization(PSO) algorithm; And parameter optimization could reduce the parameter uncertainty that exists for physically deriving model parameters. The different model resolution (200m*200m—1000m*1000m ) are proposed for modeling the Liujiang River basin flood with the Liuxihe model in this study. The best model's spatial resolution effect for flood simulation and forecasting is 200m*200m.And with the model's spatial resolution reduction, the model performance and accuracy also become worse and worse. When the model resolution is 1000m*1000m, the flood simulation and forecasting result is the worst, also the river channel divided based on this resolution is differs from the actual one. To keep the model with an acceptable performance, minimum model spatial resolution is needed. The suggested threshold model spatial resolution for modeling the Liujiang River basin flood is a 500m*500m grid cell, but the model spatial resolution with a 200m*200m grid cell is recommended in this study to keep the model at a best performance.

  17. Computational Models for Calcium-Mediated Astrocyte Functions.

    PubMed

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus, we would like to emphasize that only via reproducible research are we able to build better computational models for astrocytes, which truly advance science. Our study is the first to characterize in detail the biophysical and biochemical mechanisms that have been modeled for astrocytes.

  18. Computational Models for Calcium-Mediated Astrocyte Functions

    PubMed Central

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus, we would like to emphasize that only via reproducible research are we able to build better computational models for astrocytes, which truly advance science. Our study is the first to characterize in detail the biophysical and biochemical mechanisms that have been modeled for astrocytes. PMID:29670517

  19. Assessing the impact of land use change on hydrology by ensemble modeling (LUCHEM). I: Model intercomparison with current land use

    USGS Publications Warehouse

    Breuer, L.; Huisman, J.A.; Willems, P.; Bormann, H.; Bronstert, A.; Croke, B.F.W.; Frede, H.-G.; Graff, T.; Hubrechts, L.; Jakeman, A.J.; Kite, G.; Lanini, J.; Leavesley, G.; Lettenmaier, D.P.; Lindstrom, G.; Seibert, J.; Sivapalan, M.; Viney, N.R.

    2009-01-01

    This paper introduces the project on 'Assessing the impact of land use change on hydrology by ensemble modeling (LUCHEM)' that aims at investigating the envelope of predictions on changes in hydrological fluxes due to land use change. As part of a series of four papers, this paper outlines the motivation and setup of LUCHEM, and presents a model intercomparison for the present-day simulation results. Such an intercomparison provides a valuable basis to investigate the effects of different model structures on model predictions and paves the ground for the analysis of the performance of multi-model ensembles and the reliability of the scenario predictions in companion papers. In this study, we applied a set of 10 lumped, semi-lumped and fully distributed hydrological models that have been previously used in land use change studies to the low mountainous Dill catchment, Germany. Substantial differences in model performance were observed with Nash-Sutcliffe efficiencies ranging from 0.53 to 0.92. Differences in model performance were attributed to (1) model input data, (2) model calibration and (3) the physical basis of the models. The models were applied with two sets of input data: an original and a homogenized data set. This homogenization of precipitation, temperature and leaf area index was performed to reduce the variation between the models. Homogenization improved the comparability of model simulations and resulted in a reduced average bias, although some variation in model data input remained. The effect of the physical differences between models on the long-term water balance was mainly attributed to differences in how models represent evapotranspiration. Semi-lumped and lumped conceptual models slightly outperformed the fully distributed and physically based models. This was attributed to the automatic model calibration typically used for this type of models. Overall, however, we conclude that there was no superior model if several measures of model performance are considered and that all models are suitable to participate in further multi-model ensemble set-ups and land use change scenario investigations. ?? 2008 Elsevier Ltd. All rights reserved.

  20. Benchmarking test of empirical root water uptake models

    NASA Astrophysics Data System (ADS)

    dos Santos, Marcos Alex; de Jong van Lier, Quirijn; van Dam, Jos C.; Freire Bezerra, Andre Herman

    2017-01-01

    Detailed physical models describing root water uptake (RWU) are an important tool for the prediction of RWU and crop transpiration, but the hydraulic parameters involved are hardly ever available, making them less attractive for many studies. Empirical models are more readily used because of their simplicity and the associated lower data requirements. The purpose of this study is to evaluate the capability of some empirical models to mimic the RWU distribution under varying environmental conditions predicted from numerical simulations with a detailed physical model. A review of some empirical models used as sub-models in ecohydrological models is presented, and alternative empirical RWU models are proposed. All these empirical models are analogous to the standard Feddes model, but differ in how RWU is partitioned over depth or how the transpiration reduction function is defined. The parameters of the empirical models are determined by inverse modelling of simulated depth-dependent RWU. The performance of the empirical models and their optimized empirical parameters depends on the scenario. The standard empirical Feddes model only performs well in scenarios with low root length density R, i.e. for scenarios with low RWU compensation. For medium and high R, the Feddes RWU model cannot mimic properly the root uptake dynamics as predicted by the physical model. The Jarvis RWU model in combination with the Feddes reduction function (JMf) only provides good predictions for low and medium R scenarios. For high R, it cannot mimic the uptake patterns predicted by the physical model. Incorporating a newly proposed reduction function into the Jarvis model improved RWU predictions. Regarding the ability of the models to predict plant transpiration, all models accounting for compensation show good performance. The Akaike information criterion (AIC) indicates that the Jarvis (2010) model (JMII), with no empirical parameters to be estimated, is the best model. The proposed models are better in predicting RWU patterns similar to the physical model. The statistical indices point to them as the best alternatives for mimicking RWU predictions of the physical model.

  1. Modeling uncertainty: quicksand for water temperature modeling

    USGS Publications Warehouse

    Bartholow, John M.

    2003-01-01

    Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.

  2. Energy modeling. Volume 2: Inventory and details of state energy models

    NASA Astrophysics Data System (ADS)

    Melcher, A. G.; Underwood, R. G.; Weber, J. C.; Gist, R. L.; Holman, R. P.; Donald, D. W.

    1981-05-01

    An inventory of energy models developed by or for state governments is presented, and certain models are discussed in depth. These models address a variety of purposes such as: supply or demand of energy or of certain types of energy; emergency management of energy; and energy economics. Ten models are described. The purpose, use, and history of the model is discussed, and information is given on the outputs, inputs, and mathematical structure of the model. The models include five models dealing with energy demand, one of which is econometric and four of which are econometric-engineering end-use models.

  3. Advances in Geoscience Modeling: Smart Modeling Frameworks, Self-Describing Models and the Role of Standardized Metadata

    NASA Astrophysics Data System (ADS)

    Peckham, Scott

    2016-04-01

    Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders, time interpolators and unit converters) as necessary to mediate the differences between them so they can work together. This talk will first review two key products of the CSDMS project, namely a standardized model interface called the Basic Model Interface (BMI) and the CSDMS Standard Names. The standard names are used in conjunction with BMI to provide a semantic matching mechanism that allows output variables from one process model or data set to be reliably used as input variables to other process models in a collection. They include not just a standardized naming scheme for model variables, but also a standardized set of terms for describing the attributes and assumptions of a given model. Recent efforts to bring powerful uncertainty analysis and inverse modeling toolkits such as DAKOTA into modeling frameworks will also be described. This talk will conclude with an overview of several related modeling projects that have been funded by NSF's EarthCube initiative, namely the Earth System Bridge, OntoSoft and GeoSemantics projects.

  4. [A review on research of land surface water and heat fluxes].

    PubMed

    Sun, Rui; Liu, Changming

    2003-03-01

    Many field experiments were done, and soil-vegetation-atmosphere transfer(SVAT) models were stablished to estimate land surface heat fluxes. In this paper, the processes of experimental research on land surface water and heat fluxes are reviewed, and three kinds of SVAT model(single layer model, two layer model and multi-layer model) are analyzed. Remote sensing data are widely used to estimate land surface heat fluxes. Based on remote sensing and energy balance equation, different models such as simplified model, single layer model, extra resistance model, crop water stress index model and two source resistance model are developed to estimate land surface heat fluxes and evapotranspiration. These models are also analyzed in this paper.

  5. Examination of simplified travel demand model. [Internal volume forecasting model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, R.L. Jr.; McFarlane, W.J.

    1978-01-01

    A simplified travel demand model, the Internal Volume Forecasting (IVF) model, proposed by Low in 1972 is evaluated as an alternative to the conventional urban travel demand modeling process. The calibration of the IVF model for a county-level study area in Central Wisconsin results in what appears to be a reasonable model; however, analysis of the structure of the model reveals two primary mis-specifications. Correction of the mis-specifications leads to a simplified gravity model version of the conventional urban travel demand models. Application of the original IVF model to ''forecast'' 1960 traffic volumes based on the model calibrated for 1970more » produces accurate estimates. Shortcut and ad hoc models may appear to provide reasonable results in both the base and horizon years; however, as shown by the IVF mode, such models will not always provide a reliable basis for transportation planning and investment decisions.« less

  6. MPTinR: analysis of multinomial processing tree models in R.

    PubMed

    Singmann, Henrik; Kellen, David

    2013-06-01

    We introduce MPTinR, a software package developed for the analysis of multinomial processing tree (MPT) models. MPT models represent a prominent class of cognitive measurement models for categorical data with applications in a wide variety of fields. MPTinR is the first software for the analysis of MPT models in the statistical programming language R, providing a modeling framework that is more flexible than standalone software packages. MPTinR also introduces important features such as (1) the ability to calculate the Fisher information approximation measure of model complexity for MPT models, (2) the ability to fit models for categorical data outside the MPT model class, such as signal detection models, (3) a function for model selection across a set of nested and nonnested candidate models (using several model selection indices), and (4) multicore fitting. MPTinR is available from the Comprehensive R Archive Network at http://cran.r-project.org/web/packages/MPTinR/ .

  7. Latent log-linear models for handwritten digit classification.

    PubMed

    Deselaers, Thomas; Gass, Tobias; Heigold, Georg; Ney, Hermann

    2012-06-01

    We present latent log-linear models, an extension of log-linear models incorporating latent variables, and we propose two applications thereof: log-linear mixture models and image deformation-aware log-linear models. The resulting models are fully discriminative, can be trained efficiently, and the model complexity can be controlled. Log-linear mixture models offer additional flexibility within the log-linear modeling framework. Unlike previous approaches, the image deformation-aware model directly considers image deformations and allows for a discriminative training of the deformation parameters. Both are trained using alternating optimization. For certain variants, convergence to a stationary point is guaranteed and, in practice, even variants without this guarantee converge and find models that perform well. We tune the methods on the USPS data set and evaluate on the MNIST data set, demonstrating the generalization capabilities of our proposed models. Our models, although using significantly fewer parameters, are able to obtain competitive results with models proposed in the literature.

  8. Understanding and Predicting Urban Propagation Losses

    DTIC Science & Technology

    2009-09-01

    6. Extended Hata Model ..........................22 7. Modified Hata Model ..........................22 8. Walfisch – Ikegami Model...39 4. COST (Extended) Hata Model ...................40 5. Modified Hata Model ..........................41 6. Walfisch- Ikegami Model...47 1. Scenario One – Walfisch- Ikegami Model ........51 2. Scenario Two – Modified Hata Model ...........52 3. Scenario Three – Urban Hata

  9. A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service

    PubMed Central

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016

  10. A framework for sharing and integrating remote sensing and GIS models based on Web service.

    PubMed

    Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin

    2014-01-01

    Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.

  11. Modeling pedestrian shopping behavior using principles of bounded rationality: model comparison and validation

    NASA Astrophysics Data System (ADS)

    Zhu, Wei; Timmermans, Harry

    2011-06-01

    Models of geographical choice behavior have been dominantly based on rational choice models, which assume that decision makers are utility-maximizers. Rational choice models may be less appropriate as behavioral models when modeling decisions in complex environments in which decision makers may simplify the decision problem using heuristics. Pedestrian behavior in shopping streets is an example. We therefore propose a modeling framework for pedestrian shopping behavior incorporating principles of bounded rationality. We extend three classical heuristic rules (conjunctive, disjunctive and lexicographic rule) by introducing threshold heterogeneity. The proposed models are implemented using data on pedestrian behavior in Wang Fujing Street, the city center of Beijing, China. The models are estimated and compared with multinomial logit models and mixed logit models. Results show that the heuristic models are the best for all the decisions that are modeled. Validation tests are carried out through multi-agent simulation by comparing simulated spatio-temporal agent behavior with the observed pedestrian behavior. The predictions of heuristic models are slightly better than those of the multinomial logit models.

  12. The Sim-SEQ Project: Comparison of Selected Flow Models for the S-3 Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mukhopadhyay, Sumit; Doughty, Christine A.; Bacon, Diana H.

    Sim-SEQ is an international initiative on model comparison for geologic carbon sequestration, with an objective to understand and, if possible, quantify model uncertainties. Model comparison efforts in Sim-SEQ are at present focusing on one specific field test site, hereafter referred to as the Sim-SEQ Study site (or S-3 site). Within Sim-SEQ, different modeling teams are developing conceptual models of CO2 injection at the S-3 site. In this paper, we select five flow models of the S-3 site and provide a qualitative comparison of their attributes and predictions. These models are based on five different simulators or modeling approaches: TOUGH2/EOS7C, STOMP-CO2e,more » MoReS, TOUGH2-MP/ECO2N, and VESA. In addition to model-to-model comparison, we perform a limited model-to-data comparison, and illustrate how model choices impact model predictions. We conclude the paper by making recommendations for model refinement that are likely to result in less uncertainty in model predictions.« less

  13. Semi-automated Modular Program Constructor for physiological modeling: Building cell and organ models.

    PubMed

    Jardine, Bartholomew; Raymond, Gary M; Bassingthwaighte, James B

    2015-01-01

    The Modular Program Constructor (MPC) is an open-source Java based modeling utility, built upon JSim's Mathematical Modeling Language (MML) ( http://www.physiome.org/jsim/) that uses directives embedded in model code to construct larger, more complicated models quickly and with less error than manually combining models. A major obstacle in writing complex models for physiological processes is the large amount of time it takes to model the myriad processes taking place simultaneously in cells, tissues, and organs. MPC replaces this task with code-generating algorithms that take model code from several different existing models and produce model code for a new JSim model. This is particularly useful during multi-scale model development where many variants are to be configured and tested against data. MPC encodes and preserves information about how a model is built from its simpler model modules, allowing the researcher to quickly substitute or update modules for hypothesis testing. MPC is implemented in Java and requires JSim to use its output. MPC source code and documentation are available at http://www.physiome.org/software/MPC/.

  14. Comparison of dark energy models after Planck 2015

    NASA Astrophysics Data System (ADS)

    Xu, Yue-Yao; Zhang, Xin

    2016-11-01

    We make a comparison for ten typical, popular dark energy models according to their capabilities of fitting the current observational data. The observational data we use in this work include the JLA sample of type Ia supernovae observation, the Planck 2015 distance priors of cosmic microwave background observation, the baryon acoustic oscillations measurements, and the direct measurement of the Hubble constant. Since the models have different numbers of parameters, in order to make a fair comparison, we employ the Akaike and Bayesian information criteria to assess the worth of the models. The analysis results show that, according to the capability of explaining observations, the cosmological constant model is still the best one among all the dark energy models. The generalized Chaplygin gas model, the constant w model, and the α dark energy model are worse than the cosmological constant model, but still are good models compared to others. The holographic dark energy model, the new generalized Chaplygin gas model, and the Chevalliear-Polarski-Linder model can still fit the current observations well, but from an economically feasible perspective, they are not so good. The new agegraphic dark energy model, the Dvali-Gabadadze-Porrati model, and the Ricci dark energy model are excluded by the current observations.

  15. Parametric regression model for survival data: Weibull regression model as an example

    PubMed Central

    2016-01-01

    Weibull regression model is one of the most popular forms of parametric regression model that it provides estimate of baseline hazard function, as well as coefficients for covariates. Because of technical difficulties, Weibull regression model is seldom used in medical literature as compared to the semi-parametric proportional hazard model. To make clinical investigators familiar with Weibull regression model, this article introduces some basic knowledge on Weibull regression model and then illustrates how to fit the model with R software. The SurvRegCensCov package is useful in converting estimated coefficients to clinical relevant statistics such as hazard ratio (HR) and event time ratio (ETR). Model adequacy can be assessed by inspecting Kaplan-Meier curves stratified by categorical variable. The eha package provides an alternative method to model Weibull regression model. The check.dist() function helps to assess goodness-of-fit of the model. Variable selection is based on the importance of a covariate, which can be tested using anova() function. Alternatively, backward elimination starting from a full model is an efficient way for model development. Visualization of Weibull regression model after model development is interesting that it provides another way to report your findings. PMID:28149846

  16. Inner Magnetosphere Modeling at the CCMC: Ring Current, Radiation Belt and Magnetic Field Mapping

    NASA Astrophysics Data System (ADS)

    Rastaetter, L.; Mendoza, A. M.; Chulaki, A.; Kuznetsova, M. M.; Zheng, Y.

    2013-12-01

    Modeling of the inner magnetosphere has entered center stage with the launch of the Van Allen Probes (RBSP) in 2012. The Community Coordinated Modeling Center (CCMC) has drastically improved its offerings of inner magnetosphere models that cover energetic particles in the Earth's ring current and radiation belts. Models added to the CCMC include the stand-alone Comprehensive Inner Magnetosphere-Ionosphere (CIMI) model by M.C. Fok, the Rice Convection Model (RCM) by R. Wolf and S. Sazykin and numerous versions of the Tsyganenko magnetic field model (T89, T96, T01quiet, TS05). These models join the LANL* model by Y. Yu hat was offered for instant run earlier in the year. In addition to these stand-alone models, the Comprehensive Ring Current Model (CRCM) by M.C. Fok and N. Buzulukova joined as a component of the Space Weather Modeling Framework (SWMF) in the magnetosphere model run-on-request category. We present modeling results of the ring current and radiation belt models and demonstrate tracking of satellites such as RBSP. Calculations using the magnetic field models include mappings to the magnetic equator or to minimum-B positions and the determination of foot points in the ionosphere.

  17. A diversity index for model space selection in the estimation of benchmark and infectious doses via model averaging.

    PubMed

    Kim, Steven B; Kodell, Ralph L; Moon, Hojin

    2014-03-01

    In chemical and microbial risk assessments, risk assessors fit dose-response models to high-dose data and extrapolate downward to risk levels in the range of 1-10%. Although multiple dose-response models may be able to fit the data adequately in the experimental range, the estimated effective dose (ED) corresponding to an extremely small risk can be substantially different from model to model. In this respect, model averaging (MA) provides more robustness than a single dose-response model in the point and interval estimation of an ED. In MA, accounting for both data uncertainty and model uncertainty is crucial, but addressing model uncertainty is not achieved simply by increasing the number of models in a model space. A plausible set of models for MA can be characterized by goodness of fit and diversity surrounding the truth. We propose a diversity index (DI) to balance between these two characteristics in model space selection. It addresses a collective property of a model space rather than individual performance of each model. Tuning parameters in the DI control the size of the model space for MA. © 2013 Society for Risk Analysis.

  18. Standard fire behavior fuel models: a comprehensive set for use with Rothermel's surface fire spread model

    Treesearch

    Joe H. Scott; Robert E. Burgan

    2005-01-01

    This report describes a new set of standard fire behavior fuel models for use with Rothermel's surface fire spread model and the relationship of the new set to the original set of 13 fire behavior fuel models. To assist with transition to using the new fuel models, a fuel model selection guide, fuel model crosswalk, and set of fuel model photos are provided.

  19. [Parameters modification and evaluation of two evapotranspiration models based on Penman-Monteith model for summer maize].

    PubMed

    Wang, Juan; Wang, Jian Lin; Liu, Jia Bin; Jiang, Wen; Zhao, Chang Xing

    2017-06-18

    The dynamic variations of evapotranspiration (ET) and weather data during summer maize growing season in 2013-2015 were monitored with eddy covariance system, and the applicability of two operational models (FAO-PM model and KP-PM model) based on the Penman-Monteith model were analyzed. Firstly, the key parameters in the two models were calibrated with the measured data in 2013 and 2014; secondly, the daily ET in 2015 calculated by the FAO-PM model and KP-PM model was compared to the observed ET, respectively. Finally, the coefficients in the KP-PM model were further revised with the coefficients calculated according to the different growth stages, and the performance of the revised KP-PM model was also evaluated. These statistical parameters indicated that the calculated daily ET for 2015 by the FAO-PM model was closer to the observed ET than that by the KP-PM model. The daily ET calculated from the revised KP-PM model for daily ET was more accurate than that from the FAO-PM model. It was also found that the key parameters in the two models were correlated with weather conditions, so the calibration was necessary before using the models to predict the ET. The above results could provide some guidelines on predicting ET with the two models.

  20. Implementation of Dryden Continuous Turbulence Model into Simulink for LSA-02 Flight Test Simulation

    NASA Astrophysics Data System (ADS)

    Ichwanul Hakim, Teuku Mohd; Arifianto, Ony

    2018-04-01

    Turbulence is a movement of air on small scale in the atmosphere that caused by instabilities of pressure and temperature distribution. Turbulence model is integrated into flight mechanical model as an atmospheric disturbance. Common turbulence model used in flight mechanical model are Dryden and Von Karman model. In this minor research, only Dryden continuous turbulence model were made. Dryden continuous turbulence model has been implemented, it refers to the military specification MIL-HDBK-1797. The model was implemented into Matlab Simulink. The model will be integrated with flight mechanical model to observe response of the aircraft when it is flight through turbulence field. The turbulence model is characterized by multiplying the filter which are generated from power spectral density with band-limited Gaussian white noise input. In order to ensure that the model provide a good result, model verification has been done by comparing the implemented model with the similar model that is provided in aerospace blockset. The result shows that there are some difference for 2 linear velocities (vg and wg), and 3 angular rate (pg, qg and rg). The difference is instantly caused by different determination of turbulence scale length which is used in aerospace blockset. With the adjustment of turbulence length in the implemented model, both model result the similar output.

  1. THE EARTH SYSTEM PREDICTION SUITE: Toward a Coordinated U.S. Modeling Capability

    PubMed Central

    Theurich, Gerhard; DeLuca, C.; Campbell, T.; Liu, F.; Saint, K.; Vertenstein, M.; Chen, J.; Oehmke, R.; Doyle, J.; Whitcomb, T.; Wallcraft, A.; Iredell, M.; Black, T.; da Silva, AM; Clune, T.; Ferraro, R.; Li, P.; Kelley, M.; Aleinov, I.; Balaji, V.; Zadeh, N.; Jacob, R.; Kirtman, B.; Giraldo, F.; McCarren, D.; Sandgathe, S.; Peckham, S.; Dunlap, R.

    2017-01-01

    The Earth System Prediction Suite (ESPS) is a collection of flagship U.S. weather and climate models and model components that are being instrumented to conform to interoperability conventions, documented to follow metadata standards, and made available either under open source terms or to credentialed users. The ESPS represents a culmination of efforts to create a common Earth system model architecture, and the advent of increasingly coordinated model development activities in the U.S. ESPS component interfaces are based on the Earth System Modeling Framework (ESMF), community-developed software for building and coupling models, and the National Unified Operational Prediction Capability (NUOPC) Layer, a set of ESMF-based component templates and interoperability conventions. This shared infrastructure simplifies the process of model coupling by guaranteeing that components conform to a set of technical and semantic behaviors. The ESPS encourages distributed, multi-agency development of coupled modeling systems, controlled experimentation and testing, and exploration of novel model configurations, such as those motivated by research involving managed and interactive ensembles. ESPS codes include the Navy Global Environmental Model (NavGEM), HYbrid Coordinate Ocean Model (HYCOM), and Coupled Ocean Atmosphere Mesoscale Prediction System (COAMPS®); the NOAA Environmental Modeling System (NEMS) and the Modular Ocean Model (MOM); the Community Earth System Model (CESM); and the NASA ModelE climate model and GEOS-5 atmospheric general circulation model. PMID:29568125

  2. THE EARTH SYSTEM PREDICTION SUITE: Toward a Coordinated U.S. Modeling Capability.

    PubMed

    Theurich, Gerhard; DeLuca, C; Campbell, T; Liu, F; Saint, K; Vertenstein, M; Chen, J; Oehmke, R; Doyle, J; Whitcomb, T; Wallcraft, A; Iredell, M; Black, T; da Silva, A M; Clune, T; Ferraro, R; Li, P; Kelley, M; Aleinov, I; Balaji, V; Zadeh, N; Jacob, R; Kirtman, B; Giraldo, F; McCarren, D; Sandgathe, S; Peckham, S; Dunlap, R

    2016-07-01

    The Earth System Prediction Suite (ESPS) is a collection of flagship U.S. weather and climate models and model components that are being instrumented to conform to interoperability conventions, documented to follow metadata standards, and made available either under open source terms or to credentialed users. The ESPS represents a culmination of efforts to create a common Earth system model architecture, and the advent of increasingly coordinated model development activities in the U.S. ESPS component interfaces are based on the Earth System Modeling Framework (ESMF), community-developed software for building and coupling models, and the National Unified Operational Prediction Capability (NUOPC) Layer, a set of ESMF-based component templates and interoperability conventions. This shared infrastructure simplifies the process of model coupling by guaranteeing that components conform to a set of technical and semantic behaviors. The ESPS encourages distributed, multi-agency development of coupled modeling systems, controlled experimentation and testing, and exploration of novel model configurations, such as those motivated by research involving managed and interactive ensembles. ESPS codes include the Navy Global Environmental Model (NavGEM), HYbrid Coordinate Ocean Model (HYCOM), and Coupled Ocean Atmosphere Mesoscale Prediction System (COAMPS ® ); the NOAA Environmental Modeling System (NEMS) and the Modular Ocean Model (MOM); the Community Earth System Model (CESM); and the NASA ModelE climate model and GEOS-5 atmospheric general circulation model.

  3. The Earth System Prediction Suite: Toward a Coordinated U.S. Modeling Capability

    NASA Technical Reports Server (NTRS)

    Theurich, Gerhard; DeLuca, C.; Campbell, T.; Liu, F.; Saint, K.; Vertenstein, M.; Chen, J.; Oehmke, R.; Doyle, J.; Whitcomb, T.; hide

    2016-01-01

    The Earth System Prediction Suite (ESPS) is a collection of flagship U.S. weather and climate models and model components that are being instrumented to conform to interoperability conventions, documented to follow metadata standards, and made available either under open source terms or to credentialed users.The ESPS represents a culmination of efforts to create a common Earth system model architecture, and the advent of increasingly coordinated model development activities in the U.S. ESPS component interfaces are based on the Earth System Modeling Framework (ESMF), community-developed software for building and coupling models, and the National Unified Operational Prediction Capability (NUOPC) Layer, a set of ESMF-based component templates and interoperability conventions. This shared infrastructure simplifies the process of model coupling by guaranteeing that components conform to a set of technical and semantic behaviors. The ESPS encourages distributed, multi-agency development of coupled modeling systems, controlled experimentation and testing, and exploration of novel model configurations, such as those motivated by research involving managed and interactive ensembles. ESPS codes include the Navy Global Environmental Model (NavGEM), HYbrid Coordinate Ocean Model (HYCOM), and Coupled Ocean Atmosphere Mesoscale Prediction System (COAMPS); the NOAA Environmental Modeling System (NEMS) and the Modular Ocean Model (MOM); the Community Earth System Model (CESM); and the NASA ModelE climate model and GEOS-5 atmospheric general circulation model.

  4. The Earth System Prediction Suite: Toward a Coordinated U.S. Modeling Capability

    DOE PAGES

    Theurich, Gerhard; DeLuca, C.; Campbell, T.; ...

    2016-08-22

    The Earth System Prediction Suite (ESPS) is a collection of flagship U.S. weather and climate models and model components that are being instrumented to conform to interoperability conventions, documented to follow metadata standards, and made available either under open-source terms or to credentialed users. Furthermore, the ESPS represents a culmination of efforts to create a common Earth system model architecture, and the advent of increasingly coordinated model development activities in the United States. ESPS component interfaces are based on the Earth System Modeling Framework (ESMF), community-developed software for building and coupling models, and the National Unified Operational Prediction Capability (NUOPC)more » Layer, a set of ESMF-based component templates and interoperability conventions. Our shared infrastructure simplifies the process of model coupling by guaranteeing that components conform to a set of technical and semantic behaviors. The ESPS encourages distributed, multiagency development of coupled modeling systems; controlled experimentation and testing; and exploration of novel model configurations, such as those motivated by research involving managed and interactive ensembles. ESPS codes include the Navy Global Environmental Model (NAVGEM), the Hybrid Coordinate Ocean Model (HYCOM), and the Coupled Ocean–Atmosphere Mesoscale Prediction System (COAMPS); the NOAA Environmental Modeling System (NEMS) and the Modular Ocean Model (MOM); the Community Earth System Model (CESM); and the NASA ModelE climate model and the Goddard Earth Observing System Model, version 5 (GEOS-5), atmospheric general circulation model.« less

  5. The Earth System Prediction Suite: Toward a Coordinated U.S. Modeling Capability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theurich, Gerhard; DeLuca, C.; Campbell, T.

    The Earth System Prediction Suite (ESPS) is a collection of flagship U.S. weather and climate models and model components that are being instrumented to conform to interoperability conventions, documented to follow metadata standards, and made available either under open-source terms or to credentialed users. Furthermore, the ESPS represents a culmination of efforts to create a common Earth system model architecture, and the advent of increasingly coordinated model development activities in the United States. ESPS component interfaces are based on the Earth System Modeling Framework (ESMF), community-developed software for building and coupling models, and the National Unified Operational Prediction Capability (NUOPC)more » Layer, a set of ESMF-based component templates and interoperability conventions. Our shared infrastructure simplifies the process of model coupling by guaranteeing that components conform to a set of technical and semantic behaviors. The ESPS encourages distributed, multiagency development of coupled modeling systems; controlled experimentation and testing; and exploration of novel model configurations, such as those motivated by research involving managed and interactive ensembles. ESPS codes include the Navy Global Environmental Model (NAVGEM), the Hybrid Coordinate Ocean Model (HYCOM), and the Coupled Ocean–Atmosphere Mesoscale Prediction System (COAMPS); the NOAA Environmental Modeling System (NEMS) and the Modular Ocean Model (MOM); the Community Earth System Model (CESM); and the NASA ModelE climate model and the Goddard Earth Observing System Model, version 5 (GEOS-5), atmospheric general circulation model.« less

  6. An ontology for component-based models of water resource systems

    NASA Astrophysics Data System (ADS)

    Elag, Mostafa; Goodall, Jonathan L.

    2013-08-01

    Component-based modeling is an approach for simulating water resource systems where a model is composed of a set of components, each with a defined modeling objective, interlinked through data exchanges. Component-based modeling frameworks are used within the hydrologic, atmospheric, and earth surface dynamics modeling communities. While these efforts have been advancing, it has become clear that the water resources modeling community in particular, and arguably the larger earth science modeling community as well, faces a challenge of fully and precisely defining the metadata for model components. The lack of a unified framework for model component metadata limits interoperability between modeling communities and the reuse of models across modeling frameworks due to ambiguity about the model and its capabilities. To address this need, we propose an ontology for water resources model components that describes core concepts and relationships using the Web Ontology Language (OWL). The ontology that we present, which is termed the Water Resources Component (WRC) ontology, is meant to serve as a starting point that can be refined over time through engagement by the larger community until a robust knowledge framework for water resource model components is achieved. This paper presents the methodology used to arrive at the WRC ontology, the WRC ontology itself, and examples of how the ontology can aid in component-based water resources modeling by (i) assisting in identifying relevant models, (ii) encouraging proper model coupling, and (iii) facilitating interoperability across earth science modeling frameworks.

  7. Novel forecasting approaches using combination of machine learning and statistical models for flood susceptibility mapping.

    PubMed

    Shafizadeh-Moghadam, Hossein; Valavi, Roozbeh; Shahabi, Himan; Chapi, Kamran; Shirzadi, Ataollah

    2018-07-01

    In this research, eight individual machine learning and statistical models are implemented and compared, and based on their results, seven ensemble models for flood susceptibility assessment are introduced. The individual models included artificial neural networks, classification and regression trees, flexible discriminant analysis, generalized linear model, generalized additive model, boosted regression trees, multivariate adaptive regression splines, and maximum entropy, and the ensemble models were Ensemble Model committee averaging (EMca), Ensemble Model confidence interval Inferior (EMciInf), Ensemble Model confidence interval Superior (EMciSup), Ensemble Model to estimate the coefficient of variation (EMcv), Ensemble Model to estimate the mean (EMmean), Ensemble Model to estimate the median (EMmedian), and Ensemble Model based on weighted mean (EMwmean). The data set covered 201 flood events in the Haraz watershed (Mazandaran province in Iran) and 10,000 randomly selected non-occurrence points. Among the individual models, the Area Under the Receiver Operating Characteristic (AUROC), which showed the highest value, belonged to boosted regression trees (0.975) and the lowest value was recorded for generalized linear model (0.642). On the other hand, the proposed EMmedian resulted in the highest accuracy (0.976) among all models. In spite of the outstanding performance of some models, nevertheless, variability among the prediction of individual models was considerable. Therefore, to reduce uncertainty, creating more generalizable, more stable, and less sensitive models, ensemble forecasting approaches and in particular the EMmedian is recommended for flood susceptibility assessment. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Exploring Several Methods of Groundwater Model Selection

    NASA Astrophysics Data System (ADS)

    Samani, Saeideh; Ye, Ming; Asghari Moghaddam, Asghar

    2017-04-01

    Selecting reliable models for simulating groundwater flow and solute transport is essential to groundwater resources management and protection. This work is to explore several model selection methods for avoiding over-complex and/or over-parameterized groundwater models. We consider six groundwater flow models with different numbers (6, 10, 10, 13, 13 and 15) of model parameters. These models represent alternative geological interpretations, recharge estimates, and boundary conditions at a study site in Iran. The models were developed with Model Muse, and calibrated against observations of hydraulic head using UCODE. Model selection was conducted by using the following four approaches: (1) Rank the models using their root mean square error (RMSE) obtained after UCODE-based model calibration, (2) Calculate model probability using GLUE method, (3) Evaluate model probability using model selection criteria (AIC, AICc, BIC, and KIC), and (4) Evaluate model weights using the Fuzzy Multi-Criteria-Decision-Making (MCDM) approach. MCDM is based on the fuzzy analytical hierarchy process (AHP) and fuzzy technique for order performance, which is to identify the ideal solution by a gradual expansion from the local to the global scale of model parameters. The KIC and MCDM methods are superior to other methods, as they consider not only the fit between observed and simulated data and the number of parameter, but also uncertainty in model parameters. Considering these factors can prevent from occurring over-complexity and over-parameterization, when selecting the appropriate groundwater flow models. These methods selected, as the best model, one with average complexity (10 parameters) and the best parameter estimation (model 3).

  9. A comparative research of different ensemble surrogate models based on set pair analysis for the DNAPL-contaminated aquifer remediation strategy optimization.

    PubMed

    Hou, Zeyu; Lu, Wenxi; Xue, Haibo; Lin, Jin

    2017-08-01

    Surrogate-based simulation-optimization technique is an effective approach for optimizing the surfactant enhanced aquifer remediation (SEAR) strategy for clearing DNAPLs. The performance of the surrogate model, which is used to replace the simulation model for the aim of reducing computation burden, is the key of corresponding researches. However, previous researches are generally based on a stand-alone surrogate model, and rarely make efforts to improve the approximation accuracy of the surrogate model to the simulation model sufficiently by combining various methods. In this regard, we present set pair analysis (SPA) as a new method to build ensemble surrogate (ES) model, and conducted a comparative research to select a better ES modeling pattern for the SEAR strategy optimization problems. Surrogate models were developed using radial basis function artificial neural network (RBFANN), support vector regression (SVR), and Kriging. One ES model is assembling RBFANN model, SVR model, and Kriging model using set pair weights according their performance, and the other is assembling several Kriging (the best surrogate modeling method of three) models built with different training sample datasets. Finally, an optimization model, in which the ES model was embedded, was established to obtain the optimal remediation strategy. The results showed the residuals of the outputs between the best ES model and simulation model for 100 testing samples were lower than 1.5%. Using an ES model instead of the simulation model was critical for considerably reducing the computation time of simulation-optimization process and maintaining high computation accuracy simultaneously. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Models Archive and ModelWeb at NSSDC

    NASA Astrophysics Data System (ADS)

    Bilitza, D.; Papitashvili, N.; King, J. H.

    2002-05-01

    In addition to its large data holdings, NASA's National Space Science Data Center (NSSDC) also maintains an archive of space physics models for public use (ftp://nssdcftp.gsfc.nasa.gov/models/). The more than 60 model entries cover a wide range of parameters from the atmosphere, to the ionosphere, to the magnetosphere, to the heliosphere. The models are primarily empirical models developed by the respective model authors based on long data records from ground and space experiments. An online model catalog (http://nssdc.gsfc.nasa.gov/space/model/) provides information about these and other models and links to the model software if available. We will briefly review the existing model holdings and highlight some of its usages and users. In response to a growing need by the user community, NSSDC began to develop web-interfaces for the most frequently requested models. These interfaces enable users to compute and plot model parameters online for the specific conditions that they are interested in. Currently included in the Modelweb system (http://nssdc.gsfc.nasa.gov/space/model/) are the following models: the International Reference Ionosphere (IRI) model, the Mass Spectrometer Incoherent Scatter (MSIS) E90 model, the International Geomagnetic Reference Field (IGRF) and the AP/AE-8 models for the radiation belt electrons and protons. User accesses to both systems have been steadily increasing over the last years with occasional spikes prior to large scientific meetings. The current monthly rate is between 5,000 to 10,000 accesses for either system; in February 2002 13,872 accesses were recorded to the Modelsweb and 7092 accesses to the models archive.

  11. Towards methodical modelling: Differences between the structure and output dynamics of multiple conceptual models

    NASA Astrophysics Data System (ADS)

    Knoben, Wouter; Woods, Ross; Freer, Jim

    2016-04-01

    Conceptual hydrologic models consist of a certain arrangement of spatial and temporal dynamics consisting of stores, fluxes and transformation functions, depending on the modeller's choices and intended use. They have the advantages of being computationally efficient, being relatively easy model structures to reconfigure and having relatively low input data demands. This makes them well-suited for large-scale and large-sample hydrology, where appropriately representing the dominant hydrologic functions of a catchment is a main concern. Given these requirements, the number of parameters in the model cannot be too high, to avoid equifinality and identifiability issues. This limits the number and level of complexity of dominant hydrologic processes the model can represent. Specific purposes and places thus require a specific model and this has led to an abundance of conceptual hydrologic models. No structured overview of these models exists and there is no clear method to select appropriate model structures for different catchments. This study is a first step towards creating an overview of the elements that make up conceptual models, which may later assist a modeller in finding an appropriate model structure for a given catchment. To this end, this study brings together over 30 past and present conceptual models. The reviewed model structures are simply different configurations of three basic model elements (stores, fluxes and transformation functions), depending on the hydrologic processes the models are intended to represent. Differences also exist in the inner workings of the stores, fluxes and transformations, i.e. the mathematical formulations that describe each model element's intended behaviour. We investigate the hypothesis that different model structures can produce similar behavioural simulations. This can clarify the overview of model elements by grouping elements which are similar, which can improve model structure selection.

  12. Synthesizing models useful for ecohydrology and ecohydraulic approaches: An emphasis on integrating models to address complex research questions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brewer, Shannon K.; Worthington, Thomas A.; Mollenhauer, Robert

    Ecohydrology combines empiricism, data analytics, and the integration of models to characterize linkages between ecological and hydrological processes. A challenge for practitioners is determining which models best generalizes heterogeneity in hydrological behaviour, including water fluxes across spatial and temporal scales, integrating environmental and socio–economic activities to determine best watershed management practices and data requirements. We conducted a literature review and synthesis of hydrologic, hydraulic, water quality, and ecological models designed for solving interdisciplinary questions. We reviewed 1,275 papers and identified 178 models that have the capacity to answer an array of research questions about ecohydrology or ecohydraulics. Of these models,more » 43 were commonly applied due to their versatility, accessibility, user–friendliness, and excellent user–support. Forty–one of 43 reviewed models were linked to at least 1 other model especially: Water Quality Analysis Simulation Program (linked to 21 other models), Soil and Water Assessment Tool (19), and Hydrologic Engineering Center's River Analysis System (15). However, model integration was still relatively infrequent. There was substantial variation in model applications, possibly an artefact of the regional focus of research questions, simplicity of use, quality of user–support efforts, or a limited understanding of model applicability. Simply increasing the interoperability of model platforms, transformation of models to user–friendly forms, increasing user–support, defining the reliability and risk associated with model results, and increasing awareness of model applicability may promote increased use of models across subdisciplines. Furthermore, the current availability of models allows an array of interdisciplinary questions to be addressed, and model choice relates to several factors including research objective, model complexity, ability to link to other models, and interface choice.« less

  13. Synthesizing models useful for ecohydrology and ecohydraulic approaches: An emphasis on integrating models to address complex research questions

    USGS Publications Warehouse

    Brewer, Shannon K.; Worthington, Thomas; Mollenhauer, Robert; Stewart, David; McManamay, Ryan; Guertault, Lucie; Moore, Desiree

    2018-01-01

    Ecohydrology combines empiricism, data analytics, and the integration of models to characterize linkages between ecological and hydrological processes. A challenge for practitioners is determining which models best generalizes heterogeneity in hydrological behaviour, including water fluxes across spatial and temporal scales, integrating environmental and socio‐economic activities to determine best watershed management practices and data requirements. We conducted a literature review and synthesis of hydrologic, hydraulic, water quality, and ecological models designed for solving interdisciplinary questions. We reviewed 1,275 papers and identified 178 models that have the capacity to answer an array of research questions about ecohydrology or ecohydraulics. Of these models, 43 were commonly applied due to their versatility, accessibility, user‐friendliness, and excellent user‐support. Forty‐one of 43 reviewed models were linked to at least 1 other model especially: Water Quality Analysis Simulation Program (linked to 21 other models), Soil and Water Assessment Tool (19), and Hydrologic Engineering Center's River Analysis System (15). However, model integration was still relatively infrequent. There was substantial variation in model applications, possibly an artefact of the regional focus of research questions, simplicity of use, quality of user‐support efforts, or a limited understanding of model applicability. Simply increasing the interoperability of model platforms, transformation of models to user‐friendly forms, increasing user‐support, defining the reliability and risk associated with model results, and increasing awareness of model applicability may promote increased use of models across subdisciplines. Nonetheless, the current availability of models allows an array of interdisciplinary questions to be addressed, and model choice relates to several factors including research objective, model complexity, ability to link to other models, and interface choice.

  14. How does a three-dimensional continuum muscle model affect the kinematics and muscle strains of a finite element neck model compared to a discrete muscle model in rear-end, frontal, and lateral impacts.

    PubMed

    Hedenstierna, Sofia; Halldin, Peter

    2008-04-15

    A finite element (FE) model of the human neck with incorporated continuum or discrete muscles was used to simulate experimental impacts in rear, frontal, and lateral directions. The aim of this study was to determine how a continuum muscle model influences the impact behavior of a FE human neck model compared with a discrete muscle model. Most FE neck models used for impact analysis today include a spring element musculature and are limited to discrete geometries and nodal output results. A solid-element muscle model was thought to improve the behavior of the model by adding properties such as tissue inertia and compressive stiffness and by improving the geometry. It would also predict the strain distribution within the continuum elements. A passive continuum muscle model with nonlinear viscoelastic materials was incorporated into the KTH neck model together with active spring muscles and used in impact simulations. The resulting head and vertebral kinematics was compared with the results from a discrete muscle model as well as volunteer corridors. The muscle strain prediction was compared between the 2 muscle models. The head and vertebral kinematics were within the volunteer corridors for both models when activated. The continuum model behaved more stiffly than the discrete model and needed less active force to fit the experimental results. The largest difference was seen in the rear impact. The strain predicted by the continuum model was lower than for the discrete model. The continuum muscle model stiffened the response of the KTH neck model compared with a discrete model, and the strain prediction in the muscles was improved.

  15. Synthesizing models useful for ecohydrology and ecohydraulic approaches: An emphasis on integrating models to address complex research questions

    DOE PAGES

    Brewer, Shannon K.; Worthington, Thomas A.; Mollenhauer, Robert; ...

    2018-04-06

    Ecohydrology combines empiricism, data analytics, and the integration of models to characterize linkages between ecological and hydrological processes. A challenge for practitioners is determining which models best generalizes heterogeneity in hydrological behaviour, including water fluxes across spatial and temporal scales, integrating environmental and socio–economic activities to determine best watershed management practices and data requirements. We conducted a literature review and synthesis of hydrologic, hydraulic, water quality, and ecological models designed for solving interdisciplinary questions. We reviewed 1,275 papers and identified 178 models that have the capacity to answer an array of research questions about ecohydrology or ecohydraulics. Of these models,more » 43 were commonly applied due to their versatility, accessibility, user–friendliness, and excellent user–support. Forty–one of 43 reviewed models were linked to at least 1 other model especially: Water Quality Analysis Simulation Program (linked to 21 other models), Soil and Water Assessment Tool (19), and Hydrologic Engineering Center's River Analysis System (15). However, model integration was still relatively infrequent. There was substantial variation in model applications, possibly an artefact of the regional focus of research questions, simplicity of use, quality of user–support efforts, or a limited understanding of model applicability. Simply increasing the interoperability of model platforms, transformation of models to user–friendly forms, increasing user–support, defining the reliability and risk associated with model results, and increasing awareness of model applicability may promote increased use of models across subdisciplines. Furthermore, the current availability of models allows an array of interdisciplinary questions to be addressed, and model choice relates to several factors including research objective, model complexity, ability to link to other models, and interface choice.« less

  16. Designing and evaluating the MULTICOM protein local and global model quality prediction methods in the CASP10 experiment

    PubMed Central

    2014-01-01

    Background Protein model quality assessment is an essential component of generating and using protein structural models. During the Tenth Critical Assessment of Techniques for Protein Structure Prediction (CASP10), we developed and tested four automated methods (MULTICOM-REFINE, MULTICOM-CLUSTER, MULTICOM-NOVEL, and MULTICOM-CONSTRUCT) that predicted both local and global quality of protein structural models. Results MULTICOM-REFINE was a clustering approach that used the average pairwise structural similarity between models to measure the global quality and the average Euclidean distance between a model and several top ranked models to measure the local quality. MULTICOM-CLUSTER and MULTICOM-NOVEL were two new support vector machine-based methods of predicting both the local and global quality of a single protein model. MULTICOM-CONSTRUCT was a new weighted pairwise model comparison (clustering) method that used the weighted average similarity between models in a pool to measure the global model quality. Our experiments showed that the pairwise model assessment methods worked better when a large portion of models in the pool were of good quality, whereas single-model quality assessment methods performed better on some hard targets when only a small portion of models in the pool were of reasonable quality. Conclusions Since digging out a few good models from a large pool of low-quality models is a major challenge in protein structure prediction, single model quality assessment methods appear to be poised to make important contributions to protein structure modeling. The other interesting finding was that single-model quality assessment scores could be used to weight the models by the consensus pairwise model comparison method to improve its accuracy. PMID:24731387

  17. Designing and evaluating the MULTICOM protein local and global model quality prediction methods in the CASP10 experiment.

    PubMed

    Cao, Renzhi; Wang, Zheng; Cheng, Jianlin

    2014-04-15

    Protein model quality assessment is an essential component of generating and using protein structural models. During the Tenth Critical Assessment of Techniques for Protein Structure Prediction (CASP10), we developed and tested four automated methods (MULTICOM-REFINE, MULTICOM-CLUSTER, MULTICOM-NOVEL, and MULTICOM-CONSTRUCT) that predicted both local and global quality of protein structural models. MULTICOM-REFINE was a clustering approach that used the average pairwise structural similarity between models to measure the global quality and the average Euclidean distance between a model and several top ranked models to measure the local quality. MULTICOM-CLUSTER and MULTICOM-NOVEL were two new support vector machine-based methods of predicting both the local and global quality of a single protein model. MULTICOM-CONSTRUCT was a new weighted pairwise model comparison (clustering) method that used the weighted average similarity between models in a pool to measure the global model quality. Our experiments showed that the pairwise model assessment methods worked better when a large portion of models in the pool were of good quality, whereas single-model quality assessment methods performed better on some hard targets when only a small portion of models in the pool were of reasonable quality. Since digging out a few good models from a large pool of low-quality models is a major challenge in protein structure prediction, single model quality assessment methods appear to be poised to make important contributions to protein structure modeling. The other interesting finding was that single-model quality assessment scores could be used to weight the models by the consensus pairwise model comparison method to improve its accuracy.

  18. Replicating Health Economic Models: Firm Foundations or a House of Cards?

    PubMed

    Bermejo, Inigo; Tappenden, Paul; Youn, Ji-Hee

    2017-11-01

    Health economic evaluation is a framework for the comparative analysis of the incremental health gains and costs associated with competing decision alternatives. The process of developing health economic models is usually complex, financially expensive and time-consuming. For these reasons, model development is sometimes based on previous model-based analyses; this endeavour is usually referred to as model replication. Such model replication activity may involve the comprehensive reproduction of an existing model or 'borrowing' all or part of a previously developed model structure. Generally speaking, the replication of an existing model may require substantially less effort than developing a new de novo model by bypassing, or undertaking in only a perfunctory manner, certain aspects of model development such as the development of a complete conceptual model and/or comprehensive literature searching for model parameters. A further motivation for model replication may be to draw on the credibility or prestige of previous analyses that have been published and/or used to inform decision making. The acceptability and appropriateness of replicating models depends on the decision-making context: there exists a trade-off between the 'savings' afforded by model replication and the potential 'costs' associated with reduced model credibility due to the omission of certain stages of model development. This paper provides an overview of the different levels of, and motivations for, replicating health economic models, and discusses the advantages, disadvantages and caveats associated with this type of modelling activity. Irrespective of whether replicated models should be considered appropriate or not, complete replicability is generally accepted as a desirable property of health economic models, as reflected in critical appraisal checklists and good practice guidelines. To this end, the feasibility of comprehensive model replication is explored empirically across a small number of recent case studies. Recommendations are put forward for improving reporting standards to enhance comprehensive model replicability.

  19. Reducing hydrologic model uncertainty in monthly streamflow predictions using multimodel combination

    NASA Astrophysics Data System (ADS)

    Li, Weihua; Sankarasubramanian, A.

    2012-12-01

    Model errors are inevitable in any prediction exercise. One approach that is currently gaining attention in reducing model errors is by combining multiple models to develop improved predictions. The rationale behind this approach primarily lies on the premise that optimal weights could be derived for each model so that the developed multimodel predictions will result in improved predictions. A new dynamic approach (MM-1) to combine multiple hydrological models by evaluating their performance/skill contingent on the predictor state is proposed. We combine two hydrological models, "abcd" model and variable infiltration capacity (VIC) model, to develop multimodel streamflow predictions. To quantify precisely under what conditions the multimodel combination results in improved predictions, we compare multimodel scheme MM-1 with optimal model combination scheme (MM-O) by employing them in predicting the streamflow generated from a known hydrologic model (abcd model orVICmodel) with heteroscedastic error variance as well as from a hydrologic model that exhibits different structure than that of the candidate models (i.e., "abcd" model or VIC model). Results from the study show that streamflow estimated from single models performed better than multimodels under almost no measurement error. However, under increased measurement errors and model structural misspecification, both multimodel schemes (MM-1 and MM-O) consistently performed better than the single model prediction. Overall, MM-1 performs better than MM-O in predicting the monthly flow values as well as in predicting extreme monthly flows. Comparison of the weights obtained from each candidate model reveals that as measurement errors increase, MM-1 assigns weights equally for all the models, whereas MM-O assigns higher weights for always the best-performing candidate model under the calibration period. Applying the multimodel algorithms for predicting streamflows over four different sites revealed that MM-1 performs better than all single models and optimal model combination scheme, MM-O, in predicting the monthly flows as well as the flows during wetter months.

  20. Comparing the cognitive differences resulting from modeling instruction: Using computer microworld and physical object instruction to model real world problems

    NASA Astrophysics Data System (ADS)

    Oursland, Mark David

    This study compared the modeling achievement of students receiving mathematical modeling instruction using the computer microworld, Interactive Physics, and students receiving instruction using physical objects. Modeling instruction included activities where students applied the (a) linear model to a variety of situations, (b) linear model to two-rate situations with a constant rate, (c) quadratic model to familiar geometric figures. Both quantitative and qualitative methods were used to analyze achievement differences between students (a) receiving different methods of modeling instruction, (b) with different levels of beginning modeling ability, or (c) with different levels of computer literacy. Student achievement was analyzed quantitatively through a three-factor analysis of variance where modeling instruction, beginning modeling ability, and computer literacy were used as the three independent factors. The SOLO (Structure of the Observed Learning Outcome) assessment framework was used to design written modeling assessment instruments to measure the students' modeling achievement. The same three independent factors were used to collect and analyze the interviews and observations of student behaviors. Both methods of modeling instruction used the data analysis approach to mathematical modeling. The instructional lessons presented problem situations where students were asked to collect data, analyze the data, write a symbolic mathematical equation, and use equation to solve the problem. The researcher recommends the following practice for modeling instruction based on the conclusions of this study. A variety of activities with a common structure are needed to make explicit the modeling process of applying a standard mathematical model. The modeling process is influenced strongly by prior knowledge of the problem context and previous modeling experiences. The conclusions of this study imply that knowledge of the properties about squares improved the students' ability to model a geometric problem more than instruction in data analysis modeling. The uses of computer microworlds such as Interactive Physics in conjunction with cooperative groups are a viable method of modeling instruction.

  1. A physical data model for fields and agents

    NASA Astrophysics Data System (ADS)

    de Jong, Kor; de Bakker, Merijn; Karssenberg, Derek

    2016-04-01

    Two approaches exist in simulation modeling: agent-based and field-based modeling. In agent-based (or individual-based) simulation modeling, the entities representing the system's state are represented by objects, which are bounded in space and time. Individual objects, like an animal, a house, or a more abstract entity like a country's economy, have properties representing their state. In an agent-based model this state is manipulated. In field-based modeling, the entities representing the system's state are represented by fields. Fields capture the state of a continuous property within a spatial extent, examples of which are elevation, atmospheric pressure, and water flow velocity. With respect to the technology used to create these models, the domains of agent-based and field-based modeling have often been separate worlds. In environmental modeling, widely used logical data models include feature data models for point, line and polygon objects, and the raster data model for fields. Simulation models are often either agent-based or field-based, even though the modeled system might contain both entities that are better represented by individuals and entities that are better represented by fields. We think that the reason for this dichotomy in kinds of models might be that the traditional object and field data models underlying those models are relatively low level. We have developed a higher level conceptual data model for representing both non-spatial and spatial objects, and spatial fields (De Bakker et al. 2016). Based on this conceptual data model we designed a logical and physical data model for representing many kinds of data, including the kinds used in earth system modeling (e.g. hydrological and ecological models). The goal of this work is to be able to create high level code and tools for the creation of models in which entities are representable by both objects and fields. Our conceptual data model is capable of representing the traditional feature data models and the raster data model, among many other data models. Our physical data model is capable of storing a first set of kinds of data, like omnipresent scalars, mobile spatio-temporal points and property values, and spatio-temporal rasters. With our poster we will provide an overview of the physical data model expressed in HDF5 and show examples of how it can be used to capture both object- and field-based information. References De Bakker, M, K. de Jong, D. Karssenberg. 2016. A conceptual data model and language for fields and agents. European Geosciences Union, EGU General Assembly, 2016, Vienna.

  2. Students' Models of Curve Fitting: A Models and Modeling Perspective

    ERIC Educational Resources Information Center

    Gupta, Shweta

    2010-01-01

    The Models and Modeling Perspectives (MMP) has evolved out of research that began 26 years ago. MMP researchers use Model Eliciting Activities (MEAs) to elicit students' mental models. In this study MMP was used as the conceptual framework to investigate the nature of students' models of curve fitting in a problem-solving environment consisting of…

  3. Modeling Information Accumulation in Psychological Tests Using Item Response Times

    ERIC Educational Resources Information Center

    Ranger, Jochen; Kuhn, Jörg-Tobias

    2015-01-01

    In this article, a latent trait model is proposed for the response times in psychological tests. The latent trait model is based on the linear transformation model and subsumes popular models from survival analysis, like the proportional hazards model and the proportional odds model. Core of the model is the assumption that an unspecified monotone…

  4. Climate and atmospheric modeling studies

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The climate and atmosphere modeling research programs have concentrated on the development of appropriate atmospheric and upper ocean models, and preliminary applications of these models. Principal models are a one-dimensional radiative-convective model, a three-dimensional global model, and an upper ocean model. Principal applications were the study of the impact of CO2, aerosols, and the solar 'constant' on climate.

  5. Models in Science Education: Applications of Models in Learning and Teaching Science

    ERIC Educational Resources Information Center

    Ornek, Funda

    2008-01-01

    In this paper, I discuss different types of models in science education and applications of them in learning and teaching science, in particular physics. Based on the literature, I categorize models as conceptual and mental models according to their characteristics. In addition to these models, there is another model called "physics model" by the…

  6. Computer-Aided Modeling and Analysis of Power Processing Systems (CAMAPPS). Phase 1: Users handbook

    NASA Technical Reports Server (NTRS)

    Kim, S.; Lee, J.; Cho, B. H.; Lee, F. C.

    1986-01-01

    The EASY5 macro component models developed for the spacecraft power system simulation are described. A brief explanation about how to use the macro components with the EASY5 Standard Components to build a specific system is given through an example. The macro components are ordered according to the following functional group: converter power stage models, compensator models, current-feedback models, constant frequency control models, load models, solar array models, and shunt regulator models. Major equations, a circuit model, and a program listing are provided for each macro component.

  7. Vector models and generalized SYK models

    DOE PAGES

    Peng, Cheng

    2017-05-23

    Here, we consider the relation between SYK-like models and vector models by studying a toy model where a tensor field is coupled with a vector field. By integrating out the tensor field, the toy model reduces to the Gross-Neveu model in 1 dimension. On the other hand, a certain perturbation can be turned on and the toy model flows to an SYK-like model at low energy. Furthermore, a chaotic-nonchaotic phase transition occurs as the sign of the perturbation is altered. We further study similar models that possess chaos and enhanced reparameterization symmetries.

  8. Validation of the PVSyst Performance Model for the Concentrix CPV Technology

    NASA Astrophysics Data System (ADS)

    Gerstmaier, Tobias; Gomez, María; Gombert, Andreas; Mermoud, André; Lejeune, Thibault

    2011-12-01

    The accuracy of the two-stage PVSyst model for the Concentrix CPV Technology is determined by comparing modeled to measured values. For both stages, i) the module model and ii) the power plant model, the underlying approaches are explained and methods for obtaining the model parameters are presented. The performance of both models is quantified using 19 months of outdoor measurements for the module model and 9 months of measurements at four different sites for the power plant model. Results are presented by giving statistical quantities for the model accuracy.

  9. Comparative Protein Structure Modeling Using MODELLER

    PubMed Central

    Webb, Benjamin; Sali, Andrej

    2016-01-01

    Comparative protein structure modeling predicts the three-dimensional structure of a given protein sequence (target) based primarily on its alignment to one or more proteins of known structure (templates). The prediction process consists of fold assignment, target-template alignment, model building, and model evaluation. This unit describes how to calculate comparative models using the program MODELLER and how to use the ModBase database of such models, and discusses all four steps of comparative modeling, frequently observed errors, and some applications. Modeling lactate dehydrogenase from Trichomonas vaginalis (TvLDH) is described as an example. The download and installation of the MODELLER software is also described. PMID:27322406

  10. A comparative study of turbulence models in predicting hypersonic inlet flows

    NASA Technical Reports Server (NTRS)

    Kapoor, Kamlesh

    1993-01-01

    A computational study has been conducted to evaluate the performance of various turbulence models. The NASA P8 inlet, which represents cruise condition of a typical hypersonic air-breathing vehicle, was selected as a test case for the study; the PARC2D code, which solves the full two dimensional Reynolds-averaged Navier-Stokes equations, was used. Results are presented for a total of six versions of zero- and two-equation turbulence models. Zero-equation models tested are the Baldwin-Lomax model, the Thomas model, and a combination of the two. Two-equation models tested are low-Reynolds number models (the Chien model and the Speziale model) and a high-Reynolds number model (the Launder and Spalding model).

  11. The evolution of process-based hydrologic models: historical challenges and the collective quest for physical realism

    NASA Astrophysics Data System (ADS)

    Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R. N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-07-01

    The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.

  12. The evolution of process-based hydrologic models: historical challenges and the collective quest for physical realism

    NASA Astrophysics Data System (ADS)

    Clark, M. P.; Nijssen, B.; Wood, A.; Mizukami, N.; Newman, A. J.

    2017-12-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.

  13. Trapped Radiation Model Uncertainties: Model-Data and Model-Model Comparisons

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux and dose measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP, LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the Russian Mir Space Station. This report gives the details of the model-data comparisons-summary results in terms of empirical model uncertainty factors that can be applied for spacecraft design applications are given in a combination report. The results of model-model comparisons are also presented from standard AP8 and AE8 model predictions compared with the European Space Agency versions of AP8 and AE8 and with Russian-trapped radiation models.

  14. Trapped Radiation Model Uncertainties: Model-Data and Model-Model Comparisons

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux and dose measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP. LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the Russian Mir space station. This report gives the details of the model-data comparisons -- summary results in terms of empirical model uncertainty factors that can be applied for spacecraft design applications are given in a companion report. The results of model-model comparisons are also presented from standard AP8 and AE8 model predictions compared with the European Space Agency versions of AP8 and AE8 and with Russian trapped radiation models.

  15. Analysis of terahertz dielectric properties of pork tissue

    NASA Astrophysics Data System (ADS)

    Huang, Yuqing; Xie, Qiaoling; Sun, Ping

    2017-10-01

    Seeing that about 70% component of fresh biological tissues is water, many scientists try to use water models to describe the dielectric properties of biological tissues. The classical water dielectric models are Debye model, Double Debye model and Cole-Cole model. This work aims to determine a suitable model by comparing three models above with experimental data. These models are applied to fresh pork tissue. By means of least square method, the parameters of different models are fitted with the experimental data. Comparing different models on both dielectric function, the Cole-Cole model is verified the best to describe the experiments of pork tissue. The correction factor α of the Cole-Cole model is an important modification for biological tissues. So Cole-Cole model is supposed to be a priority selection to describe the dielectric properties for biological tissues in the terahertz range.

  16. Dealing with dissatisfaction in mathematical modelling to integrate QFD and Kano’s model

    NASA Astrophysics Data System (ADS)

    Retno Sari Dewi, Dian; Debora, Joana; Edy Sianto, Martinus

    2017-12-01

    The purpose of the study is to implement the integration of Quality Function Deployment (QFD) and Kano’s Model into mathematical model. Voice of customer data in QFD was collected using questionnaire and the questionnaire was developed based on Kano’s model. Then the operational research methodology was applied to build the objective function and constraints in the mathematical model. The relationship between voice of customer and engineering characteristics was modelled using linier regression model. Output of the mathematical model would be detail of engineering characteristics. The objective function of this model is to maximize satisfaction and minimize dissatisfaction as well. Result of this model is 62% .The major contribution of this research is to implement the existing mathematical model to integrate QFD and Kano’s Model in the case study of shoe cabinet.

  17. The Real and the Mathematical in Quantum Modeling: From Principles to Models and from Models to Principles

    NASA Astrophysics Data System (ADS)

    Plotnitsky, Arkady

    2017-06-01

    The history of mathematical modeling outside physics has been dominated by the use of classical mathematical models, C-models, primarily those of a probabilistic or statistical nature. More recently, however, quantum mathematical models, Q-models, based in the mathematical formalism of quantum theory have become more prominent in psychology, economics, and decision science. The use of Q-models in these fields remains controversial, in part because it is not entirely clear whether Q-models are necessary for dealing with the phenomena in question or whether C-models would still suffice. My aim, however, is not to assess the necessity of Q-models in these fields, but instead to reflect on what the possible applicability of Q-models may tell us about the corresponding phenomena there, vis-à-vis quantum phenomena in physics. In order to do so, I shall first discuss the key reasons for the use of Q-models in physics. In particular, I shall examine the fundamental principles that led to the development of quantum mechanics. Then I shall consider a possible role of similar principles in using Q-models outside physics. Psychology, economics, and decision science borrow already available Q-models from quantum theory, rather than derive them from their own internal principles, while quantum mechanics was derived from such principles, because there was no readily available mathematical model to handle quantum phenomena, although the mathematics ultimately used in quantum did in fact exist then. I shall argue, however, that the principle perspective on mathematical modeling outside physics might help us to understand better the role of Q-models in these fields and possibly to envision new models, conceptually analogous to but mathematically different from those of quantum theory, helpful or even necessary there or in physics itself. I shall suggest one possible type of such models, singularized probabilistic, SP, models, some of which are time-dependent, TDSP-models. The necessity of using such models may change the nature of mathematical modeling in science and, thus, the nature of science, as it happened in the case of Q-models, which not only led to a revolutionary transformation of physics but also opened new possibilities for scientific thinking and mathematical modeling beyond physics.

  18. Vertically-Integrated Dual-Continuum Models for CO2 Injection in Fractured Aquifers

    NASA Astrophysics Data System (ADS)

    Tao, Y.; Guo, B.; Bandilla, K.; Celia, M. A.

    2017-12-01

    Injection of CO2 into a saline aquifer leads to a two-phase flow system, with supercritical CO2 and brine being the two fluid phases. Various modeling approaches, including fully three-dimensional (3D) models and vertical-equilibrium (VE) models, have been used to study the system. Almost all of that work has focused on unfractured formations. 3D models solve the governing equations in three dimensions and are applicable to generic geological formations. VE models assume rapid and complete buoyant segregation of the two fluid phases, resulting in vertical pressure equilibrium and allowing integration of the governing equations in the vertical dimension. This reduction in dimensionality makes VE models computationally more efficient, but the associated assumptions restrict the applicability of VE model to formations with moderate to high permeability. In this presentation, we extend the VE and 3D models for CO2 injection in fractured aquifers. This is done in the context of dual-continuum modeling, where the fractured formation is modeled as an overlap of two continuous domains, one representing the fractures and the other representing the rock matrix. Both domains are treated as porous media continua and can be modeled by either a VE or a 3D formulation. The transfer of fluid mass between rock matrix and fractures is represented by a mass transfer function connecting the two domains. We have developed a computational model that combines the VE and 3D models, where we use the VE model in the fractures, which typically have high permeability, and the 3D model in the less permeable rock matrix. A new mass transfer function is derived, which couples the VE and 3D models. The coupled VE-3D model can simulate CO2 injection and migration in fractured aquifers. Results from this model compare well with a full-3D model in which both the fractures and rock matrix are modeled with 3D models, with the hybrid VE-3D model having significantly reduced computational cost. In addition to the VE-3D model, we explore simplifications of the rock matrix domain by using sugar-cube and matchstick conceptualizations and develop VE-dual porosity and VE-matchstick models. These vertically-integrated dual-permeability and dual-porosity models provide a range of computationally efficient tools to model CO2 storage in fractured saline aquifers.

  19. ATMOSPHERIC DISPERSAL AND DEPOSITION OF TEPHRA FROM A POTENTIAL VOLCANIC ERUPTION AT YUCCA MOUNTAIN, NEVADA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. Harrington

    2004-10-25

    The purpose of this model report is to provide documentation of the conceptual and mathematical model (Ashplume) for atmospheric dispersal and subsequent deposition of ash on the land surface from a potential volcanic eruption at Yucca Mountain, Nevada. This report also documents the ash (tephra) redistribution conceptual model. These aspects of volcanism-related dose calculation are described in the context of the entire igneous disruptive events conceptual model in ''Characterize Framework for Igneous Activity'' (BSC 2004 [DIRS 169989], Section 6.1.1). The Ashplume conceptual model accounts for incorporation and entrainment of waste fuel particles associated with a hypothetical volcanic eruption through themore » Yucca Mountain repository and downwind transport of contaminated tephra. The Ashplume mathematical model describes the conceptual model in mathematical terms to allow for prediction of radioactive waste/ash deposition on the ground surface given that the hypothetical eruptive event occurs. This model report also describes the conceptual model for tephra redistribution from a basaltic cinder cone. Sensitivity analyses and model validation activities for the ash dispersal and redistribution models are also presented. Analyses documented in this model report update the previous documentation of the Ashplume mathematical model and its application to the Total System Performance Assessment (TSPA) for the License Application (TSPA-LA) igneous scenarios. This model report also documents the redistribution model product outputs based on analyses to support the conceptual model. In this report, ''Ashplume'' is used when referring to the atmospheric dispersal model and ''ASHPLUME'' is used when referencing the code of that model. Two analysis and model reports provide direct inputs to this model report, namely ''Characterize Eruptive Processes at Yucca Mountain, Nevada and Number of Waste Packages Hit by Igneous Intrusion''. This model report provides direct inputs to the TSPA, which uses the ASHPLUME software described and used in this model report. Thus, ASHPLUME software inputs are inputs to this model report for ASHPLUME runs in this model report. However, ASHPLUME software inputs are outputs of this model report for ASHPLUME runs by TSPA.« less

  20. Predicting motor vehicle collisions using Bayesian neural network models: an empirical analysis.

    PubMed

    Xie, Yuanchang; Lord, Dominique; Zhang, Yunlong

    2007-09-01

    Statistical models have frequently been used in highway safety studies. They can be utilized for various purposes, including establishing relationships between variables, screening covariates and predicting values. Generalized linear models (GLM) and hierarchical Bayes models (HBM) have been the most common types of model favored by transportation safety analysts. Over the last few years, researchers have proposed the back-propagation neural network (BPNN) model for modeling the phenomenon under study. Compared to GLMs and HBMs, BPNNs have received much less attention in highway safety modeling. The reasons are attributed to the complexity for estimating this kind of model as well as the problem related to "over-fitting" the data. To circumvent the latter problem, some statisticians have proposed the use of Bayesian neural network (BNN) models. These models have been shown to perform better than BPNN models while at the same time reducing the difficulty associated with over-fitting the data. The objective of this study is to evaluate the application of BNN models for predicting motor vehicle crashes. To accomplish this objective, a series of models was estimated using data collected on rural frontage roads in Texas. Three types of models were compared: BPNN, BNN and the negative binomial (NB) regression models. The results of this study show that in general both types of neural network models perform better than the NB regression model in terms of data prediction. Although the BPNN model can occasionally provide better or approximately equivalent prediction performance compared to the BNN model, in most cases its prediction performance is worse than the BNN model. In addition, the data fitting performance of the BPNN model is consistently worse than the BNN model, which suggests that the BNN model has better generalization abilities than the BPNN model and can effectively alleviate the over-fitting problem without significantly compromising the nonlinear approximation ability. The results also show that BNNs could be used for other useful analyses in highway safety, including the development of accident modification factors and for improving the prediction capabilities for evaluating different highway design alternatives.

  1. Understanding seasonal variability of uncertainty in hydrological prediction

    NASA Astrophysics Data System (ADS)

    Li, M.; Wang, Q. J.

    2012-04-01

    Understanding uncertainty in hydrological prediction can be highly valuable for improving the reliability of streamflow prediction. In this study, a monthly water balance model, WAPABA, in a Bayesian joint probability with error models are presented to investigate the seasonal dependency of prediction error structure. A seasonal invariant error model, analogous to traditional time series analysis, uses constant parameters for model error and account for no seasonal variations. In contrast, a seasonal variant error model uses a different set of parameters for bias, variance and autocorrelation for each individual calendar month. Potential connection amongst model parameters from similar months is not considered within the seasonal variant model and could result in over-fitting and over-parameterization. A hierarchical error model further applies some distributional restrictions on model parameters within a Bayesian hierarchical framework. An iterative algorithm is implemented to expedite the maximum a posterior (MAP) estimation of a hierarchical error model. Three error models are applied to forecasting streamflow at a catchment in southeast Australia in a cross-validation analysis. This study also presents a number of statistical measures and graphical tools to compare the predictive skills of different error models. From probability integral transform histograms and other diagnostic graphs, the hierarchical error model conforms better to reliability when compared to the seasonal invariant error model. The hierarchical error model also generally provides the most accurate mean prediction in terms of the Nash-Sutcliffe model efficiency coefficient and the best probabilistic prediction in terms of the continuous ranked probability score (CRPS). The model parameters of the seasonal variant error model are very sensitive to each cross validation, while the hierarchical error model produces much more robust and reliable model parameters. Furthermore, the result of the hierarchical error model shows that most of model parameters are not seasonal variant except for error bias. The seasonal variant error model is likely to use more parameters than necessary to maximize the posterior likelihood. The model flexibility and robustness indicates that the hierarchical error model has great potential for future streamflow predictions.

  2. [Suitability of four stomatal conductance models in agro-pastoral ecotone in North China: A case study for potato and oil sunflower.

    PubMed

    Huang, Ming Xia; Wang, Jing; Tang, Jian Zhao; Yu, Qiang; Zhang, Jun; Xue, Qing Yu; Chang, Qing; Tan, Mei Xiu

    2016-11-18

    The suitability of four popular empirical and semi-empirical stomatal conductance models (Jarvis model, Ball-Berry model, Leuning model and Medlyn model) was evaluated based on para-llel observation data of leaf stomatal conductance, leaf net photosynthetic rate and meteorological factors during the vigorous growing period of potato and oil sunflower at Wuchuan experimental station in agro-pastoral ecotone in North China. It was found that there was a significant linear relationship between leaf stomatal conductance and leaf net photosynthetic rate for potato, whereas the linear relationship appeared weaker for oil sunflower. The results of model evaluation showed that Ball-Berry model performed best in simulating leaf stomatal conductance of potato, followed by Leuning model and Medlyn model, while Jarvis model was the last in the performance rating. The root-mean-square error (RMSE) was 0.0331, 0.0371, 0.0456 and 0.0794 mol·m -2 ·s -1 , the normalized root-mean-square error (NRMSE) was 26.8%, 30.0%, 36.9% and 64.3%, and R-squared (R 2 ) was 0.96, 0.61, 0.91 and 0.88 between simulated and observed leaf stomatal conductance of potato for Ball-Berry model, Leuning model, Medlyn model and Jarvis model, respectively. For leaf stomatal conductance of oil sunflower, Jarvis model performed slightly better than Leuning model, Ball-Berry model and Medlyn model. RMSE was 0.2221, 0.2534, 0.2547 and 0.2758 mol·m -2 ·s -1 , NRMSE was 40.3%, 46.0%, 46.2% and 50.1%, and R 2 was 0.38, 0.22, 0.23 and 0.20 between simulated and observed leaf stomatal conductance of oil sunflower for Jarvis model, Leuning model, Ball-Berry model and Medlyn model, respectively. The path analysis was conducted to identify effects of specific meteorological factors on leaf stomatal conductance. The diurnal variation of leaf stomatal conductance was principally affected by vapour pressure saturation deficit for both potato and oil sunflower. The model evaluation suggested that the stomatal conductance models for oil sunflower are to be improved in further research.

  3. Evaluation of chiller modeling approaches and their usability for fault detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sreedharan, Priya

    Selecting the model is an important and essential step in model based fault detection and diagnosis (FDD). Several factors must be considered in model evaluation, including accuracy, training data requirements, calibration effort, generality, and computational requirements. All modeling approaches fall somewhere between pure first-principles models, and empirical models. The objective of this study was to evaluate different modeling approaches for their applicability to model based FDD of vapor compression air conditioning units, which are commonly known as chillers. Three different models were studied: two are based on first-principles and the third is empirical in nature. The first-principles models are themore » Gordon and Ng Universal Chiller model (2nd generation), and a modified version of the ASHRAE Primary Toolkit model, which are both based on first principles. The DOE-2 chiller model as implemented in CoolTools{trademark} was selected for the empirical category. The models were compared in terms of their ability to reproduce the observed performance of an older chiller operating in a commercial building, and a newer chiller in a laboratory. The DOE-2 and Gordon-Ng models were calibrated by linear regression, while a direct-search method was used to calibrate the Toolkit model. The ''CoolTools'' package contains a library of calibrated DOE-2 curves for a variety of different chillers, and was used to calibrate the building chiller to the DOE-2 model. All three models displayed similar levels of accuracy. Of the first principles models, the Gordon-Ng model has the advantage of being linear in the parameters, which allows more robust parameter estimation methods to be used and facilitates estimation of the uncertainty in the parameter values. The ASHRAE Toolkit Model may have advantages when refrigerant temperature measurements are also available. The DOE-2 model can be expected to have advantages when very limited data are available to calibrate the model, as long as one of the previously identified models in the CoolTools library matches the performance of the chiller in question.« less

  4. PyMT: A Python package for model-coupling in the Earth sciences

    NASA Astrophysics Data System (ADS)

    Hutton, E.

    2016-12-01

    The current landscape of Earth-system models is not only broad in scientific scope, but also broad in type. On the one hand, the large variety of models is exciting, as it provides fertile ground for extending or linking models together in novel ways to answer new scientific questions. However, the heterogeneity in model type acts to inhibit model coupling, model development, or even model use. Existing models are written in a variety of programming languages, operate on different grids, use their own file formats (both for input and output), have different user interfaces, have their own time steps, etc. Each of these factors become obstructions to scientists wanting to couple, extend - or simply run - existing models. For scientists whose main focus may not be computer science these barriers become even larger and become significant logistical hurdles. And this is all before the scientific difficulties of coupling or running models are addressed. The CSDMS Python Modeling Toolkit (PyMT) was developed to help non-computer scientists deal with these sorts of modeling logistics. PyMT is the fundamental package the Community Surface Dynamics Modeling System uses for the coupling of models that expose the Basic Modeling Interface (BMI). It contains: Tools necessary for coupling models of disparate time and space scales (including grid mappers) Time-steppers that coordinate the sequencing of coupled models Exchange of data between BMI-enabled models Wrappers that automatically load BMI-enabled models into the PyMT framework Utilities that support open-source interfaces (UGRID, SGRID,CSDMS Standard Names, etc.) A collection of community-submitted models, written in a variety of programminglanguages, from a variety of process domains - but all usable from within the Python programming language A plug-in framework for adding additional BMI-enabled models to the framework In this presentation we intoduce the basics of the PyMT as well as provide an example of coupling models of different domains and grid types.

  5. Can the super model (SUMO) method improve hydrological simulations? Exploratory tests with the GR hydrological models

    NASA Astrophysics Data System (ADS)

    Santos, Léonard; Thirel, Guillaume; Perrin, Charles

    2017-04-01

    Errors made by hydrological models may come from a problem in parameter estimation, uncertainty on observed measurements, numerical problems and from the model conceptualization that simplifies the reality. Here we focus on this last issue of hydrological modeling. One of the solutions to reduce structural uncertainty is to use a multimodel method, taking advantage of the great number and the variability of existing hydrological models. In particular, because different models are not similarly good in all situations, using multimodel approaches can improve the robustness of modeled outputs. Traditionally, in hydrology, multimodel methods are based on the output of the model (the simulated flow series). The aim of this poster is to introduce a different approach based on the internal variables of the models. The method is inspired by the SUper MOdel (SUMO, van den Berge et al., 2011) developed for climatology. The idea of the SUMO method is to correct the internal variables of a model taking into account the values of the internal variables of (an)other model(s). This correction is made bilaterally between the different models. The ensemble of the different models constitutes a super model in which all the models exchange information on their internal variables with each other at each time step. Due to this continuity in the exchanges, this multimodel algorithm is more dynamic than traditional multimodel methods. The method will be first tested using two GR4J models (in a state-space representation) with different parameterizations. The results will be presented and compared to traditional multimodel methods that will serve as benchmarks. In the future, other rainfall-runoff models will be used in the super model. References van den Berge, L. A., Selten, F. M., Wiegerinck, W., and Duane, G. S. (2011). A multi-model ensemble method that combines imperfect models through learning. Earth System Dynamics, 2(1) :161-177.

  6. Downscaling GISS ModelE Boreal Summer Climate over Africa

    NASA Technical Reports Server (NTRS)

    Druyan, Leonard M.; Fulakeza, Matthew

    2015-01-01

    The study examines the perceived added value of downscaling atmosphere-ocean global climate model simulations over Africa and adjacent oceans by a nested regional climate model. NASA/Goddard Institute for Space Studies (GISS) coupled ModelE simulations for June- September 1998-2002 are used to form lateral boundary conditions for synchronous simulations by the GISS RM3 regional climate model. The ModelE computational grid spacing is 2deg latitude by 2.5deg longitude and the RM3 grid spacing is 0.44deg. ModelE precipitation climatology for June-September 1998-2002 is shown to be a good proxy for 30-year means so results based on the 5-year sample are presumed to be generally representative. Comparison with observational evidence shows several discrepancies in ModelE configuration of the boreal summer inter-tropical convergence zone (ITCZ). One glaring shortcoming is that ModelE simulations do not advance the West African rain band northward during the summer to represent monsoon precipitation onset over the Sahel. Results for 1998-2002 show that onset simulation is an important added value produced by downscaling with RM3. ModelE Eastern South Atlantic Ocean computed sea-surface temperatures (SST) are some 4 K warmer than reanalysis, contributing to large positive biases in overlying surface air temperatures (Tsfc). ModelE Tsfc are also too warm over most of Africa. RM3 downscaling somewhat mitigates the magnitude of Tsfc biases over the African continent, it eliminates the ModelE double ITCZ over the Atlantic and it produces more realistic orographic precipitation maxima. Parallel ModelE and RM3 simulations with observed SST forcing (in place of the predicted ocean) lower Tsfc errors but have mixed impacts on circulation and precipitation biases. Downscaling improvements of the meridional movement of the rain band over West Africa and the configuration of orographic precipitation maxima are realized irrespective of the SST biases.

  7. A tool for multi-scale modelling of the renal nephron

    PubMed Central

    Nickerson, David P.; Terkildsen, Jonna R.; Hamilton, Kirk L.; Hunter, Peter J.

    2011-01-01

    We present the development of a tool, which provides users with the ability to visualize and interact with a comprehensive description of a multi-scale model of the renal nephron. A one-dimensional anatomical model of the nephron has been created and is used for visualization and modelling of tubule transport in various nephron anatomical segments. Mathematical models of nephron segments are embedded in the one-dimensional model. At the cellular level, these segment models use models encoded in CellML to describe cellular and subcellular transport kinetics. A web-based presentation environment has been developed that allows the user to visualize and navigate through the multi-scale nephron model, including simulation results, at the different spatial scales encompassed by the model description. The Zinc extension to Firefox is used to provide an interactive three-dimensional view of the tubule model and the native Firefox rendering of scalable vector graphics is used to present schematic diagrams for cellular and subcellular scale models. The model viewer is embedded in a web page that dynamically presents content based on user input. For example, when viewing the whole nephron model, the user might be presented with information on the various embedded segment models as they select them in the three-dimensional model view. Alternatively, the user chooses to focus the model viewer on a cellular model located in a particular nephron segment in order to view the various membrane transport proteins. Selecting a specific protein may then present the user with a description of the mathematical model governing the behaviour of that protein—including the mathematical model itself and various simulation experiments used to validate the model against the literature. PMID:22670210

  8. An online model composition tool for system biology models

    PubMed Central

    2013-01-01

    Background There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. Results We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user’s input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Conclusions Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well. PMID:24006914

  9. A parsimonious dynamic model for river water quality assessment.

    PubMed

    Mannina, Giorgio; Viviani, Gaspare

    2010-01-01

    Water quality modelling is of crucial importance for the assessment of physical, chemical, and biological changes in water bodies. Mathematical approaches to water modelling have become more prevalent over recent years. Different model types ranging from detailed physical models to simplified conceptual models are available. Actually, a possible middle ground between detailed and simplified models may be parsimonious models that represent the simplest approach that fits the application. The appropriate modelling approach depends on the research goal as well as on data available for correct model application. When there is inadequate data, it is mandatory to focus on a simple river water quality model rather than detailed ones. The study presents a parsimonious river water quality model to evaluate the propagation of pollutants in natural rivers. The model is made up of two sub-models: a quantity one and a quality one. The model employs a river schematisation that considers different stretches according to the geometric characteristics and to the gradient of the river bed. Each stretch is represented with a conceptual model of a series of linear channels and reservoirs. The channels determine the delay in the pollution wave and the reservoirs cause its dispersion. To assess the river water quality, the model employs four state variables: DO, BOD, NH(4), and NO. The model was applied to the Savena River (Italy), which is the focus of a European-financed project in which quantity and quality data were gathered. A sensitivity analysis of the model output to the model input or parameters was done based on the Generalised Likelihood Uncertainty Estimation methodology. The results demonstrate the suitability of such a model as a tool for river water quality management.

  10. The cost of simplifying air travel when modeling disease spread.

    PubMed

    Lessler, Justin; Kaufman, James H; Ford, Daniel A; Douglas, Judith V

    2009-01-01

    Air travel plays a key role in the spread of many pathogens. Modeling the long distance spread of infectious disease in these cases requires an air travel model. Highly detailed air transportation models can be over determined and computationally problematic. We compared the predictions of a simplified air transport model with those of a model of all routes and assessed the impact of differences on models of infectious disease. Using U.S. ticket data from 2007, we compared a simplified "pipe" model, in which individuals flow in and out of the air transport system based on the number of arrivals and departures from a given airport, to a fully saturated model where all routes are modeled individually. We also compared the pipe model to a "gravity" model where the probability of travel is scaled by physical distance; the gravity model did not differ significantly from the pipe model. The pipe model roughly approximated actual air travel, but tended to overestimate the number of trips between small airports and underestimate travel between major east and west coast airports. For most routes, the maximum number of false (or missed) introductions of disease is small (<1 per day) but for a few routes this rate is greatly underestimated by the pipe model. If our interest is in large scale regional and national effects of disease, the simplified pipe model may be adequate. If we are interested in specific effects of interventions on particular air routes or the time for the disease to reach a particular location, a more complex point-to-point model will be more accurate. For many problems a hybrid model that independently models some frequently traveled routes may be the best choice. Regardless of the model used, the effect of simplifications and sensitivity to errors in parameter estimation should be analyzed.

  11. Risk prediction models of breast cancer: a systematic review of model performances.

    PubMed

    Anothaisintawee, Thunyarat; Teerawattananon, Yot; Wiratkapun, Chollathip; Kasamesup, Vijj; Thakkinstian, Ammarin

    2012-05-01

    The number of risk prediction models has been increasingly developed, for estimating about breast cancer in individual women. However, those model performances are questionable. We therefore have conducted a study with the aim to systematically review previous risk prediction models. The results from this review help to identify the most reliable model and indicate the strengths and weaknesses of each model for guiding future model development. We searched MEDLINE (PubMed) from 1949 and EMBASE (Ovid) from 1974 until October 2010. Observational studies which constructed models using regression methods were selected. Information about model development and performance were extracted. Twenty-five out of 453 studies were eligible. Of these, 18 developed prediction models and 7 validated existing prediction models. Up to 13 variables were included in the models and sample sizes for each study ranged from 550 to 2,404,636. Internal validation was performed in four models, while five models had external validation. Gail and Rosner and Colditz models were the significant models which were subsequently modified by other scholars. Calibration performance of most models was fair to good (expected/observe ratio: 0.87-1.12), but discriminatory accuracy was poor to fair both in internal validation (concordance statistics: 0.53-0.66) and in external validation (concordance statistics: 0.56-0.63). Most models yielded relatively poor discrimination in both internal and external validation. This poor discriminatory accuracy of existing models might be because of a lack of knowledge about risk factors, heterogeneous subtypes of breast cancer, and different distributions of risk factors across populations. In addition the concordance statistic itself is insensitive to measure the improvement of discrimination. Therefore, the new method such as net reclassification index should be considered to evaluate the improvement of the performance of a new develop model.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. A. Wasiolek

    The purpose of this report is to document the biosphere model, the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), which describes radionuclide transport processes in the biosphere and associated human exposure that may arise as the result of radionuclide release from the geologic repository at Yucca Mountain. The biosphere model is one of the process models that support the Yucca Mountain Project (YMP) Total System Performance Assessment (TSPA) for the license application (LA), the TSPA-LA. The ERMYN model provides the capability of performing human radiation dose assessments. This report documents the biosphere model, which includes: (1) Describing the referencemore » biosphere, human receptor, exposure scenarios, and primary radionuclides for each exposure scenario (Section 6.1); (2) Developing a biosphere conceptual model using site-specific features, events, and processes (FEPs), the reference biosphere, the human receptor, and assumptions (Section 6.2 and Section 6.3); (3) Building a mathematical model using the biosphere conceptual model and published biosphere models (Sections 6.4 and 6.5); (4) Summarizing input parameters for the mathematical model, including the uncertainty associated with input values (Section 6.6); (5) Identifying improvements in the ERMYN model compared with the model used in previous biosphere modeling (Section 6.7); (6) Constructing an ERMYN implementation tool (model) based on the biosphere mathematical model using GoldSim stochastic simulation software (Sections 6.8 and 6.9); (7) Verifying the ERMYN model by comparing output from the software with hand calculations to ensure that the GoldSim implementation is correct (Section 6.10); and (8) Validating the ERMYN model by corroborating it with published biosphere models; comparing conceptual models, mathematical models, and numerical results (Section 7).« less

  13. Microphysics in the Multi-Scale Modeling Systems with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2011-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (l) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, the microphysics developments of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the heavy precipitation processes will be presented.

  14. Intelligent Decisions Need Intelligent Choice of Models and Data - a Bayesian Justifiability Analysis for Models with Vastly Different Complexity

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Schöniger, A.; Wöhling, T.; Illman, W. A.

    2016-12-01

    Model-based decision support requires justifiable models with good predictive capabilities. This, in turn, calls for a fine adjustment between predictive accuracy (small systematic model bias that can be achieved with rather complex models), and predictive precision (small predictive uncertainties that can be achieved with simpler models with fewer parameters). The implied complexity/simplicity trade-off depends on the availability of informative data for calibration. If not available, additional data collection can be planned through optimal experimental design. We present a model justifiability analysis that can compare models of vastly different complexity. It rests on Bayesian model averaging (BMA) to investigate the complexity/performance trade-off dependent on data availability. Then, we disentangle the complexity component from the performance component. We achieve this by replacing actually observed data by realizations of synthetic data predicted by the models. This results in a "model confusion matrix". Based on this matrix, the modeler can identify the maximum model complexity that can be justified by the available (or planned) amount and type of data. As a side product, the matrix quantifies model (dis-)similarity. We apply this analysis to aquifer characterization via hydraulic tomography, comparing four models with a vastly different number of parameters (from a homogeneous model to geostatistical random fields). As a testing scenario, we consider hydraulic tomography data. Using subsets of these data, we determine model justifiability as a function of data set size. The test case shows that geostatistical parameterization requires a substantial amount of hydraulic tomography data to be justified, while a zonation-based model can be justified with more limited data set sizes. The actual model performance (as opposed to model justifiability), however, depends strongly on the quality of prior geological information.

  15. Model-based economic evaluation in Alzheimer's disease: a review of the methods available to model Alzheimer's disease progression.

    PubMed

    Green, Colin; Shearer, James; Ritchie, Craig W; Zajicek, John P

    2011-01-01

    To consider the methods available to model Alzheimer's disease (AD) progression over time to inform on the structure and development of model-based evaluations, and the future direction of modelling methods in AD. A systematic search of the health care literature was undertaken to identify methods to model disease progression in AD. Modelling methods are presented in a descriptive review. The literature search identified 42 studies presenting methods or applications of methods to model AD progression over time. The review identified 10 general modelling frameworks available to empirically model the progression of AD as part of a model-based evaluation. Seven of these general models are statistical models predicting progression of AD using a measure of cognitive function. The main concerns with models are on model structure, around the limited characterization of disease progression, and on the use of a limited number of health states to capture events related to disease progression over time. None of the available models have been able to present a comprehensive model of the natural history of AD. Although helpful, there are serious limitations in the methods available to model progression of AD over time. Advances are needed to better model the progression of AD and the effects of the disease on peoples' lives. Recent evidence supports the need for a multivariable approach to the modelling of AD progression, and indicates that a latent variable analytic approach to characterising AD progression is a promising avenue for advances in the statistical development of modelling methods. Copyright © 2011 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  16. Nursing resources and responsibilities according to hospital organizational model for management of inflammatory bowel disease in Spain.

    PubMed

    Marín, Laura; Torrejón, Antonio; Oltra, Lorena; Seoane, Montserrat; Hernández-Sampelayo, Paloma; Vera, María Isabel; Casellas, Francesc; Alfaro, Noelia; Lázaro, Pablo; García-Sánchez, Valle

    2011-06-01

    Nurses play an important role in the multidisciplinary management of inflammatory bowel disease (IBD), but little is known about this role and the associated resources. To improve knowledge of resource availability for health care activities and the different organizational models in managing IBD in Spain. Cross-sectional study with data obtained by questionnaire directed at Spanish Gastroenterology Services (GS). Five GS models were identified according to whether they have: no specific service for IBD management (Model A); IBD outpatient office for physician consultations (Model B); general outpatient office for nurse consultations (Model C); both, Model B and Model C (Model D); and IBD Unit (Model E) when the hospital has a Comprehensive Care Unit for IBD with telephone helpline, computer, including a Model B. Available resources and activities performed were compared according to GS model (chi-square test and test for linear trend). Responses were received from 107 GS: 33 Model A (31%), 38 Model B (36%), 4 Model C (4%), 16 Model D (15%) and 16 Model E (15%). The model in which nurses have the most resources and responsibilities is the Model E. The more complete the organizational model, the more frequent the availability of nursing resources (educational material, databases, office, and specialized software) and responsibilities (management of walk-in appointments, provision of emotional support, health education, follow-up of drug treatment and treatment adherence) (p<0.05). Nurses have more resources and responsibilities the more complete is the organizational model for IBD management. Development of these areas may improve patient outcomes. Copyright © 2011 European Crohn's and Colitis Organisation. Published by Elsevier B.V. All rights reserved.

  17. Template-free modeling by LEE and LEER in CASP11.

    PubMed

    Joung, InSuk; Lee, Sun Young; Cheng, Qianyi; Kim, Jong Yun; Joo, Keehyoung; Lee, Sung Jong; Lee, Jooyoung

    2016-09-01

    For the template-free modeling of human targets of CASP11, we utilized two of our modeling protocols, LEE and LEER. The LEE protocol took CASP11-released server models as the input and used some of them as templates for 3D (three-dimensional) modeling. The template selection procedure was based on the clustering of the server models aided by a community detection method of a server-model network. Restraining energy terms generated from the selected templates together with physical and statistical energy terms were used to build 3D models. Side-chains of the 3D models were rebuilt using target-specific consensus side-chain library along with the SCWRL4 rotamer library, which completed the LEE protocol. The first success factor of the LEE protocol was due to efficient server model screening. The average backbone accuracy of selected server models was similar to that of top 30% server models. The second factor was that a proper energy function along with our optimization method guided us, so that we successfully generated better quality models than the input template models. In 10 out of 24 cases, better backbone structures than the best of input template structures were generated. LEE models were further refined by performing restrained molecular dynamics simulations to generate LEER models. CASP11 results indicate that LEE models were better than the average template models in terms of both backbone structures and side-chain orientations. LEER models were of improved physical realism and stereo-chemistry compared to LEE models, and they were comparable to LEE models in the backbone accuracy. Proteins 2016; 84(Suppl 1):118-130. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  18. Plausible combinations: An improved method to evaluate the covariate structure of Cormack-Jolly-Seber mark-recapture models

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; McDonald, Trent L.; Amstrup, Steven C.

    2013-01-01

    Mark-recapture models are extensively used in quantitative population ecology, providing estimates of population vital rates, such as survival, that are difficult to obtain using other methods. Vital rates are commonly modeled as functions of explanatory covariates, adding considerable flexibility to mark-recapture models, but also increasing the subjectivity and complexity of the modeling process. Consequently, model selection and the evaluation of covariate structure remain critical aspects of mark-recapture modeling. The difficulties involved in model selection are compounded in Cormack-Jolly- Seber models because they are composed of separate sub-models for survival and recapture probabilities, which are conceptualized independently even though their parameters are not statistically independent. The construction of models as combinations of sub-models, together with multiple potential covariates, can lead to a large model set. Although desirable, estimation of the parameters of all models may not be feasible. Strategies to search a model space and base inference on a subset of all models exist and enjoy widespread use. However, even though the methods used to search a model space can be expected to influence parameter estimation, the assessment of covariate importance, and therefore the ecological interpretation of the modeling results, the performance of these strategies has received limited investigation. We present a new strategy for searching the space of a candidate set of Cormack-Jolly-Seber models and explore its performance relative to existing strategies using computer simulation. The new strategy provides an improved assessment of the importance of covariates and covariate combinations used to model survival and recapture probabilities, while requiring only a modest increase in the number of models on which inference is based in comparison to existing techniques.

  19. Framework for Understanding Structural Errors (FUSE): A modular framework to diagnose differences between hydrological models

    USGS Publications Warehouse

    Clark, Martyn P.; Slater, Andrew G.; Rupp, David E.; Woods, Ross A.; Vrugt, Jasper A.; Gupta, Hoshin V.; Wagener, Thorsten; Hay, Lauren E.

    2008-01-01

    The problems of identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure remain outstanding research challenges for the discipline of hydrology. Progress on these problems requires understanding of the nature of differences between models. This paper presents a methodology to diagnose differences in hydrological model structures: the Framework for Understanding Structural Errors (FUSE). FUSE was used to construct 79 unique model structures by combining components of 4 existing hydrological models. These new models were used to simulate streamflow in two of the basins used in the Model Parameter Estimation Experiment (MOPEX): the Guadalupe River (Texas) and the French Broad River (North Carolina). Results show that the new models produced simulations of streamflow that were at least as good as the simulations produced by the models that participated in the MOPEX experiment. Our initial application of the FUSE method for the Guadalupe River exposed relationships between model structure and model performance, suggesting that the choice of model structure is just as important as the choice of model parameters. However, further work is needed to evaluate model simulations using multiple criteria to diagnose the relative importance of model structural differences in various climate regimes and to assess the amount of independent information in each of the models. This work will be crucial to both identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure. To facilitate research on these problems, the FORTRAN‐90 source code for FUSE is available upon request from the lead author.

  20. Moving alcohol prevention research forward-Part II: new directions grounded in community-based system dynamics modeling.

    PubMed

    Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller

    2018-02-01

    Given the complexity of factors contributing to alcohol misuse, appropriate epistemologies and methodologies are needed to understand and intervene meaningfully. We aimed to (1) provide an overview of computational modeling methodologies, with an emphasis on system dynamics modeling; (2) explain how community-based system dynamics modeling can forge new directions in alcohol prevention research; and (3) present a primer on how to build alcohol misuse simulation models using system dynamics modeling, with an emphasis on stakeholder involvement, data sources and model validation. Throughout, we use alcohol misuse among college students in the United States as a heuristic example for demonstrating these methodologies. System dynamics modeling employs a top-down aggregate approach to understanding dynamically complex problems. Its three foundational properties-stocks, flows and feedbacks-capture non-linearity, time-delayed effects and other system characteristics. As a methodological choice, system dynamics modeling is amenable to participatory approaches; in particular, community-based system dynamics modeling has been used to build impactful models for addressing dynamically complex problems. The process of community-based system dynamics modeling consists of numerous stages: (1) creating model boundary charts, behavior-over-time-graphs and preliminary system dynamics models using group model-building techniques; (2) model formulation; (3) model calibration; (4) model testing and validation; and (5) model simulation using learning-laboratory techniques. Community-based system dynamics modeling can provide powerful tools for policy and intervention decisions that can result ultimately in sustainable changes in research and action in alcohol misuse prevention. © 2017 Society for the Study of Addiction.

  1. A Comparison of Two Mathematical Modeling Frameworks for Evaluating Sexually Transmitted Infection Epidemiology.

    PubMed

    Johnson, Leigh F; Geffen, Nathan

    2016-03-01

    Different models of sexually transmitted infections (STIs) can yield substantially different conclusions about STI epidemiology, and it is important to understand how and why models differ. Frequency-dependent models make the simplifying assumption that STI incidence is proportional to STI prevalence in the population, whereas network models calculate STI incidence more realistically by classifying individuals according to their partners' STI status. We assessed a deterministic frequency-dependent model approximation to a microsimulation network model of STIs in South Africa. Sexual behavior and demographic parameters were identical in the 2 models. Six STIs were simulated using each model: HIV, herpes, syphilis, gonorrhea, chlamydia, and trichomoniasis. For all 6 STIs, the frequency-dependent model estimated a higher STI prevalence than the network model, with the difference between the 2 models being relatively large for the curable STIs. When the 2 models were fitted to the same STI prevalence data, the best-fitting parameters differed substantially between models, with the frequency-dependent model suggesting more immunity and lower transmission probabilities. The fitted frequency-dependent model estimated that the effects of a hypothetical elimination of concurrent partnerships and a reduction in commercial sex were both smaller than estimated by the fitted network model, whereas the latter model estimated a smaller impact of a reduction in unprotected sex in spousal relationships. The frequency-dependent assumption is problematic when modeling short-term STIs. Frequency-dependent models tend to underestimate the importance of high-risk groups in sustaining STI epidemics, while overestimating the importance of long-term partnerships and low-risk groups.

  2. Using Multivariate Adaptive Regression Spline and Artificial Neural Network to Simulate Urbanization in Mumbai, India

    NASA Astrophysics Data System (ADS)

    Ahmadlou, M.; Delavar, M. R.; Tayyebi, A.; Shafizadeh-Moghadam, H.

    2015-12-01

    Land use change (LUC) models used for modelling urban growth are different in structure and performance. Local models divide the data into separate subsets and fit distinct models on each of the subsets. Non-parametric models are data driven and usually do not have a fixed model structure or model structure is unknown before the modelling process. On the other hand, global models perform modelling using all the available data. In addition, parametric models have a fixed structure before the modelling process and they are model driven. Since few studies have compared local non-parametric models with global parametric models, this study compares a local non-parametric model called multivariate adaptive regression spline (MARS), and a global parametric model called artificial neural network (ANN) to simulate urbanization in Mumbai, India. Both models determine the relationship between a dependent variable and multiple independent variables. We used receiver operating characteristic (ROC) to compare the power of the both models for simulating urbanization. Landsat images of 1991 (TM) and 2010 (ETM+) were used for modelling the urbanization process. The drivers considered for urbanization in this area were distance to urban areas, urban density, distance to roads, distance to water, distance to forest, distance to railway, distance to central business district, number of agricultural cells in a 7 by 7 neighbourhoods, and slope in 1991. The results showed that the area under the ROC curve for MARS and ANN was 94.77% and 95.36%, respectively. Thus, ANN performed slightly better than MARS to simulate urban areas in Mumbai, India.

  3. ModelMuse - A Graphical User Interface for MODFLOW-2005 and PHAST

    USGS Publications Warehouse

    Winston, Richard B.

    2009-01-01

    ModelMuse is a graphical user interface (GUI) for the U.S. Geological Survey (USGS) models MODFLOW-2005 and PHAST. This software package provides a GUI for creating the flow and transport input file for PHAST and the input files for MODFLOW-2005. In ModelMuse, the spatial data for the model is independent of the grid, and the temporal data is independent of the stress periods. Being able to input these data independently allows the user to redefine the spatial and temporal discretization at will. This report describes the basic concepts required to work with ModelMuse. These basic concepts include the model grid, data sets, formulas, objects, the method used to assign values to data sets, and model features. The ModelMuse main window has a top, front, and side view of the model that can be used for editing the model, and a 3-D view of the model that can be used to display properties of the model. ModelMuse has tools to generate and edit the model grid. It also has a variety of interpolation methods and geographic functions that can be used to help define the spatial variability of the model. ModelMuse can be used to execute both MODFLOW-2005 and PHAST and can also display the results of MODFLOW-2005 models. An example of using ModelMuse with MODFLOW-2005 is included in this report. Several additional examples are described in the help system for ModelMuse, which can be accessed from the Help menu.

  4. Transient PVT measurements and model predictions for vessel heat transfer. Part II.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felver, Todd G.; Paradiso, Nicholas Joseph; Winters, William S., Jr.

    2010-07-01

    Part I of this report focused on the acquisition and presentation of transient PVT data sets that can be used to validate gas transfer models. Here in Part II we focus primarily on describing models and validating these models using the data sets. Our models are intended to describe the high speed transport of compressible gases in arbitrary arrangements of vessels, tubing, valving and flow branches. Our models fall into three categories: (1) network flow models in which flow paths are modeled as one-dimensional flow and vessels are modeled as single control volumes, (2) CFD (Computational Fluid Dynamics) models inmore » which flow in and between vessels is modeled in three dimensions and (3) coupled network/CFD models in which vessels are modeled using CFD and flows between vessels are modeled using a network flow code. In our work we utilized NETFLOW as our network flow code and FUEGO for our CFD code. Since network flow models lack three-dimensional resolution, correlations for heat transfer and tube frictional pressure drop are required to resolve important physics not being captured by the model. Here we describe how vessel heat transfer correlations were improved using the data and present direct model-data comparisons for all tests documented in Part I. Our results show that our network flow models have been substantially improved. The CFD modeling presented here describes the complex nature of vessel heat transfer and for the first time demonstrates that flow and heat transfer in vessels can be modeled directly without the need for correlations.« less

  5. Comparison of chiller models for use in model-based fault detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sreedharan, Priya; Haves, Philip

    Selecting the model is an important and essential step in model based fault detection and diagnosis (FDD). Factors that are considered in evaluating a model include accuracy, training data requirements, calibration effort, generality, and computational requirements. The objective of this study was to evaluate different modeling approaches for their applicability to model based FDD of vapor compression chillers. Three different models were studied: the Gordon and Ng Universal Chiller model (2nd generation) and a modified version of the ASHRAE Primary Toolkit model, which are both based on first principles, and the DOE-2 chiller model, as implemented in CoolTools{trademark}, which ismore » empirical. The models were compared in terms of their ability to reproduce the observed performance of an older, centrifugal chiller operating in a commercial office building and a newer centrifugal chiller in a laboratory. All three models displayed similar levels of accuracy. Of the first principles models, the Gordon-Ng model has the advantage of being linear in the parameters, which allows more robust parameter estimation methods to be used and facilitates estimation of the uncertainty in the parameter values. The ASHRAE Toolkit Model may have advantages when refrigerant temperature measurements are also available. The DOE-2 model can be expected to have advantages when very limited data are available to calibrate the model, as long as one of the previously identified models in the CoolTools library matches the performance of the chiller in question.« less

  6. Are Model Transferability And Complexity Antithetical? Insights From Validation of a Variable-Complexity Empirical Snow Model in Space and Time

    NASA Astrophysics Data System (ADS)

    Lute, A. C.; Luce, Charles H.

    2017-11-01

    The related challenges of predictions in ungauged basins and predictions in ungauged climates point to the need to develop environmental models that are transferable across both space and time. Hydrologic modeling has historically focused on modelling one or only a few basins using highly parameterized conceptual or physically based models. However, model parameters and structures have been shown to change significantly when calibrated to new basins or time periods, suggesting that model complexity and model transferability may be antithetical. Empirical space-for-time models provide a framework within which to assess model transferability and any tradeoff with model complexity. Using 497 SNOTEL sites in the western U.S., we develop space-for-time models of April 1 SWE and Snow Residence Time based on mean winter temperature and cumulative winter precipitation. The transferability of the models to new conditions (in both space and time) is assessed using non-random cross-validation tests with consideration of the influence of model complexity on transferability. As others have noted, the algorithmic empirical models transfer best when minimal extrapolation in input variables is required. Temporal split-sample validations use pseudoreplicated samples, resulting in the selection of overly complex models, which has implications for the design of hydrologic model validation tests. Finally, we show that low to moderate complexity models transfer most successfully to new conditions in space and time, providing empirical confirmation of the parsimony principal.

  7. Geospace environment modeling 2008--2009 challenge: Dst index

    USGS Publications Warehouse

    Rastätter, L.; Kuznetsova, M.M.; Glocer, A.; Welling, D.; Meng, X.; Raeder, J.; Wittberger, M.; Jordanova, V.K.; Yu, Y.; Zaharia, S.; Weigel, R.S.; Sazykin, S.; Boynton, R.; Wei, H.; Eccles, V.; Horton, W.; Mays, M.L.; Gannon, J.

    2013-01-01

    This paper reports the metrics-based results of the Dst index part of the 2008–2009 GEM Metrics Challenge. The 2008–2009 GEM Metrics Challenge asked modelers to submit results for four geomagnetic storm events and five different types of observations that can be modeled by statistical, climatological or physics-based models of the magnetosphere-ionosphere system. We present the results of 30 model settings that were run at the Community Coordinated Modeling Center and at the institutions of various modelers for these events. To measure the performance of each of the models against the observations, we use comparisons of 1 hour averaged model data with the Dst index issued by the World Data Center for Geomagnetism, Kyoto, Japan, and direct comparison of 1 minute model data with the 1 minute Dst index calculated by the United States Geological Survey. The latter index can be used to calculate spectral variability of model outputs in comparison to the index. We find that model rankings vary widely by skill score used. None of the models consistently perform best for all events. We find that empirical models perform well in general. Magnetohydrodynamics-based models of the global magnetosphere with inner magnetosphere physics (ring current model) included and stand-alone ring current models with properly defined boundary conditions perform well and are able to match or surpass results from empirical models. Unlike in similar studies, the statistical models used in this study found their challenge in the weakest events rather than the strongest events.

  8. Hybrid Forecasting of Daily River Discharges Considering Autoregressive Heteroscedasticity

    NASA Astrophysics Data System (ADS)

    Szolgayová, Elena Peksová; Danačová, Michaela; Komorniková, Magda; Szolgay, Ján

    2017-06-01

    It is widely acknowledged that in the hydrological and meteorological communities, there is a continuing need to improve the quality of quantitative rainfall and river flow forecasts. A hybrid (combined deterministic-stochastic) modelling approach is proposed here that combines the advantages offered by modelling the system dynamics with a deterministic model and a deterministic forecasting error series with a data-driven model in parallel. Since the processes to be modelled are generally nonlinear and the model error series may exhibit nonstationarity and heteroscedasticity, GARCH-type nonlinear time series models are considered here. The fitting, forecasting and simulation performance of such models have to be explored on a case-by-case basis. The goal of this paper is to test and develop an appropriate methodology for model fitting and forecasting applicable for daily river discharge forecast error data from the GARCH family of time series models. We concentrated on verifying whether the use of a GARCH-type model is suitable for modelling and forecasting a hydrological model error time series on the Hron and Morava Rivers in Slovakia. For this purpose we verified the presence of heteroscedasticity in the simulation error series of the KLN multilinear flow routing model; then we fitted the GARCH-type models to the data and compared their fit with that of an ARMA - type model. We produced one-stepahead forecasts from the fitted models and again provided comparisons of the model's performance.

  9. RECURSIVE PROTEIN MODELING: A DIVIDE AND CONQUER STRATEGY FOR PROTEIN STRUCTURE PREDICTION AND ITS CASE STUDY IN CASP9

    PubMed Central

    CHENG, JIANLIN; EICKHOLT, JESSE; WANG, ZHENG; DENG, XIN

    2013-01-01

    After decades of research, protein structure prediction remains a very challenging problem. In order to address the different levels of complexity of structural modeling, two types of modeling techniques — template-based modeling and template-free modeling — have been developed. Template-based modeling can often generate a moderate- to high-resolution model when a similar, homologous template structure is found for a query protein but fails if no template or only incorrect templates are found. Template-free modeling, such as fragment-based assembly, may generate models of moderate resolution for small proteins of low topological complexity. Seldom have the two techniques been integrated together to improve protein modeling. Here we develop a recursive protein modeling approach to selectively and collaboratively apply template-based and template-free modeling methods to model template-covered (i.e. certain) and template-free (i.e. uncertain) regions of a protein. A preliminary implementation of the approach was tested on a number of hard modeling cases during the 9th Critical Assessment of Techniques for Protein Structure Prediction (CASP9) and successfully improved the quality of modeling in most of these cases. Recursive modeling can signicantly reduce the complexity of protein structure modeling and integrate template-based and template-free modeling to improve the quality and efficiency of protein structure prediction. PMID:22809379

  10. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  11. Comparison of childbirth care models in public hospitals, Brazil.

    PubMed

    Vogt, Sibylle Emilie; Silva, Kátia Silveira da; Dias, Marcos Augusto Bastos

    2014-04-01

    To compare collaborative and traditional childbirth care models. Cross-sectional study with 655 primiparous women in four public health system hospitals in Belo Horizonte, MG, Southeastern Brazil, in 2011 (333 women for the collaborative model and 322 for the traditional model, including those with induced or premature labor). Data were collected using interviews and medical records. The Chi-square test was used to compare the outcomes and multivariate logistic regression to determine the association between the model and the interventions used. Paid work and schooling showed significant differences in distribution between the models. Oxytocin (50.2% collaborative model and 65.5% traditional model; p < 0.001), amniotomy (54.3% collaborative model and 65.9% traditional model; p = 0.012) and episiotomy (collaborative model 16.1% and traditional model 85.2%; p < 0.001) were less used in the collaborative model with increased application of non-pharmacological pain relief (85.0% collaborative model and 78.9% traditional model; p = 0.042). The association between the collaborative model and the reduction in the use of oxytocin, artificial rupture of membranes and episiotomy remained after adjustment for confounding. The care model was not associated with complications in newborns or mothers neither with the use of spinal or epidural analgesia. The results suggest that collaborative model may reduce interventions performed in labor care with similar perinatal outcomes.

  12. Developing and upgrading of solar system thermal energy storage simulation models. Technical progress report, March 1, 1979-February 29, 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhn, J K; von Fuchs, G F; Zob, A P

    1980-05-01

    Two water tank component simulation models have been selected and upgraded. These models are called the CSU Model and the Extended SOLSYS Model. The models have been standardized and links have been provided for operation in the TRNSYS simulation program. The models are described in analytical terms as well as in computer code. Specific water tank tests were performed for the purpose of model validation. Agreement between model data and test data is excellent. A description of the limitations has also been included. Streamlining results and criteria for the reduction of computer time have also been shown for both watermore » tank computer models. Computer codes for the models and instructions for operating these models in TRNSYS have also been included, making the models readily available for DOE and industry use. Rock bed component simulation models have been reviewed and a model selected and upgraded. This model is a logical extension of the Mumma-Marvin model. Specific rock bed tests have been performed for the purpose of validation. Data have been reviewed for consistency. Details of the test results concerned with rock characteristics and pressure drop through the bed have been explored and are reported.« less

  13. Modeling approaches in avian conservation and the role of field biologists

    USGS Publications Warehouse

    Beissinger, Steven R.; Walters, J.R.; Catanzaro, D.G.; Smith, Kimberly G.; Dunning, J.B.; Haig, Susan M.; Noon, Barry; Stith, Bradley M.

    2006-01-01

    This review grew out of our realization that models play an increasingly important role in conservation but are rarely used in the research of most avian biologists. Modelers are creating models that are more complex and mechanistic and that can incorporate more of the knowledge acquired by field biologists. Such models require field biologists to provide more specific information, larger sample sizes, and sometimes new kinds of data, such as habitat-specific demography and dispersal information. Field biologists need to support model development by testing key model assumptions and validating models. The best conservation decisions will occur where cooperative interaction enables field biologists, modelers, statisticians, and managers to contribute effectively. We begin by discussing the general form of ecological models—heuristic or mechanistic, "scientific" or statistical—and then highlight the structure, strengths, weaknesses, and applications of six types of models commonly used in avian conservation: (1) deterministic single-population matrix models, (2) stochastic population viability analysis (PVA) models for single populations, (3) metapopulation models, (4) spatially explicit models, (5) genetic models, and (6) species distribution models. We end by considering their unique attributes, determining whether the assumptions that underlie the structure are valid, and testing the ability of the model to predict the future correctly.

  14. Review: Regional groundwater flow modeling in heavily irrigated basins of selected states in the western United States

    NASA Astrophysics Data System (ADS)

    Rossman, Nathan R.; Zlotnik, Vitaly A.

    2013-09-01

    Water resources in agriculture-dominated basins of the arid western United States are stressed due to long-term impacts from pumping. A review of 88 regional groundwater-flow modeling applications from seven intensively irrigated western states (Arizona, California, Colorado, Idaho, Kansas, Nebraska and Texas) was conducted to provide hydrogeologists, modelers, water managers, and decision makers insight about past modeling studies that will aid future model development. Groundwater models were classified into three types: resource evaluation models (39 %), which quantify water budgets and act as preliminary models intended to be updated later, or constitute re-calibrations of older models; management/planning models (55 %), used to explore and identify management plans based on the response of the groundwater system to water-development or climate scenarios, sometimes under water-use constraints; and water rights models (7 %), used to make water administration decisions based on model output and to quantify water shortages incurred by water users or climate changes. Results for 27 model characteristics are summarized by state and model type, and important comparisons and contrasts are highlighted. Consideration of modeling uncertainty and the management focus toward sustainability, adaptive management and resilience are discussed, and future modeling recommendations, in light of the reviewed models and other published works, are presented.

  15. Interpreting Musculoskeletal Models and Dynamic Simulations: Causes and Effects of Differences Between Models.

    PubMed

    Roelker, Sarah A; Caruthers, Elena J; Baker, Rachel K; Pelz, Nicholas C; Chaudhari, Ajit M W; Siston, Robert A

    2017-11-01

    With more than 29,000 OpenSim users, several musculoskeletal models with varying levels of complexity are available to study human gait. However, how different model parameters affect estimated joint and muscle function between models is not fully understood. The purpose of this study is to determine the effects of four OpenSim models (Gait2392, Lower Limb Model 2010, Full-Body OpenSim Model, and Full Body Model 2016) on gait mechanics and estimates of muscle forces and activations. Using OpenSim 3.1 and the same experimental data for all models, six young adults were scaled in each model, gait kinematics were reproduced, and static optimization estimated muscle function. Simulated measures differed between models by up to 6.5° knee range of motion, 0.012 Nm/Nm peak knee flexion moment, 0.49 peak rectus femoris activation, and 462 N peak rectus femoris force. Differences in coordinate system definitions between models altered joint kinematics, influencing joint moments. Muscle parameter and joint moment discrepancies altered muscle activations and forces. Additional model complexity yielded greater error between experimental and simulated measures; therefore, this study suggests Gait2392 is a sufficient model for studying walking in healthy young adults. Future research is needed to determine which model(s) is best for tasks with more complex motion.

  16. Inter-sectoral comparison of model uncertainty of climate change impacts in Africa

    NASA Astrophysics Data System (ADS)

    van Griensven, Ann; Vetter, Tobias; Piontek, Franzisca; Gosling, Simon N.; Kamali, Bahareh; Reinhardt, Julia; Dinkneh, Aklilu; Yang, Hong; Alemayehu, Tadesse

    2016-04-01

    We present the model results and their uncertainties of an inter-sectoral impact model inter-comparison initiative (ISI-MIP) for climate change impacts in Africa. The study includes results on hydrological, crop and health aspects. The impact models used ensemble inputs consisting of 20 time series of daily rainfall and temperature data obtained from 5 Global Circulation Models (GCMs) and 4 Representative concentration pathway (RCP). In this study, we analysed model uncertainty for the Regional Hydrological Models, Global Hydrological Models, Malaria models and Crop models. For the regional hydrological models, we used 2 African test cases: the Blue Nile in Eastern Africa and the Niger in Western Africa. For both basins, the main sources of uncertainty are originating from the GCM and RCPs, while the uncertainty of the regional hydrological models is relatively low. The hydrological model uncertainty becomes more important when predicting changes on low flows compared to mean or high flows. For the other sectors, the impact models have the largest share of uncertainty compared to GCM and RCP, especially for Malaria and crop modelling. The overall conclusion of the ISI-MIP is that it is strongly advised to use ensemble modeling approach for climate change impact studies throughout the whole modelling chain.

  17. Extended behavioural modelling of FET and lattice-mismatched HEMT devices

    NASA Astrophysics Data System (ADS)

    Khawam, Yahya; Albasha, Lutfi

    2017-07-01

    This study presents an improved large signal model that can be used for high electron mobility transistors (HEMTs) and field effect transistors using measurement-based behavioural modelling techniques. The steps for accurate large and small signal modelling for transistor are also discussed. The proposed DC model is based on the Fager model since it compensates between the number of model's parameters and accuracy. The objective is to increase the accuracy of the drain-source current model with respect to any change in gate or drain voltages. Also, the objective is to extend the improved DC model to account for soft breakdown and kink effect found in some variants of HEMT devices. A hybrid Newton's-Genetic algorithm is used in order to determine the unknown parameters in the developed model. In addition to accurate modelling of a transistor's DC characteristics, the complete large signal model is modelled using multi-bias s-parameter measurements. The way that the complete model is performed is by using a hybrid multi-objective optimisation technique (Non-dominated Sorting Genetic Algorithm II) and local minimum search (multivariable Newton's method) for parasitic elements extraction. Finally, the results of DC modelling and multi-bias s-parameters modelling are presented, and three-device modelling recommendations are discussed.

  18. The regionalization of national-scale SPARROW models for stream nutrients

    USGS Publications Warehouse

    Schwarz, Gregory E.; Alexander, Richard B.; Smith, Richard A.; Preston, Stephen D.

    2011-01-01

    This analysis modifies the parsimonious specification of recently published total nitrogen (TN) and total phosphorus (TP) national-scale SPAtially Referenced Regressions On Watershed attributes models to allow each model coefficient to vary geographically among three major river basins of the conterminous United States. Regionalization of the national models reduces the standard errors in the prediction of TN and TP loads, expressed as a percentage of the predicted load, by about 6 and 7%. We develop and apply a method for combining national-scale and regional-scale information to estimate a hybrid model that imposes cross-region constraints that limit regional variation in model coefficients, effectively reducing the number of free model parameters as compared to a collection of independent regional models. The hybrid TN and TP regional models have improved model fit relative to the respective national models, reducing the standard error in the prediction of loads, expressed as a percentage of load, by about 5 and 4%. Only 19% of the TN hybrid model coefficients and just 2% of the TP hybrid model coefficients show evidence of substantial regional specificity (more than ±100% deviation from the national model estimate). The hybrid models have much greater precision in the estimated coefficients than do the unconstrained regional models, demonstrating the efficacy of pooling information across regions to improve regional models.

  19. Modeling of Stiffness and Strength of Bone at Nanoscale.

    PubMed

    Abueidda, Diab W; Sabet, Fereshteh A; Jasiuk, Iwona M

    2017-05-01

    Two distinct geometrical models of bone at the nanoscale (collagen fibril and mineral platelets) are analyzed computationally. In the first model (model I), minerals are periodically distributed in a staggered manner in a collagen matrix while in the second model (model II), minerals form continuous layers outside the collagen fibril. Elastic modulus and strength of bone at the nanoscale, represented by these two models under longitudinal tensile loading, are studied using a finite element (FE) software abaqus. The analysis employs a traction-separation law (cohesive surface modeling) at various interfaces in the models to account for interfacial delaminations. Plane stress, plane strain, and axisymmetric versions of the two models are considered. Model II is found to have a higher stiffness than model I for all cases. For strength, the two models alternate the superiority of performance depending on the inputs and assumptions used. For model II, the axisymmetric case gives higher results than the plane stress and plane strain cases while an opposite trend is observed for model I. For axisymmetric case, model II shows greater strength and stiffness compared to model I. The collagen-mineral arrangement of bone at nanoscale forms a basic building block of bone. Thus, knowledge of its mechanical properties is of high scientific and clinical interests.

  20. The Use of Behavior Models for Predicting Complex Operations

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2010-01-01

    Modeling and simulation (M&S) plays an important role when complex human-system notions are being proposed, developed and tested within the system design process. National Aeronautics and Space Administration (NASA) as an agency uses many different types of M&S approaches for predicting human-system interactions, especially when it is early in the development phase of a conceptual design. NASA Ames Research Center possesses a number of M&S capabilities ranging from airflow, flight path models, aircraft models, scheduling models, human performance models (HPMs), and bioinformatics models among a host of other kinds of M&S capabilities that are used for predicting whether the proposed designs will benefit the specific mission criteria. The Man-Machine Integration Design and Analysis System (MIDAS) is a NASA ARC HPM software tool that integrates many models of human behavior with environment models, equipment models, and procedural / task models. The challenge to model comprehensibility is heightened as the number of models that are integrated and the requisite fidelity of the procedural sets are increased. Model transparency is needed for some of the more complex HPMs to maintain comprehensibility of the integrated model performance. This will be exemplified in a recent MIDAS v5 application model and plans for future model refinements will be presented.

  1. Processing Speed in Children: Examination of the Structure in Middle Childhood and Its Impact on Reading

    ERIC Educational Resources Information Center

    Gerst, Elyssa H.

    2017-01-01

    The primary aim of this study was to examine the structure of processing speed (PS) in middle childhood by comparing five theoretically driven models of PS. The models consisted of two conceptual models (a unitary model, a complexity model) and three methodological models (a stimulus material model, an output modality model, and a timing modality…

  2. The Application of Various Nonlinear Models to Describe Academic Growth Trajectories: An Empirical Analysis Using Four-Wave Longitudinal Achievement Data from a Large Urban School District

    ERIC Educational Resources Information Center

    Shin, Tacksoo

    2012-01-01

    This study introduced various nonlinear growth models, including the quadratic conventional polynomial model, the fractional polynomial model, the Sigmoid model, the growth model with negative exponential functions, the multidimensional scaling technique, and the unstructured growth curve model. It investigated which growth models effectively…

  3. Competency Modeling in Extension Education: Integrating an Academic Extension Education Model with an Extension Human Resource Management Model

    ERIC Educational Resources Information Center

    Scheer, Scott D.; Cochran, Graham R.; Harder, Amy; Place, Nick T.

    2011-01-01

    The purpose of this study was to compare and contrast an academic extension education model with an Extension human resource management model. The academic model of 19 competencies was similar across the 22 competencies of the Extension human resource management model. There were seven unique competencies for the human resource management model.…

  4. Defining a Family of Cognitive Diagnosis Models Using Log-Linear Models with Latent Variables

    ERIC Educational Resources Information Center

    Henson, Robert A.; Templin, Jonathan L.; Willse, John T.

    2009-01-01

    This paper uses log-linear models with latent variables (Hagenaars, in "Loglinear Models with Latent Variables," 1993) to define a family of cognitive diagnosis models. In doing so, the relationship between many common models is explicitly defined and discussed. In addition, because the log-linear model with latent variables is a general model for…

  5. A toolbox and a record for scientific model development

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Scientific computation can benefit from software tools that facilitate construction of computational models, control the application of models, and aid in revising models to handle new situations. Existing environments for scientific programming provide only limited means of handling these tasks. This paper describes a two pronged approach for handling these tasks: (1) designing a 'Model Development Toolbox' that includes a basic set of model constructing operations; and (2) designing a 'Model Development Record' that is automatically generated during model construction. The record is subsequently exploited by tools that control the application of scientific models and revise models to handle new situations. Our two pronged approach is motivated by our belief that the model development toolbox and record should be highly interdependent. In particular, a suitable model development record can be constructed only when models are developed using a well defined set of operations. We expect this research to facilitate rapid development of new scientific computational models, to help ensure appropriate use of such models and to facilitate sharing of such models among working computational scientists. We are testing this approach by extending SIGMA, and existing knowledge-based scientific software design tool.

  6. A decision support model for investment on P2P lending platform.

    PubMed

    Zeng, Xiangxiang; Liu, Li; Leung, Stephen; Du, Jiangze; Wang, Xun; Li, Tao

    2017-01-01

    Peer-to-peer (P2P) lending, as a novel economic lending model, has triggered new challenges on making effective investment decisions. In a P2P lending platform, one lender can invest N loans and a loan may be accepted by M investors, thus forming a bipartite graph. Basing on the bipartite graph model, we built an iteration computation model to evaluate the unknown loans. To validate the proposed model, we perform extensive experiments on real-world data from the largest American P2P lending marketplace-Prosper. By comparing our experimental results with those obtained by Bayes and Logistic Regression, we show that our computation model can help borrowers select good loans and help lenders make good investment decisions. Experimental results also show that the Logistic classification model is a good complement to our iterative computation model, which motivates us to integrate the two classification models. The experimental results of the hybrid classification model demonstrate that the logistic classification model and our iteration computation model are complementary to each other. We conclude that the hybrid model (i.e., the integration of iterative computation model and Logistic classification model) is more efficient and stable than the individual model alone.

  7. A decision support model for investment on P2P lending platform

    PubMed Central

    Liu, Li; Leung, Stephen; Du, Jiangze; Wang, Xun; Li, Tao

    2017-01-01

    Peer-to-peer (P2P) lending, as a novel economic lending model, has triggered new challenges on making effective investment decisions. In a P2P lending platform, one lender can invest N loans and a loan may be accepted by M investors, thus forming a bipartite graph. Basing on the bipartite graph model, we built an iteration computation model to evaluate the unknown loans. To validate the proposed model, we perform extensive experiments on real-world data from the largest American P2P lending marketplace—Prosper. By comparing our experimental results with those obtained by Bayes and Logistic Regression, we show that our computation model can help borrowers select good loans and help lenders make good investment decisions. Experimental results also show that the Logistic classification model is a good complement to our iterative computation model, which motivates us to integrate the two classification models. The experimental results of the hybrid classification model demonstrate that the logistic classification model and our iteration computation model are complementary to each other. We conclude that the hybrid model (i.e., the integration of iterative computation model and Logistic classification model) is more efficient and stable than the individual model alone. PMID:28877234

  8. First-Order Model Management With Variable-Fidelity Physics Applied to Multi-Element Airfoil Optimization

    NASA Technical Reports Server (NTRS)

    Alexandrov, N. M.; Nielsen, E. J.; Lewis, R. M.; Anderson, W. K.

    2000-01-01

    First-order approximation and model management is a methodology for a systematic use of variable-fidelity models or approximations in optimization. The intent of model management is to attain convergence to high-fidelity solutions with minimal expense in high-fidelity computations. The savings in terms of computationally intensive evaluations depends on the ability of the available lower-fidelity model or a suite of models to predict the improvement trends for the high-fidelity problem, Variable-fidelity models can be represented by data-fitting approximations, variable-resolution models. variable-convergence models. or variable physical fidelity models. The present work considers the use of variable-fidelity physics models. We demonstrate the performance of model management on an aerodynamic optimization of a multi-element airfoil designed to operate in the transonic regime. Reynolds-averaged Navier-Stokes equations represent the high-fidelity model, while the Euler equations represent the low-fidelity model. An unstructured mesh-based analysis code FUN2D evaluates functions and sensitivity derivatives for both models. Model management for the present demonstration problem yields fivefold savings in terms of high-fidelity evaluations compared to optimization done with high-fidelity computations alone.

  9. Macro-level pedestrian and bicycle crash analysis: Incorporating spatial spillover effects in dual state count models.

    PubMed

    Cai, Qing; Lee, Jaeyoung; Eluru, Naveen; Abdel-Aty, Mohamed

    2016-08-01

    This study attempts to explore the viability of dual-state models (i.e., zero-inflated and hurdle models) for traffic analysis zones (TAZs) based pedestrian and bicycle crash frequency analysis. Additionally, spatial spillover effects are explored in the models by employing exogenous variables from neighboring zones. The dual-state models such as zero-inflated negative binomial and hurdle negative binomial models (with and without spatial effects) are compared with the conventional single-state model (i.e., negative binomial). The model comparison for pedestrian and bicycle crashes revealed that the models that considered observed spatial effects perform better than the models that did not consider the observed spatial effects. Across the models with spatial spillover effects, the dual-state models especially zero-inflated negative binomial model offered better performance compared to single-state models. Moreover, the model results clearly highlighted the importance of various traffic, roadway, and sociodemographic characteristics of the TAZ as well as neighboring TAZs on pedestrian and bicycle crash frequency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. BioModels Database: a repository of mathematical models of biological processes.

    PubMed

    Chelliah, Vijayalakshmi; Laibe, Camille; Le Novère, Nicolas

    2013-01-01

    BioModels Database is a public online resource that allows storing and sharing of published, peer-reviewed quantitative, dynamic models of biological processes. The model components and behaviour are thoroughly checked to correspond the original publication and manually curated to ensure reliability. Furthermore, the model elements are annotated with terms from controlled vocabularies as well as linked to relevant external data resources. This greatly helps in model interpretation and reuse. Models are stored in SBML format, accepted in SBML and CellML formats, and are available for download in various other common formats such as BioPAX, Octave, SciLab, VCML, XPP and PDF, in addition to SBML. The reaction network diagram of the models is also available in several formats. BioModels Database features a search engine, which provides simple and more advanced searches. Features such as online simulation and creation of smaller models (submodels) from the selected model elements of a larger one are provided. BioModels Database can be accessed both via a web interface and programmatically via web services. New models are available in BioModels Database at regular releases, about every 4 months.

  11. Documenting Models for Interoperability and Reusability ...

    EPA Pesticide Factsheets

    Many modeling frameworks compartmentalize science via individual models that link sets of small components to create larger modeling workflows. Developing integrated watershed models increasingly requires coupling multidisciplinary, independent models, as well as collaboration between scientific communities, since component-based modeling can integrate models from different disciplines. Integrated Environmental Modeling (IEM) systems focus on transferring information between components by capturing a conceptual site model; establishing local metadata standards for input/output of models and databases; managing data flow between models and throughout the system; facilitating quality control of data exchanges (e.g., checking units, unit conversions, transfers between software languages); warning and error handling; and coordinating sensitivity/uncertainty analyses. Although many computational software systems facilitate communication between, and execution of, components, there are no common approaches, protocols, or standards for turn-key linkages between software systems and models, especially if modifying components is not the intent. Using a standard ontology, this paper reviews how models can be described for discovery, understanding, evaluation, access, and implementation to facilitate interoperability and reusability. In the proceedings of the International Environmental Modelling and Software Society (iEMSs), 8th International Congress on Environmental Mod

  12. CSR Model Implementation from School Stakeholder Perspectives

    ERIC Educational Resources Information Center

    Herrmann, Suzannah

    2006-01-01

    Despite comprehensive school reform (CSR) model developers' best intentions to make school stakeholders adhere strictly to the implementation of model components, school stakeholders implementing CSR models inevitably make adaptations to the CSR model. Adaptations are made to CSR models because school stakeholders internalize CSR model practices…

  13. A comparison of simple global kinetic models for coal devolatilization with the CPD model

    DOE PAGES

    Richards, Andrew P.; Fletcher, Thomas H.

    2016-08-01

    Simulations of coal combustors and gasifiers generally cannot incorporate the complexities of advanced pyrolysis models, and hence there is interest in evaluating simpler models over ranges of temperature and heating rate that are applicable to the furnace of interest. In this paper, six different simple model forms are compared to predictions made by the Chemical Percolation Devolatilization (CPD) model. The model forms included three modified one-step models, a simple two-step model, and two new modified two-step models. These simple model forms were compared over a wide range of heating rates (5 × 10 3 to 10 6 K/s) at finalmore » temperatures up to 1600 K. Comparisons were made of total volatiles yield as a function of temperature, as well as the ultimate volatiles yield. Advantages and disadvantages for each simple model form are discussed. In conclusion, a modified two-step model with distributed activation energies seems to give the best agreement with CPD model predictions (with the fewest tunable parameters).« less

  14. [Bone remodeling and modeling/mini-modeling.

    PubMed

    Hasegawa, Tomoka; Amizuka, Norio

    Modeling, adapting structures to loading by changing bone size and shapes, often takes place in bone of the fetal and developmental stages, while bone remodeling-replacement of old bone into new bone-is predominant in the adult stage. Modeling can be divided into macro-modeling(macroscopic modeling)and mini-modeling(microscopic modeling). In the cellular process of mini-modeling, unlike bone remodeling, bone lining cells, i.e., resting flattened osteoblasts covering bone surfaces will become active form of osteoblasts, and then, deposit new bone onto the old bone without mediating osteoclastic bone resorption. Among the drugs for osteoporotic treatment, eldecalcitol(a vitamin D3 analog)and teriparatide(human PTH[1-34])could show mini-modeling based bone formation. Histologically, mature, active form of osteoblasts are localized on the new bone induced by mini-modeling, however, only a few cell layer of preosteoblasts are formed over the newly-formed bone, and accordingly, few osteoclasts are present in the region of mini-modeling. In this review, histological characteristics of bone remodeling and modeling including mini-modeling will be introduced.

  15. An Introduction to Markov Modeling: Concepts and Uses

    NASA Technical Reports Server (NTRS)

    Boyd, Mark A.; Lau, Sonie (Technical Monitor)

    1998-01-01

    Kharkov modeling is a modeling technique that is widely useful for dependability analysis of complex fault tolerant systems. It is very flexible in the type of systems and system behavior it can model. It is not, however, the most appropriate modeling technique for every modeling situation. The first task in obtaining a reliability or availability estimate for a system is selecting which modeling technique is most appropriate to the situation at hand. A person performing a dependability analysis must confront the question: is Kharkov modeling most appropriate to the system under consideration, or should another technique be used instead? The need to answer this gives rise to other more basic questions regarding Kharkov modeling: what are the capabilities and limitations of Kharkov modeling as a modeling technique? How does it relate to other modeling techniques? What kind of system behavior can it model? What kinds of software tools are available for performing dependability analyses with Kharkov modeling techniques? These questions and others will be addressed in this tutorial.

  16. The cerebro-cerebellum: Could it be loci of forward models?

    PubMed

    Ishikawa, Takahiro; Tomatsu, Saeka; Izawa, Jun; Kakei, Shinji

    2016-03-01

    It is widely accepted that the cerebellum acquires and maintain internal models for motor control. An internal model simulates mapping between a set of causes and effects. There are two candidates of cerebellar internal models, forward models and inverse models. A forward model transforms a motor command into a prediction of the sensory consequences of a movement. In contrast, an inverse model inverts the information flow of the forward model. Despite the clearly different formulations of the two internal models, it is still controversial whether the cerebro-cerebellum, the phylogenetically newer part of the cerebellum, provides inverse models or forward models for voluntary limb movements or other higher brain functions. In this article, we review physiological and morphological evidence that suggests the existence in the cerebro-cerebellum of a forward model for limb movement. We will also discuss how the characteristic input-output organization of the cerebro-cerebellum may contribute to forward models for non-motor higher brain functions. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  17. Second Generation Crop Yield Models Review

    NASA Technical Reports Server (NTRS)

    Hodges, T. (Principal Investigator)

    1982-01-01

    Second generation yield models, including crop growth simulation models and plant process models, may be suitable for large area crop yield forecasting in the yield model development project. Subjective and objective criteria for model selection are defined and models which might be selected are reviewed. Models may be selected to provide submodels as input to other models; for further development and testing; or for immediate testing as forecasting tools. A plant process model may range in complexity from several dozen submodels simulating (1) energy, carbohydrates, and minerals; (2) change in biomass of various organs; and (3) initiation and development of plant organs, to a few submodels simulating key physiological processes. The most complex models cannot be used directly in large area forecasting but may provide submodels which can be simplified for inclusion into simpler plant process models. Both published and unpublished models which may be used for development or testing are reviewed. Several other models, currently under development, may become available at a later date.

  18. Microphysics in Multi-scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2012-01-01

    Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.

  19. Mechanical model development of rolling bearing-rotor systems: A review

    NASA Astrophysics Data System (ADS)

    Cao, Hongrui; Niu, Linkai; Xi, Songtao; Chen, Xuefeng

    2018-03-01

    The rolling bearing rotor (RBR) system is the kernel of many rotating machines, which affects the performance of the whole machine. Over the past decades, extensive research work has been carried out to investigate the dynamic behavior of RBR systems. However, to the best of the authors' knowledge, no comprehensive review on RBR modelling has been reported yet. To address this gap in the literature, this paper reviews and critically discusses the current progress of mechanical model development of RBR systems, and identifies future trends for research. Firstly, five kinds of rolling bearing models, i.e., the lumped-parameter model, the quasi-static model, the quasi-dynamic model, the dynamic model, and the finite element (FE) model are summarized. Then, the coupled modelling between bearing models and various rotor models including De Laval/Jeffcott rotor, rigid rotor, transfer matrix method (TMM) models and FE models are presented. Finally, the paper discusses the key challenges of previous works and provides new insights into understanding of RBR systems for their advanced future engineering applications.

  20. `Models of' versus `Models for'. Toward an Agent-Based Conception of Modeling in the Science Classroom

    NASA Astrophysics Data System (ADS)

    Gouvea, Julia; Passmore, Cynthia

    2017-03-01

    The inclusion of the practice of "developing and using models" in the Framework for K-12 Science Education and in the Next Generation Science Standards provides an opportunity for educators to examine the role this practice plays in science and how it can be leveraged in a science classroom. Drawing on conceptions of models in the philosophy of science, we bring forward an agent-based account of models and discuss the implications of this view for enacting modeling in science classrooms. Models, according to this account, can only be understood with respect to the aims and intentions of a cognitive agent (models for), not solely in terms of how they represent phenomena in the world (models of). We present this contrast as a heuristic— models of versus models for—that can be used to help educators notice and interpret how models are positioned in standards, curriculum, and classrooms.

  1. Model Hierarchies in Edge-Based Compartmental Modeling for Infectious Disease Spread

    PubMed Central

    Miller, Joel C.; Volz, Erik M.

    2012-01-01

    We consider the family of edge-based compartmental models for epidemic spread developed in [11]. These models allow for a range of complex behaviors, and in particular allow us to explicitly incorporate duration of a contact into our mathematical models. Our focus here is to identify conditions under which simpler models may be substituted for more detailed models, and in so doing we define a hierarchy of epidemic models. In particular we provide conditions under which it is appropriate to use the standard mass action SIR model, and we show what happens when these conditions fail. Using our hierarchy, we provide a procedure leading to the choice of the appropriate model for a given population. Our result about the convergence of models to the Mass Action model gives clear, rigorous conditions under which the Mass Action model is accurate. PMID:22911242

  2. The evolution of process-based hydrologic models: historical challenges and the collective quest for physical realism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis

    The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. Here, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We also illustrate how modeling advances have been made by groups using models of different type and complexity,more » and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.« less

  3. The evolution of process-based hydrologic models: historical challenges and the collective quest for physical realism

    DOE PAGES

    Clark, Martyn P.; Bierkens, Marc F. P.; Samaniego, Luis; ...

    2017-07-11

    The diversity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. Here, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide examples of modeling advances that address these challenges, and define outstanding research needs. We also illustrate how modeling advances have been made by groups using models of different type and complexity,more » and we argue for the need to more effectively use our diversity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.« less

  4. Modeling of near-wall turbulence

    NASA Technical Reports Server (NTRS)

    Shih, T. H.; Mansour, N. N.

    1990-01-01

    An improved k-epsilon model and a second order closure model is presented for low Reynolds number turbulence near a wall. For the k-epsilon model, a modified form of the eddy viscosity having correct asymptotic near wall behavior is suggested, and a model for the pressure diffusion term in the turbulent kinetic energy equation is proposed. For the second order closure model, the existing models are modified for the Reynolds stress equations to have proper near wall behavior. A dissipation rate equation for the turbulent kinetic energy is also reformulated. The proposed models satisfy realizability and will not produce unphysical behavior. Fully developed channel flows are used for model testing. The calculations are compared with direct numerical simulations. It is shown that the present models, both the k-epsilon model and the second order closure model, perform well in predicting the behavior of the near wall turbulence. Significant improvements over previous models are obtained.

  5. [Modeling in value-based medicine].

    PubMed

    Neubauer, A S; Hirneiss, C; Kampik, A

    2010-03-01

    Modeling plays an important role in value-based medicine (VBM). It allows decision support by predicting potential clinical and economic consequences, frequently combining different sources of evidence. Based on relevant publications and examples focusing on ophthalmology the key economic modeling methods are explained and definitions are given. The most frequently applied model types are decision trees, Markov models, and discrete event simulation (DES) models. Model validation includes besides verifying internal validity comparison with other models (external validity) and ideally validation of its predictive properties. The existing uncertainty with any modeling should be clearly stated. This is true for economic modeling in VBM as well as when using disease risk models to support clinical decisions. In economic modeling uni- and multivariate sensitivity analyses are usually applied; the key concepts here are tornado plots and cost-effectiveness acceptability curves. Given the existing uncertainty, modeling helps to make better informed decisions than without this additional information.

  6. Sequential Modelling of Building Rooftops by Integrating Airborne LIDAR Data and Optical Imagery: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Sohn, G.; Jung, J.; Jwa, Y.; Armenakis, C.

    2013-05-01

    This paper presents a sequential rooftop modelling method to refine initial rooftop models derived from airborne LiDAR data by integrating it with linear cues retrieved from single imagery. A cue integration between two datasets is facilitated by creating new topological features connecting between the initial model and image lines, with which new model hypotheses (variances to the initial model) are produced. We adopt Minimum Description Length (MDL) principle for competing the model candidates and selecting the optimal model by considering the balanced trade-off between the model closeness and the model complexity. Our preliminary results, combined with the Vaihingen data provided by ISPRS WGIII/4 demonstrate the image-driven modelling cues can compensate the limitations posed by LiDAR data in rooftop modelling.

  7. ModelMate - A graphical user interface for model analysis

    USGS Publications Warehouse

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  8. [Model-based biofuels system analysis: a review].

    PubMed

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  9. An Immuno-epidemiological Model of Paratuberculosis

    NASA Astrophysics Data System (ADS)

    Martcheva, M.

    2011-11-01

    The primary objective of this article is to introduce an immuno-epidemiological model of paratuberculosis (Johne's disease). To develop the immuno-epidemiological model, we first develop an immunological model and an epidemiological model. Then, we link the two models through time-since-infection structure and parameters of the epidemiological model. We use the nested approach to compose the immuno-epidemiological model. Our immunological model captures the switch between the T-cell immune response and the antibody response in Johne's disease. The epidemiological model is a time-since-infection model and captures the variability of transmission rate and the vertical transmission of the disease. We compute the immune-response-dependent epidemiological reproduction number. Our immuno-epidemiological model can be used for investigation of the impact of the immune response on the epidemiology of Johne's disease.

  10. Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration

    NASA Technical Reports Server (NTRS)

    Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.

    1993-01-01

    Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.

  11. FacetModeller: Software for manual creation, manipulation and analysis of 3D surface-based models

    NASA Astrophysics Data System (ADS)

    Lelièvre, Peter G.; Carter-McAuslan, Angela E.; Dunham, Michael W.; Jones, Drew J.; Nalepa, Mariella; Squires, Chelsea L.; Tycholiz, Cassandra J.; Vallée, Marc A.; Farquharson, Colin G.

    2018-01-01

    The creation of 3D models is commonplace in many disciplines. Models are often built from a collection of tessellated surfaces. To apply numerical methods to such models it is often necessary to generate a mesh of space-filling elements that conforms to the model surfaces. While there are meshing algorithms that can do so, they place restrictive requirements on the surface-based models that are rarely met by existing 3D model building software. Hence, we have developed a Java application named FacetModeller, designed for efficient manual creation, modification and analysis of 3D surface-based models destined for use in numerical modelling.

  12. ModelTest Server: a web-based tool for the statistical selection of models of nucleotide substitution online

    PubMed Central

    Posada, David

    2006-01-01

    ModelTest server is a web-based application for the selection of models of nucleotide substitution using the program ModelTest. The server takes as input a text file with likelihood scores for the set of candidate models. Models can be selected with hierarchical likelihood ratio tests, or with the Akaike or Bayesian information criteria. The output includes several statistics for the assessment of model selection uncertainty, for model averaging or to estimate the relative importance of model parameters. The server can be accessed at . PMID:16845102

  13. Application of surface complexation models to anion adsorption by natural materials

    USDA-ARS?s Scientific Manuscript database

    Various chemical models of ion adsorption will be presented and discussed. Chemical models, such as surface complexation models, provide a molecular description of anion adsorption reactions using an equilibrium approach. Two such models, the constant capacitance model and the triple layer model w...

  14. Space Environments and Effects: Trapped Proton Model

    NASA Technical Reports Server (NTRS)

    Huston, S. L.; Kauffman, W. (Technical Monitor)

    2002-01-01

    An improved model of the Earth's trapped proton environment has been developed. This model, designated Trapped Proton Model version 1 (TPM-1), determines the omnidirectional flux of protons with energy between 1 and 100 MeV throughout near-Earth space. The model also incorporates a true solar cycle dependence. The model consists of several data files and computer software to read them. There are three versions of the mo'del: a FORTRAN-Callable library, a stand-alone model, and a Web-based model.

  15. The NASA Marshall engineering thermosphere model

    NASA Technical Reports Server (NTRS)

    Hickey, Michael Philip

    1988-01-01

    Described is the NASA Marshall Engineering Thermosphere (MET) Model, which is a modified version of the MFSC/J70 Orbital Atmospheric Density Model as currently used in the J70MM program at MSFC. The modifications to the MFSC/J70 model required for the MET model are described, graphical and numerical examples of the models are included, as is a listing of the MET model computer program. Major differences between the numerical output from the MET model and the MFSC/J70 model are discussed.

  16. Wind turbine model and loop shaping controller design

    NASA Astrophysics Data System (ADS)

    Gilev, Bogdan

    2017-12-01

    A model of a wind turbine is evaluated, consisting of: wind speed model, mechanical and electrical model of generator and tower oscillation model. Model of the whole system is linearized around of a nominal point. By using the linear model with uncertainties is synthesized a uncertain model. By using the uncertain model is developed a H∞ controller, which provide mode of stabilizing the rotor frequency and damping the tower oscillations. Finally is simulated work of nonlinear system and H∞ controller.

  17. Simulated Students and Classroom Use of Model-Based Intelligent Tutoring

    NASA Technical Reports Server (NTRS)

    Koedinger, Kenneth R.

    2008-01-01

    Two educational uses of models and simulations: 1) Students create models and use simulations ; and 2) Researchers create models of learners to guide development of reliably effective materials. Cognitive tutors simulate and support tutoring - data is crucial to create effective model. Pittsburgh Science of Learning Center: Resources for modeling, authoring, experimentation. Repository of data and theory. Examples of advanced modeling efforts: SimStudent learns rule-based model. Help-seeking model: Tutors metacognition. Scooter uses machine learning detectors of student engagement.

  18. Modeling for Battery Prognostics

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan S.; Goebel, Kai; Khasin, Michael; Hogge, Edward; Quach, Patrick

    2017-01-01

    For any battery-powered vehicles (be it unmanned aerial vehicles, small passenger aircraft, or assets in exoplanetary operations) to operate at maximum efficiency and reliability, it is critical to monitor battery health as well performance and to predict end of discharge (EOD) and end of useful life (EOL). To fulfil these needs, it is important to capture the battery's inherent characteristics as well as operational knowledge in the form of models that can be used by monitoring, diagnostic, and prognostic algorithms. Several battery modeling methodologies have been developed in last few years as the understanding of underlying electrochemical mechanics has been advancing. The models can generally be classified as empirical models, electrochemical engineering models, multi-physics models, and molecular/atomist. Empirical models are based on fitting certain functions to past experimental data, without making use of any physicochemical principles. Electrical circuit equivalent models are an example of such empirical models. Electrochemical engineering models are typically continuum models that include electrochemical kinetics and transport phenomena. Each model has its advantages and disadvantages. The former type of model has the advantage of being computationally efficient, but has limited accuracy and robustness, due to the approximations used in developed model, and as a result of such approximations, cannot represent aging well. The latter type of model has the advantage of being very accurate, but is often computationally inefficient, having to solve complex sets of partial differential equations, and thus not suited well for online prognostic applications. In addition both multi-physics and atomist models are computationally expensive hence are even less suited to online application An electrochemistry-based model of Li-ion batteries has been developed, that captures crucial electrochemical processes, captures effects of aging, is computationally efficient, and is of suitable accuracy for reliable EOD prediction in a variety of operational profiles. The model can be considered an electrochemical engineering model, but unlike most such models found in the literature, certain approximations are done that allow to retain computational efficiency for online implementation of the model. Although the focus here is on Li-ion batteries, the model is quite general and can be applied to different chemistries through a change of model parameter values. Progress on model development, providing model validation results and EOD prediction results is being presented.

  19. Comparison of modeling methods to predict the spatial distribution of deep-sea coral and sponge in the Gulf of Alaska

    NASA Astrophysics Data System (ADS)

    Rooper, Christopher N.; Zimmermann, Mark; Prescott, Megan M.

    2017-08-01

    Deep-sea coral and sponge ecosystems are widespread throughout most of Alaska's marine waters, and are associated with many different species of fishes and invertebrates. These ecosystems are vulnerable to the effects of commercial fishing activities and climate change. We compared four commonly used species distribution models (general linear models, generalized additive models, boosted regression trees and random forest models) and an ensemble model to predict the presence or absence and abundance of six groups of benthic invertebrate taxa in the Gulf of Alaska. All four model types performed adequately on training data for predicting presence and absence, with regression forest models having the best overall performance measured by the area under the receiver-operating-curve (AUC). The models also performed well on the test data for presence and absence with average AUCs ranging from 0.66 to 0.82. For the test data, ensemble models performed the best. For abundance data, there was an obvious demarcation in performance between the two regression-based methods (general linear models and generalized additive models), and the tree-based models. The boosted regression tree and random forest models out-performed the other models by a wide margin on both the training and testing data. However, there was a significant drop-off in performance for all models of invertebrate abundance ( 50%) when moving from the training data to the testing data. Ensemble model performance was between the tree-based and regression-based methods. The maps of predictions from the models for both presence and abundance agreed very well across model types, with an increase in variability in predictions for the abundance data. We conclude that where data conforms well to the modeled distribution (such as the presence-absence data and binomial distribution in this study), the four types of models will provide similar results, although the regression-type models may be more consistent with biological theory. For data with highly zero-inflated distributions and non-normal distributions such as the abundance data from this study, the tree-based methods performed better. Ensemble models that averaged predictions across the four model types, performed better than the GLM or GAM models but slightly poorer than the tree-based methods, suggesting ensemble models might be more robust to overfitting than tree methods, while mitigating some of the disadvantages in predictive performance of regression methods.

  20. A toy terrestrial carbon flow model

    NASA Technical Reports Server (NTRS)

    Parton, William J.; Running, Steven W.; Walker, Brian

    1992-01-01

    A generalized carbon flow model for the major terrestrial ecosystems of the world is reported. The model is a simplification of the Century model and the Forest-Biogeochemical model. Topics covered include plant production, decomposition and nutrient cycling, biomes, the utility of the carbon flow model for predicting carbon dynamics under global change, and possible applications to state-and-transition models and environmentally driven global vegetation models.

  1. BioModels Database: An enhanced, curated and annotated resource for published quantitative kinetic models

    PubMed Central

    2010-01-01

    Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in community-supported and standardised formats. In addition, the models and their components should be cross-referenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freely-accessible online resource for storing, viewing, retrieving, and analysing published, peer-reviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access up-to-date data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. PMID:20587024

  2. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P. Dixon

    The purpose of this Model Report (REV02) is to document the unsaturated zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrological-chemical (THC) processes on UZ flow and transport. This Model Report has been developed in accordance with the ''Technical Work Plan for: Performance Assessment Unsaturated Zone'' (Bechtel SAIC Company, LLC (BSC) 2002 [160819]). The technical work plan (TWP) describes planning information pertaining to the technical scope, content, and management of this Model Report in Section 1.12, Work Package AUZM08, ''Coupled Effects on Flow and Seepage''. The plan for validation of the models documented in this Model Reportmore » is given in Attachment I, Model Validation Plans, Section I-3-4, of the TWP. Except for variations in acceptance criteria (Section 4.2), there were no deviations from this TWP. This report was developed in accordance with AP-SIII.10Q, ''Models''. This Model Report documents the THC Seepage Model and the Drift Scale Test (DST) THC Model. The THC Seepage Model is a drift-scale process model for predicting the composition of gas and water that could enter waste emplacement drifts and the effects of mineral alteration on flow in rocks surrounding drifts. The DST THC model is a drift-scale process model relying on the same conceptual model and much of the same input data (i.e., physical, hydrological, thermodynamic, and kinetic) as the THC Seepage Model. The DST THC Model is the primary method for validating the THC Seepage Model. The DST THC Model compares predicted water and gas compositions, as well as mineral alteration patterns, with observed data from the DST. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal-loading conditions, and predict the evolution of mineral alteration and fluid chemistry around potential waste emplacement drifts. The DST THC Model is used solely for the validation of the THC Seepage Model and is not used for calibration to measured data.« less

  3. Review: To be or not to be an identifiable model. Is this a relevant question in animal science modelling?

    PubMed

    Muñoz-Tamayo, R; Puillet, L; Daniel, J B; Sauvant, D; Martin, O; Taghipoor, M; Blavy, P

    2018-04-01

    What is a good (useful) mathematical model in animal science? For models constructed for prediction purposes, the question of model adequacy (usefulness) has been traditionally tackled by statistical analysis applied to observed experimental data relative to model-predicted variables. However, little attention has been paid to analytic tools that exploit the mathematical properties of the model equations. For example, in the context of model calibration, before attempting a numerical estimation of the model parameters, we might want to know if we have any chance of success in estimating a unique best value of the model parameters from available measurements. This question of uniqueness is referred to as structural identifiability; a mathematical property that is defined on the sole basis of the model structure within a hypothetical ideal experiment determined by a setting of model inputs (stimuli) and observable variables (measurements). Structural identifiability analysis applied to dynamic models described by ordinary differential equations (ODEs) is a common practice in control engineering and system identification. This analysis demands mathematical technicalities that are beyond the academic background of animal science, which might explain the lack of pervasiveness of identifiability analysis in animal science modelling. To fill this gap, in this paper we address the analysis of structural identifiability from a practitioner perspective by capitalizing on the use of dedicated software tools. Our objectives are (i) to provide a comprehensive explanation of the structural identifiability notion for the community of animal science modelling, (ii) to assess the relevance of identifiability analysis in animal science modelling and (iii) to motivate the community to use identifiability analysis in the modelling practice (when the identifiability question is relevant). We focus our study on ODE models. By using illustrative examples that include published mathematical models describing lactation in cattle, we show how structural identifiability analysis can contribute to advancing mathematical modelling in animal science towards the production of useful models and, moreover, highly informative experiments via optimal experiment design. Rather than attempting to impose a systematic identifiability analysis to the modelling community during model developments, we wish to open a window towards the discovery of a powerful tool for model construction and experiment design.

  4. Ecosystem Model Skill Assessment. Yes We Can!

    PubMed Central

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S.

    2016-01-01

    Need to Assess the Skill of Ecosystem Models Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. Northeast US Atlantis Marine Ecosystem Model We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. Skill Assessment Is Both Possible and Advisable We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that it is necessary do so to instill confidence in model results and encourage their use for strategic management. Our methods are applicable to any type of predictive model, and should be considered for use in fields outside ecology (e.g. economics, climate change, and risk assessment). PMID:26731540

  5. Challenges and opportunities for integrating lake ecosystem modelling approaches

    USGS Publications Warehouse

    Mooij, Wolf M.; Trolle, Dennis; Jeppesen, Erik; Arhonditsis, George; Belolipetsky, Pavel V.; Chitamwebwa, Deonatus B.R.; Degermendzhy, Andrey G.; DeAngelis, Donald L.; Domis, Lisette N. De Senerpont; Downing, Andrea S.; Elliott, J. Alex; Ruberto, Carlos Ruberto; Gaedke, Ursula; Genova, Svetlana N.; Gulati, Ramesh D.; Hakanson, Lars; Hamilton, David P.; Hipsey, Matthew R.; Hoen, Jochem 't; Hulsmann, Stephan; Los, F. Hans; Makler-Pick, Vardit; Petzoldt, Thomas; Prokopkin, Igor G.; Rinke, Karsten; Schep, Sebastiaan A.; Tominaga, Koji; Van Dam, Anne A.; Van Nes, Egbert H.; Wells, Scott A.; Janse, Jan H.

    2010-01-01

    A large number and wide variety of lake ecosystem models have been developed and published during the past four decades. We identify two challenges for making further progress in this field. One such challenge is to avoid developing more models largely following the concept of others ('reinventing the wheel'). The other challenge is to avoid focusing on only one type of model, while ignoring new and diverse approaches that have become available ('having tunnel vision'). In this paper, we aim at improving the awareness of existing models and knowledge of concurrent approaches in lake ecosystem modelling, without covering all possible model tools and avenues. First, we present a broad variety of modelling approaches. To illustrate these approaches, we give brief descriptions of rather arbitrarily selected sets of specific models. We deal with static models (steady state and regression models), complex dynamic models (CAEDYM, CE-QUAL-W2, Delft 3D-ECO, LakeMab, LakeWeb, MyLake, PCLake, PROTECH, SALMO), structurally dynamic models and minimal dynamic models. We also discuss a group of approaches that could all be classified as individual based: super-individual models (Piscator, Charisma), physiologically structured models, stage-structured models and trait-based models. We briefly mention genetic algorithms, neural networks, Kalman filters and fuzzy logic. Thereafter, we zoom in, as an in-depth example, on the multi-decadal development and application of the lake ecosystem model PCLake and related models (PCLake Metamodel, Lake Shira Model, IPH-TRIM3D-PCLake). In the discussion, we argue that while the historical development of each approach and model is understandable given its 'leading principle', there are many opportunities for combining approaches. We take the point of view that a single 'right' approach does not exist and should not be strived for. Instead, multiple modelling approaches, applied concurrently to a given problem, can help develop an integrative view on the functioning of lake ecosystems. We end with a set of specific recommendations that may be of help in the further development of lake ecosystem models.

  6. Combination of Alternative Models by Mutual Data Assimilation: Supermodeling With A Suite of Primitive Equation Models

    NASA Astrophysics Data System (ADS)

    Duane, G. S.; Selten, F.

    2016-12-01

    Different models of climate and weather commonly give projections/predictions that differ widely in their details. While averaging of model outputs almost always improves results, nonlinearity implies that further improvement can be obtained from model interaction in run time, as has already been demonstrated with toy systems of ODEs and idealized quasigeostrophic models. In the supermodeling scheme, models effectively assimilate data from one another and partially synchronize with one another. Spread among models is manifest as a spread in possible inter-model connection coefficients, so that the models effectively "agree to disagree". Here, we construct a supermodel formed from variants of the SPEEDO model, a primitive-equation atmospheric model (SPEEDY) coupled to ocean and land. A suite of atmospheric models, coupled to the same ocean and land, is chosen to represent typical differences among climate models by varying model parameters. Connections are introduced between all pairs of corresponding independent variables at synoptic-scale intervals. Strengths of the inter-atmospheric connections can be considered to represent inverse inter-model observation error. Connection strengths are adapted based on an established procedure that extends the dynamical equations of a pair of synchronizing systems to synchronize parameters as well. The procedure is applied to synchronize the suite of SPEEDO models with another SPEEDO model regarded as "truth", adapting the inter-model connections along the way. The supermodel with trained connections gives marginally lower error in all fields than any weighted combination of the separate model outputs when used in "weather-prediction mode", i.e. with constant nudging to truth. Stronger results are obtained if a supermodel is used to predict the formation of coherent structures or the frequency of such. Partially synchronized SPEEDO models give a better representation of the blocked-zonal index cycle than does a weighted average of the constituent model outputs. We have thus shown that supermodeling and the synchronization-based procedure to adapt inter-model connections give results superior to output averaging not only with highly nonlinear toy systems, but with smaller nonlinearities as occur in climate models.

  7. [Analyzing and modeling methods of near infrared spectroscopy for in-situ prediction of oil yield from oil shale].

    PubMed

    Liu, Jie; Zhang, Fu-Dong; Teng, Fei; Li, Jun; Wang, Zhi-Hong

    2014-10-01

    In order to in-situ detect the oil yield of oil shale, based on portable near infrared spectroscopy analytical technology, with 66 rock core samples from No. 2 well drilling of Fuyu oil shale base in Jilin, the modeling and analyzing methods for in-situ detection were researched. By the developed portable spectrometer, 3 data formats (reflectance, absorbance and K-M function) spectra were acquired. With 4 different modeling data optimization methods: principal component-mahalanobis distance (PCA-MD) for eliminating abnormal samples, uninformative variables elimination (UVE) for wavelength selection and their combina- tions: PCA-MD + UVE and UVE + PCA-MD, 2 modeling methods: partial least square (PLS) and back propagation artificial neural network (BPANN), and the same data pre-processing, the modeling and analyzing experiment were performed to determine the optimum analysis model and method. The results show that the data format, modeling data optimization method and modeling method all affect the analysis precision of model. Results show that whether or not using the optimization method, reflectance or K-M function is the proper spectrum format of the modeling database for two modeling methods. Using two different modeling methods and four different data optimization methods, the model precisions of the same modeling database are different. For PLS modeling method, the PCA-MD and UVE + PCA-MD data optimization methods can improve the modeling precision of database using K-M function spectrum data format. For BPANN modeling method, UVE, UVE + PCA-MD and PCA- MD + UVE data optimization methods can improve the modeling precision of database using any of the 3 spectrum data formats. In addition to using the reflectance spectra and PCA-MD data optimization method, modeling precision by BPANN method is better than that by PLS method. And modeling with reflectance spectra, UVE optimization method and BPANN modeling method, the model gets the highest analysis precision, its correlation coefficient (Rp) is 0.92, and its standard error of prediction (SEP) is 0.69%.

  8. Bayesian multimodel inference of soil microbial respiration models: Theory, application and future prospective

    NASA Astrophysics Data System (ADS)

    Elshall, A. S.; Ye, M.; Niu, G. Y.; Barron-Gafford, G.

    2015-12-01

    Models in biogeoscience involve uncertainties in observation data, model inputs, model structure, model processes and modeling scenarios. To accommodate for different sources of uncertainty, multimodal analysis such as model combination, model selection, model elimination or model discrimination are becoming more popular. To illustrate theoretical and practical challenges of multimodal analysis, we use an example about microbial soil respiration modeling. Global soil respiration releases more than ten times more carbon dioxide to the atmosphere than all anthropogenic emissions. Thus, improving our understanding of microbial soil respiration is essential for improving climate change models. This study focuses on a poorly understood phenomena, which is the soil microbial respiration pulses in response to episodic rainfall pulses (the "Birch effect"). We hypothesize that the "Birch effect" is generated by the following three mechanisms. To test our hypothesis, we developed and assessed five evolving microbial-enzyme models against field measurements from a semiarid Savannah that is characterized by pulsed precipitation. These five model evolve step-wise such that the first model includes none of these three mechanism, while the fifth model includes the three mechanisms. The basic component of Bayesian multimodal analysis is the estimation of marginal likelihood to rank the candidate models based on their overall likelihood with respect to observation data. The first part of the study focuses on using this Bayesian scheme to discriminate between these five candidate models. The second part discusses some theoretical and practical challenges, which are mainly the effect of likelihood function selection and the marginal likelihood estimation methods on both model ranking and Bayesian model averaging. The study shows that making valid inference from scientific data is not a trivial task, since we are not only uncertain about the candidate scientific models, but also about the statistical methods that are used to discriminate between these models.

  9. Ecosystem Model Skill Assessment. Yes We Can!

    PubMed

    Olsen, Erik; Fay, Gavin; Gaichas, Sarah; Gamble, Robert; Lucey, Sean; Link, Jason S

    2016-01-01

    Accelerated changes to global ecosystems call for holistic and integrated analyses of past, present and future states under various pressures to adequately understand current and projected future system states. Ecosystem models can inform management of human activities in a complex and changing environment, but are these models reliable? Ensuring that models are reliable for addressing management questions requires evaluating their skill in representing real-world processes and dynamics. Skill has been evaluated for just a limited set of some biophysical models. A range of skill assessment methods have been reviewed but skill assessment of full marine ecosystem models has not yet been attempted. We assessed the skill of the Northeast U.S. (NEUS) Atlantis marine ecosystem model by comparing 10-year model forecasts with observed data. Model forecast performance was compared to that obtained from a 40-year hindcast. Multiple metrics (average absolute error, root mean squared error, modeling efficiency, and Spearman rank correlation), and a suite of time-series (species biomass, fisheries landings, and ecosystem indicators) were used to adequately measure model skill. Overall, the NEUS model performed above average and thus better than expected for the key species that had been the focus of the model tuning. Model forecast skill was comparable to the hindcast skill, showing that model performance does not degenerate in a 10-year forecast mode, an important characteristic for an end-to-end ecosystem model to be useful for strategic management purposes. We identify best-practice approaches for end-to-end ecosystem model skill assessment that would improve both operational use of other ecosystem models and future model development. We show that it is possible to not only assess the skill of a complicated marine ecosystem model, but that it is necessary do so to instill confidence in model results and encourage their use for strategic management. Our methods are applicable to any type of predictive model, and should be considered for use in fields outside ecology (e.g. economics, climate change, and risk assessment).

  10. iMarNet: an ocean biogeochemistry model inter-comparison project within a common physical ocean modelling framework

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, L.; Yool, A.; Allen, J. I.; Anderson, T. R.; Barciela, R.; Buitenhuis, E. T.; Butenschön, M.; Enright, C.; Halloran, P. R.; Le Quéré, C.; de Mora, L.; Racault, M.-F.; Sinha, B.; Totterdell, I. J.; Cox, P. M.

    2014-07-01

    Ocean biogeochemistry (OBGC) models span a wide range of complexities from highly simplified, nutrient-restoring schemes, through nutrient-phytoplankton-zooplankton-detritus (NPZD) models that crudely represent the marine biota, through to models that represent a broader trophic structure by grouping organisms as plankton functional types (PFT) based on their biogeochemical role (Dynamic Green Ocean Models; DGOM) and ecosystem models which group organisms by ecological function and trait. OBGC models are now integral components of Earth System Models (ESMs), but they compete for computing resources with higher resolution dynamical setups and with other components such as atmospheric chemistry and terrestrial vegetation schemes. As such, the choice of OBGC in ESMs needs to balance model complexity and realism alongside relative computing cost. Here, we present an inter-comparison of six OBGC models that were candidates for implementation within the next UK Earth System Model (UKESM1). The models cover a large range of biological complexity (from 7 to 57 tracers) but all include representations of at least the nitrogen, carbon, alkalinity and oxygen cycles. Each OBGC model was coupled to the Nucleus for the European Modelling of the Ocean (NEMO) ocean general circulation model (GCM), and results from physically identical hindcast simulations were compared. Model skill was evaluated for biogeochemical metrics of global-scale bulk properties using conventional statistical techniques. The computing cost of each model was also measured in standardised tests run at two resource levels. No model is shown to consistently outperform or underperform all other models across all metrics. Nonetheless, the simpler models that are easier to tune are broadly closer to observations across a number of fields, and thus offer a high-efficiency option for ESMs that prioritise high resolution climate dynamics. However, simpler models provide limited insight into more complex marine biogeochemical processes and ecosystem pathways, and a parallel approach of low resolution climate dynamics and high complexity biogeochemistry is desirable in order to provide additional insights into biogeochemistry-climate interactions.

  11. iMarNet: an ocean biogeochemistry model intercomparison project within a common physical ocean modelling framework

    NASA Astrophysics Data System (ADS)

    Kwiatkowski, L.; Yool, A.; Allen, J. I.; Anderson, T. R.; Barciela, R.; Buitenhuis, E. T.; Butenschön, M.; Enright, C.; Halloran, P. R.; Le Quéré, C.; de Mora, L.; Racault, M.-F.; Sinha, B.; Totterdell, I. J.; Cox, P. M.

    2014-12-01

    Ocean biogeochemistry (OBGC) models span a wide variety of complexities, including highly simplified nutrient-restoring schemes, nutrient-phytoplankton-zooplankton-detritus (NPZD) models that crudely represent the marine biota, models that represent a broader trophic structure by grouping organisms as plankton functional types (PFTs) based on their biogeochemical role (dynamic green ocean models) and ecosystem models that group organisms by ecological function and trait. OBGC models are now integral components of Earth system models (ESMs), but they compete for computing resources with higher resolution dynamical setups and with other components such as atmospheric chemistry and terrestrial vegetation schemes. As such, the choice of OBGC in ESMs needs to balance model complexity and realism alongside relative computing cost. Here we present an intercomparison of six OBGC models that were candidates for implementation within the next UK Earth system model (UKESM1). The models cover a large range of biological complexity (from 7 to 57 tracers) but all include representations of at least the nitrogen, carbon, alkalinity and oxygen cycles. Each OBGC model was coupled to the ocean general circulation model Nucleus for European Modelling of the Ocean (NEMO) and results from physically identical hindcast simulations were compared. Model skill was evaluated for biogeochemical metrics of global-scale bulk properties using conventional statistical techniques. The computing cost of each model was also measured in standardised tests run at two resource levels. No model is shown to consistently outperform all other models across all metrics. Nonetheless, the simpler models are broadly closer to observations across a number of fields and thus offer a high-efficiency option for ESMs that prioritise high-resolution climate dynamics. However, simpler models provide limited insight into more complex marine biogeochemical processes and ecosystem pathways, and a parallel approach of low-resolution climate dynamics and high-complexity biogeochemistry is desirable in order to provide additional insights into biogeochemistry-climate interactions.

  12. Design of Soil Salinity Policies with Tinamit, a Flexible and Rapid Tool to Couple Stakeholder-Built System Dynamics Models with Physically-Based Models

    NASA Astrophysics Data System (ADS)

    Malard, J. J.; Baig, A. I.; Hassanzadeh, E.; Adamowski, J. F.; Tuy, H.; Melgar-Quiñonez, H.

    2016-12-01

    Model coupling is a crucial step to constructing many environmental models, as it allows for the integration of independently-built models representing different system sub-components to simulate the entire system. Model coupling has been of particular interest in combining socioeconomic System Dynamics (SD) models, whose visual interface facilitates their direct use by stakeholders, with more complex physically-based models of the environmental system. However, model coupling processes are often cumbersome and inflexible and require extensive programming knowledge, limiting their potential for continued use by stakeholders in policy design and analysis after the end of the project. Here, we present Tinamit, a flexible Python-based model-coupling software tool whose easy-to-use API and graphical user interface make the coupling of stakeholder-built SD models with physically-based models rapid, flexible and simple for users with limited to no coding knowledge. The flexibility of the system allows end users to modify the SD model as well as the linking variables between the two models themselves with no need for recoding. We use Tinamit to couple a stakeholder-built socioeconomic model of soil salinization in Pakistan with the physically-based soil salinity model SAHYSMOD. As climate extremes increase in the region, policies to slow or reverse soil salinity buildup are increasing in urgency and must take both socioeconomic and biophysical spheres into account. We use the Tinamit-coupled model to test the impact of integrated policy options (economic and regulatory incentives to farmers) on soil salinity in the region in the face of future climate change scenarios. Use of the Tinamit model allowed for rapid and flexible coupling of the two models, allowing the end user to continue making model structure and policy changes. In addition, the clear interface (in contrast to most model coupling code) makes the final coupled model easily accessible to stakeholders with limited technical background.

  13. Bayesian Model Selection under Time Constraints

    NASA Astrophysics Data System (ADS)

    Hoege, M.; Nowak, W.; Illman, W. A.

    2017-12-01

    Bayesian model selection (BMS) provides a consistent framework for rating and comparing models in multi-model inference. In cases where models of vastly different complexity compete with each other, we also face vastly different computational runtimes of such models. For instance, time series of a quantity of interest can be simulated by an autoregressive process model that takes even less than a second for one run, or by a partial differential equations-based model with runtimes up to several hours or even days. The classical BMS is based on a quantity called Bayesian model evidence (BME). It determines the model weights in the selection process and resembles a trade-off between bias of a model and its complexity. However, in practice, the runtime of models is another weight relevant factor for model selection. Hence, we believe that it should be included, leading to an overall trade-off problem between bias, variance and computing effort. We approach this triple trade-off from the viewpoint of our ability to generate realizations of the models under a given computational budget. One way to obtain BME values is through sampling-based integration techniques. We argue with the fact that more expensive models can be sampled much less under time constraints than faster models (in straight proportion to their runtime). The computed evidence in favor of a more expensive model is statistically less significant than the evidence computed in favor of a faster model, since sampling-based strategies are always subject to statistical sampling error. We present a straightforward way to include this misbalance into the model weights that are the basis for model selection. Our approach follows directly from the idea of insufficient significance. It is based on a computationally cheap bootstrapping error estimate of model evidence and is easy to implement. The approach is illustrated in a small synthetic modeling study.

  14. Prediction-error variance in Bayesian model updating: a comparative study

    NASA Astrophysics Data System (ADS)

    Asadollahi, Parisa; Li, Jian; Huang, Yong

    2017-04-01

    In Bayesian model updating, the likelihood function is commonly formulated by stochastic embedding in which the maximum information entropy probability model of prediction error variances plays an important role and it is Gaussian distribution subject to the first two moments as constraints. The selection of prediction error variances can be formulated as a model class selection problem, which automatically involves a trade-off between the average data-fit of the model class and the information it extracts from the data. Therefore, it is critical for the robustness in the updating of the structural model especially in the presence of modeling errors. To date, three ways of considering prediction error variances have been seem in the literature: 1) setting constant values empirically, 2) estimating them based on the goodness-of-fit of the measured data, and 3) updating them as uncertain parameters by applying Bayes' Theorem at the model class level. In this paper, the effect of different strategies to deal with the prediction error variances on the model updating performance is investigated explicitly. A six-story shear building model with six uncertain stiffness parameters is employed as an illustrative example. Transitional Markov Chain Monte Carlo is used to draw samples of the posterior probability density function of the structure model parameters as well as the uncertain prediction variances. The different levels of modeling uncertainty and complexity are modeled through three FE models, including a true model, a model with more complexity, and a model with modeling error. Bayesian updating is performed for the three FE models considering the three aforementioned treatments of the prediction error variances. The effect of number of measurements on the model updating performance is also examined in the study. The results are compared based on model class assessment and indicate that updating the prediction error variances as uncertain parameters at the model class level produces more robust results especially when the number of measurement is small.

  15. Comparison and Analysis of Geometric Correction Models of Spaceborne SAR

    PubMed Central

    Jiang, Weihao; Yu, Anxi; Dong, Zhen; Wang, Qingsong

    2016-01-01

    Following the development of synthetic aperture radar (SAR), SAR images have become increasingly common. Many researchers have conducted large studies on geolocation models, but little work has been conducted on the available models for the geometric correction of SAR images of different terrain. To address the terrain issue, four different models were compared and are described in this paper: a rigorous range-doppler (RD) model, a rational polynomial coefficients (RPC) model, a revised polynomial (PM) model and an elevation derivation (EDM) model. The results of comparisons of the geolocation capabilities of the models show that a proper model for a SAR image of a specific terrain can be determined. A solution table was obtained to recommend a suitable model for users. Three TerraSAR-X images, two ALOS-PALSAR images and one Envisat-ASAR image were used for the experiment, including flat terrain and mountain terrain SAR images as well as two large area images. Geolocation accuracies of the models for different terrain SAR images were computed and analyzed. The comparisons of the models show that the RD model was accurate but was the least efficient; therefore, it is not the ideal model for real-time implementations. The RPC model is sufficiently accurate and efficient for the geometric correction of SAR images of flat terrain, whose precision is below 0.001 pixels. The EDM model is suitable for the geolocation of SAR images of mountainous terrain, and its precision can reach 0.007 pixels. Although the PM model does not produce results as precise as the other models, its efficiency is excellent and its potential should not be underestimated. With respect to the geometric correction of SAR images over large areas, the EDM model has higher accuracy under one pixel, whereas the RPC model consumes one third of the time of the EDM model. PMID:27347973

  16. Towards policy relevant environmental modeling: contextual validity and pragmatic models

    USGS Publications Warehouse

    Miles, Scott B.

    2000-01-01

    "What makes for a good model?" In various forms, this question is a question that, undoubtedly, many people, businesses, and institutions ponder with regards to their particular domain of modeling. One particular domain that is wrestling with this question is the multidisciplinary field of environmental modeling. Examples of environmental models range from models of contaminated ground water flow to the economic impact of natural disasters, such as earthquakes. One of the distinguishing claims of the field is the relevancy of environmental modeling to policy and environment-related decision-making in general. A pervasive view by both scientists and decision-makers is that a "good" model is one that is an accurate predictor. Thus, determining whether a model is "accurate" or "correct" is done by comparing model output to empirical observations. The expected outcome of this process, usually referred to as "validation" or "ground truthing," is a stamp on the model in question of "valid" or "not valid" that serves to indicate whether or not the model will be reliable before it is put into service in a decision-making context. In this paper, I begin by elaborating on the prevailing view of model validation and why this view must change. Drawing from concepts coming out of the studies of science and technology, I go on to propose a contextual view of validity that can overcome the problems associated with "ground truthing" models as an indicator of model goodness. The problem of how we talk about and determine model validity has much to do about how we perceive the utility of environmental models. In the remainder of the paper, I argue that we should adopt ideas of pragmatism in judging what makes for a good model and, in turn, developing good models. From such a perspective of model goodness, good environmental models should facilitate communication, convey—not bury or "eliminate"—uncertainties, and, thus, afford the active building of consensus decisions, instead of promoting passive or self-righteous decisions.

  17. On Using Meta-Modeling and Multi-Modeling to Address Complex Problems

    ERIC Educational Resources Information Center

    Abu Jbara, Ahmed

    2013-01-01

    Models, created using different modeling techniques, usually serve different purposes and provide unique insights. While each modeling technique might be capable of answering specific questions, complex problems require multiple models interoperating to complement/supplement each other; we call this Multi-Modeling. To address the syntactic and…

  18. Preparing the Model for Prediction Across Scales (MPAS) for global retrospective air quality modeling

    EPA Science Inventory

    The US EPA has a plan to leverage recent advances in meteorological modeling to develop a "Next-Generation" air quality modeling system that will allow consistent modeling of problems from global to local scale. The meteorological model of choice is the Model for Predic...

  19. Model Comparison of Bayesian Semiparametric and Parametric Structural Equation Models

    ERIC Educational Resources Information Center

    Song, Xin-Yuan; Xia, Ye-Mao; Pan, Jun-Hao; Lee, Sik-Yum

    2011-01-01

    Structural equation models have wide applications. One of the most important issues in analyzing structural equation models is model comparison. This article proposes a Bayesian model comparison statistic, namely the "L[subscript nu]"-measure for both semiparametric and parametric structural equation models. For illustration purposes, we consider…

  20. National Centers for Environmental Prediction

    Science.gov Websites

    Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post / VISION | About EMC EMC > Mesoscale Modeling > MODELS Home Mission Models R & D Collaborators Cyclone Tracks & Verification Implementation Info FAQ Disclaimer More Info MESOSCALE MODELING SREF

  1. Computer Models of Personality: Implications for Measurement

    ERIC Educational Resources Information Center

    Cranton, P. A.

    1976-01-01

    Current research on computer models of personality is reviewed and categorized under five headings: (1) models of belief systems; (2) models of interpersonal behavior; (3) models of decision-making processes; (4) prediction models; and (5) theory-based simulations of specific processes. The use of computer models in personality measurement is…

  2. Uses of Computer Simulation Models in Ag-Research and Everyday Life

    USDA-ARS?s Scientific Manuscript database

    When the news media talks about models they could be talking about role models, fashion models, conceptual models like the auto industry uses, or computer simulation models. A computer simulation model is a computer code that attempts to imitate the processes and functions of certain systems. There ...

  3. A Framework of Operating Models for Interdisciplinary Research Programs in Clinical Service Organizations

    ERIC Educational Resources Information Center

    King, Gillian; Currie, Melissa; Smith, Linda; Servais, Michelle; McDougall, Janette

    2008-01-01

    A framework of operating models for interdisciplinary research programs in clinical service organizations is presented, consisting of a "clinician-researcher" skill development model, a program evaluation model, a researcher-led knowledge generation model, and a knowledge conduit model. Together, these models comprise a tailored, collaborative…

  4. Modelling Students' Visualisation of Chemical Reaction

    ERIC Educational Resources Information Center

    Cheng, Maurice M. W.; Gilbert, John K.

    2017-01-01

    This paper proposes a model-based notion of "submicro representations of chemical reactions". Based on three structural models of matter (the simple particle model, the atomic model and the free electron model of metals), we suggest there are two major models of reaction in school chemistry curricula: (a) reactions that are simple…

  5. Multilevel and Latent Variable Modeling with Composite Links and Exploded Likelihoods

    ERIC Educational Resources Information Center

    Rabe-Hesketh, Sophia; Skrondal, Anders

    2007-01-01

    Composite links and exploded likelihoods are powerful yet simple tools for specifying a wide range of latent variable models. Applications considered include survival or duration models, models for rankings, small area estimation with census information, models for ordinal responses, item response models with guessing, randomized response models,…

  6. Planning Major Curricular Change.

    ERIC Educational Resources Information Center

    Kirkland, Travis P.

    Decision-making and change models can take many forms. One researcher (Nordvall, 1982) has suggested five conceptual models for introducing change: a political model; a rational decision-making model; a social interaction decision model; the problem-solving method; and an adaptive/linkage model which is an amalgam of each of the other models.…

  7. UNITED STATES METEOROLOGICAL DATA - DAILY AND HOURLY FILES TO SUPPORT PREDICTIVE EXPOSURE MODELING

    EPA Science Inventory

    ORD numerical models for pesticide exposure include a model of spray drift (AgDisp), a cropland pesticide persistence model (PRZM), a surface water exposure model (EXAMS), and a model of fish bioaccumulation (BASS). A unified climatological database for these models has been asse...

  8. Enhancement of the Acquisition Process for a Combat System-A Case Study to Model the Workflow Processes for an Air Defense System Acquisition

    DTIC Science & Technology

    2009-12-01

    Business Process Modeling BPMN Business Process Modeling Notation SoA Service-oriented Architecture UML Unified Modeling Language CSP...system developers. Supporting technologies include Business Process Modeling Notation ( BPMN ), Unified Modeling Language (UML), model-driven architecture

  9. Improving a complex finite-difference ground water flow model through the use of an analytic element screening model

    USGS Publications Warehouse

    Hunt, R.J.; Anderson, M.P.; Kelson, V.A.

    1998-01-01

    This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.

  10. A stochastic model for tumor geometry evolution during radiation therapy in cervical cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yifang; Lee, Chi-Guhn; Chan, Timothy C. Y., E-mail: tcychan@mie.utoronto.ca

    2014-02-15

    Purpose: To develop mathematical models to predict the evolution of tumor geometry in cervical cancer undergoing radiation therapy. Methods: The authors develop two mathematical models to estimate tumor geometry change: a Markov model and an isomorphic shrinkage model. The Markov model describes tumor evolution by investigating the change in state (either tumor or nontumor) of voxels on the tumor surface. It assumes that the evolution follows a Markov process. Transition probabilities are obtained using maximum likelihood estimation and depend on the states of neighboring voxels. The isomorphic shrinkage model describes tumor shrinkage or growth in terms of layers of voxelsmore » on the tumor surface, instead of modeling individual voxels. The two proposed models were applied to data from 29 cervical cancer patients treated at Princess Margaret Cancer Centre and then compared to a constant volume approach. Model performance was measured using sensitivity and specificity. Results: The Markov model outperformed both the isomorphic shrinkage and constant volume models in terms of the trade-off between sensitivity (target coverage) and specificity (normal tissue sparing). Generally, the Markov model achieved a few percentage points in improvement in either sensitivity or specificity compared to the other models. The isomorphic shrinkage model was comparable to the Markov approach under certain parameter settings. Convex tumor shapes were easier to predict. Conclusions: By modeling tumor geometry change at the voxel level using a probabilistic model, improvements in target coverage and normal tissue sparing are possible. Our Markov model is flexible and has tunable parameters to adjust model performance to meet a range of criteria. Such a model may support the development of an adaptive paradigm for radiation therapy of cervical cancer.« less

  11. The Radiative Forcing Model Intercomparison Project (RFMIP): Assessment and characterization of forcing to enable feedback studies

    NASA Astrophysics Data System (ADS)

    Pincus, R.; Stevens, B. B.; Forster, P.; Collins, W.; Ramaswamy, V.

    2014-12-01

    The Radiative Forcing Model Intercomparison Project (RFMIP): Assessment and characterization of forcing to enable feedback studies An enormous amount of attention has been paid to the diversity of responses in the CMIP and other multi-model ensembles. This diversity is normally interpreted as a distribution in climate sensitivity driven by some distribution of feedback mechanisms. Identification of these feedbacks relies on precise identification of the forcing to which each model is subject, including distinguishing true error from model diversity. The Radiative Forcing Model Intercomparison Project (RFMIP) aims to disentangle the role of forcing from model sensitivity as determinants of varying climate model response by carefully characterizing the radiative forcing to which such models are subject and by coordinating experiments in which it is specified. RFMIP consists of four activities: 1) An assessment of accuracy in flux and forcing calculations for greenhouse gases under past, present, and future climates, using off-line radiative transfer calculations in specified atmospheres with climate model parameterizations and reference models 2) Characterization and assessment of model-specific historical forcing by anthropogenic aerosols, based on coordinated diagnostic output from climate models and off-line radiative transfer calculations with reference models 3) Characterization of model-specific effective radiative forcing, including contributions of model climatology and rapid adjustments, using coordinated climate model integrations and off-line radiative transfer calculations with a single fast model 4) Assessment of climate model response to precisely-characterized radiative forcing over the historical record, including efforts to infer true historical forcing from patterns of response, by direct specification of non-greenhouse-gas forcing in a series of coordinated climate model integrations This talk discusses the rationale for RFMIP, provides an overview of the four activities, and presents preliminary motivating results.

  12. Greedy Sampling and Incremental Surrogate Model-Based Tailoring of Aeroservoelastic Model Database for Flexible Aircraft

    NASA Technical Reports Server (NTRS)

    Wang, Yi; Pant, Kapil; Brenner, Martin J.; Ouellette, Jeffrey A.

    2018-01-01

    This paper presents a data analysis and modeling framework to tailor and develop linear parameter-varying (LPV) aeroservoelastic (ASE) model database for flexible aircrafts in broad 2D flight parameter space. The Kriging surrogate model is constructed using ASE models at a fraction of grid points within the original model database, and then the ASE model at any flight condition can be obtained simply through surrogate model interpolation. The greedy sampling algorithm is developed to select the next sample point that carries the worst relative error between the surrogate model prediction and the benchmark model in the frequency domain among all input-output channels. The process is iterated to incrementally improve surrogate model accuracy till a pre-determined tolerance or iteration budget is met. The methodology is applied to the ASE model database of a flexible aircraft currently being tested at NASA/AFRC for flutter suppression and gust load alleviation. Our studies indicate that the proposed method can reduce the number of models in the original database by 67%. Even so the ASE models obtained through Kriging interpolation match the model in the original database constructed directly from the physics-based tool with the worst relative error far below 1%. The interpolated ASE model exhibits continuously-varying gains along a set of prescribed flight conditions. More importantly, the selected grid points are distributed non-uniformly in the parameter space, a) capturing the distinctly different dynamic behavior and its dependence on flight parameters, and b) reiterating the need and utility for adaptive space sampling techniques for ASE model database compaction. The present framework is directly extendible to high-dimensional flight parameter space, and can be used to guide the ASE model development, model order reduction, robust control synthesis and novel vehicle design of flexible aircraft.

  13. Mind the Noise When Identifying Computational Models of Cognition from Brain Activity.

    PubMed

    Kolossa, Antonio; Kopp, Bruno

    2016-01-01

    The aim of this study was to analyze how measurement error affects the validity of modeling studies in computational neuroscience. A synthetic validity test was created using simulated P300 event-related potentials as an example. The model space comprised four computational models of single-trial P300 amplitude fluctuations which differed in terms of complexity and dependency. The single-trial fluctuation of simulated P300 amplitudes was computed on the basis of one of the models, at various levels of measurement error and at various numbers of data points. Bayesian model selection was performed based on exceedance probabilities. At very low numbers of data points, the least complex model generally outperformed the data-generating model. Invalid model identification also occurred at low levels of data quality and under low numbers of data points if the winning model's predictors were closely correlated with the predictors from the data-generating model. Given sufficient data quality and numbers of data points, the data-generating model could be correctly identified, even against models which were very similar to the data-generating model. Thus, a number of variables affects the validity of computational modeling studies, and data quality and numbers of data points are among the main factors relevant to the issue. Further, the nature of the model space (i.e., model complexity, model dependency) should not be neglected. This study provided quantitative results which show the importance of ensuring the validity of computational modeling via adequately prepared studies. The accomplishment of synthetic validity tests is recommended for future applications. Beyond that, we propose to render the demonstration of sufficient validity via adequate simulations mandatory to computational modeling studies.

  14. Chasing Perfection: Should We Reduce Model Uncertainty in Carbon Cycle-Climate Feedbacks

    NASA Astrophysics Data System (ADS)

    Bonan, G. B.; Lombardozzi, D.; Wieder, W. R.; Lindsay, K. T.; Thomas, R. Q.

    2015-12-01

    Earth system model simulations of the terrestrial carbon (C) cycle show large multi-model spread in the carbon-concentration and carbon-climate feedback parameters. Large differences among models are also seen in their simulation of global vegetation and soil C stocks and other aspects of the C cycle, prompting concern about model uncertainty and our ability to faithfully represent fundamental aspects of the terrestrial C cycle in Earth system models. Benchmarking analyses that compare model simulations with common datasets have been proposed as a means to assess model fidelity with observations, and various model-data fusion techniques have been used to reduce model biases. While such efforts will reduce multi-model spread, they may not help reduce uncertainty (and increase confidence) in projections of the C cycle over the twenty-first century. Many ecological and biogeochemical processes represented in Earth system models are poorly understood at both the site scale and across large regions, where biotic and edaphic heterogeneity are important. Our experience with the Community Land Model (CLM) suggests that large uncertainty in the terrestrial C cycle and its feedback with climate change is an inherent property of biological systems. The challenge of representing life in Earth system models, with the rich diversity of lifeforms and complexity of biological systems, may necessitate a multitude of modeling approaches to capture the range of possible outcomes. Such models should encompass a range of plausible model structures. We distinguish between model parameter uncertainty and model structural uncertainty. Focusing on improved parameter estimates may, in fact, limit progress in assessing model structural uncertainty associated with realistically representing biological processes. Moreover, higher confidence may be achieved through better process representation, but this does not necessarily reduce uncertainty.

  15. Clarity versus complexity: land-use modeling as a practical tool for decision-makers

    USGS Publications Warehouse

    Sohl, Terry L.; Claggett, Peter

    2013-01-01

    The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.

  16. Modeling Methods

    USGS Publications Warehouse

    Healy, Richard W.; Scanlon, Bridget R.

    2010-01-01

    Simulation models are widely used in all types of hydrologic studies, and many of these models can be used to estimate recharge. Models can provide important insight into the functioning of hydrologic systems by identifying factors that influence recharge. The predictive capability of models can be used to evaluate how changes in climate, water use, land use, and other factors may affect recharge rates. Most hydrological simulation models, including watershed models and groundwater-flow models, are based on some form of water-budget equation, so the material in this chapter is closely linked to that in Chapter 2. Empirical models that are not based on a water-budget equation have also been used for estimating recharge; these models generally take the form of simple estimation equations that define annual recharge as a function of precipitation and possibly other climatic data or watershed characteristics.Model complexity varies greatly. Some models are simple accounting models; others attempt to accurately represent the physics of water movement through each compartment of the hydrologic system. Some models provide estimates of recharge explicitly; for example, a model based on the Richards equation can simulate water movement from the soil surface through the unsaturated zone to the water table. Recharge estimates can be obtained indirectly from other models. For example, recharge is a parameter in groundwater-flow models that solve for hydraulic head (i.e. groundwater level). Recharge estimates can be obtained through a model calibration process in which recharge and other model parameter values are adjusted so that simulated water levels agree with measured water levels. The simulation that provides the closest agreement is called the best fit, and the recharge value used in that simulation is the model-generated estimate of recharge.

  17. Emulating a System Dynamics Model with Agent-Based Models: A Methodological Case Study in Simulation of Diabetes Progression

    DOE PAGES

    Schryver, Jack; Nutaro, James; Shankar, Mallikarjun

    2015-10-30

    An agent-based simulation model hierarchy emulating disease states and behaviors critical to progression of diabetes type 2 was designed and implemented in the DEVS framework. The models are translations of basic elements of an established system dynamics model of diabetes. In this model hierarchy, which mimics diabetes progression over an aggregated U.S. population, was dis-aggregated and reconstructed bottom-up at the individual (agent) level. Four levels of model complexity were defined in order to systematically evaluate which parameters are needed to mimic outputs of the system dynamics model. Moreover, the four estimated models attempted to replicate stock counts representing disease statesmore » in the system dynamics model, while estimating impacts of an elderliness factor, obesity factor and health-related behavioral parameters. Health-related behavior was modeled as a simple realization of the Theory of Planned Behavior, a joint function of individual attitude and diffusion of social norms that spread over each agent s social network. Although the most complex agent-based simulation model contained 31 adjustable parameters, all models were considerably less complex than the system dynamics model which required numerous time series inputs to make its predictions. In all three elaborations of the baseline model provided significantly improved fits to the output of the system dynamics model. The performances of the baseline agent-based model and its extensions illustrate a promising approach to translate complex system dynamics models into agent-based model alternatives that are both conceptually simpler and capable of capturing main effects of complex local agent-agent interactions.« less

  18. Forecasting plant phenology: evaluating the phenological models for Betula pendula and Padus racemosa spring phases, Latvia.

    PubMed

    Kalvāns, Andis; Bitāne, Māra; Kalvāne, Gunta

    2015-02-01

    A historical phenological record and meteorological data of the period 1960-2009 are used to analyse the ability of seven phenological models to predict leaf unfolding and beginning of flowering for two tree species-silver birch Betula pendula and bird cherry Padus racemosa-in Latvia. Model stability is estimated performing multiple model fitting runs using half of the data for model training and the other half for evaluation. Correlation coefficient, mean absolute error and mean squared error are used to evaluate model performance. UniChill (a model using sigmoidal development rate and temperature relationship and taking into account the necessity for dormancy release) and DDcos (a simple degree-day model considering the diurnal temperature fluctuations) are found to be the best models for describing the considered spring phases. A strong collinearity between base temperature and required heat sum is found for several model fitting runs of the simple degree-day based models. Large variation of the model parameters between different model fitting runs in case of more complex models indicates similar collinearity and over-parameterization of these models. It is suggested that model performance can be improved by incorporating the resolved daily temperature fluctuations of the DDcos model into the framework of the more complex models (e.g. UniChill). The average base temperature, as found by DDcos model, for B. pendula leaf unfolding is 5.6 °C and for the start of the flowering 6.7 °C; for P. racemosa, the respective base temperatures are 3.2 °C and 3.4 °C.

  19. A toolbox and record for scientific models

    NASA Technical Reports Server (NTRS)

    Ellman, Thomas

    1994-01-01

    Computational science presents a host of challenges for the field of knowledge-based software design. Scientific computation models are difficult to construct. Models constructed by one scientist are easily misapplied by other scientists to problems for which they are not well-suited. Finally, models constructed by one scientist are difficult for others to modify or extend to handle new types of problems. Construction of scientific models actually involves much more than the mechanics of building a single computational model. In the course of developing a model, a scientist will often test a candidate model against experimental data or against a priori expectations. Test results often lead to revisions of the model and a consequent need for additional testing. During a single model development session, a scientist typically examines a whole series of alternative models, each using different simplifying assumptions or modeling techniques. A useful scientific software design tool must support these aspects of the model development process as well. In particular, it should propose and carry out tests of candidate models. It should analyze test results and identify models and parts of models that must be changed. It should determine what types of changes can potentially cure a given negative test result. It should organize candidate models, test data, and test results into a coherent record of the development process. Finally, it should exploit the development record for two purposes: (1) automatically determining the applicability of a scientific model to a given problem; (2) supporting revision of a scientific model to handle a new type of problem. Existing knowledge-based software design tools must be extended in order to provide these facilities.

  20. More than a name: Heterogeneity in characteristics of models of maternity care reported from the Australian Maternity Care Classification System validation study.

    PubMed

    Donnolley, Natasha R; Chambers, Georgina M; Butler-Henderson, Kerryn A; Chapman, Michael G; Sullivan, Elizabeth A

    2017-08-01

    Without a standard terminology to classify models of maternity care, it is problematic to compare and evaluate clinical outcomes across different models. The Maternity Care Classification System is a novel system developed in Australia to classify models of maternity care based on their characteristics and an overarching broad model descriptor (Major Model Category). This study aimed to assess the extent of variability in the defining characteristics of models of care grouped to the same Major Model Category, using the Maternity Care Classification System. All public hospital maternity services in New South Wales, Australia, were invited to complete a web-based survey classifying two local models of care using the Maternity Care Classification System. A descriptive analysis of the variation in 15 attributes of models of care was conducted to evaluate the level of heterogeneity within and across Major Model Categories. Sixty-nine out of seventy hospitals responded, classifying 129 models of care. There was wide variation in a number of important attributes of models classified to the same Major Model Category. The category of 'Public hospital maternity care' contained the most variation across all characteristics. This study demonstrated that although models of care can be grouped into a distinct set of Major Model Categories, there are significant variations in models of the same type. This could result in seemingly 'like' models of care being incorrectly compared if grouped only by the Major Model Category. Copyright © 2017 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

Top