Addressing Students' Difficulties with Faraday's Law: A Guided Problem Solving Approach
ERIC Educational Resources Information Center
Zuza, Kristina; Almudí, José-Manuel; Leniz, Ane; Guisasola, Jenaro
2014-01-01
In traditional teaching, the fundamental concepts of electromagnetic induction are usually quickly analyzed, spending most of the time solving problems in a more or less rote manner. However, physics education research has shown that the fundamental concepts of the electromagnetic induction theory are barely understood by students. This article…
NASA Astrophysics Data System (ADS)
Levin, Alan R.; Zhang, Deyin; Polizzi, Eric
2012-11-01
In a recent article Polizzi (2009) [15], the FEAST algorithm has been presented as a general purpose eigenvalue solver which is ideally suited for addressing the numerical challenges in electronic structure calculations. Here, FEAST is presented beyond the “black-box” solver as a fundamental modeling framework which can naturally address the original numerical complexity of the electronic structure problem as formulated by Slater in 1937 [3]. The non-linear eigenvalue problem arising from the muffin-tin decomposition of the real-space domain is first derived and then reformulated to be solved exactly within the FEAST framework. This new framework is presented as a fundamental and practical solution for performing both accurate and scalable electronic structure calculations, bypassing the various issues of using traditional approaches such as linearization and pseudopotential techniques. A finite element implementation of this FEAST framework along with simulation results for various molecular systems is also presented and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iliev, Metodi
The goals of this project are to identify fundamental and practical problems and features with SiPMs as they relate to IAEA detector needs, Identify published results and implementations of scintillation detectors tat use SiPMs that are of interest to IAEA, asses how effectively the fundamental problems were addresses, and perform simulations and experiments as needed to reproduce crucial results and make recommendations.
ERIC Educational Resources Information Center
Walker, Decker F.
This paper addresses the reasons that it is difficult to find good educational software and proposes measures for coping with this problem. The fundamental problem is a shortange of educational software that can be used as a major part of the teaching of academic subjects in elementary and secondary schools--a shortage that is both the effect and…
Devouring the Other: Democracy in Music Education
ERIC Educational Resources Information Center
Gould, Elizabeth
2008-01-01
In this essay, the author builds on Val Plumwood's (1993, p. 192) notion of "devouring the other" to address fundamental problems of social justice and difference in liberal democracies and music education. The problem with liberal democracies is that they assimilate (devour) difference; consensual treatment of its citizens is predicated on the…
The Applicability of Market-Based Solutions to Public Sector Problems.
ERIC Educational Resources Information Center
Kelley, Carolyn
This paper examines the ways in which private- and public-sector location affects organizational structure and functions, and the implications for school reform. It identifies the differences that are often overlooked when policymakers utilize market-based organizational reform models to address public school problems. Two fundamental questions…
Residualization is not the answer: Rethinking how to address multicollinearity.
York, Richard
2012-11-01
Here I show that a commonly used procedure to address problems stemming from collinearity and multicollinearity among independent variables in regression analysis, "residualization", leads to biased coefficient and standard error estimates and does not address the fundamental problem of collinearity, which is a lack of information. I demonstrate this using visual representations of collinearity, hypothetical experimental designs, and analyses of both artificial and real world data. I conclude by noting the importance of examining methodological practices to ensure that their validity can be established based on rational criteria. Copyright © 2012 Elsevier Inc. All rights reserved.
Temporal planning for transportation planning and scheduling
NASA Technical Reports Server (NTRS)
Frederking, Robert E.; Muscettola, Nicola
1992-01-01
In this paper we describe preliminary work done in the CORTES project, applying the Heuristic Scheduling Testbed System (HSTS) to a transportation planning and scheduling domain. First, we describe in more detail the transportation problems that we are addressing. We then describe the fundamental characteristics of HSTS and we concentrate on the representation of multiple capacity resources. We continue with a more detailed description of the transportation planning problem that we have initially addressed in HSTS and of its solution. Finally we describe future directions for our research.
3D Data Acquisition Platform for Human Activity Understanding
2016-03-02
address fundamental research problems of representation and invariant description of3D data, human motion modeling and applications of human activity analysis, and computational optimization of large-scale 3D data.
3D Data Acquisition Platform for Human Activity Understanding
2016-03-02
address fundamental research problems of representation and invariant description of 3D data, human motion modeling and applications of human activity analysis, and computational optimization of large-scale 3D data.
MEVTV study: Early tectonic evolution of Mars: Crustal dichotomy to Valles Marineris
NASA Technical Reports Server (NTRS)
Frey, Herbert V.; Schultz, Richard A.
1990-01-01
Several fundamental problems were addressed in the early impact, tectonic, and volcanic evolution of the martian lithosphere: (1) origin and evolution of the fundamental crustal dichotomy, including development of the highland/lowland transition zone; (2) growth and evolution of the Valles Marineris; and (3) nature and role of major resurfacing events in early martian history. The results in these areas are briefly summarized.
Conceptual Problems in the Foundations of Mechanics
ERIC Educational Resources Information Center
Coelho, Ricardo Lopes
2012-01-01
There has been much research on principles and fundamental concepts of mechanics. Problems concerning the law of inertia, the concepts of force, fictitious force, weight, mass and the distinction between inertial and gravitational mass are addressed in the first part of the present paper. It is argued in the second that the law of inertia is the…
The report summarizes information on how bilding systems -- especially the heating, ventilating, and air-conditioning (HVAC) system -- inclurence radon entry into large buildings and can be used to mitigate radon problems. It addresses the fundamentals of large building HVAC syst...
Cheyenne/Laramie County MX Impact Human Service System Refinements Project. Refinements Manual
1986-01-01
following are but four of many possible examples ofthese types of questions. A. Assume that your agency has decided to address the problem of hunger . Should...they do not represent a long-term solution to problems . Conversely, community problem solving and attempts to bring about fundamental changes may be...are victims of acts of violence in the home... Problem solving approaches include education, the provision of food and temporary shel~er, counseling
Commuter choice managers and parking managers coordination
DOT National Transportation Integrated Search
2002-11-01
Shared use park and ride represents a unique approach for addressing parking problems, and can offer substantial savings in land and development costs. One of the fundamental factors that determines the success of this approach is the level of coordi...
ERIC Educational Resources Information Center
Michaelsen, Larry K.; Davidson, Neil; Major, Claire Howell
2014-01-01
The authors address three questions: (1) What are the foundational practices of team-based learning (TBL)? (2) What are the fundamental principles underlying TBL's foundational practices? and (3) In what ways are TBL's foundational practices similar to and/or different from the practices employed by problem-based learning (PBL) and…
Education in Environmental Chemistry: Setting the Agenda and Recommending Action
ERIC Educational Resources Information Center
Zoller, Uri
2005-01-01
The effective utilization of Education in Environmental Chemistry (EEC) in addressing global and societal environmental problems requires integration between educational, technical, financial, ethical and societal considerations. An interdisciplinary approach is fundamental to efforts to achieve long-term solutions.
Policy Board Proposals Ignore Real Problems.
ERIC Educational Resources Information Center
Hawley, Willis D.
1989-01-01
The recent National Policy Board for Educational Administration report ("Improving the Preparation of School Administrators: An Agenda for Reform") does not address fundamental questions or make convincing proposals concerning the preparation of school administrators. The report's nine overall recommendations for improving school administration…
ERIC Educational Resources Information Center
Drengenberg, Nicholas; Bain, Alan
2017-01-01
This paper addresses the wicked problem of measuring the productivity of learning and teaching in higher education. We show how fundamental validity issues and difficulties identified in educational productivity research point to the need for a qualitatively different framework when considering the entire question. We describe the work that needs…
Explicit Low-Thrust Guidance for Reference Orbit Targeting
NASA Technical Reports Server (NTRS)
Lam, Try; Udwadia, Firdaus E.
2013-01-01
The problem of a low-thrust spacecraft controlled to a reference orbit is addressed in this paper. A simple and explicit low-thrust guidance scheme with constrained thrust magnitude is developed by combining the fundamental equations of motion for constrained systems from analytical dynamics with a Lyapunov-based method. Examples are given for a spacecraft controlled to a reference trajectory in the circular restricted three body problem.
Fundamental solutions to the bioheat equation and their application to magnetic fluid hyperthermia.
Giordano, Mauricio A; Gutierrez, Gustavo; Rinaldi, Carlos
2010-01-01
Methods of predicting temperature profiles during local hyperthermia treatment are very important to avoid damage to healthy tissue. With this aim, fundamental solutions of Pennes' bioheat equation are derived in rectangular, cylindrical, and spherical coordinates. The medium is idealised as isotropic with effective thermal properties. Temperature distributions due to space- and time-dependent heat sources are obtained by the solution method presented. Applications of the fundamental solutions are addressed with emphasis on a particular problem of Magnetic Fluid Hyperthermia (MFH) consisting of a thin shell of magnetic nanoparticles in the outer surface of a spherical solid tumour. It is observed from the solution of this particular problem that the temperature profiles are strongly dependent on the distribution of the magnetic nanoparticles within the tissue. An almost uniform temperature profile is obtained inside the tumour with little penetration of therapeutic temperatures to the outer region of healthy tissue. The fundamental solutions obtained can be used to develop boundary element methods to predict temperature profiles with more complicated geometries.
Wrestling with Equity: Reauthorization of IDEA.
ERIC Educational Resources Information Center
Mead, Julie Fisher
1997-01-01
Explores six proposed changes and the controversies that have stalled Congress's reauthorization of the Individuals with Disabilities Education Act. Although IDEA's fundamental characteristics will remain unchanged, there is likely to be an increased focus on outcomes, an augmented appeals process, and provisions addressing discipline problems.…
Crustal evolution of the early earth: The role of major impacts
NASA Technical Reports Server (NTRS)
Frey, H.
1979-01-01
The role of major impact basins (such as those which formed on the moon before 4 billion years ago) is examined to determine the effects of such impacts on the early crustal evolution of the earth. Specifically addressed is the fundamental problem of what is the origin of the earth's fundamental crustal dichotomy of low density continental and high density oceanic crust and its relationship to the superficially similar highlands/maria crustal dichotomies of the moon, Mercury and Mars.
ERIC Educational Resources Information Center
National Catholic Rural Life Conference, Des Moines, IA.
Written in 1939, this book outlines fundamental Catholic principles and policies that address problems associated with the agricultural system and rural living during the early 20th century. The manifesto was derived from Catholic social philosophy and espouses the benefits of an occupation in agriculture, including the development of private…
Nexus: Intellectual Capital--The Most Strategic Asset.
ERIC Educational Resources Information Center
Kirk, Camille M.
2000-01-01
Discusses the importance of a higher education institution's intellectual capital as a strategic asset in long-range planning. Addresses problems of part-time and graduate student instructors in the context of teaching quality as the institution's fundamental mission. Suggests that tenure encourages research, builds institutional strength, and…
Creationism as Science: What Every Teacher-Scientist Should Know.
ERIC Educational Resources Information Center
Gatzke, Ken W.
1985-01-01
Addresses philosophical problems of the evolution/creationism debate (including underlying assumptions of creationism and nature of science), suggesting that creationism cannot be presented as science in science courses because it fails to qualify as a science. Prediction and explanation, absolute creationism, and a fundamental difficulty in…
NASA Astrophysics Data System (ADS)
Stoller, Martin Reid
Rhetoric, in the Aristotelian sense of "the available means of persuasion," is a crucial, often determining component of the process of making public policy generally, and environmental policy specifically. Environmental crises which have been addressed by the governmental, industrial, and social policy -making establishments have tended to be treated in a manner similar to that in which social, political, economic, military, and other problems have been commonly treated, utilizing a traditional rhetoric, including long-proven persuasive language and arguments. Such problems as air pollution and water pollution have been, to some degree, successfully addressed in this manner. A new and fundamentally different cluster of environmental problems has recently been recognized by elements of the policy making establishment as a legitimate candidate for consideration and policy formation. These environmental problems differ from the more familiar type in a variety of ways, each of which, to a greater or lesser degree, make problematic for those activists concerned with these crises the production of an effective crisis-oriented rhetoric. This study addresses two such closely related phenomena, the Greenhouse Effect and ozone depletion, and identifies those characteristics which contribute to their rhetorical complexity. Using traditional techniques of rhetorical examination, primarily neo-Aristotelian analysis, this study demonstrates the inadequacy of current crisis-oriented rhetoric, and identifies the causes of this rhetorical ineffectiveness. The study concludes that the mediation of such crises as the Greenhouse Effect and ozone depletion cannot be significantly facilitated by traditional environmental-oriented rhetoric, and may in fact be hindered by the use of rhetoric associated with fundamentally different (i.e., easier to solve) environmental problems.
Toxic red tides and harmful algal blooms: A practical challenge in coastal oceanography
NASA Astrophysics Data System (ADS)
Anderson, Donald M.
1995-07-01
The debate over the relative value of practical or applied versus fundamental research has heated up considerably in recent years, and oceanography has not been spared this re-evaluation of science funding policy. Some federal agencies with marine interests have always focused their resources on practical problems, but those with a traditional commitment to basic research such as the National Science Foundation have increasingly had to fight to maintain their freedom to fund quality science without regard to practical or commercial applications. Within this context, it is instructive to highlight the extent to which certain scientific programs can satisfy both sides of this policy dilemma—i.e. address important societal issues through advances in fundamental or basic research. One clear oceanographic example of such a program involves the phenomena called "red tides" or "harmful algal blooms". This paper describes the nature and extent of the problems caused by these outbreaks, emphasizing the alarming expansion in their incidence and their impacts in recent years, both in the U.S. and worldwide. The objective is to highlight fundamental physical, biological, and chemical oceanographic question that must be addressed if we are to achieve the practical goal of scientifically based management of fisheries resources, public health, and ecosystem health in regions threatened by toxic and harmful algae.
Open Educational Resources: Education for the World?
ERIC Educational Resources Information Center
Richter, Thomas; McPherson, Maggie
2012-01-01
Education is widely seen as an important means of addressing both national and international problems, such as political or religious extremism, poverty, and hunger. However, if developing countries are to become societies that can compete properly with Western industrialized countries, not only is a fundamental shift in thinking with regard to…
Design in Four Diagnostic Language Assessments
ERIC Educational Resources Information Center
Cumming, Alister
2015-01-01
The studies documented in the four articles in this special issue uniquely exemplify principles of design-based research as follows: by taking innovative approaches to significant problems in the contexts of real educational practices; by addressing fundamental pedagogical and policy issues related to language, learning, and teaching; and, in the…
Youth in the Workplace. The Dynamics of Learner Needs and Work Roles. Summary.
ERIC Educational Resources Information Center
Miguel, Richard J.
A two-year study addressed the problem of developing a typology of experiential education programs theoretically based and empirically tested that could guide systematic research on questions fundamental to workplace-based experiential education programs. The research question focused on was, "Can experiential education programs be classified…
ERIC Educational Resources Information Center
Edgecomb, Philip L.; Shapiro, Marion
Addressed to vocational, or academic middle or high school students, this book reviews mathematics fundamentals using metric units of measurement. It utilizes a common-sense approach to the degree of accuracy needed in solving actual trade and every-day problems. Stress is placed on reading off metric measurements from a ruler or tape, and on…
The Powers That Be: Environmental Education and the Transcendent
ERIC Educational Resources Information Center
Bonnett, Michael
2015-01-01
This paper argues that with regard to addressing the potentially catastrophic environmental problems recognized by many as now confronting us, the most fundamental disaster that threatens is a deep-seated and increasing inability in Western style societies to think properly about the issues involved. The highly anthropocentric motives embedded in…
NASA Astrophysics Data System (ADS)
Head, J. W.; Pieters, C. M.; Scott, D. R.
2018-02-01
We outline an Orientale Basin Human/Robotic Architecture that can be facilitated by a Deep Space Gateway International Science Operations Center (DSG-ISOC) (like McMurdo/Antarctica) to address fundamental scientific problems about the Moon and Mars.
Exploring New Physics Frontiers Through Numerical Relativity.
Cardoso, Vitor; Gualtieri, Leonardo; Herdeiro, Carlos; Sperhake, Ulrich
2015-01-01
The demand to obtain answers to highly complex problems within strong-field gravity has been met with significant progress in the numerical solution of Einstein's equations - along with some spectacular results - in various setups. We review techniques for solving Einstein's equations in generic spacetimes, focusing on fully nonlinear evolutions but also on how to benchmark those results with perturbative approaches. The results address problems in high-energy physics, holography, mathematical physics, fundamental physics, astrophysics and cosmology.
Emergence of Fundamental Limits in Spatially Distributed Dynamical Networks and Their Tradeoffs
2017-05-01
It is shown that the resulting non -convex optimization problem can be equivalently reformulated into a rank-constrained problem. We then...display a current ly valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM- YYYY) ,2. REPORT TYPE 3...robustness in distributed control and dynamical systems. Our research re- sults are highly relevant for analysis and synthesis of engineered and natural
A Perspective of the Science and Mission Challenges in Aeronomy
NASA Technical Reports Server (NTRS)
Spann, James F.
2010-01-01
There are significant fundamental problems for which aeronomy can provide solutions and a critical role in applied science and space weather that only aeronomy can address. Examples of unresolved problems include the interaction of neutral and charged, the role of mass and energy transfer across Earth's interface with space, and the predictability of ionospheric density and composition variability. These and other problems impact the productivity of space assets and thus have a tangible applied dimension. This talk will explore open science problems and barriers to potential mission solutions in an era of constrained resources.
NASA Astrophysics Data System (ADS)
Turyshev, S. G.
2009-01-01
Einstein's general theory of relativity is the standard theory of gravity, especially where the needs of astronomy, astrophysics, cosmology, and fundamental physics are concerned. As such, this theory is used for many practical purposes involving spacecraft navigation, geodesy, and time transfer. We review the foundations of general relativity, discuss recent progress in tests of relativistic gravity, and present motivations for the new generation of high-accuracy tests of new physics beyond general relativity. Space-based experiments in fundamental physics are presently capable of uniquely addressing important questions related to the fundamental laws of nature. We discuss the advances in our understanding of fundamental physics that are anticipated in the near future and evaluate the discovery potential of a number of recently proposed space-based gravitational experiments.
Brains in Jars: The Problem of Language in Neuroscientific Research
ERIC Educational Resources Information Center
Scott, Jessica A.; Curran, Christopher M.
2010-01-01
Neuroscience is a rapidly expanding scientific field, and its influence on our perceptions of fundamental aspects of human life is becoming widespread, particularly in the social and behavioral sciences. This influence has many philosophical implications, only one of which will be addressed in this article. For many centuries, philosophers have…
A Case Study of the Perceptions of Secondary School Counselors Regarding Cyberbullying
ERIC Educational Resources Information Center
King, Angela Anne Adair
2014-01-01
Cyberbullying is a relatively new phenomenon, and research in the area has been limited. Many of the fundamental concepts and definitions associated with cyberbullying are still being developed. The problem addressed was the difficulty school counselors have in ascertaining the extent of cyberbullying, in identifying incidents of cyberbullying,…
Health, Physical Education, Recreation, and Dance for the Older Adult: A Modular Approach.
ERIC Educational Resources Information Center
American Alliance for Health, Physical Education, Recreation and Dance (AAHPERD).
This book is addressed to the teacher of health, physical education, recreation, and dance courses for older adults. The first section provides the foundation for understanding gerontology. It includes fundamental concepts within the areas of sociological, physiological, and psychological aspects of aging, health problems, and nutritional status…
ERIC Educational Resources Information Center
Dhar, Vasant
1998-01-01
Shows how counterfactuals and machine learning methods can be used to guide exploration of large databases that addresses some of the fundamental problems that organizations face in learning from data. Discusses data mining, particularly in the financial arena; generating useful knowledge from data; and the evaluation of counterfactuals. (LRW)
An Arduino-Based Experiment Designed to Clarify the Transition to Total Internal Reflection
ERIC Educational Resources Information Center
Atkin, Keith
2018-01-01
The topic of refraction and reflection of light at the boundary of transparent media is a fundamentally important one. The special case of total internal reflection is however commonly misrepresented in elementary textbooks. This paper addresses the problem and describes an experimental procedure for measuring and displaying reflected and…
Revitalizing Adversary Evaluation: Deep Dark Deficits or Muddled Mistaken Musings
ERIC Educational Resources Information Center
Thurston, Paul
1978-01-01
The adversary evaluation model consists of utilizing the judicial process as a metaphor for educational evaluation. In this article, previous criticism of the model is addressed and its fundamental problems are detailed. It is speculated that the model could be improved by borrowing ideas from other legal forms of inquiry. (Author/GC)
Shao, Jing-jing; Yu, Jing-jin; Yu, Ming-zhu; Duan, Yong; Gong, Xiangguang; Chen, Zheng; Wang, Hua; Shi, Peiwu; Liang, Zhankai; Yang, Feng; Wang, Dunzhi; Yue, Jianning; Luo, Shi; Luo, Li; Wang, Weicheng; Wang, Ying; Sun, Mei; Su, Zhongxin; Ma, Ning; Xie, Hongbin; Hao, Mo
2005-03-01
To develop and demonstrate the strategies to solve the problem of public health service delivery insufficiency of disease prevention and control system of China. 205 literatures in 8 national academic journals concerning health service management have been reviewed. The method of boundary analysis has been employed to conclude the various reform strategies. Based on the causes and mechanism of public health service delivery insufficiency of disease prevention and control system, the logic analysis has been employed to develop fundamental strategies, which has been demonstrated by 154 CDC using intention questionnaires. There are fundamental strategies to which the agreeing rate for sampling CDC was over 95%: to make sure government should afford the financing function of disease prevention and control and secure the feasible investment for centers of disease prevention and control. Meanwhile, the working efficiency of CDC should be improved through strengthening management and reforming government investing manner.
Information needs related to extension service and community outreach.
Bottcher, Robert W
2003-06-01
Air quality affects everyone. Some people are affected by air quality impacts, regulations, and technological developments in several ways. Stakeholders include the medical community, ecologists, government regulators, industries, technology providers, academic professionals, concerned citizens, the news media, and elected officials. Each of these groups may perceive problems and opportunities differently, but all need access to information as it is developed. The diversity and complexity of air quality problems contribute to the challenges faced by extension and outreach professionals who must communicate with stakeholders having diverse backgrounds. Gases, particulates, biological aerosols, pathogens, and odors all require expensive and relatively complex technology to measure and control. Economic constraints affect the ability of regulators and others to measure air quality, and industry and others to control it. To address these challenges, while communicating air quality research results and concepts to stakeholders, three areas of information needs are evident. (1) A basic understanding of the fundamental concepts regarding air pollutants and their measurement and control is needed by all stakeholders; the Extension Specialist, to be effective, must help people move some distance up the learning curve. (2) Each problem or set of problems must be reasonably well defined since comprehensive solution of all problems simultaneously may not be feasible; for instance, the solution of an odor problem associated with animal production may not address atmospheric effects due to ammonia emissions. (3) The integrity of the communication process must be preserved by avoiding prejudice and protectionism; although stakeholders may seek to modify information to enhance their interests, extension and outreach professionals must be willing to present unwelcome information or admit to a lack of information. A solid grounding in fundamental concepts, careful and fair problem definition, and a resolute commitment to integrity and credibility will enable effective communication of air quality information to and among diverse stakeholders.
The National Security Strategy Under the United Nations and International Law
2004-03-19
a result of that war." This was addressed in 1951 by Hans Kelsen in a legal analysis of fundamental problems with the UN Charter. He concluded that...www.zmag.org/content/print_article.cfm>; Internet; accessed 31 January 2004. 36 Charter of the United Nations, Article 107. 37 Kearly, 27–28. 38 Hans Kelsen
Ecological Democracy: An Environmental Approach to Citizenship Education
ERIC Educational Resources Information Center
Houser, Neil O.
2009-01-01
Civic educators strive to develop the kinds of citizens who can identify and address the significant challenges of life in society. A case can be made that we have failed in this fundamental task. In spite of our efforts, contemporary societies seem ill-equipped to cope with the enormous social and environmental issues of our age. The problem is…
Building systemic capacity for nutrition: training towards a professionalised workforce for Africa.
Ellahi, Basma; Annan, Reginald; Sarkar, Swrajit; Amuna, Paul; Jackson, Alan A
2015-11-01
The fundamental role played by good nutrition in enabling personal, social and economic development is now widely recognised as presenting a fundamental global challenge that has to be addressed if major national and international problems are to be resolved in the coming decades. The recent focus provided by the Millennium Development Goals and the Scaling-Up-Nutrition (SUN) movement has been towards reducing the extent of nutrition-related malnutrition in high-burden countries. This has served to emphasise that there is a problem of inadequate professional capacity in nutrition that is sufficiently widespread to severely limit all attempts at the effective delivery and sustainability of nutrition-related and nutrition-enabling interventions that have impact at scale. Many high-burden countries are in sub-Saharan Africa where there is a high dependency on external technical support to address nutrition-related problems. We have sought to explore the nature and magnitude of the capacity needs with a particular focus on achieving levels of competency within standardised professional pre-service training which is fit-for-purpose to meet the objectives within the SUN movement in Africa. We review our experience of engaging with stakeholders through workshops, a gap analysis of the extent of the problem to be addressed, and a review of current efforts in Africa to move the agenda forward. We conclude that there are high aspirations but severely limited human resource and capacity for training that is fit-for-purpose at all skill levels in nutrition-related subjects in Africa. There are no structured or collaborative plans within professional groups to address the wide gap between what is currently available, the ongoing needs and the future expectations for meeting local technical and professional capability. Programmatic initiatives encouraged by agencies and other external players, will need to be matched by improved local capabilities to address the serious efforts required to meet the needs for sustained improvements related to SUN in high-burden countries. Importantly, there are pockets of effort which need to be encouraged within a context in which experience can be shared and mutual support provided.
Fundamental organometallic reactions: Applications on the CYBER 205
NASA Technical Reports Server (NTRS)
Rappe, A. K.
1984-01-01
Two of the most challenging problems of Organometallic chemistry (loosely defined) are pollution control with the large space velocities needed and nitrogen fixation, a process so capably done by nature and so relatively poorly done by man (industry). For a computational chemist these problems are on the fringe of what is possible with conventional computers (large models needed and accurate energetics required). A summary of the algorithmic modification needed to address these problems on a vector processor such as the CYBER 205 and a sketch of findings to date on deNOx catalysis and nitrogen fixation are presented.
ERIC Educational Resources Information Center
Bajrektarevic, Anis
2013-01-01
From Rio to Rio with Kyoto, Copenhagen and Durban in between, the conclusion remains the same: we fundamentally disagree on realities of this planet and the ways we can address them. A decisive breakthrough would necessitate both wider contexts and a larger participatory base so as to identify problems, formulate policies, and broaden and…
ERIC Educational Resources Information Center
Crowley, Jocelyn Elise; Roff, Brian H.; Lynch, Jeneve
2007-01-01
Understanding the behaviors and attitudes of at-risk populations is fundamental to controlling the spread of HIV, the virus that causes AIDS. The problem of nonresponse among these populations, however, plagues survey research designed to address these issues. Previous work undertaken to map out the dynamics of nonresponse--both noncontacts and…
Modeling of high speed chemically reacting flow-fields
NASA Technical Reports Server (NTRS)
Drummond, J. P.; Carpenter, Mark H.; Kamath, H.
1989-01-01
The SPARK3D and SPARK3D-PNS computer programs were developed to model 3-D supersonic, chemically reacting flow-fields. The SPARK3D code is a full Navier-Stokes solver, and is suitable for use in scramjet combustors and other regions where recirculation may be present. The SPARK3D-PNS is a parabolized Navier-Stokes solver and provides an efficient means of calculating steady-state combustor far-fields and nozzles. Each code has a generalized chemistry package, making modeling of any chemically reacting flow possible. Research activities by the Langley group range from addressing fundamental theoretical issues to simulating problems of practical importance. Algorithmic development includes work on higher order and upwind spatial difference schemes. Direct numerical simulations employ these algorithms to address the fundamental issues of flow stability and transition, and the chemical reaction of supersonic mixing layers and jets. It is believed that this work will lend greater insight into phenomenological model development for simulating supersonic chemically reacting flows in practical combustors. Currently, the SPARK3D and SPARK3D-PNS codes are used to study problems of engineering interest, including various injector designs and 3-D combustor-nozzle configurations. Examples, which demonstrate the capabilities of each code are presented.
Integrative endeavor for renaissance in Ayurveda
Raut, Ashwinikumar A.
2011-01-01
Currently western medicine has assumed the central position in mainstream global healthcare. Openness to learn from contemporary disciplines of basic sciences, application of modern technology and further adoption of the evidence-based approach has helped western medicine gain its currently acknowledged position as mainstream modern medicine. Modern medicine has further developed forms of integrative medicine by developing interfaces with other systems of medicine, including traditional, complementary and alternative medicine. However, these developments do not seem to address all the problems facing global health care caused by overemphasis on pharmaco-therapeutic drug developments. On the other hand, Ayurveda which is founded on genuine fundamentals, has the longest uninterrupted tradition of healthcare practice, and its holistic approach to healthcare management emphasizes disease prevention and health promotion; if it opens up to incorporate emerging new knowledge into mainstream Ayurveda, and maintains fidelity to Ayurveda fundamentals, it will certainly provide a broad-based opportunity to address the majority of the problems that have emerged from global healthcare requirements. To bring these solutions to bear, however, it will be necessary to progress from the present “utilitarian ethos” to a “unifying ethos” for realization of medical integration. PMID:21731380
Control of Flexible Structures (COFS) Flight Experiment Background and Description
NASA Technical Reports Server (NTRS)
Hanks, B. R.
1985-01-01
A fundamental problem in designing and delivering large space structures to orbit is to provide sufficient structural stiffness and static configuration precision to meet performance requirements. These requirements are directly related to control requirements and the degree of control system sophistication available to supplement the as-built structure. Background and rationale are presented for a research study in structures, structural dynamics, and controls using a relatively large, flexible beam as a focus. This experiment would address fundamental problems applicable to large, flexible space structures in general and would involve a combination of ground tests, flight behavior prediction, and instrumented orbital tests. Intended to be multidisciplinary but basic within each discipline, the experiment should provide improved understanding and confidence in making design trades between structural conservatism and control system sophistication for meeting static shape and dynamic response/stability requirements. Quantitative results should be obtained for use in improving the validity of ground tests for verifying flight performance analyses.
Addressing the missing matter problem in galaxies through a new fundamental gravitational radius
DOE Office of Scientific and Technical Information (OSTI.GOV)
Capozziello, S.; Jovanović, P.; Jovanović, V. Borka
We demonstrate that the existence of a Noether symmetry in f ( R ) theories of gravity gives rise to a further gravitational radius, besides the standard Schwarzschild one, determining the dynamics at galactic scales. By this feature, it is possible to explain the baryonic Tully-Fisher relation and the rotation curve of gas-rich galaxies without the dark matter hypothesis.
An Arduino-based experiment designed to clarify the transition to total internal reflection
NASA Astrophysics Data System (ADS)
Atkin, Keith
2018-03-01
The topic of refraction and reflection of light at the boundary of transparent media is a fundamentally important one. The special case of total internal reflection is however commonly misrepresented in elementary textbooks. This paper addresses the problem and describes an experimental procedure for measuring and displaying reflected and transmitted light intensities using readily available components and the Arduino microcontroller.
3D Data Acquisition Platform for Human Activity Understanding
2016-03-02
3D data. The support for the acquisition of such research instrumentation have significantly facilitated our current and future research and educate ...SECURITY CLASSIFICATION OF: In this project, we incorporated motion capture devices, 3D vision sensors, and EMG sensors to cross validate...multimodality data acquisition, and address fundamental research problems of representation and invariant description of 3D data, human motion modeling and
Prevention of the Post-traumatic Fibrotic Response in Joints
2014-10-01
an experimental model in mice. The American journal of forensic medicine and pathology . 1988; 9(4):310-2. 14 APPENDICES: An abstract...ongoing study addresses the critical clinical problem of posttraumatic joint stiffness, a pathology that reduces the range of motion (ROM) of injured...joints and contributes to the development of osteoarthritis. The fundamental hypothesis that drives the current study is that pathological fibrotic
Value propositions of mHealth projects.
Gorski, Irena; Bram, Joshua T; Sutermaster, Staci; Eckman, Molly; Mehta, Khanjan
While mHealth holds great potential for addressing global health disparities, a majority of the initiatives never proceed beyond the pilot stage. One fundamental concern is that mHealth projects are seldom designed from the customer's perspective to address their specific problems and/or create appreciable value. A customer-centric view, where direct tangible benefits of interventions are identified and communicated effectively, can drive customer engagement and advance projects toward self-sustaining business models. This article reviews the business models of 234 mHealth projects to identify nine distinct value propositions that solve specific problems for customers. Each of these value propositions is discussed with real-world examples, analyses of their design approaches and business strategies, and common enablers as well as hurdles to surviving past the pilot stage. Furthermore, a deeper analysis of 42 mHealth ventures that have achieved self-sustainability through project revenue provides a host of practical and poignant insights into the design of systems that can fulfil mHealth's promise to address healthcare challenges in the long term.
NASA flight cell and battery issues
NASA Technical Reports Server (NTRS)
Schulze, N. R.
1989-01-01
The author presents the important battery and cell problems, encompassing both test failures and accidents, which were encountered during the past year. Practical issues facing programs, which have to be considered in the development of a battery program strategy, are addressed. The problems of one program, the GRO (Gamma Ray Observatory), during the past year are focused on to illustrate the fundamental types of battery problems that occur. Problems encountered by other programs are briefly mentioned to complete the accounting. Two major categories of issues are defined, namely, whose which are quality and design related, i.e., problems having inherent manufacturing-process-related aspects with an impact on cell reliability, and these which are accident triggered or man induced, i.e., those operational issues having an impact on battery and cell reliability.
Varieties of second modernity: the cosmopolitan turn in social and political theory and research.
Beck, Ulrich; Grande, Edgar
2010-09-01
The theme of this special issue is the necessity of a cosmopolitan turn in social and political theory. The question at the heart of this introductory chapter takes the challenge of 'methodological cosmopolitanism', already addressed in a Special Issue on Cosmopolitan Sociology in this journal (Beck and Sznaider 2006), an important step further: How can social and political theory be opened up, theoretically as well as methodologically and normatively, to a historically new, entangled Modernity which threatens its own foundations? How can it account for the fundamental fragility, the mutability of societal dynamics (of unintended side effects, domination and power), shaped by the globalization of capital and risks at the beginning of the twenty-first century? What theoretical and methodological problems arise and how can they be addressed in empirical research? In the following, we will develop this 'cosmopolitan turn' in four steps: firstly, we present the major conceptual tools for a theory of cosmopolitan modernities; secondly, we de-construct Western modernity by using examples taken from research on individualization and risk; thirdly, we address the key problem of methodological cosmopolitanism, namely the problem of defining the appropriate unit of analysis; and finally,we discuss normative questions, perspectives, and dilemmas of a theory of cosmopolitan modernities, in particular problems of political agency and prospects of political realization.
Can compactifications solve the cosmological constant problem?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hertzberg, Mark P.; Center for Theoretical Physics, Department of Physics,Massachusetts Institute of Technology,77 Massachusetts Ave, Cambridge, MA 02139; Masoumi, Ali
2016-06-30
Recently, there have been claims in the literature that the cosmological constant problem can be dynamically solved by specific compactifications of gravity from higher-dimensional toy models. These models have the novel feature that in the four-dimensional theory, the cosmological constant Λ is much smaller than the Planck density and in fact accumulates at Λ=0. Here we show that while these are very interesting models, they do not properly address the real cosmological constant problem. As we explain, the real problem is not simply to obtain Λ that is small in Planck units in a toy model, but to explain whymore » Λ is much smaller than other mass scales (and combinations of scales) in the theory. Instead, in these toy models, all other particle mass scales have been either removed or sent to zero, thus ignoring the real problem. To this end, we provide a general argument that the included moduli masses are generically of order Hubble, so sending them to zero trivially sends the cosmological constant to zero. We also show that the fundamental Planck mass is being sent to zero, and so the central problem is trivially avoided by removing high energy physics altogether. On the other hand, by including various large mass scales from particle physics with a high fundamental Planck mass, one is faced with a real problem, whose only known solution involves accidental cancellations in a landscape.« less
Military Operations in Built-Up Areas (MOBA).
1979-01-01
jhe interim report dated 18 April 1977) "the most fundamental problem with Army C is the lack of an enforced, Systems Architecture/ Systems Engineering...materiel developer and the combat developer, data has not been collected or evaluated to adequately address this spectrum of system performance. Testing...within the MOB.A environment should be institutionalized for all systems as a standard, routine requirement. , Training for MOBA has been cursory at best
ERIC Educational Resources Information Center
Osman, Abdulaziz
2016-01-01
The purpose of this research study was to examine the unknown fears of embracing cloud computing which stretches across measurements like fear of change from leaders and the complexity of the technology in 9-1-1 dispatch centers in USA. The problem that was addressed in the study was that many 9-1-1 dispatch centers in USA are still using old…
Human factors for a sustainable future.
Thatcher, Andrew; Yeow, Paul H P
2016-11-01
Current human activities are seriously eroding the ability of natural and social systems to cope. Clearly we cannot continue along our current path without seriously damaging our own ability to survive as a species. This problem is usually framed as one of sustainability. As concerned professionals, citizens, and humans there is a strong collective will to address what we see as a failure to protect the natural and social environments that supports us. While acknowledging that we cannot do this alone, human factors and ergonomics needs to apply its relevant skills and knowledge to assist where it can in addressing the commonly identified problem areas. These problems include pollution, climate change, renewable energy, land transformation, and social unrest amongst numerous other emerging global problems. The issue of sustainability raises two fundamental questions for human factors and ergonomics: which system requires sustaining and what length of time is considered sustainable? In this paper we apply Wilson (2014) parent-sibling-child model to understanding what is required of an HFE sustainability response. This model is used to frame the papers that appear in this Special Issue. Copyright © 2016 Elsevier Ltd. All rights reserved.
Myths And Misconceptions About U.S. Health Insurance
Baicker, Katherine; Chandra, Amitabh
2009-01-01
Several myths about health insurance interfere with the diagnosis of problems in the current system and impede the development of productive reforms. Although many are built on a kernel of truth, complicated issues are often simplified to the point of being false or misleading. Several stem from the conflation of health, health care, and health insurance, while others attempt to use economic arguments to justify normative preferences. We apply a combination of economic principles and lessons from empirical research to examine the policy problems that underlie the myths and focus attention on addressing these fundamental challenges. PMID:18940834
Bridge to the future: nontraditional clinical settings, concepts and issues.
Faller, H S; Dowell, M A; Jackson, M A
1995-11-01
Healthcare restructuring in the wake of healthcare reform places greater emphasis on primary healthcare. Clinical education in acute care settings and existing community health agencies are not compatible with teaching basic concepts, principles and skills fundamental to nursing. Problems of clients in acute care settings are too complex and clients in the community are often too dispersed for necessary faculty support and supervision of beginning nursing students. Nontraditional learning settings offer the baccalaureate student the opportunity to practice fundamental skills of care and address professional skills of negotiation, assertiveness, organization, collaboration and leadership. An overview of faculty designed clinical learning experiences in nontraditional sites such as McDonald's restaurants, inner city churches, YWCA's, the campus community and homes are presented. The legal, ethical and academic issues associated with nontraditional learning settings are discussed in relation to individual empowerment, decision making and evaluation. Implications for the future address the role of the students and faculty as they interact with the community in which they live and practice.
Vavken, Patrick; Ganal-Antonio, Anne Kathleen B.; Quidde, Julia; Shen, Francis H.; Chapman, Jens R.; Samartzis, Dino
2015-01-01
Study Design A broad narrative review. Objectives Outcome assessment in spinal disorders is imperative to help monitor the safety and efficacy of the treatment in an effort to change the clinical practice and improve patient outcomes. The following article, part two of a two-part series, discusses the various outcome tools and instruments utilized to address spinal disorders and their management. Methods A thorough review of the peer-reviewed literature was performed, irrespective of language, addressing outcome research, instruments and tools, and applications. Results Numerous articles addressing the development and implementation of health-related quality-of-life, neck and low back pain, overall pain, spinal deformity, and other condition-specific outcome instruments have been reported. Their applications in the context of the clinical trial studies, the economic analyses, and overall evidence-based orthopedics have been noted. Additional issues regarding the problems and potential sources of bias utilizing outcomes scales and the concept of minimally clinically important difference were discussed. Conclusion Continuing research needs to assess the outcome instruments and tools used in the clinical outcome assessment for spinal disorders. Understanding the fundamental principles in spinal outcome assessment may also advance the field of “personalized spine care.” PMID:26225283
Conceptual, Methodological, and Ethical Problems in Communicating Uncertainty in Clinical Evidence
Han, Paul K. J.
2014-01-01
The communication of uncertainty in clinical evidence is an important endeavor that poses difficult conceptual, methodological, and ethical problems. Conceptual problems include logical paradoxes in the meaning of probability and “ambiguity”— second-order uncertainty arising from the lack of reliability, credibility, or adequacy of probability information. Methodological problems include questions about optimal methods for representing fundamental uncertainties and for communicating these uncertainties in clinical practice. Ethical problems include questions about whether communicating uncertainty enhances or diminishes patient autonomy and produces net benefits or harms. This article reviews the limited but growing literature on these problems and efforts to address them and identifies key areas of focus for future research. It is argued that the critical need moving forward is for greater conceptual clarity and consistent representational methods that make the meaning of various uncertainties understandable, and for clinical interventions to support patients in coping with uncertainty in decision making. PMID:23132891
Sanyal, Udishnu; Demirci, Umit B; Jagirdar, Balaji R; Miele, Philippe
2011-12-16
In today's era of energy crisis and global warming, hydrogen has been projected as a sustainable alternative to depleting CO(2)-emitting fossil fuels. However, its deployment as an energy source is impeded by many issues, one of the most important being storage. Chemical hydrogen storage materials, in particular B-N compounds such as ammonia borane, with a potential storage capacity of 19.6 wt % H(2) and 0.145 kg(H2)L(-1), have been intensively studied from the standpoint of addressing the storage issues. Ammonia borane undergoes dehydrogenation through hydrolysis at room temperature in the presence of a catalyst, but its practical implementation is hindered by several problems affecting all of the chemical compounds in the reaction scheme, including ammonia borane, water, borate byproducts, and hydrogen. In this Minireview, we exhaustively survey the state of the art, discuss the fundamental problems, and, where applicable, propose solutions with the prospect of technological applications. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Advances in numerical and applied mathematics
NASA Technical Reports Server (NTRS)
South, J. C., Jr. (Editor); Hussaini, M. Y. (Editor)
1986-01-01
This collection of papers covers some recent developments in numerical analysis and computational fluid dynamics. Some of these studies are of a fundamental nature. They address basic issues such as intermediate boundary conditions for approximate factorization schemes, existence and uniqueness of steady states for time dependent problems, and pitfalls of implicit time stepping. The other studies deal with modern numerical methods such as total variation diminishing schemes, higher order variants of vortex and particle methods, spectral multidomain techniques, and front tracking techniques. There is also a paper on adaptive grids. The fluid dynamics papers treat the classical problems of imcompressible flows in helically coiled pipes, vortex breakdown, and transonic flows.
Molten salt corrosion of SiC and Si3N4
NASA Technical Reports Server (NTRS)
Jacobson, N. S.; Smialek, J. L.; Fox, D. S.
1986-01-01
The most severe type of corrosion encountered in heat engines is corrosion by molten sodium sulfate, formed by the reaction of ingested sodium chloride and sulfur impurities in the fuel. This problem was studied extensively for superalloys, but only recently examined for ceramics. This problem is addressed with laboratory studies to understand the fundamental reaction mechanisms and with burner studies to provide a more realistic simulation of the conditions encountered in a heat engine. In addition the effect of corrosion on the strengths of these materials was assessed. Each of these aspects will be reviewed and some ideas toward possible solutions will be discussed.
Many-Worlds Interpretation of Quantum Theory and Mesoscopic Anthropic Principle
NASA Astrophysics Data System (ADS)
Kamenshchik, A. Yu.; Teryaev, O. V.
2008-10-01
We suggest to combine the Anthropic Principle with Many-Worlds Interpretation of Quantum Theory. Realizing the multiplicity of worlds it provides an opportunity of explanation of some important events which are assumed to be extremely improbable. The Mesoscopic Anthropic Principle suggested here is aimed to explain appearance of such events which are necessary for emergence of Life and Mind. It is complementary to Cosmological Anthropic Principle explaining the fine tuning of fundamental constants. We briefly discuss various possible applications of Mesoscopic Anthropic Principle including the Solar Eclipses and assembling of complex molecules. Besides, we address the problem of Time's Arrow in the framework of Many-World Interpretation. We suggest the recipe for disentangling of quantities defined by fundamental physical laws and by an anthropic selection.
2001-12-01
addition, the Defense Nuclear Facilities Safety Board warned in 1997 that, given likely future reductions in DOE’s budget, the department needed to...future leaders of the acquisition workforce. The Defense Nuclear Facilities Safety Board’s 2000 report credited DOE with taking steps to improve the...technical capabilities of personnel at its defense nuclear facilities , but pointed out the need for DOE’s leadership to pay increased attention to this
Computation of repetitions and regularities of biologically weighted sequences.
Christodoulakis, M; Iliopoulos, C; Mouchard, L; Perdikuri, K; Tsakalidis, A; Tsichlas, K
2006-01-01
Biological weighted sequences are used extensively in molecular biology as profiles for protein families, in the representation of binding sites and often for the representation of sequences produced by a shotgun sequencing strategy. In this paper, we address three fundamental problems in the area of biologically weighted sequences: (i) computation of repetitions, (ii) pattern matching, and (iii) computation of regularities. Our algorithms can be used as basic building blocks for more sophisticated algorithms applied on weighted sequences.
Inverse problems in complex material design: Applications to non-crystalline solids
NASA Astrophysics Data System (ADS)
Biswas, Parthapratim; Drabold, David; Elliott, Stephen
The design of complex amorphous materials is one of the fundamental problems in disordered condensed-matter science. While impressive developments of ab-initio simulation methods during the past several decades have brought tremendous success in understanding materials property from micro- to mesoscopic length scales, a major drawback is that they fail to incorporate existing knowledge of the materials in simulation methodologies. Since an essential feature of materials design is the synergy between experiment and theory, a properly developed approach to design materials should be able to exploit all available knowledge of the materials from measured experimental data. In this talk, we will address the design of complex disordered materials as an inverse problem involving experimental data and available empirical information. We show that the problem can be posed as a multi-objective non-convex optimization program, which can be addressed using a number of recently-developed bio-inspired global optimization techniques. In particular, we will discuss how a population-based stochastic search procedure can be used to determine the structure of non-crystalline solids (e.g. a-SiH, a-SiO2, amorphous graphene, and Fe and Ni clusters). The work is partially supported by NSF under Grant Nos. DMR 1507166 and 1507670.
NASA Astrophysics Data System (ADS)
Rodrigues, João Fabrício Mota; Coelho, Marco Túlio Pacheco; Ribeiro, Bruno R.
2018-04-01
Species distribution models (SDM) have been broadly used in ecology to address theoretical and practical problems. Currently, there are two main approaches to generate SDMs: (i) correlative, which is based on species occurrences and environmental predictor layers and (ii) process-based models, which are constructed based on species' functional traits and physiological tolerances. The distributions estimated by each approach are based on different components of species niche. Predictions of correlative models approach species realized niches, while predictions of process-based are more akin to species fundamental niche. Here, we integrated the predictions of fundamental and realized distributions of the freshwater turtle Trachemys dorbigni. Fundamental distribution was estimated using data of T. dorbigni's egg incubation temperature, and realized distribution was estimated using species occurrence records. Both types of distributions were estimated using the same regression approaches (logistic regression and support vector machines), both considering macroclimatic and microclimatic temperatures. The realized distribution of T. dorbigni was generally nested in its fundamental distribution reinforcing theoretical assumptions that the species' realized niche is a subset of its fundamental niche. Both modelling algorithms produced similar results but microtemperature generated better results than macrotemperature for the incubation model. Finally, our results reinforce the conclusion that species realized distributions are constrained by other factors other than just thermal tolerances.
Liebert, Wolfgang J
2013-12-01
In order to raise awareness of the ambiguous nature of scientific-technological progress, and of the challenging problems it raises, problems which are not easily addressed by courses in a single discipline and cannot be projected onto disciplinary curricula, Technical University of Darmstadt has established three interdisciplinary study concentrations: "Technology and International Development", "Environmental Sciences", and "Sustainable Shaping of Technology and Science". These three programmes seek to overcome the limitations of strictly disciplinary research and teaching by developing an integrated, problem-oriented approach. For example, one course considers fundamental nuclear dilemmas and uses role-playing techniques to address a controversy in the area of nuclear security. At the same time, incorporating interdisciplinary teaching into a university that is organized around mono- or multi-disciplinary faculties also poses a number of challenges. Recognition in disciplinary curricula, and appropriate organizational support and funding are examples of those challenges. It is expected that science and engineering students, empowered by such interdisciplinary study programmes, will be better prepared to act responsibly with regard to scientific and technological challenges.
Resource Allocation Algorithms for the Next Generation Cellular Networks
NASA Astrophysics Data System (ADS)
Amzallag, David; Raz, Danny
This chapter describes recent results addressing resource allocation problems in the context of current and future cellular technologies. We present models that capture several fundamental aspects of planning and operating these networks, and develop new approximation algorithms providing provable good solutions for the corresponding optimization problems. We mainly focus on two families of problems: cell planning and cell selection. Cell planning deals with choosing a network of base stations that can provide the required coverage of the service area with respect to the traffic requirements, available capacities, interference, and the desired QoS. Cell selection is the process of determining the cell(s) that provide service to each mobile station. Optimizing these processes is an important step towards maximizing the utilization of current and future cellular networks.
Cosmic strings - A problem or a solution?
NASA Technical Reports Server (NTRS)
Bennett, David P.; Bouchet, Francois R.
1988-01-01
The most fundamental issue in the theory of cosmic strings is addressed by means of Numerical Simulations: the existence of a scaling solution. The resolution of this question will determine whether cosmic strings can form the basis of an attractive theory of galaxy formation or prove to be a cosmological disaster like magnetic monopoles or domain walls. After a brief discussion of our numerical technique, results are presented which, though still preliminary, offer the best support to date of this scaling hypothesis.
Ultrafast and nanoscale diodes
NASA Astrophysics Data System (ADS)
Zhang, Peng; Lau, Y. Y.
2016-10-01
Charge carrier transport across interfaces of dissimilar materials (including vacuum) is the essence of all electronic devices. Ultrafast charge transport across a nanometre length scale is of fundamental importance in the miniaturization of vacuum and plasma electronics. With the combination of recent advances in electronics, photonics and nanotechnology, these miniature devices may integrate with solid-state platforms, achieving superior performance. This paper reviews recent modelling efforts on quantum tunnelling, ultrafast electron emission and transport, and electrical contact resistance. Unsolved problems and challenges in these areas are addressed.
The Hyperloop as a Source of Interesting Estimation Questions
NASA Astrophysics Data System (ADS)
Allain, Rhett
2014-03-01
The Hyperloop is a conceptual high speed transportation system proposed by Elon Musk. The basic idea uses passenger capsules inside a reduced pressure tube. Even though the actual physics of dynamic air flow in a confined space can be complicated, there are a multitude estimation problems that can be addressed. These back-of-the-envelope questions can be approximated by physicists of all levels as well as the general public and serve as a great example of the fundamental aspects of physics.
Cryptography and the Internet: lessons and challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCurley, K.S.
1996-12-31
The popularization of the Internet has brought fundamental changes to the world, because it allows a universal method of communication between computers. This carries enormous benefits with it, but also raises many security considerations. Cryptography is a fundamental technology used to provide security of computer networks, and there is currently a widespread engineering effort to incorporate cryptography into various aspects of the Internet. The system-level engineering required to provide security services for the Internet carries some important lessons for researchers whose study is focused on narrowly defined problems. It also offers challenges to the cryptographic research community by raising newmore » questions not adequately addressed by the existing body of knowledge. This paper attempts to summarize some of these lessons and challenges for the cryptographic research community.« less
A history of development in rotordynamics: A manufacturer's perspective
NASA Technical Reports Server (NTRS)
Shemeld, David E.
1987-01-01
The subject of rotordynamics and instability problems in high performance turbomachinery has been a topic of considerable industry discussion and debate over the last 15 or so years. This paper reviews an original equipment manufacturer's history of development of concepts and equipment as applicable to multistage centrifugal compressors. The variety of industry user compression requirements and resultant problematical situations tends to confound many of the theories and analytical techniques set forth. The experiences and examples described herein support the conclusion that the successful addressing of potential rotordynamics problems is best served by a fundamental knowledge of the specific equipment. This in addition to having the appropriate analytical tools. Also, that the final proof is in the doing.
Science objectives for ground- and space-based optical/IR interferometry
NASA Technical Reports Server (NTRS)
Ridgway, Stephen T.
1992-01-01
Ground-based interferometry will make spectacular strides in the next decade. However, it will always be limited by the turbulence of the terrestrial atmosphere. Some of the most exciting and subtle problems may only be addressed from a stable platform above the atmosphere. The lunar surface offers such a platform, nearly ideal in many respects. Once built, such a telescope array will not only resolve key fundamental problems, but will revolutionize virtually every topic in observational astronomy. Estimates of the possible performance of lunar and ground-based interferometers of the 21st century shows that the lunar interferometer reaches the faintest sources of all wavelengths, but has the most significant advantage in the infrared.
Promoting patient-centred fundamental care in acute healthcare systems.
Feo, Rebecca; Kitson, Alison
2016-05-01
Meeting patients' fundamental care needs is essential for optimal safety and recovery and positive experiences within any healthcare setting. There is growing international evidence, however, that these fundamentals are often poorly executed in acute care settings, resulting in patient safety threats, poorer and costly care outcomes, and dehumanising experiences for patients and families. Whilst care standards and policy initiatives are attempting to address these issues, their impact has been limited. This discussion paper explores, through a series of propositions, why fundamental care can be overlooked in sophisticated, high technology acute care settings. We argue that the central problem lies in the invisibility and subsequent devaluing of fundamental care. Such care is perceived to involve simple tasks that require little skill to execute and have minimal impact on patient outcomes. The propositions explore the potential origins of this prevailing perception, focusing upon the impact of the biomedical model, the consequences of managerial approaches that drive healthcare cultures, and the devaluing of fundamental care by nurses themselves. These multiple sources of invisibility and devaluing surrounding fundamental care have rendered the concept underdeveloped and misunderstood both conceptually and theoretically. Likewise, there remains minimal role clarification around who should be responsible for and deliver such care, and a dearth of empirical evidence and evidence-based metrics. In explicating these propositions, we argue that key to transforming the delivery of acute healthcare is a substantial shift in the conceptualisation of fundamental care. The propositions present a cogent argument that counters the prevailing perception that fundamental care is basic and does not require systematic investigation. We conclude by calling for the explicit valuing and embedding of fundamental care in healthcare education, research, practice and policy. Without this re-conceptualisation and subsequent action, poor quality, depersonalised fundamental care will prevail. Copyright © 2016 Elsevier Ltd. All rights reserved.
Nanoinformatics: a new area of research in nanomedicine
Maojo, Victor; Fritts, Martin; de la Iglesia, Diana; Cachau, Raul E; Garcia-Remesal, Miguel; Mitchell, Joyce A; Kulikowski, Casimir
2012-01-01
Over a decade ago, nanotechnologists began research on applications of nanomaterials for medicine. This research has revealed a wide range of different challenges, as well as many opportunities. Some of these challenges are strongly related to informatics issues, dealing, for instance, with the management and integration of heterogeneous information, defining nomenclatures, taxonomies and classifications for various types of nanomaterials, and research on new modeling and simulation techniques for nanoparticles. Nanoinformatics has recently emerged in the USA and Europe to address these issues. In this paper, we present a review of nanoinformatics, describing its origins, the problems it addresses, areas of interest, and examples of current research initiatives and informatics resources. We suggest that nanoinformatics could accelerate research and development in nanomedicine, as has occurred in the past in other fields. For instance, biomedical informatics served as a fundamental catalyst for the Human Genome Project, and other genomic and –omics projects, as well as the translational efforts that link resulting molecular-level research to clinical problems and findings. PMID:22866003
Nanoinformatics: a new area of research in nanomedicine.
Maojo, Victor; Fritts, Martin; de la Iglesia, Diana; Cachau, Raul E; Garcia-Remesal, Miguel; Mitchell, Joyce A; Kulikowski, Casimir
2012-01-01
Over a decade ago, nanotechnologists began research on applications of nanomaterials for medicine. This research has revealed a wide range of different challenges, as well as many opportunities. Some of these challenges are strongly related to informatics issues, dealing, for instance, with the management and integration of heterogeneous information, defining nomenclatures, taxonomies and classifications for various types of nanomaterials, and research on new modeling and simulation techniques for nanoparticles. Nanoinformatics has recently emerged in the USA and Europe to address these issues. In this paper, we present a review of nanoinformatics, describing its origins, the problems it addresses, areas of interest, and examples of current research initiatives and informatics resources. We suggest that nanoinformatics could accelerate research and development in nanomedicine, as has occurred in the past in other fields. For instance, biomedical informatics served as a fundamental catalyst for the Human Genome Project, and other genomic and -omics projects, as well as the translational efforts that link resulting molecular-level research to clinical problems and findings.
End Effects and Load Diffusion in Composite Structures
NASA Technical Reports Server (NTRS)
Horgan, Cornelius O.; Ambur, D. (Technical Monitor); Nemeth, M. P. (Technical Monitor)
2002-01-01
The research carried out here builds on our previous NASA supported research on the general topic of edge effects and load diffusion in composite structures. Further fundamental solid mechanics studies were carried out to provide a basis for assessing the complicated modeling necessary for large scale structures used by NASA. An understanding of the fundamental mechanisms of load diffusion in composite subcomponents is essential in developing primary composite structures. Specific problems recently considered were focussed on end effects in sandwich structures and for functionally graded materials. Both linear and nonlinear (geometric and material) problems have been addressed. Our goal is the development of readily applicable design formulas for the decay lengths in terms of non-dimensional material and geometric parameters. Analytical models of load diffusion behavior are extremely valuable in building an intuitive base for developing refined modeling strategies and assessing results from finite element analyses. The decay behavior of stresses and other field quantities provides a significant aid towards this process. The analysis is also amenable to parameter study with a large parameter space and should be useful in structural tailoring studies.
A Perspective on Coupled Multiscale Simulation and Validation in Nuclear Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. P. Short; D. Gaston; C. R. Stanek
2014-01-01
The field of nuclear materials encompasses numerous opportunities to address and ultimately solve longstanding industrial problems by improving the fundamental understanding of materials through the integration of experiments with multiscale modeling and high-performance simulation. A particularly noteworthy example is an ongoing study of axial power distortions in a nuclear reactor induced by corrosion deposits, known as CRUD (Chalk River unidentified deposits). We describe how progress is being made toward achieving scientific advances and technological solutions on two fronts. Specifically, the study of thermal conductivity of CRUD phases has augmented missing data as well as revealed new mechanisms. Additionally, the developmentmore » of a multiscale simulation framework shows potential for the validation of a new capability to predict the power distribution of a reactor, in effect direct evidence of technological impact. The material- and system-level challenges identified in the study of CRUD are similar to other well-known vexing problems in nuclear materials, such as irradiation accelerated corrosion, stress corrosion cracking, and void swelling; they all involve connecting materials science fundamentals at the atomistic- and mesoscales to technology challenges at the macroscale.« less
Cavopulmonary assist: (Em)powering the univentricular Fontan circulation
Rodefeld, Mark D; Frankel, Steven H; Giridharan, Guruprasad A
2011-01-01
Since the Fontan/Kreutzer procedure was introduced, evolutionary clinical advances via a staged surgical reconstructive approach have markedly improved outcomes for patients with functional single ventricle. However, significant challenges remain. Early stage mortality risk seems impenetrable. Serious morbidities - construed as immutable consequences of palliation - have hardly been addressed. Late functional status is increasingly linked to pathophysiologic consequences of prior staged procedures. As more single ventricle patients survive into adulthood, Fontan failure is emerging as an intractable problem for which there is no targeted therapy. Incremental solutions to address these ongoing problems have not had a measurable impact. Therefore, a fundamental reconsideration of the overall approach is reasonable and warranted. The ability to provide a modest pressure boost (2-6 mmHg) to existing blood flow at the total cavopulmonary connection can effectively restore more stable biventricular status. This would impact not only treatment of late Fontan failure, but also facilitate early surgical repair. A realistic means to provide such a pressure boost has never been apparent. Recent advances are beginning to unravel the unique challenges which must be addressed to realize this goal, with promise to open single ventricle palliation to new therapeutic vistas. PMID:21444049
Reconciling Scientific Curiosity and Policy Needs in Atmospheric Chemistry Research
NASA Astrophysics Data System (ADS)
Jacob, D. J.
2002-05-01
Young people generally choose a career in atmospheric chemistry because they care about the environment and want to make a difference. However, in the course of graduate training this initial motivation often becomes replaced by the more standard motivation of academic scientists: to understand the world (and get credit for it). We are taught during our Ph.D. that the more fundamental the research the better to earn the respect of our peers. And yet, in environmental research where funding is dominated by societal and policy demands, most of us have no choice but to follow this funding trail. This is not simple venality. Fortunately, most atmospheric chemists want to be societally relevant, we thrive on the spotlight thrown by society on atmospheric chemistry issues, and we are thankful that societal concerns are allowing our science to grow at a fast pace. It appears that the atmospheric chemistry community resolves its conflict between policy-driven vs. fundamental research by posting policy relevance as the canon for successful research, as the endpoint of useful work. The greatest glory then comes from picking up some fundamental knowledge along the way that provides bridges to other problems, and from uncovering new environmental problems that will require attention from policymakers. Sometimes we are frustrated, as when policymakers decide that research on our favorite problem is not needed anymore because there is now policy to address it. But of course we have to remember what got our research funded in the first place, lobby as we can, and move on. I will present, rather pretentiously, a few examples from my own research.
Fundamental solution of the problem of linear programming and method of its determination
NASA Technical Reports Server (NTRS)
Petrunin, S. V.
1978-01-01
The idea of a fundamental solution to a problem in linear programming is introduced. A method of determining the fundamental solution and of applying this method to the solution of a problem in linear programming is proposed. Numerical examples are cited.
Late-time cosmic acceleration: ABCD of dark energy and modified theories of gravity
NASA Astrophysics Data System (ADS)
Sami, M.; Myrzakulov, R.
2016-10-01
We briefly review the problems and prospects of the standard lore of dark energy. We have shown that scalar fields, in principle, cannot address the cosmological constant problem. Indeed, a fundamental scalar field is faced with a similar problem dubbed naturalness. In order to keep the discussion pedagogical, aimed at a wider audience, we have avoided technical complications in several places and resorted to heuristic arguments based on physical perceptions. We presented underlying ideas of modified theories based upon chameleon mechanism and Vainshtein screening. We have given a lucid illustration of recently investigated ghost-free nonlinear massive gravity. Again, we have sacrificed rigor and confined to the basic ideas that led to the formulation of the theory. The review ends with a brief discussion on the difficulties of the theory applied to cosmology.
An approximation algorithm for the Noah's Ark problem with random feature loss.
Hickey, Glenn; Blanchette, Mathieu; Carmi, Paz; Maheshwari, Anil; Zeh, Norbert
2011-01-01
The phylogenetic diversity (PD) of a set of species is a measure of their evolutionary distinctness based on a phylogenetic tree. PD is increasingly being adopted as an index of biodiversity in ecological conservation projects. The Noah's Ark Problem (NAP) is an NP-Hard optimization problem that abstracts a fundamental conservation challenge in asking to maximize the expected PD of a set of taxa given a fixed budget, where each taxon is associated with a cost of conservation and a probability of extinction. Only simplified instances of the problem, where one or more parameters are fixed as constants, have as of yet been addressed in the literature. Furthermore, it has been argued that PD is not an appropriate metric for models that allow information to be lost along paths in the tree. We therefore generalize the NAP to incorporate a proposed model of feature loss according to an exponential distribution and term this problem NAP with Loss (NAPL). In this paper, we present a pseudopolynomial time approximation scheme for NAPL.
Valverde, María Eugenia Rojas
2010-01-01
This article shows the significance of the problems of political harassment and violence against women in positions of political responsibility in Bolivia. This phenomenon is seen in both rural and urban areas and transcends borders. It has been shown that these attacks constitute a violation of women's civil and political rights and a threat to the physical and mental health of women leaders in Bolivia. Furthermore, there is no punishment of guilty parties, reparation, or moral or material compensation for the women who are affected. In Bolivia, gender-based harassment and violence is a fundamental barrier to women's political participation. However, this phenomenon is still not addressed by government programs and is not part of the public discourse and debate. In spite of the measures taken to promote women's political participation, several different administrations have been unable to guarantee women the capacity to occupy positions of responsibility without being threatened or harassed. The results of our research led to a bill addressing this problem. Subsequently, Ecuador took this bill as an example and replicated it in a legislative initiative. These results show the importance of research by organizations that represent women in preventing unjust situations and health problems.
Construction of a single atom trap for quantum information protocols
NASA Astrophysics Data System (ADS)
Shea, Margaret E.; Baker, Paul M.; Gauthier, Daniel J.; Duke Physics Department Team
2016-05-01
The field of quantum information science addresses outstanding problems such as achieving fundamentally secure communication and solving computationally hard problems. Great progress has been made in the field, particularly using photons coupled to ions and super conducting qubits. Neutral atoms are also interesting for these applications and though the technology for control of neutrals lags behind that of trapped ions, they offer some key advantages: primarily coupling to optical frequencies closer to the telecom band than trapped ions or superconducting qubits. Here we report progress on constructing a single atom trap for 87 Rb. This system is a promising platform for studying the technical problems facing neutral atom quantum computing. For example, most protocols destroy the trap when reading out the neutral atom's state; we will investigate an alternative non-destructive state detection scheme. We detail the experimental systems involved and the challenges addressed in trapping a single atom. All of our hardware components are off the shelf and relatively inexpensive. Unlike many other systems, we place a high numerical aperture lens inside our vacuum system to increase photon collection efficiency. We gratefully acknowledge the financial support of the ARO through Grant # W911NF1520047.
Job Management Requirements for NAS Parallel Systems and Clusters
NASA Technical Reports Server (NTRS)
Saphir, William; Tanner, Leigh Ann; Traversat, Bernard
1995-01-01
A job management system is a critical component of a production supercomputing environment, permitting oversubscribed resources to be shared fairly and efficiently. Job management systems that were originally designed for traditional vector supercomputers are not appropriate for the distributed-memory parallel supercomputers that are becoming increasingly important in the high performance computing industry. Newer job management systems offer new functionality but do not solve fundamental problems. We address some of the main issues in resource allocation and job scheduling we have encountered on two parallel computers - a 160-node IBM SP2 and a cluster of 20 high performance workstations located at the Numerical Aerodynamic Simulation facility. We describe the requirements for resource allocation and job management that are necessary to provide a production supercomputing environment on these machines, prioritizing according to difficulty and importance, and advocating a return to fundamental issues.
Nonlinear dynamics and quantum entanglement in optomechanical systems.
Wang, Guanglei; Huang, Liang; Lai, Ying-Cheng; Grebogi, Celso
2014-03-21
To search for and exploit quantum manifestations of classical nonlinear dynamics is one of the most fundamental problems in physics. Using optomechanical systems as a paradigm, we address this problem from the perspective of quantum entanglement. We uncover strong fingerprints in the quantum entanglement of two common types of classical nonlinear dynamical behaviors: periodic oscillations and quasiperiodic motion. There is a transition from the former to the latter as an experimentally adjustable parameter is changed through a critical value. Accompanying this process, except for a small region about the critical value, the degree of quantum entanglement shows a trend of continuous increase. The time evolution of the entanglement measure, e.g., logarithmic negativity, exhibits a strong dependence on the nature of classical nonlinear dynamics, constituting its signature.
Laser Spot Detection Based on Reaction Diffusion.
Vázquez-Otero, Alejandro; Khikhlukha, Danila; Solano-Altamirano, J M; Dormido, Raquel; Duro, Natividad
2016-03-01
Center-location of a laser spot is a problem of interest when the laser is used for processing and performing measurements. Measurement quality depends on correctly determining the location of the laser spot. Hence, improving and proposing algorithms for the correct location of the spots are fundamental issues in laser-based measurements. In this paper we introduce a Reaction Diffusion (RD) system as the main computational framework for robustly finding laser spot centers. The method presented is compared with a conventional approach for locating laser spots, and the experimental results indicate that RD-based computation generates reliable and precise solutions. These results confirm the flexibility of the new computational paradigm based on RD systems for addressing problems that can be reduced to a set of geometric operations.
Multimedia Analysis plus Visual Analytics = Multimedia Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chinchor, Nancy; Thomas, James J.; Wong, Pak C.
2010-10-01
Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.
Interlaboratory studies and initiatives developing standards for proteomics
Ivanov, Alexander R.; Colangelo, Christopher M.; Dufresne, Craig P.; Friedman, David B.; Lilley, Kathryn S.; Mechtler, Karl; Phinney, Brett S.; Rose, Kristie L.; Rudnick, Paul A.; Searle, Brian C.; Shaffer, Scott A.; Weintraub, Susan T.
2013-01-01
Proteomics is a rapidly transforming interdisciplinary field of research that embraces a diverse set of analytical approaches to tackle problems in fundamental and applied biology. This view-point article highlights the benefits of interlaboratory studies and standardization initiatives to enable investigators to address many of the challenges found in proteomics research. Among these initiatives, we discuss our efforts on a comprehensive performance standard for characterizing PTMs by MS that was recently developed by the Association of Biomolecular Resource Facilities (ABRF) Proteomics Standards Research Group (sPRG). PMID:23319436
Distributed computer taxonomy based on O/S structure
NASA Technical Reports Server (NTRS)
Foudriat, Edwin C.
1985-01-01
The taxonomy considers the resource structure at the operating system level. It compares a communication based taxonomy with the new taxonomy to illustrate how the latter does a better job when related to the client's view of the distributed computer. The results illustrate the fundamental features and what is required to construct fully distributed processing systems. The problem of using network computers on the space station is addressed. A detailed discussion of the taxonomy is not given here. Information is given in the form of charts and diagrams that were used to illustrate a talk.
NASA Astrophysics Data System (ADS)
Budaev, Bair V.; Bogy, David B.
2018-06-01
We extend the statistical analysis of equilibrium systems to systems with a constant heat flux. This extension leads to natural generalizations of Maxwell-Boltzmann's and Planck's equilibrium energy distributions to energy distributions of systems with a net heat flux. This development provides a long needed foundation for addressing problems of nanoscale heat transport by a systematic method based on a few fundamental principles. As an example, we consider the computation of the radiative heat flux between narrowly spaced half-spaces maintained at different temperatures.
A Real Options Approach to Quantity and Cost Optimization for Lifetime and Bridge Buys of Parts
2015-05-01
demand • Asymmetric over- and under-buy penalties • Non-negligible inventory costs • Cost of money (non-zero WACC ) • Uncertain end of support date...ROA for WACC of 12%. The optimum life- cycle buy size distribution for the example case from DES and ROA for WACC of 3%. ~1 part difference...of a variable end-of-support (EOS) date – A fundamental problem that needs to be addressed is how to set the discount factor ( WACC ) for the
Sensory-Motor Adaptation to Space Flight: Human Balance Control and Artificial Gravity
NASA Technical Reports Server (NTRS)
Paloski, William H.
2004-01-01
Gravity, which is sensed directly by the otolith organs and indirectly by proprioceptors and exteroceptors, provides the CNS a fundamental reference for estimating spatial orientation and coordinating movements in the terrestrial environment. The sustained absence of gravity during orbital space flight creates a unique environment that cannot be reproduced on Earth. Loss of this fundamental CNS reference upon insertion into orbit triggers neuro-adaptive processes that optimize performance for the microgravity environment, while its reintroduction upon return to Earth triggers neuro-adaptive processes that return performance to terrestrial norms. Five pioneering symposia on The Role of the Vestibular Organs in the Exploration of Space were convened between 1965 and 1970. These innovative meetings brought together the top physicians, physiologists, and engineers in the vestibular field to discuss and debate the challenges associated with human vestibular system adaptation to the then novel environment of space flight. These highly successful symposia addressed the perplexing problem of how to understand and ameliorate the adverse physiological effects on humans resulting from the reduction of gravitational stimulation of the vestibular receptors in space. The series resumed in 2002 with the Sixth Symposium, which focused on the microgravity environment as an essential tool for the study of fundamental vestibular functions. The three day meeting included presentations on historical perspectives, vestibular neurobiology, neurophysiology, neuroanatomy, neurotransmitter systems, theoretical considerations, spatial orientation, psychophysics, motor integration, adaptation, autonomic function, space motion sickness, clinical issues, countermeasures, and rehabilitation. Scientists and clinicians entered into lively exchanges on how to design and perform mutually productive research and countermeasure development projects in the future. The problems posed by long duration missions dominated these discussions and were driven by the paucity of data available. These issues along with more specific recommendations arising from the above discussions will be addressed an upcoming issue of the Journal of Vestibular Research.
Presser, Theresa S.; Jenni, Karen E.; Nieman, Timothy; Coleman, James
2010-01-01
Constraints on drainage management in the western San Joaquin Valley and implications of proposed approaches to management were recently evaluated by the U.S. Geological Survey (USGS). The USGS found that a significant amount of data for relevant technical issues was available and that a structured, analytical decision support tool could help optimize combinations of specific in-valley drainage management strategies, address uncertainties, and document underlying data analysis for future use. To follow-up on USGS's technical analysis and to help define a scientific basis for decisionmaking in implementing in-valley drainage management strategies, this report describes the first step (that is, a framing study) in a Decision Analysis process. In general, a Decision Analysis process includes four steps: (1) problem framing to establish the scope of the decision problem(s) and a set of fundamental objectives to evaluate potential solutions, (2) generation of strategies to address identified decision problem(s), (3) identification of uncertainties and their relationships, and (4) construction of a decision support model. Participation in such a systematic approach can help to promote consensus and to build a record of qualified supporting data for planning and implementation. In December 2008, a Decision Analysis framing study was initiated with a series of meetings designed to obtain preliminary input from key stakeholder groups on the scope of decisions relevant to drainage management that were of interest to them, and on the fundamental objectives each group considered relevant to those decisions. Two key findings of this framing study are: (1) participating stakeholders have many drainage management objectives in common; and (2) understanding the links between drainage management and water management is necessary both for sound science-based decisionmaking and for resolving stakeholder differences about the value of proposed drainage management solutions. Citing ongoing legal processes associated with drainage management in the western San Joaquin Valley, the U.S. Bureau of Reclamation (USBR) withdrew from the Decision Analysis process early in the proceedings. Without the involvement of the USBR, the USGS discontinued further development of this study.
Identifying Organic Molecules in Space: The AstroBiology Explorer (ABE) Mission Concept
NASA Technical Reports Server (NTRS)
Ennico, Kimberly; Sandford, S.; Allamandola, L.; Bregman, J.; Cohen, M.; Cruikshank, D.; Dumas, C.; Greene, T.; Hudgins, D.; Kwok, S.
2004-01-01
The AstroBiology Explorer (ABE) mission concept consists of a modest dedicated space observatory having a 60 cm class primary mirror cooled to T less than 50 K equipped with medium resolution cross-dispersed spectrometers having cooled large format near- and mid-infrared detector arrays. Such a system would be capable of addressing outstanding problems in Astrochemistry and Astrophysics that are particularly relevant to Astrobiology and addressable via astronomical observation. The mission's observaticxiai program woiild make fundamental scieztific: prngress in establishing the nature, distribution, formation and evolution of organic and other molecular materials in the following extra-terrestrial environments: 1) The Outflow of Dying Stars; 2) The Diffuse Interstellar Medium (DISM); 3) Dense Molecular Clouds, Star Formation Regions, and Young Stellar/Planetary Systems; 4) Planets, Satellites, and Small Bodies within the Solar System; and 5) The Interstellar Media of Other Galaxies ABE could make fundamental progress in all of these area by conducting a 1 to 2 year mission to obtain a coordinated set of infrared spectroscopic observations over the 2.5 - 20 micron spectral range at a spectral resolution of R greater than 2500 of about 1500 galaxies, stars, planetary nebulae, young stellar objects, and solar system objects.
NASA Technical Reports Server (NTRS)
Gjerleov, J. W.; Slavin, J. A.
2001-01-01
Of the three Mercury passes made by Mariner 10, the first and third went through the Mercury magnetosphere. The third encounter which occurred during northward IMF (interplanetary magnetic field) showed quiet time magnetic fields. In contrast the third encounter observed clear substorm signatures including dipolarization, field-aligned currents (FACs) and injection of energetic electrons at geosynchronous orbit. However, the determined cross-tail potential drop and the assumed height integrated conductance indicate that the FAC should be 2-50 times weaker than observed. We address this inconsistency and the fundamental problem of FAC closure whether this takes place in the regolith or in the exosphere. The current state of knowledge of the magnetosphere-exosphere/regolith coupling is addressed and similarities and differences to the Earth magnetosphere-ionosphere coupling are discussed.
NASA Astrophysics Data System (ADS)
Isobe, Masaharu
Hard sphere/disk systems are among the simplest models and have been used to address numerous fundamental problems in the field of statistical physics. The pioneering numerical works on the solid-fluid phase transition based on Monte Carlo (MC) and molecular dynamics (MD) methods published in 1957 represent historical milestones, which have had a significant influence on the development of computer algorithms and novel tools to obtain physical insights. This chapter addresses the works of Alder's breakthrough regarding hard sphere/disk simulation: (i) event-driven molecular dynamics, (ii) long-time tail, (iii) molasses tail, and (iv) two-dimensional melting/crystallization. From a numerical viewpoint, there are serious issues that must be overcome for further breakthrough. Here, we present a brief review of recent progress in this area.
The Aerial Regional-scale Environmental Survey (ARES) Mission to Mars
NASA Technical Reports Server (NTRS)
Levine, J. S.
2005-01-01
ARES is an exploration mission concept for an Aerial Regional-scale Environmental Survey of Mars designed to fly an instrumented platform over the surface of Mars at very low altitudes (1-3 km) for distances of hundreds to thousands of kilometers to obtain scientific data to address fundamental problems in Mars science. ARES helps to fill a gap in the scale and perspective of the Mars Exploration Program and addresses many key COMPLEX/MEPAG questions (e.g., nature and origin of crustal magnetic anomalies) not readily pursued in other parts of the exploration program. ARES supports the human exploration program through key environmental measurements and high-resolution contiguous data essential to reference mission design. Here we describe the major types of scientific goals, candidate instruments, and reference mission profiles.
Kim, Yoon Jae; Kim, Yoon Young
2010-10-01
This paper presents a numerical method for the optimization of the sequencing of solid panels, perforated panels and air gaps and their respective thickness for maximizing sound transmission loss and/or absorption. For the optimization, a method based on the topology optimization formulation is proposed. It is difficult to employ only the commonly-used material interpolation technique because the involved layers exhibit fundamentally different acoustic behavior. Thus, an optimization method formulation using a so-called unified transfer matrix is newly proposed. The key idea is to form elements of the transfer matrix such that interpolated elements by the layer design variables can be those of air, perforated and solid panel layers. The problem related to the interpolation is addressed and bench mark-type problems such as sound transmission or absorption maximization problems are solved to check the efficiency of the developed method.
Review of analytical models to stream depletion induced by pumping: Guide to model selection
NASA Astrophysics Data System (ADS)
Huang, Ching-Sheng; Yang, Tao; Yeh, Hund-Der
2018-06-01
Stream depletion due to groundwater extraction by wells may cause impact on aquatic ecosystem in streams, conflict over water rights, and contamination of water from irrigation wells near polluted streams. A variety of studies have been devoted to addressing the issue of stream depletion, but a fundamental framework for analytical modeling developed from aquifer viewpoint has not yet been found. This review shows key differences in existing models regarding the stream depletion problem and provides some guidelines for choosing a proper analytical model in solving the problem of concern. We introduce commonly used models composed of flow equations, boundary conditions, well representations and stream treatments for confined, unconfined, and leaky aquifers. They are briefly evaluated and classified according to six categories of aquifer type, flow dimension, aquifer domain, stream representation, stream channel geometry, and well type. Finally, we recommend promising analytical approaches that can solve stream depletion problem in reality with aquifer heterogeneity and irregular geometry of stream channel. Several unsolved stream depletion problems are also recommended.
The role of failure/problems in engineering: A commentary of failures experienced - lessons learned
NASA Technical Reports Server (NTRS)
Ryan, R. S.
1992-01-01
The written version of a series of seminars given to several aerospace companies and three NASA centers are presented. The results are lessons learned through a study of the problems experienced in 35 years of engineering. The basic conclusion is that the primary cause of problems has not been mission technologies, as important as technology is, but the neglect of basic principles. Undergirding this is the lack of a systems focus from determining requirements through design, verification, and operations phases. Many of the concepts discussed are fundamental to total quality management (TQM) and can be used to augment this product enhanced philosophy. Fourteen principles are addressed with problems experienced and are used as examples. Included is a discussion of the implication of constraints, poorly defined requirements, and schedules. Design guidelines, lessons learned, and future tasks are listed. Two additional sections are included that deal with personal lessons learned and thoughts on future thrusts (TQM).
The sound of moving bodies. Ph.D. Thesis - Cambridge Univ.
NASA Technical Reports Server (NTRS)
Brentner, Kenneth Steven
1990-01-01
The importance of the quadrupole source term in the Ffowcs, Williams, and Hawkings (FWH) equation was addressed. The quadrupole source contains fundamental components of the complete fluid mechanics problem, which are ignored only at the risk of error. The results made it clear that any application of the acoustic analogy should begin with all of the source terms in the FWH theory. The direct calculation of the acoustic field as part of the complete unsteady fluid mechanics problem using CFD is considered. It was shown that aeroelastic calculation can indeed be made with CFD codes. The results indicate that the acoustic field is the most susceptible component of the computation to numerical error. Therefore, the ability to measure the damping of acoustic waves is absolutely essential both to develop acoustic computations. Essential groundwork for a new approach to the problem of sound generation by moving bodies is presented. This new computational acoustic approach holds the promise of solving many problems hitherto pushed aside.
Structuring policy problems for plastics, the environment and human health: reflections from the UK
Shaxson, Louise
2009-01-01
How can we strengthen the science–policy interface for plastics, the environment and human health? In a complex policy area with multiple stakeholders, it is important to clarify the nature of the particular plastics-related issue before trying to understand how to reconcile the supply and demand for evidence in policy. This article proposes a simple problem typology to assess the fundamental characteristics of a policy issue and thus identify appropriate processes for science–policy interactions. This is illustrated with two case studies from one UK Government Department, showing how policy and science meet over the environmental problems of plastics waste in the marine environment and on land. A problem-structuring methodology helps us understand why some policy issues can be addressed through relatively linear flows of science from experts to policymakers but why others demand a more reflexive approach to brokering the knowledge between science and policy. Suggestions are given at the end of the article for practical actions that can be taken on both sides. PMID:19528061
Structuring policy problems for plastics, the environment and human health: reflections from the UK.
Shaxson, Louise
2009-07-27
How can we strengthen the science-policy interface for plastics, the environment and human health? In a complex policy area with multiple stakeholders, it is important to clarify the nature of the particular plastics-related issue before trying to understand how to reconcile the supply and demand for evidence in policy. This article proposes a simple problem typology to assess the fundamental characteristics of a policy issue and thus identify appropriate processes for science-policy interactions. This is illustrated with two case studies from one UK Government Department, showing how policy and science meet over the environmental problems of plastics waste in the marine environment and on land. A problem-structuring methodology helps us understand why some policy issues can be addressed through relatively linear flows of science from experts to policymakers but why others demand a more reflexive approach to brokering the knowledge between science and policy. Suggestions are given at the end of the article for practical actions that can be taken on both sides.
MEVTV Workshop on Early Tectonic and Volcanic Evolution of Mars
NASA Technical Reports Server (NTRS)
Frey, H. (Editor)
1988-01-01
Although not ignored, the problems of the early tectonic and volcanic evolution of Mars have generally received less attention than those later in the evolution of the planet. Specifically, much attention was devoted to the evolution of the Tharsis region of Mars and to the planet itself at the time following the establishment of this major tectonic and volcanic province. By contrast, little attention was directed at fundamental questions, such as the conditions that led to the development of Tharsis and the cause of the basic fundamental dichotomy of the Martian crust. It was to address these and related questions of the earliest evolution of Mars that a workshop was organized under the auspices of the Mars: Evolution of Volcanism, Tectonism, and Volatiles (MEVTV) Program. Four sessions were held: crustal dichotomy; crustal differentiation/volcanism; Tharsis, Elysium, and Valles Marineris; and ridges and fault tectonics.
Thermodynamics and Diffusion Coupling in Alloys—Application-Driven Science
NASA Astrophysics Data System (ADS)
Ågren, John
2012-10-01
As emphasized by Stokes (1997), the common assumption of a linear progression from basic research (science), via applied research, to technological innovations (engineering) should be questioned. In fact, society would gain much by supporting long-term research that stems from practical problems and has usefulness as a key word. Such research may be fundamental, and often, it cannot be distinguished from "basic" research if it were not for its different motivation. The development of the Calphad method and the more recent development of accompanying kinetic approaches for diffusion serve as excellent examples and are the themes of this symposium. The drivers are, e.g., the development of new materials, processes, and lifetime predictions. Many challenges of the utmost practical importance require long-term fundamental research. This presentation will address some of them, e.g., the effect of various ordering phenomena on activation barriers, and the strength and practical importance of correlation effects.
Setting objectives for managing Key deer
Diefenbach, Duane R.; Wagner, Tyler; Stauffer, Glenn E.
2014-01-01
The U.S. Fish and Wildlife Service (FWS) is responsible for the protection and management of Key deer (Odocoileus virginianus clavium) because the species is listed as Endangered under the Endangered Species Act (ESA). The purpose of the ESA is to protect and recover imperiled species and the ecosystems upon which they depend. There are a host of actions that could possibly be undertaken to recover the Key deer population, but without a clearly defined problem and stated objectives it can be difficult to compare and evaluate alternative actions. In addition, management goals and the acceptability of alternative management actions are inherently linked to stakeholders, who should be engaged throughout the process of developing a decision framework. The purpose of this project was to engage a representative group of stakeholders to develop a problem statement that captured the management problem the FWS must address with Key deer and identify objectives that, if met, would help solve the problem. In addition, the objectives were organized in a hierarchical manner (i.e., an objectives network) to show how they are linked, and measurable attributes were identified for each objective. We organized a group of people who represented stakeholders interested in and potentially affected by the management of Key deer. These stakeholders included individuals who represented local, state, and federal governments, non-governmental organizations, the general public, and local businesses. This stakeholder group met five full days over the course of an eight-week period to identify objectives that would address the following problem:“As recovery and removal from the Endangered Species list is the purpose of the Endangered Species Act, the U.S. Fish and Wildlife Service needs a management approach that will ensure a sustainable, viable, and healthy Key deer population. Urbanization has affected the behavior and population dynamics of the Key deer and the amount and characteristics of available habitat. The identified management approach must balance relevant social and economic concerns, Federal (e.g., Endangered Species Act, Wilderness Act, Refuge Act) and state regulations, and the conservation of biodiversity (e.g., Endangered/Threatened species, native habitat) in the Lower Keys.”The stakeholder group identified four fundamental objectives that are essential to addressing the problem: 1) Maximize a sustainable, viable, and healthy Key deer population, 2) Maximize value of Key deer to the People, 3) Minimize deer-related negative impacts to biodiversity, and 4) Minimize costs. In addition, the group identified 25 additional objectives that, if met, would help meet the fundamental objectives. The objectives network and measurable attributes identified by the stakeholder group can be used in the future to develop and evaluate potential management alternatives.
Predicting the evolution of spreading on complex networks
Chen, Duan-Bing; Xiao, Rui; Zeng, An
2014-01-01
Due to the wide applications, spreading processes on complex networks have been intensively studied. However, one of the most fundamental problems has not yet been well addressed: predicting the evolution of spreading based on a given snapshot of the propagation on networks. With this problem solved, one can accelerate or slow down the spreading in advance if the predicted propagation result is narrower or wider than expected. In this paper, we propose an iterative algorithm to estimate the infection probability of the spreading process and then apply it to a mean-field approach to predict the spreading coverage. The validation of the method is performed in both artificial and real networks. The results show that our method is accurate in both infection probability estimation and spreading coverage prediction. PMID:25130862
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gruenbacher, Don
2015-12-31
This project addresses both fundamental and applied research problems that will help with problems defined by the DOE “20% Wind by 2030 Report”. In particular, this work focuses on increasing the capacity of small or community wind generation capabilities that would be operated in a distributed generation approach. A consortium (KWEC – Kansas Wind Energy Consortium) of researchers from Kansas State University and Wichita State University aims to dramatically increase the penetration of wind energy via distributed wind power generation. We believe distributed generation through wind power will play a critical role in the ability to reach and extend themore » renewable energy production targets set by the Department of Energy. KWEC aims to find technical and economic solutions to enable widespread implementation of distributed renewable energy resources that would apply to wind.« less
Tulchinsky, T H; Varavikova, E A
1996-01-01
OBJECTIVES. This paper reviews Russia's health crisis, financing, and organization and public health reform needs. METHODS. The structure, policy, supply of services, and health status indicators of Russia's health system are examined. RESULTS. Longevity is declining; mortality rates from cardiovascular diseases and trauma are high and rising; maternal and infant mortality are high. Vaccine-preventable diseases have reappeared in epidemic form. Nutrition status is problematic. CONCLUSIONS. The crisis relates to Russia's economic transition, but it also goes deep into the former Soviet health system. The epidemiologic transition from a predominance of infectious to noninfectious diseases was addressed by increasing the quantity of services. The health system lacked mechanisms for epidemiologic or economic analysis and accountability to the public. Policy and funding favored hospitals over ambulatory care and individual routine checkups over community-oriented preventive approaches. Reform since 1991 has centered on national health insurance and decentralized management of services. A national health strategy to address fundamental public health problems is recommended. PMID:8604754
A Linear Kernel for Co-Path/Cycle Packing
NASA Astrophysics Data System (ADS)
Chen, Zhi-Zhong; Fellows, Michael; Fu, Bin; Jiang, Haitao; Liu, Yang; Wang, Lusheng; Zhu, Binhai
Bounded-Degree Vertex Deletion is a fundamental problem in graph theory that has new applications in computational biology. In this paper, we address a special case of Bounded-Degree Vertex Deletion, the Co-Path/Cycle Packing problem, which asks to delete as few vertices as possible such that the graph of the remaining (residual) vertices is composed of disjoint paths and simple cycles. The problem falls into the well-known class of 'node-deletion problems with hereditary properties', is hence NP-complete and unlikely to admit a polynomial time approximation algorithm with approximation factor smaller than 2. In the framework of parameterized complexity, we present a kernelization algorithm that produces a kernel with at most 37k vertices, improving on the super-linear kernel of Fellows et al.'s general theorem for Bounded-Degree Vertex Deletion. Using this kernel,and the method of bounded search trees, we devise an FPT algorithm that runs in time O *(3.24 k ). On the negative side, we show that the problem is APX-hard and unlikely to have a kernel smaller than 2k by a reduction from Vertex Cover.
NASA Astrophysics Data System (ADS)
Melious, J. O.
2012-12-01
In the northwestern corner of Washington state, a large landslide on Sumas Mountain deposits more than 100,000 cubic yards of soil containing asbestos fibers and heavy metals into Swift Creek every year. Engineers predict that asbestos-laden soils will slide into Swift Creek for at least the next 400 years. Swift Creek joins the Sumas River, which crosses the border into Canada, serving as an international delivery system for asbestos-laden soils. When the rivers flood, as happens regularly, they deliver asbestos into field, yards, and basements. The tools available to address the Swift Creek situation are at odds with the scope and nature of the problem. Asbestos regulation primarily addresses occupational settings, where exposures can be estimated. Hazardous waste regulation primarily addresses liability for abandoned waste products from human activities. Health and environmental issues relating to naturally occurring asbestos (NOA) are fundamentally different from either regulatory scheme. Liability is not a logical lever for a naturally occurring substance, the existence of which is nobody's fault, and exposures to NOA in the environment do not necessarily resemble occupational exposures. The gaps and flaws in the legal regime exacerbate the uncertainties created by uncertainties in the science. Once it is assumed that no level of exposure is safe, legal requirements adopted in very different contexts foreclose the options for addressing the Swift Creek problem. This presentation will outline the applicable laws and how they intersect with issues of risk perception, uncertainty and politics in efforts to address the Swift Creek NOA site.
Practical Approaches for Detecting Selection in Microbial Genomes.
Hedge, Jessica; Wilson, Daniel J
2016-02-01
Microbial genome evolution is shaped by a variety of selective pressures. Understanding how these processes occur can help to address important problems in microbiology by explaining observed differences in phenotypes, including virulence and resistance to antibiotics. Greater access to whole-genome sequencing provides microbiologists with the opportunity to perform large-scale analyses of selection in novel settings, such as within individual hosts. This tutorial aims to guide researchers through the fundamentals underpinning popular methods for measuring selection in pathogens. These methods are transferable to a wide variety of organisms, and the exercises provided are designed for researchers with any level of programming experience.
Implications of new petrographic analysis for the Olmec "mother culture" model.
Flannery, Kent V; Balkansky, Andrew K; Feinman, Gary M; Grove, David C; Marcus, Joyce; Redmond, Elsa M; Reynolds, Robert G; Sharer, Robert J; Spencer, Charles S; Yaeger, Jason
2005-08-09
Petrographic analysis of Formative Mexican ceramics by J. B. Stoltman et al. (see the companion piece in this issue of PNAS) refutes a recent model of Olmec "one-way" trade. In this paper, we address the model's more fundamental problems of sampling bias, anthropological implausibility, and logical non sequiturs. No bridging argument exists to link motifs on pottery to the social, political, and religious institutions of the Olmec. In addition, the model of unreciprocated exchange is implausible, given everything that the anthropological and ethnohistoric records tell us about non-Western societies of that general sociopolitical level.
On the theory of coronal heating mechanisms
NASA Technical Reports Server (NTRS)
Kuperus, M.; Ionson, J. A.; Spicer, D. S.
1980-01-01
Theoretical models describing solar coronal heating mechanisms are reviewed in some detail. The requirements of chromospheric and coronal heating are discussed in the context of the fundamental constraints encountered in modelling the outer solar atmosphere. Heating by acoustic processes in the 'nonmagnetic' parts of the atmosphere is examined with particular emphasis on the shock wave theory. Also discussed are theories of heating by electrodynamic processes in the magnetic regions of the corona, either magnetohydrodynamic waves or current heating in the regions with large electric current densities (flare type heating). Problems associated with each of the models are addressed.
Information technology challenges of biodiversity and ecosystems informatics
Schnase, J.L.; Cushing, J.; Frame, M.; Frondorf, A.; Landis, E.; Maier, D.; Silberschatz, A.
2003-01-01
Computer scientists, biologists, and natural resource managers recently met to examine the prospects for advancing computer science and information technology research by focusing on the complex and often-unique challenges found in the biodiversity and ecosystem domain. The workshop and its final report reveal that the biodiversity and ecosystem sciences are fundamentally information sciences and often address problems having distinctive attributes of scale and socio-technical complexity. The paper provides an overview of the emerging field of biodiversity and ecosystem informatics and demonstrates how the demands of biodiversity and ecosystem research can advance our understanding and use of information technologies.
European research efforts in medical knowledge-based systems.
Stefanelli, M
1993-04-01
This article describes the major projects going on in Europe in the field of Artificial Intelligence in Medicine. The important role of the Commission of the European Communities in providing the needed resources is stressed throughout the paper. Particular attention is given to the methodological and technological issues addressed by the European research teams, since the results which these teams accomplish are fundamental for a more extensive diffusion of knowledge-based systems in real medical settings. The variety of medical problems tackled shows that there is no field of medicine where the potential of advanced informatics technologies has not yet been assessed.
Introduction to autonomous mobile robotics using Lego Mindstorms NXT
NASA Astrophysics Data System (ADS)
Akın, H. Levent; Meriçli, Çetin; Meriçli, Tekin
2013-12-01
Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the Lego Mindstorms NXT kits are used as the robot platform. The aims, scope and contents of the course are presented, and the design of the laboratory sessions as well as the term projects, which address several core problems of robotics and artificial intelligence simultaneously, are explained in detail.
Modelling Cosmic-Ray Effects in the Protosolar Disk
NASA Technical Reports Server (NTRS)
Wilson, Thomas L.
2010-01-01
The role that Galactic cosmic rays (GCRs) and solar energetic particles (SEPs) play in the dynamic evolution of protosolar disks and the origin of our Solar System is a fundamental one. The GCRs are an important component of the interstellar medium (ISM), and even play a role in correcting the age determinations of some irons versus CAIs (calcium-aluminum inclusions) in meteoroids . Because CRs also are one of the energy transport mechanisms in a planetary nebula, the question of modelling their effect upon this broad subject is a serious topic for planetary science. The problem is addressed here.
Neural learning of constrained nonlinear transformations
NASA Technical Reports Server (NTRS)
Barhen, Jacob; Gulati, Sandeep; Zak, Michail
1989-01-01
Two issues that are fundamental to developing autonomous intelligent robots, namely, rudimentary learning capability and dexterous manipulation, are examined. A powerful neural learning formalism is introduced for addressing a large class of nonlinear mapping problems, including redundant manipulator inverse kinematics, commonly encountered during the design of real-time adaptive control mechanisms. Artificial neural networks with terminal attractor dynamics are used. The rapid network convergence resulting from the infinite local stability of these attractors allows the development of fast neural learning algorithms. Approaches to manipulator inverse kinematics are reviewed, the neurodynamics model is discussed, and the neural learning algorithm is presented.
Discrete Inverse and State Estimation Problems
NASA Astrophysics Data System (ADS)
Wunsch, Carl
2006-06-01
The problems of making inferences about the natural world from noisy observations and imperfect theories occur in almost all scientific disciplines. This book addresses these problems using examples taken from geophysical fluid dynamics. It focuses on discrete formulations, both static and time-varying, known variously as inverse, state estimation or data assimilation problems. Starting with fundamental algebraic and statistical ideas, the book guides the reader through a range of inference tools including the singular value decomposition, Gauss-Markov and minimum variance estimates, Kalman filters and related smoothers, and adjoint (Lagrange multiplier) methods. The final chapters discuss a variety of practical applications to geophysical flow problems. Discrete Inverse and State Estimation Problems is an ideal introduction to the topic for graduate students and researchers in oceanography, meteorology, climate dynamics, and geophysical fluid dynamics. It is also accessible to a wider scientific audience; the only prerequisite is an understanding of linear algebra. Provides a comprehensive introduction to discrete methods of inference from incomplete information Based upon 25 years of practical experience using real data and models Develops sequential and whole-domain analysis methods from simple least-squares Contains many examples and problems, and web-based support through MIT opencourseware
ERIC Educational Resources Information Center
Moore, John A.
1983-01-01
Discusses why there are creationists, fundamentalists, and evolutionists. Topics addressed include: modern, primitive and creationist thought; myths; appeal of occult; experiments in naturalistic thought; early evolution of American fundamentalism; militant fundamentalism; fundamentalist activities; Islamic fundamentalism; and others. Suggestions…
Case-based medical informatics
Pantazi, Stefan V; Arocha, José F; Moehr, Jochen R
2004-01-01
Background The "applied" nature distinguishes applied sciences from theoretical sciences. To emphasize this distinction, we begin with a general, meta-level overview of the scientific endeavor. We introduce the knowledge spectrum and four interconnected modalities of knowledge. In addition to the traditional differentiation between implicit and explicit knowledge we outline the concepts of general and individual knowledge. We connect general knowledge with the "frame problem," a fundamental issue of artificial intelligence, and individual knowledge with another important paradigm of artificial intelligence, case-based reasoning, a method of individual knowledge processing that aims at solving new problems based on the solutions to similar past problems. We outline the fundamental differences between Medical Informatics and theoretical sciences and propose that Medical Informatics research should advance individual knowledge processing (case-based reasoning) and that natural language processing research is an important step towards this goal that may have ethical implications for patient-centered health medicine. Discussion We focus on fundamental aspects of decision-making, which connect human expertise with individual knowledge processing. We continue with a knowledge spectrum perspective on biomedical knowledge and conclude that case-based reasoning is the paradigm that can advance towards personalized healthcare and that can enable the education of patients and providers. We center the discussion on formal methods of knowledge representation around the frame problem. We propose a context-dependent view on the notion of "meaning" and advocate the need for case-based reasoning research and natural language processing. In the context of memory based knowledge processing, pattern recognition, comparison and analogy-making, we conclude that while humans seem to naturally support the case-based reasoning paradigm (memory of past experiences of problem-solving and powerful case matching mechanisms), technical solutions are challenging. Finally, we discuss the major challenges for a technical solution: case record comprehensiveness, organization of information on similarity principles, development of pattern recognition and solving ethical issues. Summary Medical Informatics is an applied science that should be committed to advancing patient-centered medicine through individual knowledge processing. Case-based reasoning is the technical solution that enables a continuous individual knowledge processing and could be applied providing that challenges and ethical issues arising are addressed appropriately. PMID:15533257
Identifying Organic Molecules in Space: The AstroBiology Explorer (ABE) Mission Concept
NASA Technical Reports Server (NTRS)
Ennico, K. A.; Sandford, S. A.; Allamandola, L.; Bregman, J.; Cohen, M.; Cruikshank, D.; Dumas, C.; Greene, T.; Hudgins, D.; Kwok, S.
2004-01-01
The AstroBiology Explorer (ABE) mission concept consists of a dedicated space observatory having a 60 cm class primary mirror cooled to T < 50 K equipped with medium resolution cross-dispersed spectrometers having cooled large format near- and mid-infrared detector arrays. Such a system would be capable of addressing outstanding problems in Astrochemistry and Astrophysics that are particularly relevant to Astrobiology and addressable via astronomical observation. The mission s observational program would make fundamental scientific progress in establishing the nature, distribution, formation and evolution of organic and other molecular materials in the following extra-terrestrial environments: 1) The Outflow of Dying Stars, 2) The Diffuse Interstellar Medium, 3) Dense Molecular Clouds, Star Formation Regions, and Young StellarPlanetary Systems, 4) Planets, Satellites, and Small Bodies within the Solar System, and 5 ) The Interstellar Media of Other Galaxies. ABE could make fundamental progress in all of these areas by conducting a 1 to 2 year mission to obtain a coordinated set of infrared spectroscopic observations over the 2.5-20 micron spectral range at a spectral resolution of R > 2000 of about 1500 objects including galaxies, stars, planetary nebulae, young stellar objects, and solar system objects. Keywords: Astrobiology, infrared, Explorers, interstellar organics, telescope, spectrometer, space, infrared detectors
Spatial and Temporal Scaling of Thermal Infrared Remote Sensing Data
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Goel, Narendra S.
1995-01-01
Although remote sensing has a central role to play in the acquisition of synoptic data obtained at multiple spatial and temporal scales to facilitate our understanding of local and regional processes as they influence the global climate, the use of thermal infrared (TIR) remote sensing data in this capacity has received only minimal attention. This results from some fundamental challenges that are associated with employing TIR data collected at different space and time scales, either with the same or different sensing systems, and also from other problems that arise in applying a multiple scaled approach to the measurement of surface temperatures. In this paper, we describe some of the more important problems associated with using TIR remote sensing data obtained at different spatial and temporal scales, examine why these problems appear as impediments to using multiple scaled TIR data, and provide some suggestions for future research activities that may address these problems. We elucidate the fundamental concept of scale as it relates to remote sensing and explore how space and time relationships affect TIR data from a problem-dependency perspective. We also describe how linearity and non-linearity observation versus parameter relationships affect the quantitative analysis of TIR data. Some insight is given on how the atmosphere between target and sensor influences the accurate measurement of surface temperatures and how these effects will be compounded in analyzing multiple scaled TIR data. Last, we describe some of the challenges in modeling TIR data obtained at different space and time scales and discuss how multiple scaled TIR data can be used to provide new and important information for measuring and modeling land-atmosphere energy balance processes.
Analysis of Naval NETWAR FORCEnet Enterprise: Implications for Capabilities Based Budgeting
2006-12-01
of this background information and projecting how ADNS is likely to succeed in the NNFE framework , two fundamental research questions were addressed...background information and projecting how ADNS is likely to succeed in the NNFE framework , two fundamental research questions were addressed. The...Business Approach ......................................................26 Figure 8. Critical Assumption for Common Analytical Framework
Pacific Northwest Laboratory annual report for 1992 to the DOE Office of Energy Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grove, L.K.
1993-03-01
The 1992 Annual Report from Pacific Northwest Laboratory (PNL) to the US Department of Energy (DOE) describes research in environment and health conducted during fiscal year 1992. This report consists of four volumes oriented to particular segments of the PNL program, describing research performed for the DOE Office of Health and Environmental Research in the Office of Energy Research. The parts of the 1992 Annual Report are: Biomedical Sciences; Environmental Sciences; Atmospheric Sciences; and Physical Sciences. This Report is Part II: Environmental Sciences. Included in this report are developments in Subsurface Science, Terrestrial Science, Laboratory-Directed Research and Development, Interactions withmore » Educational Institutions, Technology Transfer, Publications, and Presentations. The research is directed toward developing a fundamental understanding of subsurface and terrestrial systems as a basis for both managing these critical resources and addressing environmental problems such as environmental restoration and global change. The Technology Transfer section of this report describes a number of examples in which fundamental research is laying the groundwork for the technology needed to resolve important environmental problems. The Interactions with Educational Institutions section of the report illustrates the results of a long-term, proactive program to make PNL facilities available for university and preuniversity education and to involve educational institutions in research programs. The areas under investigation include the effect of geochemical and physical phenomena on the diversity and function of microorganisms in deep subsurface environments, ways to address subsurface heterogeneity, and ways to determine the key biochemical and physiological pathways (and DNA markers) that control nutrient, water, and energy dynamics in arid ecosystems and the response of these systems to disturbance and climatic change.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grove, L.K.; Wildung, R.E.
1993-03-01
The 1992 Annual Report from Pacific Northwest Laboratory (PNL) to the US Department of Energy (DOE) describes research in environment and health conducted during fiscal year 1992. This report consists of four volumes oriented to particular segments of the PNL program, describing research performed for the DOE Office of Health and Environmental Research in the Office of Energy Research. The parts of the 1992 Annual Report are: Biomedical Sciences; Environmental Sciences; Atmospheric Sciences; and Physical Sciences. This Report is Part 2: Environmental Sciences. Included in this report are developments in Subsurface Science, Terrestrial Science, Laboratory-Directed Research and Development, Interactions withmore » Educational Institutions, Technology Transfer, Publications, and Presentations. The research is directed toward developing a fundamental understanding of subsurface and terrestrial systems as a basis for both managing these critical resources and addressing environmental problems such as environmental restoration and global change. The Technology Transfer section of this report describes a number of examples in which fundamental research is laying the groundwork for the technology needed to resolve important environmental problems. The Interactions with Educational Institutions section of the report illustrates the results of a long-term, proactive program to make PNL facilities available for university and preuniversity education and to involve educational institutions in research programs. The areas under investigation include the effect of geochemical and physical phenomena on the diversity and function of microorganisms in deep subsurface environments, ways to address subsurface heterogeneity, and ways to determine the key biochemical and physiological pathways (and DNA markers) that control nutrient, water, and energy dynamics in arid ecosystems and the response of these systems to disturbance and climatic change.« less
NASA Astrophysics Data System (ADS)
Baxter, C.; Rowan, J. S.; McKenzie, B. M.; Neilson, R.
2013-04-01
Soil is a key asset of natural capital, providing a myriad of goods and ecosystem services that sustain life through regulating, supporting and provisioning roles, delivered by chemical, physical and biological processes. One of the greatest threats to soil is accelerated erosion, which raises a natural process to unsustainable levels, and has downstream consequences (e.g. economic, environmental and social). Global intensification of agroecosystems is a major cause of soil erosion which, in light of predicted population growth and increased demand for food security, will continue or increase. Elevated erosion and transport is common in agroecosystems and presents a multi-disciplinary problem with direct physical impacts (e.g. soil loss), other less tangible impacts (e.g. loss of ecosystem productivity), and indirect downstream effects that necessitate an integrated approach to effectively address the problem. Climate is also likely to increase susceptibility of soil to erosion. Beyond physical response, the consequences of erosion on soil biota have hitherto been ignored, yet biota play a fundamental role in ecosystem service provision. To our knowledge few studies have addressed the gap between erosion and consequent impacts on soil biota. Transport and redistribution of soil biota by erosion is poorly understood, as is the concomitant impact on biodiversity and ability of soil to deliver the necessary range of ecosystem services to maintain function. To investigate impacts of erosion on soil biota a two-fold research approach is suggested. Physical processes involved in redistribution should be characterised and rates of transport and redistribution quantified. Similarly, cumulative and long-term impacts of biota erosion should be considered. Understanding these fundamental aspects will provide a basis upon which mitigation strategies can be considered.
Addressing the unmet need for visualizing conditional random fields in biological data
2014-01-01
Background The biological world is replete with phenomena that appear to be ideally modeled and analyzed by one archetypal statistical framework - the Graphical Probabilistic Model (GPM). The structure of GPMs is a uniquely good match for biological problems that range from aligning sequences to modeling the genome-to-phenome relationship. The fundamental questions that GPMs address involve making decisions based on a complex web of interacting factors. Unfortunately, while GPMs ideally fit many questions in biology, they are not an easy solution to apply. Building a GPM is not a simple task for an end user. Moreover, applying GPMs is also impeded by the insidious fact that the “complex web of interacting factors” inherent to a problem might be easy to define and also intractable to compute upon. Discussion We propose that the visualization sciences can contribute to many domains of the bio-sciences, by developing tools to address archetypal representation and user interaction issues in GPMs, and in particular a variety of GPM called a Conditional Random Field(CRF). CRFs bring additional power, and additional complexity, because the CRF dependency network can be conditioned on the query data. Conclusions In this manuscript we examine the shared features of several biological problems that are amenable to modeling with CRFs, highlight the challenges that existing visualization and visual analytics paradigms induce for these data, and document an experimental solution called StickWRLD which, while leaving room for improvement, has been successfully applied in several biological research projects. Software and tutorials are available at http://www.stickwrld.org/ PMID:25000815
... of the research on RS focuses on answering fundamental questions about the disorder such as how problems ... of the research on RS focuses on answering fundamental questions about the disorder such as how problems ...
Ultrasound beam transmission using a discretely orthogonal Gaussian aperture basis
NASA Astrophysics Data System (ADS)
Roberts, R. A.
2018-04-01
Work is reported on development of a computational model for ultrasound beam transmission at an arbitrary geometry transmission interface for generally anisotropic materials. The work addresses problems encountered when the fundamental assumptions of ray theory do not hold, thereby introducing errors into ray-theory-based transmission models. Specifically, problems occur when the asymptotic integral analysis underlying ray theory encounters multiple stationary phase points in close proximity, due to focusing caused by concavity on either the entry surface or a material slowness surface. The approach presented here projects integrands over both the transducer aperture and the entry surface beam footprint onto a Gaussian-derived basis set, thereby distributing the integral over a summation of second-order phase integrals which are amenable to single stationary phase point analysis. Significantly, convergence is assured provided a sufficiently fine distribution of basis functions is used.
Gyrodampers for large space structures
NASA Technical Reports Server (NTRS)
Aubrun, J. N.; Margulies, G.
1979-01-01
The problem of controlling the vibrations of a large space structures by the use of actively augmented damping devices distributed throughout the structure is addressed. The gyrodamper which consists of a set of single gimbal control moment gyros which are actively controlled to extract the structural vibratory energy through the local rotational deformations of the structure, is described and analyzed. Various linear and nonlinear dynamic simulations of gyrodamped beams are shown, including results on self-induced vibrations due to sensor noise and rotor imbalance. The complete nonlinear dynamic equations are included. The problem of designing and sizing a system of gyrodampers for a given structure, or extrapolating results for one gyrodamped structure to another is solved in terms of scaling laws. Novel scaling laws for gyro systems are derived, based upon fundamental physical principles, and various examples are given.
Toward blind removal of unwanted sound from orchestrated music
NASA Astrophysics Data System (ADS)
Chang, Soo-Young; Chun, Joohwan
2000-11-01
The problem addressed in this paper is to removing unwanted sounds from music sound. The sound to be removed could be disturbance such as cough. We shall present some preliminary results on this problem using statistical properties of signals. Our approach consists of three steps. We first estimate the fundamental frequencies and partials given noise-corrupted music sound. This gives us the autoregressive (AR) model of the music sound. Then we filter the noise-corrupted sound using the AR parameters. The filtered signal is then subtracted from the original noise-corrupted signal to get the disturbance. Finally, the obtained disturbance is used a reference signal to eliminate the disturbance from the noise- corrupted music signal. Above three steps are carried out in a recursive manner using a sliding window or an infinitely growing window with an appropriate forgetting factor.
Violence in Mexico: A social or public health problem?
Casas Patiño, Donovan; Rodríguez Torres, Alejandra; Salazar Morales, Mario Rodolfo
2016-03-08
This article seeks to explain the importance of violence as a social phenomenon and public health, trying to envision this issue not only from a curative approach to health, but from the social determinants of health, such as economics, politics and the administration of justice. Here, the younger population lacks real opportunities with an absent State that fails to provide structure. These frameworks play a fundamental role in the manifestation of violence. Thus, the debate for addressing and resolving violence opens the way to new perspectives regarding social factors as part of a public health, which cannot be oblivious to the state of the collective. Thus, the analysis of this situation shows that we cannot keep overlooking the whole picture of the real problem in the social health of our world instead of focusing on its discordant parts.
NASA Astrophysics Data System (ADS)
Ibrahim, Raouf A.
2005-06-01
The problem of liquid sloshing in moving or stationary containers remains of great concern to aerospace, civil, and nuclear engineers; physicists; designers of road tankers and ship tankers; and mathematicians. Beginning with the fundamentals of liquid sloshing theory, this book takes the reader systematically from basic theory to advanced analytical and experimental results in a self-contained and coherent format. The book is divided into four sections. Part I deals with the theory of linear liquid sloshing dynamics; Part II addresses the nonlinear theory of liquid sloshing dynamics, Faraday waves, and sloshing impacts; Part III presents the problem of linear and nonlinear interaction of liquid sloshing dynamics with elastic containers and supported structures; and Part IV considers the fluid dynamics in spinning containers and microgravity sloshing. This book will be invaluable to researchers and graduate students in mechanical and aeronautical engineering, designers of liquid containers, and applied mathematicians.
Environmental urban runoff monitoring
NASA Astrophysics Data System (ADS)
Yu, Byunggu; Behera, Pradeep K.; Kim, Seon Ho; Ramirez Rochac, Juan F.; Branham, Travis
2010-04-01
Urban stormwater runoff has been a critical and chronic problem in the quantity and quality of receiving waters, resulting in a major environmental concern. To address this problem engineers and professionals have developed a number of solutions which include various monitoring and modeling techniques. The most fundamental issue in these solutions is accurate monitoring of the quantity and quality of the runoff from both combined and separated sewer systems. This study proposes a new water quantity monitoring system, based on recent developments in sensor technology. Rather than using a single independent sensor, we harness an intelligent sensor platform that integrates various sensors, a wireless communication module, data storage, a battery, and processing power such that more comprehensive, efficient, and scalable data acquisition becomes possible. Our experimental results show the feasibility and applicability of such a sensor platform in the laboratory test setting.
Nonequilibrium statistical mechanics Brussels-Austin style
NASA Astrophysics Data System (ADS)
Bishop, Robert C.
The fundamental problem on which Ilya Prigogine and the Brussels-Austin Group have focused can be stated briefly as follows. Our observations indicate that there is an arrow of time in our experience of the world (e.g., decay of unstable radioactive atoms like uranium, or the mixing of cream in coffee). Most of the fundamental equations of physics are time reversible, however, presenting an apparent conflict between our theoretical descriptions and experimental observations. Many have thought that the observed arrow of time was either an artifact of our observations or due to very special initial conditions. An alternative approach, followed by the Brussels-Austin Group, is to consider the observed direction of time to be a basic physical phenomenon due to the dynamics of physical systems. This essay focuses mainly on recent developments in the Brussels-Austin Group after the mid-1980s. The fundamental concerns are the same as in their earlier approaches (subdynamics, similarity transformations), but the contemporary approach utilizes rigged Hilbert space (whereas the older approaches used Hilbert space). While the emphasis on nonequilibrium statistical mechanics remains the same, their more recent approach addresses the physical features of large Poincaré systems, nonlinear dynamics and the mathematical tools necessary to analyze them.
Henriksen, Ingvild Oxås; Ranøyen, Ingunn; Indredavik, Marit Sæbø; Stenseng, Frode
2017-01-01
Self-esteem is fundamentally linked to mental health, but its' role in trajectories of psychiatric problems is unclear. In particular, few studies have addressed the role of self-esteem in the development of attention problems. Hence, we examined the role of global self-esteem in the development of symptoms of anxiety/depression and attention problems, simultaneously, in a clinical sample of adolescents while accounting for gender, therapy, and medication. Longitudinal data were obtained from a sample of 201 adolescents-aged 13-18-referred to the Department of Child and Adolescent Psychiatry in Trondheim, Norway. In the baseline study, self-esteem, and symptoms of anxiety/depression and attention problems were measured by means of self-report. Participants were reassessed 3 years later, with a participation rate of 77% in the clinical sample. Analyses showed that high self-esteem at baseline predicted fewer symptoms of both anxiety/depression and attention problems 3 years later after controlling for prior symptom levels, gender, therapy (or not), and medication. Results highlight the relevance of global self-esteem in the clinical practice, not only with regard to emotional problems, but also to attention problems. Implications for clinicians, parents, and others are discussed.
Redesigning Introductory Science Courses to Teach Sustainability: Introducing the L(SC)2 Paradigm
NASA Astrophysics Data System (ADS)
Myers, J. D.; Campbell-Stone, E.; Massey, G.
2008-12-01
Modern societies consume vast quantities of Earth resources at unsustainable levels; at the same time, resource extraction, processing, production, use and disposal have resulted in environmental damage severe enough to threaten the life-support systems of our planet. These threats are produced by multiple, integrative and cumulative environmental stresses, i.e. syndromes, which result from human physical, ecological and social interactions with the environment in specific geographic places. In recent decades, recognition of this growing threat has lead to the concept of sustainability. The science needed to provide the knowledge and know-how for a successful sustainability transition differs markedly from the science that built our modern world. Sustainability science must balanced basic and applied research, promote integrative research focused on specific problems and devise a means of merging fundamental, general scientific principles with understanding of specific places. At the same time, it must use a variety of knowledge areas, i.e. biological systems, Earth systems, technological systems and social systems, to devise solutions to the many complex and difficult problems humankind faces. Clearly, sustainability science is far removed from the discipline-based science taught in most U.S. colleges. Many introductory science courses focus on content, lack context and do not integrate scientific disciplines. To prepare the citizens who will confront future sustainability issues as well as the scientists needed to devise future sustainability strategies, educators and scientists must redesign the typical college science course. A new course paradigm, Literacies and Scientific Content in Social Context (L(SC)2), is ideally suited to teach sustainability science. It offers an alternative approach to liberal science education by redefining and expanding the concept of the interdisciplinary course and merging it with the integrated science course. In addition to promoting scientific literacy, L(SC)2 courses explicitly promote mastery of fundamental quantitative and qualitative skills critical to science and commonly a barrier to student success in science. Scientific content addresses the principles and disciplines necessary to tackle the multifaceted problems that must be solved in any sustainability transition and illustrates the limitations on what can be accomplished. Finally, social context adds the place-based component that is critical to sustainability science while revealing how science impacts students' everyday lives. Experience in addressing realistic, real-life problems fosters the habits of mind necessary to address these problems and instills a sense of social and political efficacy and responsibility. The L(SC)2 course paradigm employs a variety of educational tools (active problem-based learning, collaborative work, peer instruction, interdisciplinarity, and global context-based instruction) that improve lasting comprehension by creating a more effective learning environment. In this paradigm, STEM students learn that although there may be a technically or scientifically optimal solution to a problem, it must be responsive to a society's social, legal, cultural and religious parameters. Conversely, students in non-STEM fields learn that solutions to societal problems must be scientifically valid and technologically feasible. The interaction of STEM and non-STEM students in L(SC)2 courses builds bridges between the natural and social sciences that are critical for a successful sustainability transition and lacking in most traditional science courses.
The colloquial approach: An active learning technique
NASA Astrophysics Data System (ADS)
Arce, Pedro
1994-09-01
This paper addresses the very important problem of the effectiveness of teaching methodologies in fundamental engineering courses such as transport phenomena. An active learning strategy, termed the colloquial approach, is proposed in order to increase student involvement in the learning process. This methodology is a considerable departure from traditional methods that use solo lecturing. It is based on guided discussions, and it promotes student understanding of new concepts by directing the student to construct new ideas by building upon the current knowledge and by focusing on key cases that capture the essential aspects of new concepts. The colloquial approach motivates the student to participate in discussions, to develop detailed notes, and to design (or construct) his or her own explanation for a given problem. This paper discusses the main features of the colloquial approach within the framework of other current and previous techniques. Problem-solving strategies and the need for new textbooks and for future investigations based on the colloquial approach are also outlined.
NASA Technical Reports Server (NTRS)
Stewart, E. C.; Brown, P. W.; Yenni, K. R.
1986-01-01
A simulation study was conducted to investigate the piloting problems associated with failure of an engine on a generic light twin-engine airplane. A primary piloting problem for a light twin-engine airplane after an engine failure is maintaining precise control of the airplane in the presence of large steady control forces. To address this problem, a simulated automatic trim system which drives the trim tabs as an open-loop function of propeller slipstream measurements was developed. The simulated automatic trim system was found to greatly increase the controllability in asymmetric powered flight without having to resort to complex control laws or an irreversible control system. However, the trim-tab control rates needed to produce the dramatic increase in controllability may require special design consideration for automatic trim system failures. Limited measurements obtained in full-scale flight tests confirmed the fundamental validity of the proposed control law.
Introduction to the special issue Hermann Weyl and the philosophy of the 'New Physics'
NASA Astrophysics Data System (ADS)
De Bianchi, Silvia; Catren, Gabriel
2018-02-01
This Special Issue Hermann Weyl and the Philosophy of the 'New Physics' has two main objectives: first, to shed fresh light on the relevance of Weyl's work for modern physics and, second, to evaluate the importance of Weyl's work and ideas for contemporary philosophy of physics. Regarding the first objective, this Special Issue emphasizes aspects of Weyl's work (e.g. his work on spinors in n dimensions) whose importance has recently been emerging in research fields across both mathematical and experimental physics, as well as in the history and philosophy of physics. Regarding the second objective, this Special Issue addresses the relevance of Weyl's ideas regarding important open problems in the philosophy of physics, such as the problem of characterizing scientific objectivity and the problem of providing a satisfactory interpretation of fundamental symmetries in gauge theories and quantum mechanics. In this Introduction, we sketch the state of the art in Weyl studies and we summarize the content of the contributions to the present volume.
NASA Astrophysics Data System (ADS)
Dell, Elizabeth M.; Verhoeven, Yen; Christman, Jeanne W.; Garrick, Robert D.
2018-05-01
Diverse perspectives are required to address the technological problems facing our world. Although women perform as well as their male counterparts in math and science prior to entering college, the numbers of women students entering and completing engineering programmes are far below their representation in the workforce. This paper reports on a qualitative, multiyear study of the experiences of women students in an Engineering Technology programme. The project addressed some of the unique, fundamental challenges that female students face within their programmes, and the authors describe a programmatic framework based on Self-Determination Theory as an intervention for the recruitment and retention of female engineering students. Data from focus groups and interviews show how students were supported in their undergraduate experiences and how inclusive learning environments are needed to further improve outcomes. Conceptual issues and methodological considerations of our outcomes are presented.
Atomic electron tomography: 3D structures without crystals
Miao, Jianwei; Ercius, Peter; Billinge, S. J. L.
2016-09-23
Crystallography has been fundamental to the development of many fields of science over the last century. However, much of our modern science and technology relies on materials with defects and disorders, and their three-dimensional (3D) atomic structures are not accessible to crystallography. One method capable of addressing this major challenge is atomic electron tomography. By combining advanced electron microscopes and detectors with powerful data analysis and tomographic reconstruction algorithms, it is now possible to determine the 3D atomic structure of crystal defects such as grain boundaries, stacking faults, dislocations, and point defects, as well as to precisely localize the 3Dmore » coordinates of individual atoms in materials without assuming crystallinity. In this work, we review the recent advances and the interdisciplinary science enabled by this methodology. We also outline further research needed for atomic electron tomography to address long-standing unresolved problems in the physical sciences.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miao, Jianwei; Ercius, Peter; Billinge, S. J. L.
Crystallography has been fundamental to the development of many fields of science over the last century. However, much of our modern science and technology relies on materials with defects and disorders, and their three-dimensional (3D) atomic structures are not accessible to crystallography. One method capable of addressing this major challenge is atomic electron tomography. By combining advanced electron microscopes and detectors with powerful data analysis and tomographic reconstruction algorithms, it is now possible to determine the 3D atomic structure of crystal defects such as grain boundaries, stacking faults, dislocations, and point defects, as well as to precisely localize the 3Dmore » coordinates of individual atoms in materials without assuming crystallinity. In this work, we review the recent advances and the interdisciplinary science enabled by this methodology. We also outline further research needed for atomic electron tomography to address long-standing unresolved problems in the physical sciences.« less
CODATA Fundamental Physical Constants
National Institute of Standards and Technology Data Gateway
SRD 121 NIST CODATA Fundamental Physical Constants (Web, free access) This site, developed in the Physics Laboratory at NIST, addresses three topics: fundamental physical constants, the International System of Units (SI), which is the modern metric system, and expressing the uncertainty of measurement results.
Ethical principles in health research and review process.
Tangwa, Godfrey B
2009-11-01
In this paper I want to reflect on the fundamental ethical principles and their application in different particular contexts, especially in health research and the ethics review process. Four fundamental ethical principles have been identified and widely discussed in bioethical literature. These principles namely are: autonomy or respect for others, beneficence, non-maleficence and justice. These principles have cross-cultural validity, relevance and applicability. Every real-life situation and every concrete particular case in which ethical decision-making is called-for is unique and different from all others; but the same fundamental ethical principles are relevant and used in addressing all such cases and situations. Very often ethical problems will present themselves in the form of dilemmas and it is then necessary to use the same fundamental principles to analyze the situations, to argue persuasively and cogently with competence for the best options or choices in such situations. The issues I will be dealing with in this paper are necessarily more abstract and theoretical, but we will be discussing them from a very practical viewpoint and impulse, with a view to application in concrete real-life situations. The paper ends with some sample practical examples of cases that the reader can use to test his/her grasp of the principles, how to apply them, how to balance them in differing situations and contexts and how to adjudicate between them when they seem to be in conflict.
Coordinating Multi-Rover Systems: Evaluation Functions for Dynamic and Noisy Environments
NASA Technical Reports Server (NTRS)
Turner, Kagan; Agogino, Adrian
2005-01-01
This paper addresses the evolution of control strategies for a collective: a set of entities that collectively strives to maximize a global evaluation function that rates the performance of the full system. Directly addressing such problems by having a population of collectives and applying the evolutionary algorithm to that population is appealing, but the search space is prohibitively large in most cases. Instead, we focus on evolving control policies for each member of the collective. The fundamental issue in this approach is how to create an evaluation function for each member of the collective that is both aligned with the global evaluation function and is sensitive to the fitness changes of the member, while relatively insensitive to the fitness changes of other members. We show how to construct evaluation functions in dynamic, noisy and communication-limited collective environments. On a rover coordination problem, a control policy evolved using aligned and member-sensitive evaluations outperfoms global evaluation methods by up to 400%. More notably, in the presence of a larger number of rovers or rovers with noisy and communication limited sensors, the proposed method outperforms global evaluation by a higher percentage than in noise-free conditions with a small number of rovers.
Combustion Fundamentals Research
NASA Technical Reports Server (NTRS)
1983-01-01
Increased emphasis is placed on fundamental and generic research at Lewis Research Center with less systems development efforts. This is especially true in combustion research, where the study of combustion fundamentals has grown significantly in order to better address the perceived long term technical needs of the aerospace industry. The main thrusts for this combustion fundamentals program area are as follows: analytical models of combustion processes, model verification experiments, fundamental combustion experiments, and advanced numeric techniques.
NASA Astrophysics Data System (ADS)
Thompson, S. E.; Sivapalan, M.; Harman, C. J.; Srinivasan, V.; Hipsey, M. R.; Reed, P.; Montanari, A.; Blöschl, G.
2013-06-01
Globally, many different kinds of water resources management issues call for policy and infrastructure based responses. Yet responsible decision making about water resources management raises a fundamental challenge for hydrologists: making predictions about water resources on decadal-to-century long timescales. Obtaining insight into hydrologic futures over 100 yr timescales forces researchers to address internal and exogenous changes in the properties of hydrologic systems. To do this, new hydrologic research must identify, describe and model feedbacks between water and other changing, coupled environmental subsystems. These models must be constrained to yield useful insights, despite the many likely sources of uncertainty in their predictions. Chief among these uncertainties are the impacts of the increasing role of human intervention in the global water cycle - a defining challenge for hydrology in the Anthropocene. Here we present a research agenda that proposes a suite of strategies to address these challenges. The research agenda focuses on the development of co-evolutionary hydrologic modeling to explore coupling across systems, and to address the implications of this coupling on the long-time behavior of the coupled systems. Three research directions support the development of these models: hydrologic reconstruction, comparative hydrology and model-data learning. These strategies focus on understanding hydrologic processes and feedbacks over long timescales, across many locations, and through strategic coupling of observational and model data in specific systems. We highlight the value of use-inspired and team-based science that is motivated by real-world hydrologic problems but targets improvements in fundamental understanding to support decision-making and management.
Equatorial oceanography. [review of research
NASA Technical Reports Server (NTRS)
Cane, M. A.; Sarachik, E. S.
1983-01-01
United States progress in equatorial oceanography is reviewed, focusing on the low frequency response of upper equatorial oceans to forcing by the wind. Variations of thermocline depth, midocean currents, and boundary currents are discussed. The factors which determine sea surface temperature (SST) variability in equatorial oceans are reviewed, and the status of understanding of the most spectacular manifestation of SST variability, the El Nino-Southern Oscillation phenomenon, is discussed. The problem of observing surface winds, regarded as a fundamental factor limiting understanding of the equatorial oceans, is addressed. Finally, an attempt is made to identify those current trends which are expected to bear fruit in the near and distant future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-04-01
This 1993 Annual Report from Pacific Northwest Laboratory (PNL) to the US DOE describes research in environment and health conducted during fiscal year (FY) 1993. The report is divided into four parts, each in a separate volume. This part, Volume 2, covers Environmental Sciences. The research is directed toward developing a fundamental understanding of subsurface and terrestrial systems as a basis for both managing these critical resources and addressing environmental problems such as environmental restoration and global change. There are sections on Subsurface Science, Terrestrial Science, Technology Transfer, Interactions with Educational Institutions, and Laboratory Directed Research and Development.
Generation method of synthetic training data for mobile OCR system
NASA Astrophysics Data System (ADS)
Chernyshova, Yulia S.; Gayer, Alexander V.; Sheshkus, Alexander V.
2018-04-01
This paper addresses one of the fundamental problems of machine learning - training data acquiring. Obtaining enough natural training data is rather difficult and expensive. In last years usage of synthetic images has become more beneficial as it allows to save human time and also to provide a huge number of images which otherwise would be difficult to obtain. However, for successful learning on artificial dataset one should try to reduce the gap between natural and synthetic data distributions. In this paper we describe an algorithm which allows to create artificial training datasets for OCR systems using russian passport as a case study.
A strategy for Earth science from space in the 1980s. Part 1: Solid earth and oceans
NASA Technical Reports Server (NTRS)
1982-01-01
The report develops a ten-year science strategy for investigating the solid earth and dynamics of world oceans from Earth orbit. The strategy begins from the premise that earth studies have proceeded to the point where further advances in understanding Earth processes must be based on a global perspective and that the U.S. is technically ready to begin a global study approach from Earth orbit. The major areas of study and their fundamental problems are identified. The strategy defines the primary science objectives to be addressed and the essential measurements and precision to achieve them.
How Should Energy Be Defined Throughout Schooling?
NASA Astrophysics Data System (ADS)
Bächtold, Manuel
2017-02-01
The question of how to teach energy has been renewed by recent studies focusing on the learning and teaching progressions for this concept. In this context, one question has been, for the most part, overlooked: how should energy be defined throughout schooling? This paper addresses this question in three steps. We first identify and discuss two main approaches in physics concerning the definition of energy, one claiming there is no satisfactory definition and taking conservation as a fundamental property, and the other based on Rankine's definition of energy as the capacity of a system to produce changes. We then present a study concerning how energy is actually defined throughout schooling in the case of France by analyzing national programs, physics textbooks, and the answers of teachers to a questionnaire. This study brings to light a consistency problem in the way energy is defined across school years: in primary school, an adapted version of Rankine's definition is introduced and conservation is ignored; in high school, conservation is introduced and Rankine's definition is ignored. Finally, we address this consistency problem by discussing possible teaching progressions. We argue in favor of the use of Rankine's definition throughout schooling: at primary school, it is a possible substitute to students' erroneous conceptions; at secondary school, it might help students become aware of the unifying role of energy and thereby overcome the compartmentalization problem.
Tackling racism as a "wicked" public health problem: Enabling allies in anti-racism praxis.
Came, Heather; Griffith, Derek
2018-02-01
Racism is a "wicked" public health problem that fuels systemic health inequities between population groups in New Zealand, the United States and elsewhere. While literature has examined racism and its effects on health, the work describing how to intervene to address racism in public health is less developed. While the notion of raising awareness of racism through socio-political education is not new, given the way racism has morphed into new narratives in health institutional settings, it has become critical to support allies to make informing efforts to address racism as a fundamental cause of health inequities. In this paper, we make the case for anti-racism praxis as a tool to address inequities in public health, and focus on describing an anti-racism praxis framework to inform the training and support of allies. The limited work on anti-racism rarely articulates the unique challenges or needs of allies or targets of racism, but we seek to help fill that gap. Our anti-racism praxis for allies includes five core elements: reflexive relational praxis, structural power analysis, socio-political education, monitoring and evaluation and systems change approaches. We recognize that racism is a modifiable determinant of health and racial inequities can be eliminated with the necessary political will and a planned system change approach. Anti-racism praxis provides the tools to examine the interconnection and interdependence of cultural and institutional factors as a foundation for examining where and how to intervene to address racism. Copyright © 2017 Elsevier Ltd. All rights reserved.
Addressing population health and health inequalities: the role of fundamental causes.
Cerdá, Magdalena; Tracy, Melissa; Ahern, Jennifer; Galea, Sandro
2014-09-01
As a case study of the impact of universal versus targeted interventions on population health and health inequalities, we used simulations to examine (1) whether universal or targeted manipulations of collective efficacy better reduced population-level rates and racial/ethnic inequalities in violent victimization; and (2) whether experiments reduced disparities without addressing fundamental causes. We applied agent-based simulation techniques to the specific example of an intervention on neighborhood collective efficacy to reduce population-level rates and racial/ethnic inequalities in violent victimization. The agent population consisted of 4000 individuals aged 18 years and older with sociodemographic characteristics assigned to match distributions of the adult population in New York City according to the 2000 U.S. Census. Universal experiments reduced rates of victimization more than targeted experiments. However, neither experiment reduced inequalities. To reduce inequalities, it was necessary to eliminate racial/ethnic residential segregation. These simulations support the use of universal intervention but suggest that it is not possible to address inequalities in health without first addressing fundamental causes.
Fundamental care guided by the Careful Nursing Philosophy and Professional Practice Model©.
Meehan, Therese Connell; Timmins, Fiona; Burke, Jacqueline
2018-02-05
To propose the Careful Nursing Philosophy and Professional Practice Model © as a conceptual and practice solution to current fundamental nursing care erosion and deficits. There is growing awareness of the crucial importance of fundamental care. Efforts are underway to heighten nurses' awareness of values that motivate fundamental care and thereby increase their attention to effective provision of fundamental care. However, there remains a need for nursing frameworks which motivate nurses to bring fundamental care values to life in their practice and strengthen their commitment to provide fundamental care. This descriptive position paper builds on the Careful Nursing Philosophy and Professional Practice Model © (Careful Nursing). Careful Nursing elaborates explicit nursing values and addresses both relational and pragmatic aspects of nursing practice, offering an ideal guide to provision of fundamental nursing care. A comparative alignment approach is used to review the capacity of Careful Nursing to address fundamentals of nursing care. Careful Nursing provides a value-based comprehensive and practical framework which can strengthen clinical nurses' ability to articulate and control their practice and, thereby, more effectively fulfil their responsibility to provide fundamental care and measure its effectiveness. This explicitly value-based nursing philosophy and professional practice model offers nurses a comprehensive, pragmatic and engaging framework designed to strengthen their control over their practice and ability to provide high-quality fundamental nursing care. © 2018 John Wiley & Sons Ltd.
Medical sociology as a vocation.
Bosk, Charles L
2014-12-01
This article extends Weber's discussion of science as a vocation by applying it to medical sociology. Having used qualitative methods for nearly 40 years to interpret problems of meaning as they arise in the context of health care, I describe how ethnography, in particular, and qualitative inquiry, more generally, may be used as a tool for understanding fundamental questions close to the heart but far from the mind of medical sociology. Such questions overlap with major policy questions such as how do we achieve a higher standard for quality of care and assure the safety of patients. Using my own research, I show how this engagement takes the form of showing how simple narratives of policy change fail to address the complexities of the problems that they are designed to remedy. I also attempt to explain how I balance objectivity with a commitment to creating a more equitable framework for health care. © American Sociological Association 2014.
Oliveri, Paolo
2017-08-22
Qualitative data modelling is a fundamental branch of pattern recognition, with many applications in analytical chemistry, and embraces two main families: discriminant and class-modelling methods. The first strategy is appropriate when at least two classes are meaningfully defined in the problem under study, while the second strategy is the right choice when the focus is on a single class. For this reason, class-modelling methods are also referred to as one-class classifiers. Although, in the food analytical field, most of the issues would be properly addressed by class-modelling strategies, the use of such techniques is rather limited and, in many cases, discriminant methods are forcedly used for one-class problems, introducing a bias in the outcomes. Key aspects related to the development, optimisation and validation of suitable class models for the characterisation of food products are critically analysed and discussed. Copyright © 2017 Elsevier B.V. All rights reserved.
Loop Quantum Gravity and Asymptotically Flat Spaces
NASA Astrophysics Data System (ADS)
Arnsdorf, Matthias
2002-12-01
Remarkable progress has been made in the field of non-perturbative (loop) quantum gravity in the last decade or so and it is now a rigorously defined kinematical theory (c.f. [5] for a review and references). We are now at the stage where physical applications of loop quantum gravity can be studied and used to provide checks for the consistency of the quantisation programme. Equally, old fundamental problems of canonical quantum gravity such as the problem of time or the interpretation of quantum cosmology need to be reevaluated seriously. These issues can be addressed most profitably in the asymptotically flat sector of quantum gravity. Indeed, it is likely that we should obtain a quantum theory for this special case even if it is not possible to quantise full general relativity. The purpose of this summary is to advertise the extension of loop quantum gravity to this sector that was developed in [1]...
A Science Superior to Music: Joseph Sauveur and the Estrangement between Music and Acoustics
NASA Astrophysics Data System (ADS)
Fix, Adam
2015-09-01
The scientific revolution saw a shift from the natural philosophy of music to the science of acoustics. Joseph Sauveur (1653-1716), an early pioneer in acoustics, determined that science as understood in the eighteenth century could not address the fundamental problems of music, particularly the problem of consonance. Building on Descartes, Mersenne, and Huygens especially, Sauveur drew a sharp divide between sound and music, recognizing the former as a physical phenomenon obeying mechanical and mathematical principles and the latter as an inescapably subjective and unquantifiable perception. While acoustics grew prominent in the Académie des sciences, music largely fell out of the scientific discourse, becoming primarily practiced art rather than natural philosophy. This study illuminates what was considered proper science at the dawn of the Enlightenment and why one particular branch of natural philosophy—music—did not make the cut.
A pervasive parallel framework for visualization: final report for FWP 10-014707
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth D.
2014-01-01
We are on the threshold of a transformative change in the basic architecture of highperformance computing. The use of accelerator processors, characterized by large core counts, shared but asymmetrical memory, and heavy thread loading, is quickly becoming the norm in high performance computing. These accelerators represent significant challenges in updating our existing base of software. An intrinsic problem with this transition is a fundamental programming shift from message passing processes to much more fine thread scheduling with memory sharing. Another problem is the lack of stability in accelerator implementation; processor and compiler technology is currently changing rapidly. This report documentsmore » the results of our three-year ASCR project to address these challenges. Our project includes the development of the Dax toolkit, which contains the beginnings of new algorithms for a new generation of computers and the underlying infrastructure to rapidly prototype and build further algorithms as necessary.« less
Learning-Based Adaptive Optimal Tracking Control of Strict-Feedback Nonlinear Systems.
Gao, Weinan; Jiang, Zhong-Ping; Weinan Gao; Zhong-Ping Jiang; Gao, Weinan; Jiang, Zhong-Ping
2018-06-01
This paper proposes a novel data-driven control approach to address the problem of adaptive optimal tracking for a class of nonlinear systems taking the strict-feedback form. Adaptive dynamic programming (ADP) and nonlinear output regulation theories are integrated for the first time to compute an adaptive near-optimal tracker without any a priori knowledge of the system dynamics. Fundamentally different from adaptive optimal stabilization problems, the solution to a Hamilton-Jacobi-Bellman (HJB) equation, not necessarily a positive definite function, cannot be approximated through the existing iterative methods. This paper proposes a novel policy iteration technique for solving positive semidefinite HJB equations with rigorous convergence analysis. A two-phase data-driven learning method is developed and implemented online by ADP. The efficacy of the proposed adaptive optimal tracking control methodology is demonstrated via a Van der Pol oscillator with time-varying exogenous signals.
Thermodynamics fundamentals of energy conversion
NASA Astrophysics Data System (ADS)
Dan, Nicolae
The work reported in the chapters 1-5 focuses on the fundamentals of heat transfer, fluid dynamics, thermodynamics and electrical phenomena related to the conversion of one form of energy to another. Chapter 6 is a re-examination of the fundamental heat transfer problem of how to connect a finite-size heat generating volume to a concentrated sink. Chapter 1 extends to electrical machines the combined thermodynamics and heat transfer optimization approach that has been developed for heat engines. The conversion efficiency at maximum power is 1/2. When, as in specific applications, the operating temperature of windings must not exceed a specified level, the power output is lower and efficiency higher. Chapter 2 addresses the fundamental problem of determining the optimal history (regime of operation) of a battery so that the work output is maximum. Chapters 3 and 4 report the energy conversion aspects of an expanding mixture of hot particles, steam and liquid water. At the elemental level, steam annuli develop around the spherical drops as time increases. At the mixture level, the density decreases while the pressure and velocity increases. Chapter 4 describes numerically, based on the finite element method, the time evolution of the expanding mixture of hot spherical particles, steam and water. The fluid particles are moved in time in a Lagrangian manner to simulate the change of the domain configuration. Chapter 5 describes the process of thermal interaction between the molten material and water. In the second part of the chapter the model accounts for the irreversibility due to the flow of the mixture through the cracks of the mixing vessel. The approach presented in this chapter is based on exergy analysis and represents a departure from the line of inquiry that was followed in chapters 3-4. Chapter 6 shows that the geometry of the heat flow path between a volume and one point can be optimized in two fundamentally different ways. In the "growth" method the structure is optimized starting from the smallest volume element of fixed size. In "design" method the overall volume is fixed, and the designer works "inward" by increasing the internal complexity of the paths for heat flow.
An Explorer-Class Astrobiology Mission
NASA Technical Reports Server (NTRS)
Sandford, Scott; Greene, Thomas; Allamandola, Louis; Arno, Roger; Bregman, Jesse; Cox, Sylvia; Davis, Paul K.; Gonzales, Andrew; Haas, Michael; Hanel, Robert;
2000-01-01
In this paper we describe a potential new Explorer-class space mission, the AstroBiology Explorer (ABE), consisting of a relatively modest dedicated space observatory having a 50 cm aperture primary mirror which is passively cooled to T less than 65 K, resides in a low-background orbit (heliocentric orbit at 1 AU, Earth drift-away), and is equipped with a suite of three moderate order (m approx. 10) dispersive spectrographs equipped with first-order cross-dispersers in an "echellette" configuration and large format (1024xl024 pixel) near- and mid-IR detector arrays cooled by a modest amount of cryogen. Such a system would be capable of addressing outstanding problems in Astrochemistry and Astrophysics that are particularly relevant to Astrobiology and addressable via astronomical observation. The observational program of this mission would make fundamental scientific progress in each of the key areas of the cosmic history of molecular carbon, the distribution and chemistry of organic compounds in the diffuse and dense interstellar media, and the evolution of ices and organic matter in young planetary systems. ABE could make fundamental progress in all of these areas by conducting an approximately one year mission to obtain a coordinated set of infrared spectroscopic observations over the 2.5-20 micrometers spectral range at spectral resolutions of R greater than or equal to 1000 of approximately 1000 galaxies, stars, planetary nebulae, and young star planetary systems.
Unesco in Asia and the Pacific: 40 years on.
1986-11-01
The United Nations Education. Scientific. and Cultural Organization (UNESCO) has for more that 40 years helped build schools, train teachers, produce educational materials, print textbooks, develop curricula, formulate educational policies, plan short and long educational strategies in Asia and the Pacific areas. It has restored and preserved cultural monuments, rare manuscripts, forms of music and plays, and has translated works from national to international languages. It has brought scientists from around the world to address problems such as environment, vegetation, water and marine life to discover common solutions. It has brought social scientists together to address human rights, fundamental freedoms and nation building issues. It assisted in building communications infrastructures, training and works to provide a better flow of information between countries and regions. This bulletin provides information on UNESCO's activities in Asia and the Pacific. Educational activities include universal primary education, eradication of illiteracy, higher education, science, teacher education, population education, cultural activities and social and human sciences. Other activities include, educational reform in India, Japan, Malaysia, Pakistan, Australia, and New Zealand. Recent developments in upgrading science education and future challenges to educational reform is being pursued. UNESCO's fundamental purpose is to break down the barriers of prejudice and ignorance and improve the knowledge of other cultures. To develop a lasting peace, a people to people relationship must be developed that will generate a world wide intellectual and moral solidarity that will prevent tendencies toward confrontation.
Focus on the emerging new fields of network physiology and network medicine
NASA Astrophysics Data System (ADS)
Ivanov, Plamen Ch; Liu, Kang K. L.; Bartsch, Ronny P.
2016-10-01
Despite the vast progress and achievements in systems biology and integrative physiology in the last decades, there is still a significant gap in understanding the mechanisms through which (i) genomic, proteomic and metabolic factors and signaling pathways impact vertical processes across cells, tissues and organs leading to the expression of different disease phenotypes and influence the functional and clinical associations between diseases, and (ii) how diverse physiological systems and organs coordinate their functions over a broad range of space and time scales and horizontally integrate to generate distinct physiologic states at the organism level. Two emerging fields, network medicine and network physiology, aim to address these fundamental questions. Novel concepts and approaches derived from recent advances in network theory, coupled dynamical systems, statistical and computational physics show promise to provide new insights into the complexity of physiological structure and function in health and disease, bridging the genetic and sub-cellular level with inter-cellular interactions and communications among integrated organ systems and sub-systems. These advances form first building blocks in the methodological formalism and theoretical framework necessary to address fundamental problems and challenges in physiology and medicine. This ‘focus on’ issue contains 26 articles representing state-of-the-art contributions covering diverse systems from the sub-cellular to the organism level where physicists have key role in laying the foundations of these new fields.
Biomedical engineering strategies in system design space.
Savageau, Michael A
2011-04-01
Modern systems biology and synthetic bioengineering face two major challenges in relating properties of the genetic components of a natural or engineered system to its integrated behavior. The first is the fundamental unsolved problem of relating the digital representation of the genotype to the analog representation of the parameters for the molecular components. For example, knowing the DNA sequence does not allow one to determine the kinetic parameters of an enzyme. The second is the fundamental unsolved problem of relating the parameters of the components and the environment to the phenotype of the global system. For example, knowing the parameters does not tell one how many qualitatively distinct phenotypes are in the organism's repertoire or the relative fitness of the phenotypes in different environments. These also are challenges for biomedical engineers as they attempt to develop therapeutic strategies to treat pathology or to redirect normal cellular functions for biotechnological purposes. In this article, the second of these fundamental challenges will be addressed, and the notion of a "system design space" for relating the parameter space of components to the phenotype space of bioengineering systems will be focused upon. First, the concept of a system design space will be motivated by introducing one of its key components from an intuitive perspective. Second, a simple linear example will be used to illustrate a generic method for constructing the design space in which qualitatively distinct phenotypes can be identified and counted, their fitness analyzed and compared, and their tolerance to change measured. Third, two examples of nonlinear systems from different areas of biomedical engineering will be presented. Finally, after giving reference to a few other applications that have made use of the system design space approach to reveal important design principles, some concluding remarks concerning challenges and opportunities for further development will be made.
NASA Astrophysics Data System (ADS)
Perdigão, R. A. P.
2017-12-01
Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.
Smith, David K
2018-05-08
This feature article provides a personal insight into the research from my group over the past 10 years. In particular, the article explains how, inspired in 2005 by meeting my now-husband, Sam, who had cystic fibrosis, and who in 2011 went on to have a double lung transplant, I took an active decision to follow a more applied approach to some of our research, attempting to use fundamental supramolecular chemistry to address problems of medical interest. In particular, our strategy uses self-assembly to fabricate biologically-active nanosystems from simple low-molecular-weight building blocks. These systems can bind biological polyanions in highly competitive conditions, allowing us to approach applications in gene delivery and coagulation control. In the process, however, we have also developed new fundamental principles such as self-assembled multivalency (SAMul), temporary 'on-off' multivalency, and adaptive/shape-persistent multivalent binding. By targeting materials with applications in drug formulation and tissue engineering, we have discovered novel self-assembling low-molecular-weight hydrogelators based on the industrially-relevant dibenzylidenesorbitol framework and developed innovative approaches to spatially-resolved gels and functional multicomponent hybrid hydrogels. In this way, taking an application-led approach to research has also delivered significant academic value and conceptual advances. Furthermore, beginning to translate fundamental supramolecular chemistry into real-world applications, starts to demonstrate the power of this approach, and its potential to transform the world around us for the better.
Optimal Electrical Energy Slewing for Reaction Wheel Spacecraft
NASA Astrophysics Data System (ADS)
Marsh, Harleigh Christian
The results contained in this dissertation contribute to a deeper level of understanding to the energy required to slew a spacecraft using reaction wheels. This work addresses the fundamental manner in which spacecrafts are slewed (eigenaxis maneuvering), and demonstrates that this conventional maneuver can be dramatically improved upon in regards to reduction of energy, dissipative losses, as well as peak power. Energy is a fundamental resource that effects every asset, system, and subsystem upon a spacecraft, from the attitude control system which orients the spacecraft, to the communication subsystem to link with ground stations, to the payloads which collect scientific data. For a reaction wheel spacecraft, the attitude control system is a particularly heavy load on the power and energy resources on a spacecraft. The central focus of this dissertation is reducing the burden which the attitude control system places upon the spacecraft in regards to electrical energy, which is shown in this dissertation to be a challenging problem to computationally solve and analyze. Reducing power and energy demands can have a multitude of benefits, spanning from the initial design phase, to in-flight operations, to potentially extending the mission life of the spacecraft. This goal is approached from a practical standpoint apropos to an industry-flight setting. Metrics to measure electrical energy and power are developed which are in-line with the cost associated to operating reaction wheel based attitude control systems. These metrics are incorporated into multiple families of practical high-dimensional constrained nonlinear optimal control problems to reduce the electrical energy, as well as the instantaneous power burdens imposed by the attitude control system upon the spacecraft. Minimizing electrical energy is shown to be a problem in L1 optimal control which is nonsmooth in regards to state variables as well as the control. To overcome the challenge of nonsmoothness, a method is adopted in this dissertation to transform the nonsmooth minimum electrical energy problem into an equivalent smooth formulation, which then allows standard techniques in optimal control to solve and analyze the problem. Through numerically solving families of optimal control problems, the relationship between electrical energy and transfer time is identified and explored for both off-and on-eigenaxis maneuvering, under minimum dissipative losses as well as under minimum electrical energy. A trade space between on-and off-eigenaxis maneuvering is identified, from which is shown that agile near time optimal maneuvers exist within the energy budget associated with conventional eigenaxis maneuvering. Moreover, even for conventional eigenaxis maneuvering, energy requirements can be dramatically reduced by maneuvering off-eigenaxis. These results address one of the fundamental assumptions in the field of optimal path design verses conventional maneuver design. Two practical flight situations are addressed in this dissertation in regards to reducing energy and power: The case when the attitude of the spacecraft is predetermined, and the case where reaction wheels can not be directly controlled. For the setting where the attitude of spacecraft is on a predefined trajectory, it is demonstrated that reduced energy maneuvers are only attainable though the application of null-motions, which requires control of the reaction wheels. A computationally light formulation is developed minimizing the dissipative losses through the application of null motions. In the situation where the reaction wheels can not be directly controlled, it is demonstrated that energy consumption, dissipative losses, and peak-power loads, of the reaction-wheel array can each be reduced substantially by controlling the input to the attitude control system through attitude steering. It is demonstrated that the open loop trajectories correctly predict the closed loop response when tracked by an attitude control system which does not allow direct command of the reaction wheels.
Probing Mechanism of Evolution of Simple Genomes
NASA Technical Reports Server (NTRS)
Pohorille, Andrew; Ditzler, Mark; Popovic, Milena; Wei, Chenyu
2016-01-01
Our overarching goal is to discover how the structure of the genotypic space of RNA polymers affects their ability to evolve. Specifically, we will address several fundamental questions that, so far, have remained largely unanswered. Was the genotypic space explored globally or only locally? Was the outcome of early evolution predictable or was it, instead, govern by chance? What was the role of neutral mutations in the evolution of increasing complex systems? As the first step, we study the problem in the example of RNA ligases. We obtain the complete, empirical fitness landscapes for short ligases and examine possible evolutionary paths for RNA molecules that are sufficiently long to preclude exhaustive search of the genotypic space.
Human Injury From Atomic Particles and Photon Exposure: Fears, Myths, Risks, and Mortality
2010-01-01
Energy absorbtion from particles and photons moving at relativistic speeds has been a fundamental part of life on earth and wherever else life might exist. Heat and visible light have deeply influenced the course of human evolution, affecting habitat and nutrition. The photons of ionizing radiation that over time can possibly affect evolution, contribute to the more immediate problem of morbidity and mortality of cancer. This review addresses our radiative energy absorbtion, from both natural and manmade sources, and its relationship with disease and death. Educational Public Health efforts to offset the dangers of solar ultraviolet overexposure are presented, together with data on the significant mortality of metastatic melanoma. PMID:20481234
Workshop discusses community models for coastal sediment transport
NASA Astrophysics Data System (ADS)
Sherwood, Christopher R.; Signell, Richard P.; Harris, Courtney K.; Butman, Bradford
Numerical models of coastal sediment transport are increasingly used to address problems ranging from remediation of contaminated sediments, to siting of sewage outfalls and disposal sites, to evaluating impacts of coastal development. They are also used as a test bed for sediment-transport algorithms, to provide realistic settings for biological and geochemical models, and for a variety of other research, both fundamental and applied. However, there are few full-featured, publicly available coastal sediment-transport models, and fewer still that are well tested and have been widely applied.This was the motivation for a workshop in Woods Hole, Massachusetts, on June 22-23, 2000, that explored the establishment of community models for coastal sediment-transport processes.
NASA Technical Reports Server (NTRS)
Venable, D. D.
1983-01-01
A semi-analytic Monte Carlo simulation methodology (SALMON) was discussed. This simulation technique is particularly well suited for addressing fundamental radiative transfer problems in oceanographic LIDAR (optical radar), and also provides a framework for investigating the effects of environmental factors on LIDAR system performance. The simulation model was extended for airborne laser fluorosensors to allow for inhomogeneities in the vertical distribution of constituents in clear sea water. Results of the simulations for linearly varying step concentrations of chlorophyll are presented. The SALMON technique was also employed to determine how the LIDAR signals from an inhomogeneous media differ from those from homogeneous media.
NASA Astrophysics Data System (ADS)
Thompson, S. E.; Sivapalan, M.; Harman, C. J.; Srinivasan, V.; Hipsey, M. R.; Reed, P.; Montanari, A.; Blöschl, G.
2013-12-01
Globally, many different kinds of water resources management issues call for policy- and infrastructure-based responses. Yet responsible decision-making about water resources management raises a fundamental challenge for hydrologists: making predictions about water resources on decadal- to century-long timescales. Obtaining insight into hydrologic futures over 100 yr timescales forces researchers to address internal and exogenous changes in the properties of hydrologic systems. To do this, new hydrologic research must identify, describe and model feedbacks between water and other changing, coupled environmental subsystems. These models must be constrained to yield useful insights, despite the many likely sources of uncertainty in their predictions. Chief among these uncertainties are the impacts of the increasing role of human intervention in the global water cycle - a defining challenge for hydrology in the Anthropocene. Here we present a research agenda that proposes a suite of strategies to address these challenges from the perspectives of hydrologic science research. The research agenda focuses on the development of co-evolutionary hydrologic modeling to explore coupling across systems, and to address the implications of this coupling on the long-time behavior of the coupled systems. Three research directions support the development of these models: hydrologic reconstruction, comparative hydrology and model-data learning. These strategies focus on understanding hydrologic processes and feedbacks over long timescales, across many locations, and through strategic coupling of observational and model data in specific systems. We highlight the value of use-inspired and team-based science that is motivated by real-world hydrologic problems but targets improvements in fundamental understanding to support decision-making and management. Fully realizing the potential of this approach will ultimately require detailed integration of social science and physical science understanding of water systems, and is a priority for the developing field of sociohydrology.
The next generation in optical transport semiconductors: IC solutions at the system level
NASA Astrophysics Data System (ADS)
Gomatam, Badri N.
2005-02-01
In this tutorial overview, we survey some of the challenging problems facing Optical Transport and their solutions using new semiconductor-based technologies. Advances in 0.13um CMOS, SiGe/HBT and InP/HBT IC process technologies and mixed-signal design strategies are the fundamental breakthroughs that have made these solutions possible. In combination with innovative packaging and transponder/transceiver architectures IC approaches have clearly demonstrated enhanced optical link budgets with simultaneously lower (perhaps the lowest to date) cost and manufacturability tradeoffs. This paper will describe: *Electronic Dispersion Compensation broadly viewed as the overcoming of dispersion based limits to OC-192 links and extending link budgets, *Error Control/Coding also known as Forward Error Correction (FEC), *Adaptive Receivers for signal quality monitoring for real-time estimation of Q/OSNR, eye-pattern, signal BER and related temporal statistics (such as jitter). We will discuss the theoretical underpinnings of these receiver and transmitter architectures, provide examples of system performance and conclude with general market trends. These Physical layer IC solutions represent a fundamental new toolbox of options for equipment designers in addressing systems level problems. With unmatched cost and yield/performance tradeoffs, it is expected that IC approaches will provide significant flexibility in turn, for carriers and service providers who must ultimately manage the network and assure acceptable quality of service under stringent cost constraints.
Are you sitting comfortably? The political economy of the body.
Wilkin, Peter
2009-01-01
The aim of this paper is to examine the relationship between the mass production of furniture in modern industrial societies and lower back pain (LBP). The latter has proven to be a major cost to health services and private industry throughout the industrialised world and now represents a global health issue as recent WHO reports on obesity and LBP reveal. Thus far there have been few co-ordinated attempts to deal with the causes of the problem through public policy. Drawing upon a range of sources in anthropology, health studies, politics and economics, the paper argues that this a modern social problem rooted in the contingent conjuncture of natural and social causal mechanisms. The key question it raises is: what are the appropriate mechanisms for addressing this problem? This paper develops an analysis rooted in libertarian social theory and argues that both the state and the capitalist market are flawed mechanisms for resolving this problem. There remains a fundamental dilemma for libertarians, however. Whilst the state and the market may well be flawed mechanisms, they are the dominant ones shaping global political economy. To what extent can libertarians work within these structures and remain committed to libertarian goals?
Critical materialism: science, technology, and environmental sustainability.
York, Richard; Clark, Brett
2010-01-01
There are widely divergent views on how science and technology are connected to environmental problems. A view commonly held among natural scientists and policy makers is that environmental problems are primarily technical problems that can be solved via the development and implementation of technological innovations. This technologically optimistic view tends to ignore power relationships in society and the political-economic order that drives environmental degradation. An opposed view, common among postmodernist and poststructuralist scholars, is that the emergence of the scientific worldview is one of the fundamental causes of human oppression. This postmodernist view rejects scientific epistemology and often is associated with an anti-realist stance, which ultimately serves to deny the reality of environmental problems, thus (unintentionally) abetting right-wing efforts to scuttle environmental protection. We argue that both the technologically optimistic and the postmodernist views are misguided, and both undermine our ability to address environmental crises. We advocate the adoption of a critical materialist stance, which recognizes the importance of natural science for helping us to understand the world while also recognizing the social embeddedness of the scientific establishment and the need to challenge the manipulation of science by the elite.
Pan, Xiaochuan; Sidky, Emil Y; Vannier, Michael
2010-01-01
Despite major advances in x-ray sources, detector arrays, gantry mechanical design and especially computer performance, one component of computed tomography (CT) scanners has remained virtually constant for the past 25 years—the reconstruction algorithm. Fundamental advances have been made in the solution of inverse problems, especially tomographic reconstruction, but these works have not been translated into clinical and related practice. The reasons are not obvious and seldom discussed. This review seeks to examine the reasons for this discrepancy and provides recommendations on how it can be resolved. We take the example of field of compressive sensing (CS), summarizing this new area of research from the eyes of practical medical physicists and explaining the disconnection between theoretical and application-oriented research. Using a few issues specific to CT, which engineers have addressed in very specific ways, we try to distill the mathematical problem underlying each of these issues with the hope of demonstrating that there are interesting mathematical problems of general importance that can result from in depth analysis of specific issues. We then sketch some unconventional CT-imaging designs that have the potential to impact on CT applications, if the link between applied mathematicians and engineers/physicists were stronger. Finally, we close with some observations on how the link could be strengthened. There is, we believe, an important opportunity to rapidly improve the performance of CT and related tomographic imaging techniques by addressing these issues. PMID:20376330
NASA Astrophysics Data System (ADS)
Pan, Xiaochuan; Sidky, Emil Y.; Vannier, Michael
2009-12-01
Despite major advances in x-ray sources, detector arrays, gantry mechanical design and especially computer performance, one component of computed tomography (CT) scanners has remained virtually constant for the past 25 years—the reconstruction algorithm. Fundamental advances have been made in the solution of inverse problems, especially tomographic reconstruction, but these works have not been translated into clinical and related practice. The reasons are not obvious and seldom discussed. This review seeks to examine the reasons for this discrepancy and provides recommendations on how it can be resolved. We take the example of field of compressive sensing (CS), summarizing this new area of research from the eyes of practical medical physicists and explaining the disconnection between theoretical and application-oriented research. Using a few issues specific to CT, which engineers have addressed in very specific ways, we try to distill the mathematical problem underlying each of these issues with the hope of demonstrating that there are interesting mathematical problems of general importance that can result from in depth analysis of specific issues. We then sketch some unconventional CT-imaging designs that have the potential to impact on CT applications, if the link between applied mathematicians and engineers/physicists were stronger. Finally, we close with some observations on how the link could be strengthened. There is, we believe, an important opportunity to rapidly improve the performance of CT and related tomographic imaging techniques by addressing these issues.
Obesity and Associated Comorbidities in People and Companion Animals: A One Health Perspective.
Chandler, M; Cunningham, S; Lund, E M; Khanna, C; Naramore, R; Patel, A; Day, M J
2017-05-01
This article reviews the biology, prevalence and risks for obesity in people and companion dogs and cats, and explores the links between obesity and diabetes mellitus and cancer across these species. Obesity is a major healthcare problem in both human and veterinary medicine and there is an increasing prevalence of obesity in people and pets. In people and animals, obesity is a complex disorder involving diet, level of physical activity, behavioural factors, socioeconomic factors, environment exposures, genetics, metabolism and the microbiome. Pets and people share a number of obesity-related comorbidities. Obesity is a major risk factor for type 2 diabetes mellitus in people and in cats, but this association is not recognized in dogs. Obesity is a recognized risk factor for a number of human cancers, but there are fewer data available describing this association with canine neoplastic disease. One approach to addressing the problem of obesity is by taking a 'One Health' perspective. Comparative clinical research examining shared lifestyle and environmental risk factors and the reasons underlying species differences should provide new perspectives on the fundamental biology of obesity. One Health programmes involving human healthcare professionals and veterinarians could help address obesity with simple interventions at the community level. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Braun, M Miles
2013-10-01
Study of complementary and alternative medicine's mind and body interventions (CAM-MABI) is hindered not only by the inability to mask participants and their teachers to the study intervention but also by the major practical hurdles of long-term study of practices that can be lifelong. Two other important methodological problems are that study of newly trained practitioners cannot directly address long-term practice, and that long-term practitioners likely self-select in ways that make finding appropriate controls (or a comparison group) challenging. The temporary practice pause then resumption study design (TPPR) introduced here is a new tool that extends the withdrawal study design, established in the field of drug evaluation, to the field of CAM-MABI. With the exception of the inability to mask, TPPR can address the other methodological problems noted above. Of great interest to investigators will likely be measures in practitioners of CAM-MABI that change with temporary pausing of CAM-MABI practice, followed by return of the measures to pre-pause levels with resumption of practice; this would suggest a link of the practice to measured changes. Such findings using this tool may enhance our insight into fundamental biological processes, leading to beneficial practical applications.
Stereo using monocular cues within the tensor voting framework.
Mordohai, Philippos; Medioni, Gérard
2006-06-01
We address the fundamental problem of matching in two static images. The remaining challenges are related to occlusion and lack of texture. Our approach addresses these difficulties within a perceptual organization framework, considering both binocular and monocular cues. Initially, matching candidates for all pixels are generated by a combination of matching techniques. The matching candidates are then embedded in disparity space, where perceptual organization takes place in 3D neighborhoods and, thus, does not suffer from problems associated with scanline or image neighborhoods. The assumption is that correct matches produce salient, coherent surfaces, while wrong ones do not. Matching candidates that are consistent with the surfaces are kept and grouped into smooth layers. Thus, we achieve surface segmentation based on geometric and not photometric properties. Surface overextensions, which are due to occlusion, can be corrected by removing matches whose projections are not consistent in color with their neighbors of the same surface in both images. Finally, the projections of the refined surfaces on both images are used to obtain disparity hypotheses for unmatched pixels. The final disparities are selected after a second tensor voting stage, during which information is propagated from more reliable pixels to less reliable ones. We present results on widely used benchmark stereo pairs.
Interdisciplinary Matchmaking: Choosing Collaborators by Skill, Acquaintance and Trust
NASA Astrophysics Data System (ADS)
Hupa, Albert; Rzadca, Krzysztof; Wierzbicki, Adam; Datta, Anwitaman
Social networks are commonly used to enhance recommender systems. Most of such systems recommend a single resource or a person. However, complex problems or projects usually require a team of experts that must work together on a solution. Team recommendation is much more challenging, mostly because of the complex interpersonal relations between members. This chapter presents fundamental concepts on how to score a team based on members' social context and their suitability for a particular project. We represent the social context of an individual as a three-dimensional social network (3DSN) composed of a knowledge dimension expressing skills, a trust dimension and an acquaintance dimension. Dimensions of a 3DSN are used to mathematically formalize the criteria for prediction of the team's performance. We use these criteria to formulate the team recommendation problem as a multi-criteria optimization problem. We demonstrate our approach on empirical data crawled from two web2.0 sites:
The Terra Data Fusion Project: An Update
NASA Astrophysics Data System (ADS)
Di Girolamo, L.; Bansal, S.; Butler, M.; Fu, D.; Gao, Y.; Lee, H. J.; Liu, Y.; Lo, Y. L.; Raila, D.; Turner, K.; Towns, J.; Wang, S. W.; Yang, K.; Zhao, G.
2017-12-01
Terra is the flagship of NASA's Earth Observing System. Launched in 1999, Terra's five instruments continue to gather data that enable scientists to address fundamental Earth science questions. By design, the strength of the Terra mission has always been rooted in its five instruments and the ability to fuse the instrument data together for obtaining greater quality of information for Earth Science compared to individual instruments alone. As the data volume grows and the central Earth Science questions move towards problems requiring decadal-scale data records, the need for data fusion and the ability for scientists to perform large-scale analytics with long records have never been greater. The challenge is particularly acute for Terra, given its growing volume of data (> 1 petabyte), the storage of different instrument data at different archive centers, the different file formats and projection systems employed for different instrument data, and the inadequate cyberinfrastructure for scientists to access and process whole-mission fusion data (including Level 1 data). Sharing newly derived Terra products with the rest of the world also poses challenges. As such, the Terra Data Fusion Project aims to resolve two long-standing problems: 1) How do we efficiently generate and deliver Terra data fusion products? 2) How do we facilitate the use of Terra data fusion products by the community in generating new products and knowledge through national computing facilities, and disseminate these new products and knowledge through national data sharing services? Here, we will provide an update on significant progress made in addressing these problems by working with NASA and leveraging national facilities managed by the National Center for Supercomputing Applications (NCSA). The problems that we faced in deriving and delivering Terra L1B2 basic, reprojected and cloud-element fusion products, such as data transfer, data fusion, processing on different computer architectures, science, and sharing, will be presented with quantitative specifics. Results from several science-specific drivers for Terra fusion products will also be presented. We demonstrate that the Terra Data Fusion Project itself provides an excellent use-case for the community addressing Big Data and cyberinfrastructure problems.
Theory of wide-angle photometry from standard stars
NASA Technical Reports Server (NTRS)
Usher, Peter D.
1989-01-01
Wide angle celestial structures, such as bright comet tails and nearby galaxies and clusters of galaxies, rely on photographic methods for quantified morphology and photometry, primarily because electronic devices with comparable resolution and sky coverage are beyond current technological capability. The problem of the photometry of extended structures and of how this problem may be overcome through calibration by photometric standard stars is examined. The perfect properties of the ideal field of view are stated in the guise of a radiometric paraxial approximation, in the hope that fields of view of actual telescopes will conform. Fundamental radiometric concepts are worked through before the issue of atmospheric attenuation is addressed. The independence of observed atmospheric extinction and surface brightness leads off the quest for formal solutions to the problem of surface photometry. Methods and problems of solution are discussed. The spectre is confronted in the spirit of standard stars and shown to be chimerical in that light, provided certain rituals are adopted. After a brief discussion of Baker-Sampson polynomials and the vexing issue of saturation, a pursuit is made of actual numbers to be expected in real cases. While the numbers crunched are gathered ex nihilo, they demonstrate the feasibility of Newton's method in the solution of this overdetermined, nonlinear, least square, multiparametric, photometric problem.
Optimal dynamic remapping of parallel computations
NASA Technical Reports Server (NTRS)
Nicol, David M.; Reynolds, Paul F., Jr.
1987-01-01
A large class of computations are characterized by a sequence of phases, with phase changes occurring unpredictably. The decision problem was considered regarding the remapping of workload to processors in a parallel computation when the utility of remapping and the future behavior of the workload is uncertain, and phases exhibit stable execution requirements during a given phase, but requirements may change radically between phases. For these problems a workload assignment generated for one phase may hinder performance during the next phase. This problem is treated formally for a probabilistic model of computation with at most two phases. The fundamental problem of balancing the expected remapping performance gain against the delay cost was addressed. Stochastic dynamic programming is used to show that the remapping decision policy minimizing the expected running time of the computation has an extremely simple structure. Because the gain may not be predictable, the performance of a heuristic policy that does not require estimnation of the gain is examined. The heuristic method's feasibility is demonstrated by its use on an adaptive fluid dynamics code on a multiprocessor. The results suggest that except in extreme cases, the remapping decision problem is essentially that of dynamically determining whether gain can be achieved by remapping after a phase change. The results also suggest that this heuristic is applicable to computations with more than two phases.
Pharmacovigilance in Crisis: Drug Safety at a Crossroads.
Price, John
2018-05-01
Pharmacovigilance (PV) is under unprecedented stress from fundamental changes in a booming pharmaceutical industry, from the challenges of creating and maintaining an increasingly complex PV system in a globally diverse regulatory environment, and from unpredicted consequences of historical PV cost-reduction strategies. At the same time, talent availability lags demand, and many PV professionals may no longer be finding personal fulfillment in their careers. The situation creates risks for companies. Advantages and disadvantages of potential strategies to address this increasing problem at a corporate and industry level and in collaboration with regulatory agencies are discussed, as well as opportunities to adopt new technologies, including artificial intelligence and machine-learning to automate pharmacovigilance operations. These approaches would address burdensome and wasteful effort assuring regulatory compliance and free up resources to support the original mission of PV as an important public health activity and to reinvest in the development of new drugs. Copyright © 2018 Elsevier HS Journals, Inc. All rights reserved.
Instilling new habits: addressing implicit bias in healthcare professionals.
Byrne, Aidan; Tanesini, Alessandra
2015-12-01
There appears to be a fundamental inconsistency between research which shows that some minority groups consistently receive lower quality healthcare and the literature indicating that healthcare workers appear to hold equality as a core personal value. Recent evidence using Implicit Association Tests suggests that these disparities in outcome may in part be due to social biases that are primarily unconscious. In some individuals the activation of these biases may be also facilitated by the high levels of cognitive load associated with clinical practice. However, a range of measures, such as counter-stereotypical stimuli and targeted experience with minority groups, have been identified as possible solutions in other fields and may be adapted for use within healthcare settings. We suggest that social bias should not be seen exclusively as a problem of conscious attitudes which need to be addressed through increased awareness. Instead the delivery of bias free healthcare should become a habit, developed through a continuous process of practice, feedback and reflection.
(How) Can We Write about Our Patients?
Ackerman, Sarah
2018-02-01
The ethical underpinnings of writing about patients are explored, the question of how best to undertake the writing of case reports being subordinated to a more general question about the ethics of choosing how or whether to write. An unsolvable paradox is encountered here: that we need to write or speak about our clinical work in order to conceptualize and understand the work we are doing, but that in the very gesture of doing so, we are breaking a fundamental bond with the patient. This conundrum is viewed from a number of vantage points. The controversy about how best to go about writing clinical accounts is first addressed, after which the literature is reviewed to draw out the ethical conflicts that writing about patients engenders in the patient. Next attention is given to undercurrents in the analyst's motivation to write, again drawing on current literature. Finally, a consideration is provided of how, based on what we might learn from this review, these problems can be addressed.
Fracture healing: A review of clinical, imaging and laboratory diagnostic options.
Cunningham, Brian P; Brazina, Sloane; Morshed, Saam; Miclau, Theodore
2017-06-01
A fundamental issue in clinical orthopaedics is the determination of when a fracture is united. However, there are no established "gold standards," nor standardized methods for assessing union, which has resulted in significant disagreement among orthopaedic surgeons in both clinical practice and research. A great deal of investigative work has been directed to addressing this problem, with a number of exciting new techniques described. This review provides a brief summary of the burden of nonunion fractures and addresses some of the challenges related to the assessment of fracture healing. The tools currently available to determine union are discussed, including various imaging modalities, biomechanical testing methods, and laboratory and clinical assessments. The evaluation of fracture healing in the setting of both patient care and clinical research is integral to the orthopaedic practice. Weighted integration of several available metrics must be considered to create a composite outcome measure of patient prognosis. Copyright © 2017 Elsevier Ltd. All rights reserved.
GOBLET: The Global Organisation for Bioinformatics Learning, Education and Training
Atwood, Teresa K.; Bongcam-Rudloff, Erik; Brazas, Michelle E.; Corpas, Manuel; Gaudet, Pascale; Lewitter, Fran; Mulder, Nicola; Palagi, Patricia M.; Schneider, Maria Victoria; van Gelder, Celia W. G.
2015-01-01
In recent years, high-throughput technologies have brought big data to the life sciences. The march of progress has been rapid, leaving in its wake a demand for courses in data analysis, data stewardship, computing fundamentals, etc., a need that universities have not yet been able to satisfy—paradoxically, many are actually closing “niche” bioinformatics courses at a time of critical need. The impact of this is being felt across continents, as many students and early-stage researchers are being left without appropriate skills to manage, analyse, and interpret their data with confidence. This situation has galvanised a group of scientists to address the problems on an international scale. For the first time, bioinformatics educators and trainers across the globe have come together to address common needs, rising above institutional and international boundaries to cooperate in sharing bioinformatics training expertise, experience, and resources, aiming to put ad hoc training practices on a more professional footing for the benefit of all. PMID:25856076
Jones, Susan M
2017-04-01
Australian lizards exhibit a broad array of different reproductive strategies and provide an extraordinary diversity and range of models with which to address fundamental problems in reproductive biology. Studies on lizards have frequently led to new insights into hormonal regulatory pathways or mechanisms of control, but we have detailed knowledge of the reproductive cycle in only a small percentage of known species. This review provides an overview and synthesis of current knowledge of the hormonal control of reproductive cycles in Australian lizards. Agamid lizards have provided useful models with which to test hypotheses about the hormonal regulation of the expression of reproductive behaviors, while research on viviparous skinks is providing insights into the evolution of the endocrine control of gestation. However, in order to better understand the potential risks that environmental factors such as climate change and endocrine disrupting chemicals pose to our fauna, better knowledge is required of the fundamental characteristics of the reproductive cycle in a broader range of lizard species. Copyright © 2015 Elsevier Inc. All rights reserved.
Amyloid and membrane complexity: The toxic interplay revealed by AFM.
Canale, Claudio; Oropesa-Nuñez, Reinier; Diaspro, Alberto; Dante, Silvia
2018-01-01
Lipid membranes play a fundamental role in the pathological development of protein misfolding diseases. Several pieces of evidence suggest that the lipid membrane could act as a catalytic surface for protein aggregation. Furthermore, a leading theory indicates the interaction between the cell membrane and misfolded oligomer species as the responsible for cytotoxicity, hence, for neurodegeneration in disorders such as Alzheimer's and Parkinson's disease. The definition of the mechanisms that drive the interaction between pathological protein aggregates and plasma membrane is fundamental for the development of effective therapies for a large class of diseases. Atomic force microscopy (AFM) has been employed to study how amyloid aggregates affect the cell physiological properties. Considerable efforts were spent to characterize the interaction with model systems, i.e., planar supported lipid bilayers, but some works also addressed the problem directly on living cells. Here, an overview of the main works involving the use of the AFM on both model system and living cells will be provided. Different kind of approaches will be presented, as well as the main results derived from the AFM analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Evolutionary principles and their practical application
Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P
2011-01-01
Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology. PMID:25567966
Evolutionary principles and their practical application.
Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P
2011-03-01
Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology.
2005-01-01
Students are most motivated and learn best when they are immersed in an environment that causes them to realize why they should learn. Perhaps nowhere is this truer than when teaching the biological sciences to engineers. Transitioning from a traditionally mathematics-based to a traditionally knowledge-based pedagogical style can challenge student learning and engagement. To address this, human pathologies were used as a problem-based context for teaching knowledge-based cell biological mechanisms. Lectures were divided into four modules. First, a disease was presented from clinical, economic, and etiological standpoints. Second, fundamental concepts of cell and molecular biology were taught that were directly relevant to that disease. Finally, we discussed the cellular and molecular basis of the disease based on these fundamental concepts, together with current clinical approaches to the disease. The basic science is thus presented within a “shrink wrap” of disease application. Evaluation of this contextual technique suggests that it is very useful in improving undergraduate student focus and motivation, and offers many advantages to the instructor as well. PMID:15917872
A pedestrian approach to the measurement problem in quantum mechanics
NASA Astrophysics Data System (ADS)
Boughn, Stephen; Reginatto, Marcel
2013-09-01
The quantum theory of measurement has been a matter of debate for over eighty years. Most of the discussion has focused on theoretical issues with the consequence that other aspects (such as the operational prescriptions that are an integral part of experimental physics) have been largely ignored. This has undoubtedly exacerbated attempts to find a solution to the "measurement problem". How the measurement problem is defined depends to some extent on how the theoretical concepts introduced by the theory are interpreted. In this paper, we fully embrace the minimalist statistical (ensemble) interpretation of quantum mechanics espoused by Einstein, Ballentine, and others. According to this interpretation, the quantum state description applies only to a statistical ensemble of similarly prepared systems rather than representing an individual system. Thus, the statistical interpretation obviates the need to entertain reduction of the state vector, one of the primary dilemmas of the measurement problem. The other major aspect of the measurement problem, the necessity of describing measurements in terms of classical concepts that lay outside of quantum theory, remains. A consistent formalism for interacting quantum and classical systems, like the one based on ensembles on configuration space that we refer to in this paper, might seem to eliminate this facet of the measurement problem; however, we argue that the ultimate interface with experiments is described by operational prescriptions and not in terms of the concepts of classical theory. There is no doubt that attempts to address the measurement problem have yielded important advances in fundamental physics; however, it is also very clear that the measurement problem is still far from being resolved. The pedestrian approach presented here suggests that this state of affairs is in part the result of searching for a theoretical/mathematical solution to what is fundamentally an experimental/observational question. It suggests also that the measurement problem is, in some sense, ill-posed and might never be resolved. This point of view is tenable so long as one is willing to view physical theories as providing models of nature rather than complete descriptions of reality. Among other things, these considerations lead us to suggest that the Copenhagen interpretation's insistence on the classicality of the measurement apparatus should be replaced by the requirement that a measurement, which is specified operationally, should simply be of sufficient precision.
Fundamental problems in provable security and cryptography.
Dent, Alexander W
2006-12-15
This paper examines methods for formally proving the security of cryptographic schemes. We show that, despite many years of active research and dozens of significant results, there are fundamental problems which have yet to be solved. We also present a new approach to one of the more controversial aspects of provable security, the random oracle model.
Direct Determination of Nonmetals in Solution with Atomic Spectrometry.
ERIC Educational Resources Information Center
McGregor, David A.; And Others
1988-01-01
Addresses solution nonmetal determinations on a fundamental level. Characterizes research in this area of chemical instrumentation. Discusses the fundamental limitations of nonmetal atomic spectrometry, the status of nonmetals and atomic spectroscopic techniques, and current directions in solution nonmetal determinations. (CW)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weisbin, C.R.; Hamel, W.R.; Barhen, J.
1986-02-01
The Oak Ridge National Laboratory has established the Center for Engineering Systems Advanced Research (CESAR) for the purpose of addressing fundamental problems of intelligent machine technologies. The purpose of this document is to establish a framework and guidelines for research and development within ORNL's CESAR program in areas pertaining to intelligent machines. The specific objective is to present a CESAR Research and Development Plan for such work with a planning horizon of five to ten years, i.e., FY 1985 to FY 1990 and beyond. As much as possible, the plan is based on anticipated DOE needs in the area ofmore » productivity increase and safety to the end of this century.« less
NASA Astrophysics Data System (ADS)
Dhamala, Mukeshwar; Lai, Ying-Cheng
1999-02-01
Transient chaos is a common phenomenon in nonlinear dynamics of many physical, biological, and engineering systems. In applications it is often desirable to maintain sustained chaos even in parameter regimes of transient chaos. We address how to sustain transient chaos in deterministic flows. We utilize a simple and practical method, based on extracting the fundamental dynamics from time series, to maintain chaos. The method can result in control of trajectories from almost all initial conditions in the original basin of the chaotic attractor from which transient chaos is created. We apply our method to three problems: (1) voltage collapse in electrical power systems, (2) species preservation in ecology, and (3) elimination of undesirable bursting behavior in a chemical reaction system.
100 years of sedimentation study by the USGS
Glysson, G. Douglas
1989-01-01
On January 15, 1889, the U.S. Geological Survey began collecting sediment data on the Rio Grande at Embudo, New Mexico. During the past 100 years the U.S. Geological Survey's Water Resources Division (WRD) has collected daily sediment data at more than 1,200 sites. Projects have addressed the problems associated with reservoir construction, agricultural irrigation projects, energy production, and transport and deposition of pollutants sorbed to sediments. The Survey has been active as a charter member of the Federal Interagency Sediment Project and currently has three full-time hydrologists working on the project. The WRD's sediment-research projects have covered a wide variety of subjects from the fundamental theories of resistance to flow and sediment transport in alluvial channels to lunar erosion mechanisms.
Breaking down barriers in cooperative fault management: Temporal and functional information displays
NASA Technical Reports Server (NTRS)
Potter, Scott S.; Woods, David D.
1994-01-01
At the highest level, the fundamental question addressed by this research is how to aid human operators engaged in dynamic fault management. In dynamic fault management there is some underlying dynamic process (an engineered or physiological process referred to as the monitored process - MP) whose state changes over time and whose behavior must be monitored and controlled. In these types of applications (dynamic, real-time systems), a vast array of sensor data is available to provide information on the state of the MP. Faults disturb the MP and diagnosis must be performed in parallel with responses to maintain process integrity and to correct the underlying problem. These situations frequently involve time pressure, multiple interacting goals, high consequences of failure, and multiple interleaved tasks.
Chaffee, Benjamin W.; Couch, Elizabeth T.; Ryder, Mark I.
2016-01-01
Although the prevalence of tobacco use has declined in some parts of the world, tobacco use remains a persistent and, in some cases, growing problem that will continue to be a fundamental challenge facing dental practitioners in the decades ahead. The dental practitioner has a unique opportunity and professional obligation to be a positive influence in reducing the economic and social burden inflicted by tobacco use on dental and general health. In this article, the current non-invasive, evidence-based approaches are presented for the dental practitioner to help patients avoid tobacco initiation, to encourage and assist patients’ in tobacco cessation, and to address tobacco-induced damage to periodontal supporting tissues. PMID:27045430
Assessment of the effectiveness of supply-side cost-containment measures
Garrison, Louis P.
1992-01-01
This article assesses the arguments and evidence concerning the likely effectiveness of four supply-side cost-containment measures. The health planning efforts of the 1970s, particularly certificate-of-need regulations, had very limited success in containing costs. The new and related tools of technology assessment and practice guidelines hold some promise for refining benefit packages, but they are inadequate for micromanaging complex medical practices. Payment policies, such as hospital ratesetting, have enjoyed some success in limiting hospital cost growth but are less effective at controlling total costs. None of these measures alone is likely to address fully the fundamental issues of equity and efficiency in health care resource allocation that underlie the problem of rising costs. PMID:25372721
Computational fluid dynamics - An introduction for engineers
NASA Astrophysics Data System (ADS)
Abbott, Michael Barry; Basco, David R.
An introduction to the fundamentals of CFD for engineers and physical scientists is presented. The principal definitions, basic ideas, and most common methods used in CFD are presented, and the application of these methods to the description of free surface, unsteady, and turbulent flow is shown. Emphasis is on the numerical treatment of incompressible unsteady fluid flow with primary applications to water problems using the finite difference method. While traditional areas of application like hydrology, hydraulic and coastal engineering and oceanography get the main emphasis, newer areas of application such as medical fluid dynamics, bioengineering, and soil physics and chemistry are also addressed. The possibilities and limitations of CFD are pointed out along with the relations of CFD to other branches of science.
Radio-Frequency Applications for Food Processing and Safety.
Jiao, Yang; Tang, Juming; Wang, Yifen; Koral, Tony L
2018-03-25
Radio-frequency (RF) heating, as a thermal-processing technology, has been extending its applications in the food industry. Although RF has shown some unique advantages over conventional methods in industrial drying and frozen food thawing, more research is needed to make it applicable for food safety applications because of its complex heating mechanism. This review provides comprehensive information regarding RF-heating history, mechanism, fundamentals, and applications that have already been fully developed or are still under research. The application of mathematical modeling as a useful tool in RF food processing is also reviewed in detail. At the end of the review, we summarize the active research groups in the RF food thermal-processing field, and address the current problems that still need to be overcome.
Control-structure interaction in precision pointing servo loops
NASA Technical Reports Server (NTRS)
Spanos, John T.
1989-01-01
The control-structure interaction problem is addressed via stability analysis of a generic linear servo loop model. With the plant described by the rigid body mode and a single elastic mode, structural flexibility is categorized into one of three types: (1) appendage, (2) in-the-loop minimum phase, and (3) in-the-loop nonminimum phase. Closing the loop with proportional-derivative (PD) control action and introducing sensor roll-off dynamics in the feedback path, stability conditions are obtained. Trade studies are conducted with modal frequency, modal participation, modal damping, loop bandwidth, and sensor bandwidth treated as free parameters. Results indicate that appendage modes are most likely to produce instability if they are near the sensor rolloff, whereas in-the-loop modes are most dangerous near the loop bandwidth. The main goal of this paper is to provide a fundamental understanding of the control-structure interaction problem so that it may benefit the design of complex spacecraft and pointing system servo loops. In this framework, the JPL Pathfinder gimbal pointer is considered as an example.
On Cognition, Structured Sequence Processing, and Adaptive Dynamical Systems
NASA Astrophysics Data System (ADS)
Petersson, Karl Magnus
2008-11-01
Cognitive neuroscience approaches the brain as a cognitive system: a system that functionally is conceptualized in terms of information processing. We outline some aspects of this concept and consider a physical system to be an information processing device when a subclass of its physical states can be viewed as representational/cognitive and transitions between these can be conceptualized as a process operating on these states by implementing operations on the corresponding representational structures. We identify a generic and fundamental problem in cognition: sequentially organized structured processing. Structured sequence processing provides the brain, in an essential sense, with its processing logic. In an approach addressing this problem, we illustrate how to integrate levels of analysis within a framework of adaptive dynamical systems. We note that the dynamical system framework lends itself to a description of asynchronous event-driven devices, which is likely to be important in cognition because the brain appears to be an asynchronous processing system. We use the human language faculty and natural language processing as a concrete example through out.
Cross-Identification of Astronomical Catalogs on Multiple GPUs
NASA Astrophysics Data System (ADS)
Lee, M. A.; Budavári, T.
2013-10-01
One of the most fundamental problems in observational astronomy is the cross-identification of sources. Observations are made in different wavelengths, at different times, and from different locations and instruments, resulting in a large set of independent observations. The scientific outcome is often limited by our ability to quickly perform meaningful associations between detections. The matching, however, is difficult scientifically, statistically, as well as computationally. The former two require detailed physical modeling and advanced probabilistic concepts; the latter is due to the large volumes of data and the problem's combinatorial nature. In order to tackle the computational challenge and to prepare for future surveys, whose measurements will be exponentially increasing in size past the scale of feasible CPU-based solutions, we developed a new implementation which addresses the issue by performing the associations on multiple Graphics Processing Units (GPUs). Our implementation utilizes up to 6 GPUs in combination with the Thrust library to achieve an over 40x speed up verses the previous best implementation running on a multi-CPU SQL Server.
Imprints from the global cosmological expansion to the local spacetime dynamics.
Fahr, Hans J; Siewert, Mark
2008-05-01
We study the general relativistic spacetime metrics surrounding massive cosmological objects, such as suns, stars, galaxies or galaxy clusters. The question addressed here is the transition of local, object-related spacetime metrics into the global, cosmological Robertson-Walker metrics. We demonstrate that the answer often quoted for this problem from the literature, the so-called Einstein-Straus vacuole, which connects a static outer Schwarzschild solution with the time-dependent Robertson-Walker universe, is inadequate to describe the local spacetime of a gravitationally bound system. Thus, we derive here an alternative model describing such bound systems by a metrics more closely tied to the fundamental problem of structure formation in the early universe and obtain a multitude of solutions characterising the time-dependence of a local scale parameter. As we can show, a specific solution out of this multitude is able to, as a by-product, surprisingly enough, explain the presently much discussed phenomenon of the PIONEER anomaly.
Perceptions of political leaders.
David Schmitz, J; Murray, Gregg R
2017-01-01
Partisan identification is a fundamental force in individual and mass political behavior around the world. Informed by scholarship on human sociality, coalitional psychology, and group behavior, this research argues that partisan identification, like many other group-based behaviors, is influenced by forces of evolution. If correct, then party identifiers should exhibit adaptive behaviors when making group-related political decisions. The authors test this assertion with citizen assessments of the relative physical formidability of competing leaders, an important adaptive factor in leader evaluations. Using original and novel data collected during the contextually different 2008 and 2012 U.S. presidential elections, as well as two distinct measures obtained during both elections, this article presents evidence that partisans overestimate the physical stature of the presidential candidate of their own party compared with the stature of the candidate of the opposition party. These findings suggest that the power of party identification on political behavior may be attributable to the fact that modern political parties address problems similar to the problems groups faced in human ancestral times.
Enhanced computer vision with Microsoft Kinect sensor: a review.
Han, Jungong; Shao, Ling; Xu, Dong; Shotton, Jamie
2013-10-01
With the invention of the low-cost Microsoft Kinect sensor, high-resolution depth and visual (RGB) sensing has become available for widespread use. The complementary nature of the depth and visual information provided by the Kinect sensor opens up new opportunities to solve fundamental problems in computer vision. This paper presents a comprehensive review of recent Kinect-based computer vision algorithms and applications. The reviewed approaches are classified according to the type of vision problems that can be addressed or enhanced by means of the Kinect sensor. The covered topics include preprocessing, object tracking and recognition, human activity analysis, hand gesture analysis, and indoor 3-D mapping. For each category of methods, we outline their main algorithmic contributions and summarize their advantages/differences compared to their RGB counterparts. Finally, we give an overview of the challenges in this field and future research trends. This paper is expected to serve as a tutorial and source of references for Kinect-based computer vision researchers.
Free boundary problems in shock reflection/diffraction and related transonic flow problems
Chen, Gui-Qiang; Feldman, Mikhail
2015-01-01
Shock waves are steep wavefronts that are fundamental in nature, especially in high-speed fluid flows. When a shock hits an obstacle, or a flying body meets a shock, shock reflection/diffraction phenomena occur. In this paper, we show how several long-standing shock reflection/diffraction problems can be formulated as free boundary problems, discuss some recent progress in developing mathematical ideas, approaches and techniques for solving these problems, and present some further open problems in this direction. In particular, these shock problems include von Neumann's problem for shock reflection–diffraction by two-dimensional wedges with concave corner, Lighthill's problem for shock diffraction by two-dimensional wedges with convex corner, and Prandtl-Meyer's problem for supersonic flow impinging onto solid wedges, which are also fundamental in the mathematical theory of multidimensional conservation laws. PMID:26261363
Moisture Content and Migration Dynamics in Unsaturated Porous Media
NASA Technical Reports Server (NTRS)
Homsy, G. M.
1993-01-01
Fundamental studies of fluid mechanics and transport in partially saturated soils are presented. Solution of transient diffusion problems in support of the development of probes for the in-situ measurement of moisture content is given. Numerical and analytical methods are used to study the fundamental problem of meniscus and saturation front propagation in geometric models of porous media.
The Use of Therapeutic Techniques in Actor Training.
ERIC Educational Resources Information Center
Gross, Roger
Since a fundamental problem of acting--fear--is a fundamental human problem, the basic job of acting teachers is to help their students become the kind of people who can act. Acting teachers need to help their students cast off their fears, free their bodies and their imaginations, and learn all the skills of self-knowledge, self-control, and…
Characterizing the Fundamental Intellectual Steps Required in the Solution of Conceptual Problems
NASA Astrophysics Data System (ADS)
Stewart, John
2010-02-01
At some level, the performance of a science class must depend on what is taught, the information content of the materials and assignments of the course. The introductory calculus-based electricity and magnetism class at the University of Arkansas is examined using a catalog of the basic reasoning steps involved in the solution of problems assigned in the class. This catalog was developed by sampling popular physics textbooks for conceptual problems. The solution to each conceptual problem was decomposed into its fundamental reasoning steps. These fundamental steps are, then, used to quantify the distribution of conceptual content within the course. Using this characterization technique, an exceptionally detailed picture of the information flow and structure of the class can be produced. The intellectual structure of published conceptual inventories is compared with the information presented in the class and the dependence of conceptual performance on the details of coverage extracted. )
2016-01-01
The kinetics of proteins at interfaces plays an important role in biological functions and inspires solutions to fundamental problems in biomedical sciences and engineering. Nonetheless, due to the lack of surface-specific and structural-sensitive biophysical techniques, it still remains challenging to probe protein kinetics in situ and in real time without the use of spectroscopic labels at interfaces. Broad-bandwidth chiral sum frequency generation (SFG) spectroscopy has been recently developed for protein kinetic studies at interfaces by tracking the chiral vibrational signals of proteins. In this article, we review our recent progress in kinetic studies of proteins at interfaces using broad-bandwidth chiral SFG spectroscopy. We illustrate the use of chiral SFG signals of protein side chains in the C–H stretch region to monitor self-assembly processes of proteins at interfaces. We also present the use of chiral SFG signals from the protein backbone in the N–H stretch region to probe the real-time kinetics of proton exchange between protein and water at interfaces. In addition, we demonstrate the applications of spectral features of chiral SFG that are typical of protein secondary structures in both the amide I and the N–H stretch regions for monitoring the kinetics of aggregation of amyloid proteins at membrane surfaces. These studies exhibit the power of broad-bandwidth chiral SFG to study protein kinetics at interfaces and the promise of this technique in research areas of surface science to address fundamental problems in biomedical and material sciences. PMID:26196215
Sakai, Tetsuro; Karausky, Patricia L; Valenti, Shannon L; Sandusky, Susan L; Hirsch, Sandra C; Xu, Yan
2013-09-01
To present a new research problem-based learning discussion (PBLD) conference and to evaluate its effect on residents. Retrospective observational study of resident education before and after implementation of a research PBLD. Large U.S. academic anesthesiology department. 93 anesthesiology residents with research PBLD exposure in the academic year (AY) 2010 and AY 2011, and 85 residents without research PBLD exposure in AY 2008 and AY 2009. Since AY 2010, a PBLD format has been used to teach residents clinical research fundamentals. The annual 90-minute PBLD addressed residents' perceived barriers to research and introduced research resources available via the Clinical and Translational Science Institute (CTSI). Data recorded were: 1) number of residents who made CTSI consultation solicitations as a new investigator, and 2) number of new research projects proposed by the residents and designed with CTSI consultation. Each outcome was compared between the prePBLD group (AY 2008 [n=43] and AY 2009 [n=42]) and the postPBLD group (AY 2010 [n=43] and AY 2011 [n=50]). The number of residents who consulted the CTSI as new investigators increased from 4 of 85 residents (4.7%) in the prePBLD group to 13 of 93 residents (14.0%) in the postPBLD group (P = 0.042). The number of new research projects for which the residents consulted CTSI increased from 10 to 20 (100% increase). A PBLD format for research education of anesthesiology residents is effective. © 2013 Elsevier Inc. All rights reserved.
Compact scheme for systems of equations applied to fundamental problems of mechanics of continua
NASA Technical Reports Server (NTRS)
Klimkowski, Jerzy Z.
1990-01-01
Compact scheme formulation was used in the treatment of boundary conditions for a system of coupled diffusion and Poisson equations. Models and practical solutions of specific engineering problems arising in solid mechanics, chemical engineering, heat transfer and fuid mechanics are described and analyzed for efficiency and accuracy. Only 2-D cases are discussed and a new method of numerical treatment of boundary conditions common in the fundamental problems of mechanics of continua is presented.
Managing the Budget: Stock-Flow Reasoning and the CO2 Accumulation Problem.
Newell, Ben R; Kary, Arthur; Moore, Chris; Gonzalez, Cleotilde
2016-01-01
The majority of people show persistent poor performance in reasoning about "stock-flow problems" in the laboratory. An important example is the failure to understand the relationship between the "stock" of CO2 in the atmosphere, the "inflow" via anthropogenic CO2 emissions, and the "outflow" via natural CO2 absorption. This study addresses potential causes of reasoning failures in the CO2 accumulation problem and reports two experiments involving a simple re-framing of the task as managing an analogous financial (rather than CO2 ) budget. In Experiment 1 a financial version of the task that required participants to think in terms of controlling debt demonstrated significant improvements compared to a standard CO2 accumulation problem. Experiment 2, in which participants were invited to think about managing savings, suggested that this improvement was fortuitous and coincidental rather than due to a fundamental change in understanding the stock-flow relationships. The role of graphical information in aiding or abetting stock-flow reasoning was also explored in both experiments, with the results suggesting that graphs do not always assist understanding. The potential for leveraging the kind of reasoning exhibited in such tasks in an effort to change people's willingness to reduce CO2 emissions is briefly discussed. Copyright © 2015 Cognitive Science Society, Inc.
Online Reinforcement Learning Using a Probability Density Estimation.
Agostini, Alejandro; Celaya, Enric
2017-01-01
Function approximation in online, incremental, reinforcement learning needs to deal with two fundamental problems: biased sampling and nonstationarity. In this kind of task, biased sampling occurs because samples are obtained from specific trajectories dictated by the dynamics of the environment and are usually concentrated in particular convergence regions, which in the long term tend to dominate the approximation in the less sampled regions. The nonstationarity comes from the recursive nature of the estimations typical of temporal difference methods. This nonstationarity has a local profile, varying not only along the learning process but also along different regions of the state space. We propose to deal with these problems using an estimation of the probability density of samples represented with a gaussian mixture model. To deal with the nonstationarity problem, we use the common approach of introducing a forgetting factor in the updating formula. However, instead of using the same forgetting factor for the whole domain, we make it dependent on the local density of samples, which we use to estimate the nonstationarity of the function at any given input point. To address the biased sampling problem, the forgetting factor applied to each mixture component is modulated according to the new information provided in the updating, rather than forgetting depending only on time, thus avoiding undesired distortions of the approximation in less sampled regions.
Conceptual Problems in the Foundations of Mechanics
NASA Astrophysics Data System (ADS)
Coelho, Ricardo Lopes
2012-09-01
There has been much research on principles and fundamental concepts of mechanics. Problems concerning the law of inertia, the concepts of force, fictitious force, weight, mass and the distinction between inertial and gravitational mass are addressed in the first part of the present paper. It is argued in the second that the law of inertia is the source of these problems. Consequences drawn from the law explain the metaphysical concept of force, the problematic concept of fictitious force, the nominal definition of weight and the difficulty with defining mass operationally. The core of this connection between the law and these consequences lies in the fact that acceleration is a sufficient condition for force. The experimental basis of the law in the course of its history shows, however, that the law presupposes acceleration necessarily whereas acceleration does not presuppose the law. Therefore, there is no inconvenience in taking acceleration independently of the law. This is enough to bypass those problems. Taking into account how force is measured by force meters and how mass is basically determined, by comparison with the standard mass, a minimal meaning for both concepts of force and mass is established. All this converges with several solutions proposed in the course of history and increases the communicability of mechanics, as outlined in the final part of this paper.
NASA Astrophysics Data System (ADS)
Liu, Benjamin M.; Abebe, Yitayew; McHugh, Oloro V.; Collick, Amy S.; Gebrekidan, Brhane; Steenhuis, Tammo S.
This study highlights two highly degraded watersheds in the semi-arid Amhara region of Ethiopia where integrated water resource management activities were carried out to decrease dependence on food aid through improved management of ‘green’ water. While top-down approaches require precise and centrally available knowledge to deal with the uncertainty in engineering design of watershed management projects, bottom-up approaches can succeed without such information by making extensive use of stakeholder knowledge. This approach works best in conjunction with the development of leadership confidence within local communities. These communities typically face a number of problems, most notably poverty, that prevent them from fully investing in the protection of their natural resources, so an integrated management system is needed to suitably address the interrelated problems. Many different implementing agencies were brought together in the two study watersheds to address water scarcity, crop production, and soil erosion, but the cornerstone was enabling local potential through the creation and strengthening of community watershed management organizations. Leadership training and the reinforcement of stakeholder feedback as a fundamental activity led to increased ownership and willingness to take on new responsibilities. A series of small short term successes ranging from micro-enterprise cooperatives to gully rehabilitation have resulted in the pilot communities becoming confident of their own capabilities and proud to share their successes and knowledge with other communities struggling with natural resource degradation.
eHealth Literacy: Essential Skills for Consumer Health in a Networked World.
Norman, Cameron D; Skinner, Harvey A
2006-06-16
Electronic health tools provide little value if the intended users lack the skills to effectively engage them. With nearly half the adult population in the United States and Canada having literacy levels below what is needed to fully engage in an information-rich society, the implications for using information technology to promote health and aid in health care, or for eHealth, are considerable. Engaging with eHealth requires a skill set, or literacy, of its own. The concept of eHealth literacy is introduced and defined as the ability to seek, find, understand, and appraise health information from electronic sources and apply the knowledge gained to addressing or solving a health problem. In this paper, a model of eHealth literacy is introduced, comprised of multiple literacy types, including an outline of a set of fundamental skills consumers require to derive direct benefits from eHealth. A profile of each literacy type with examples of the problems patient-clients might present is provided along with a resource list to aid health practitioners in supporting literacy improvement with their patient-clients across each domain. Facets of the model are illustrated through a set of clinical cases to demonstrate how health practitioners can address eHealth literacy issues in clinical or public health practice. Potential future applications of the model are discussed.
Finite Dimensional Approximations for Continuum Multiscale Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berlyand, Leonid
2017-01-24
The completed research project concerns the development of novel computational techniques for modeling nonlinear multiscale physical and biological phenomena. Specifically, it addresses the theoretical development and applications of the homogenization theory (coarse graining) approach to calculation of the effective properties of highly heterogenous biological and bio-inspired materials with many spatial scales and nonlinear behavior. This theory studies properties of strongly heterogeneous media in problems arising in materials science, geoscience, biology, etc. Modeling of such media raises fundamental mathematical questions, primarily in partial differential equations (PDEs) and calculus of variations, the subject of the PI’s research. The focus of completed researchmore » was on mathematical models of biological and bio-inspired materials with the common theme of multiscale analysis and coarse grain computational techniques. Biological and bio-inspired materials offer the unique ability to create environmentally clean functional materials used for energy conversion and storage. These materials are intrinsically complex, with hierarchical organization occurring on many nested length and time scales. The potential to rationally design and tailor the properties of these materials for broad energy applications has been hampered by the lack of computational techniques, which are able to bridge from the molecular to the macroscopic scale. The project addressed the challenge of computational treatments of such complex materials by the development of a synergistic approach that combines innovative multiscale modeling/analysis techniques with high performance computing.« less
Butler, Richard J; Brusatte, Stephen L; Andres, Brian; Benson, Roger B J
2012-01-01
A fundamental contribution of paleobiology to macroevolutionary theory has been the illumination of deep time patterns of diversification. However, recent work has suggested that taxonomic diversity counts taken from the fossil record may be strongly biased by uneven spatiotemporal sampling. Although morphological diversity (disparity) is also frequently used to examine evolutionary radiations, no empirical work has yet addressed how disparity might be affected by uneven fossil record sampling. Here, we use pterosaurs (Mesozoic flying reptiles) as an exemplar group to address this problem. We calculate multiple disparity metrics based upon a comprehensive anatomical dataset including a novel phylogenetic correction for missing data, statistically compare these metrics to four geological sampling proxies, and use multiple regression modeling to assess the importance of uneven sampling and exceptional fossil deposits (Lagerstätten). We find that range-based disparity metrics are strongly affected by uneven fossil record sampling, and should therefore be interpreted cautiously. The robustness of variance-based metrics to sample size and geological sampling suggests that they can be more confidently interpreted as reflecting true biological signals. In addition, our results highlight the problem of high levels of missing data for disparity analyses, indicating a pressing need for more theoretical and empirical work. © 2011 The Author(s). Evolution © 2011 The Society for the Study of Evolution.
Soft systems thinking and social learning for adaptive management.
Cundill, G; Cumming, G S; Biggs, D; Fabricius, C
2012-02-01
The success of adaptive management in conservation has been questioned and the objective-based management paradigm on which it is based has been heavily criticized. Soft systems thinking and social-learning theory expose errors in the assumption that complex systems can be dispassionately managed by objective observers and highlight the fact that conservation is a social process in which objectives are contested and learning is context dependent. We used these insights to rethink adaptive management in a way that focuses on the social processes involved in management and decision making. Our approach to adaptive management is based on the following assumptions: action toward a common goal is an emergent property of complex social relationships; the introduction of new knowledge, alternative values, and new ways of understanding the world can become a stimulating force for learning, creativity, and change; learning is contextual and is fundamentally about practice; and defining the goal to be addressed is continuous and in principle never ends. We believe five key activities are crucial to defining the goal that is to be addressed in an adaptive-management context and to determining the objectives that are desirable and feasible to the participants: situate the problem in its social and ecological context; raise awareness about alternative views of a problem and encourage enquiry and deconstruction of frames of reference; undertake collaborative actions; and reflect on learning. ©2011 Society for Conservation Biology.
Fundamental aspects of the phase retrieval problem
NASA Astrophysics Data System (ADS)
Ferwerda, H. A.
1980-12-01
A review is given of the fundamental aspects of the phase retrieval problem in optical imaging for one dimension. The phase problem is treated using the fact that the wavefunction in the image-plane is a band-limited entire function of order 1. The ambiguity of the phase reconstruction is formulated in terms of the complex zeros of entire functions. Procedures are given how the relevant zeros might be determined. When the zeros are known one can derive dispersion relations which relate the phase of the wavefunction to the intensity distribution. The phase problem of coherence theory is similar to the previously discussed problem and is briefly touched upon. The extension of the phase problem to two dimensions is not straight-forward and still remains to be solved.
NASA Astrophysics Data System (ADS)
Zurbuchen, Thomas H.
2007-04-01
There is a need for a motivated and innovative work force for the U.S. aerospace industry. The education of such engineers and scientists typically revolves around a fundamental knowledge of basic important technologies, such as the mechanics relevant to orbit-design, structures, avionics, and many others. A few years ago, the University of Michigan developed a Masters of Engineering program that provides students with skills that are not taught as part of a typical engineering curriculum. This program is focused on open problem solving, space systems, and space policy, as well as other classes that further their understanding of the connections between technologies and the nontechnical aspects of managing a space mission. The value of such an education is substantially increased through a direct connection to industry. An innovative problem-oriented approach has been developed that enables direct connections between industry and classroom teaching. The class works as a system study group and addresses problems of interest to and defined by a company with a specific application. We discuss such an application, a near-space lidar wind measurement system to enhance weather predictions, as well as the approach taken to link educational rationales.
Stability analysis of multiple-robot control systems
NASA Technical Reports Server (NTRS)
Wen, John T.; Kreutz, Kenneth
1989-01-01
In a space telerobotic service scenario, cooperative motion and force control of multiple robot arms are of fundamental importance. Three paradigms to study this problem are proposed. They are distinguished by the set of variables used for control design. They are joint torques, arm tip force vectors, and an accelerated generalized coordinate set. Control issues related to each case are discussed. The latter two choices require complete model information, which presents practical modeling, computational, and robustness problems. Therefore, focus is on the joint torque control case to develop relatively model independent motion and internal force control laws. The rigid body assumption allows the motion and force control problems to be independently addressed. By using an energy motivated Lyapunov function, a simple proportional derivative plus gravity compensation type of motion control law is always shown to be stabilizing. The asymptotic convergence of the tracing error to zero requires the use of a generalized coordinate with the contact constraints taken into account. If a non-generalized coordinate is used, only convergence to a steady state manifold can be concluded. For the force control, both feedforward and feedback schemes are analyzed. The feedback control, if proper care has been taken, exhibits better robustness and transient performance.
Guo, Wenzhong; Hong, Wei; Zhang, Bin; Chen, Yuzhong; Xiong, Naixue
2014-01-01
Mobile security is one of the most fundamental problems in Wireless Sensor Networks (WSNs). The data transmission path will be compromised for some disabled nodes. To construct a secure and reliable network, designing an adaptive route strategy which optimizes energy consumption and network lifetime of the aggregation cost is of great importance. In this paper, we address the reliable data aggregation route problem for WSNs. Firstly, to ensure nodes work properly, we propose a data aggregation route algorithm which improves the energy efficiency in the WSN. The construction process achieved through discrete particle swarm optimization (DPSO) saves node energy costs. Then, to balance the network load and establish a reliable network, an adaptive route algorithm with the minimal energy and the maximum lifetime is proposed. Since it is a non-linear constrained multi-objective optimization problem, in this paper we propose a DPSO with the multi-objective fitness function combined with the phenotype sharing function and penalty function to find available routes. Experimental results show that compared with other tree routing algorithms our algorithm can effectively reduce energy consumption and trade off energy consumption and network lifetime. PMID:25215944
Fundamentals of Diesel Engines.
ERIC Educational Resources Information Center
Marine Corps Inst., Washington, DC.
This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the fundamentals of diesel engine mechanics. Addressed in the three individual units of the course are the following topics: basic principles of diesel mechanics; principles, mechanics, and…
Fundamentals of the Slide Library.
ERIC Educational Resources Information Center
Boerner, Susan Zee
This paper is an introduction to the fundamentals of the art (including architecture) slide library, with some emphasis on basic procedures of the science slide library. Information in this paper is particularly relevant to the college, university, and museum slide library. Topics addressed include: (1) history of the slide library; (2) duties of…
Fundamentals of Physics, 6th Edition Enhanced Problems Version
NASA Astrophysics Data System (ADS)
Halliday, David; Resnick, Robert; Walker, Jearl
2002-04-01
No other text on the market today can match the success of Halliday, Resnick and Walker's Fundamentals of Physics. This text continues to outperform the competition year after year, and the new edition will be no exception. Intended for Calculus-based Physics courses, the 6th edition of this extraordinary text is a major redesign of the best-selling 5th edition, which still maintains many of the elements that led to its enormous success. Jearl Walker adds his unique style to this edition with the addition of new problems designed to capture, and keep, students' attention. Nearly all changes are based on suggestions from instructors and students using the 5th edition, from reviewer comments, and from research done on the process of learning. The primary goal of this text is to provide students with a solid understanding of fundamental physics concepts, and to help them apply this conceptual understanding to quantitative problem solving. The principal goal of Halliday-Resnick-Walker is to provide instructors with a tool by which they can teach students how to effectively read scientific material and successfully reason through scientific questions. To sharpen this tool, the Enhanced Problems Version of the sixth edition of Fundamentals of Physics contains over 1000 new, high-quality problems that require thought and reasoning rather than simplistic plugging of data into formulas.
Investigating the Conceptual Variation of Major Physics Textbooks
NASA Astrophysics Data System (ADS)
Stewart, John; Campbell, Richard; Clanton, Jessica
2008-04-01
The conceptual problem content of the electricity and magnetism chapters of seven major physics textbooks was investigated. The textbooks presented a total of 1600 conceptual electricity and magnetism problems. The solution to each problem was decomposed into its fundamental reasoning steps. These fundamental steps are, then, used to quantify the distribution of conceptual content among the set of topics common to the texts. The variation of the distribution of conceptual coverage within each text is studied. The variation between the major groupings of the textbooks (conceptual, algebra-based, and calculus-based) is also studied. A measure of the conceptual complexity of the problems in each text is presented.
Ergonomics and sustainability: towards an embrace of complexity and emergence.
Dekker, Sidney W A; Hancock, Peter A; Wilkin, Peter
2013-01-01
Technology offers a promising route to a sustainable future, and ergonomics can serve a vital role. The argument of this article is that the lasting success of sustainability initiatives in ergonomics hinges on an examination of ergonomics' own epistemology and ethics. The epistemology of ergonomics is fundamentally empiricist and positivist. This places practical constraints on its ability to address important issues such as sustainability, emergence and complexity. The implicit ethical position of ergonomics is one of neutrality, and its positivist epistemology generally puts value-laden questions outside the parameters of what it sees as scientific practice. We argue, by contrast, that a discipline that deals with both technology and human beings cannot avoid engaging with questions of complexity and emergence and seeking innovative ways of addressing these issues. Ergonomics has largely modelled its research on a reductive science, studying parts and problems to fix. In sustainability efforts, this can lead to mere local adaptations with a negative effect on global sustainability. Ergonomics must consider quality of life globally, appreciating complexity and emergent effects of local relationships.
NASA Astrophysics Data System (ADS)
Reiter, D. T.; Rodi, W. L.
2015-12-01
Constructing 3D Earth models through the joint inversion of large geophysical data sets presents numerous theoretical and practical challenges, especially when diverse types of data and model parameters are involved. Among the challenges are the computational complexity associated with large data and model vectors and the need to unify differing model parameterizations, forward modeling methods and regularization schemes within a common inversion framework. The challenges can be addressed in part by decomposing the inverse problem into smaller, simpler inverse problems that can be solved separately, providing one knows how to merge the separate inversion results into an optimal solution of the full problem. We have formulated an approach to the decomposition of large inverse problems based on the augmented Lagrangian technique from optimization theory. As commonly done, we define a solution to the full inverse problem as the Earth model minimizing an objective function motivated, for example, by a Bayesian inference formulation. Our decomposition approach recasts the minimization problem equivalently as the minimization of component objective functions, corresponding to specified data subsets, subject to the constraints that the minimizing models be equal. A standard optimization algorithm solves the resulting constrained minimization problems by alternating between the separate solution of the component problems and the updating of Lagrange multipliers that serve to steer the individual solution models toward a common model solving the full problem. We are applying our inversion method to the reconstruction of the·crust and upper-mantle seismic velocity structure across Eurasia.· Data for the inversion comprise a large set of P and S body-wave travel times·and fundamental and first-higher mode Rayleigh-wave group velocities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Acquesta, Erin C.S.; Valicka, Christopher G.; Hinga, Mark B.
As a tool developed to translate geospatial data into geometrical descriptors, Tracktable offers a highly efficient means to detect anomalous flight and maritime behavior. Following the success of using geometrical descriptors for detecting anomalous trajectory behavior, the question of whether Tracktable could be used to detect satellite maneuvers arose. In answering this question, this re- port will introduce a brief description of how Tracktable has been used in the past, along with an introduction to the fundamental properties of astrodynamics for satellite trajectories. This will then allow us to compare the two problem spaces, addressing how easily the methods usedmore » by Tracktable will translate to orbital mechanics. Based on these results, we will then be able to out- line the current limitations as well as possible path forward for using Tracktable to detect satellite maneuvers.« less
Chaffee, Benjamin W; Couch, Elizabeth T; Ryder, Mark I
2016-06-01
Although the prevalence of tobacco use has declined in some parts of the world, tobacco use remains a persistent and, in some cases, growing problem that will continue to be a fundamental challenge facing dental practitioners in the decades ahead. Dental practitioners have a unique opportunity and professional obligation to be a positive influence in reducing the economic and social burden inflicted by tobacco use on dental and general health. In this article, the current noninvasive, evidence-based approaches are presented for dental practitioners to help patients avoid initiating tobacco use, to encourage and assist patients in ceasing tobacco use and to address tobacco-induced damage to periodontal supporting tissues. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
User-Centered Design for Psychosocial Intervention Development and Implementation
Lyon, Aaron R.; Koerner, Kelly
2018-01-01
The current paper articulates how common difficulties encountered when attempting to implement or scale-up evidence-based treatments are exacerbated by fundamental design problems, which may be addressed by a set of principles and methods drawn from the contemporary field of user-centered design. User-centered design is an approach to product development that grounds the process in information collected about the individuals and settings where products will ultimately be used. To demonstrate the utility of this perspective, we present four design concepts and methods: (a) clear identification of end users and their needs, (b) prototyping/rapid iteration, (c) simplifying existing intervention parameters/procedures, and (d) exploiting natural constraints. We conclude with a brief design-focused research agenda for the developers and implementers of evidence-based treatments. PMID:29456295
Teaching Electrostatics and Entropy in Introductory Physics
NASA Astrophysics Data System (ADS)
Reeves, Mark
Entropy changes underlie the physics that dominates biological interactions. Indeed, introductory biology courses often begin with an exploration of the qualities of water that are important to living systems. However, one idea that is not explicitly addressed in most introductory physics or biology courses is important contribution of the entropy in driving fundamental biological processes towards equilibrium. I will present material developed to teach electrostatic screening in solutions and the function of nerve cells where entropic effects act to counterbalance electrostatic attraction. These ideas are taught in an introductory, calculus-based physics course to biomedical engineers using SCALEUP pedagogy. Results of student mastering of complex problems that cross disciplinary boundaries between biology and physics, as well as the challenges that they face in learning this material will be presented.
The generation, destination, and astrophysical applications of magnetohydrodynamic turbulence
NASA Astrophysics Data System (ADS)
Xu, Siyao; Lazarian, Alex; Zhang, Bing
2017-01-01
The ubiquitous turbulence in the interstellar medium (ISM) participates in astrophysical processes over a huge dynamic range of scales. Understanding the turbulence properties in the multiphase, magnetized, partially ionized, and compressible ISM is the fundamental step prior to the studies of the ISM physics and other fields of astrophysics. I feel that a triad of analytical, numerical and observational efforts provides a winning combination to understand this complex system and solve long-standing puzzles. I have intensively studied the fundamental physics of magnetohydrodynamic (MHD) turbulence, and focused on two primary domains, dynamo and dissipation, which concern the origin of strong magnetic fields and the destination of turbulence, respectively. I further applied my theoretical studies in interpreting numerical results and observational data in various astrophysical contexts. The advanced analyses of MHD turbulence enable me to address a number of challenging astrophysical problems, e.g. the importance of magnetic fields for star formation in the early and present-day universe, new methods of measuring magnetic fields, the density distribution in the Galaxy and the host galaxy of a fast radio burst, the diffusion and acceleration of cosmic rays in partially ionized ISM phases.
Virtual laboratories: new opportunities for collaborative water science
NASA Astrophysics Data System (ADS)
Ceola, Serena; Arheimer, Berit; Bloeschl, Guenter; Baratti, Emanuele; Capell, Rene; Castellarin, Attilio; Freer, Jim; Han, Dawei; Hrachowitz, Markus; Hundecha, Yeshewatesfa; Hutton, Christopher; Lindström, Goran; Montanari, Alberto; Nijzink, Remko; Parajka, Juraj; Toth, Elena; Viglione, Alberto; Wagener, Thorsten
2015-04-01
Reproducibility and repeatability of experiments are the fundamental prerequisites that allow researchers to validate results and share hydrological knowledge, experience and expertise in the light of global water management problems. Virtual laboratories offer new opportunities to enable these prerequisites since they allow experimenters to share data, tools and pre-defined experimental procedures (i.e. protocols). Here we present the outcomes of a first collaborative numerical experiment undertaken by five different international research groups in a virtual laboratory to address the key issues of reproducibility and repeatability. Moving from the definition of accurate and detailed experimental protocols, a rainfall-runoff model was independently applied to 15 European catchments by the research groups and model results were collectively examined through a web-based discussion. We found that a detailed modelling protocol was crucial to ensure the comparability and reproducibility of the proposed experiment across groups. Our results suggest that sharing comprehensive and precise protocols and running the experiments within a controlled environment (e.g. virtual laboratory) is as fundamental as sharing data and tools for ensuring experiment repeatability and reproducibility across the broad scientific community and thus advancing hydrology in a more coherent way.
Fundamental study of subharmonic vibration of order 1/2 in automatic transmissions for cars
NASA Astrophysics Data System (ADS)
Ryu, T.; Nakae, T.; Matsuzaki, K.; Nanba, A.; Takikawa, Y.; Ooi, Y.; Sueoka, A.
2016-09-01
A torque converter is an element that transfers torque from the engine to the gear train in the automatic transmission of an automobile. The damper spring of the lock-up clutch in the torque converter is used to effectively absorb the torsional vibration caused by engine combustion. A damper with low stiffness reduces fluctuations in rotational speed but is difficult to use because of space limitations. In order to address this problem, the damper is designed using a piecewise-linear spring with three stiffness stages. However, the damper causes a nonlinear vibration referred to as a subharmonic vibration of order 1/2. In the subharmonic vibration, the frequency is half that of the vibrations from the engine. In order to clarify the mechanism of the subharmonic vibration, in the present study, experiments are conducted using the fundamental experimental apparatus of a single-degree-of-freedom system with two stiffness stages. In the experiments, countermeasures to reduce the subharmonic vibration by varying the conditions of the experiments are also performed. The results of the experiments are evaluated through numerical analysis using the shooting method. The experimental and analytical results were found to be in close agreement.
Systemic Risk Analysis on Reconstructed Economic and Financial Networks
Cimini, Giulio; Squartini, Tiziano; Garlaschelli, Diego; Gabrielli, Andrea
2015-01-01
We address a fundamental problem that is systematically encountered when modeling real-world complex systems of societal relevance: the limitedness of the information available. In the case of economic and financial networks, privacy issues severely limit the information that can be accessed and, as a consequence, the possibility of correctly estimating the resilience of these systems to events such as financial shocks, crises and cascade failures. Here we present an innovative method to reconstruct the structure of such partially-accessible systems, based on the knowledge of intrinsic node-specific properties and of the number of connections of only a limited subset of nodes. This information is used to calibrate an inference procedure based on fundamental concepts derived from statistical physics, which allows to generate ensembles of directed weighted networks intended to represent the real system—so that the real network properties can be estimated as their average values within the ensemble. We test the method both on synthetic and empirical networks, focusing on the properties that are commonly used to measure systemic risk. Indeed, the method shows a remarkable robustness with respect to the limitedness of the information available, thus representing a valuable tool for gaining insights on privacy-protected economic and financial systems. PMID:26507849
Systemic Risk Analysis on Reconstructed Economic and Financial Networks
NASA Astrophysics Data System (ADS)
Cimini, Giulio; Squartini, Tiziano; Garlaschelli, Diego; Gabrielli, Andrea
2015-10-01
We address a fundamental problem that is systematically encountered when modeling real-world complex systems of societal relevance: the limitedness of the information available. In the case of economic and financial networks, privacy issues severely limit the information that can be accessed and, as a consequence, the possibility of correctly estimating the resilience of these systems to events such as financial shocks, crises and cascade failures. Here we present an innovative method to reconstruct the structure of such partially-accessible systems, based on the knowledge of intrinsic node-specific properties and of the number of connections of only a limited subset of nodes. This information is used to calibrate an inference procedure based on fundamental concepts derived from statistical physics, which allows to generate ensembles of directed weighted networks intended to represent the real system—so that the real network properties can be estimated as their average values within the ensemble. We test the method both on synthetic and empirical networks, focusing on the properties that are commonly used to measure systemic risk. Indeed, the method shows a remarkable robustness with respect to the limitedness of the information available, thus representing a valuable tool for gaining insights on privacy-protected economic and financial systems.
Closed-loop transfer recovery with observer-based controllers. I - Analysis. II - Design
NASA Technical Reports Server (NTRS)
Chen, Ben M.; Saberi, Ali; Ly, Uy-Loi
1992-01-01
A detailed study is presented of three fundamental issues related to the problem of closed-loop transfer (CLT) recovery. The first issues concerns what can and cannot be achieved for a given system and for an arbitrary target CLT function (TCLTF). The second issue involves developing necessary and/or sufficient conditions for a TCLTF to be recoverable either exactly or approximately. The third issue involves the necessary and/or sufficient conditions on a given system such that it has at least one recoverable TCLTF. The results of the analysis identify some fundamental limitations of the given system as a consequence of its structural properties which enables designers to appreciate at the outset different design limitations incurred in the synthesis of output-feedback controllers. Then, the actual design of full-order or reduced-order observer-based controllers is addressed which will achieve as close as possibly the desired TCLTF. Three design methods are considered: (1) the ATEA method, (2) a method that minimizes the H2-norm of a recovery matrix, and (3) a method that minimizes the respective H(infinity) norm. The relative merits of the methods are discussed.
A self-learning camera for the validation of highly variable and pseudorandom patterns
NASA Astrophysics Data System (ADS)
Kelley, Michael
2004-05-01
Reliable and productive manufacturing operations have depended on people to quickly detect and solve problems whenever they appear. Over the last 20 years, more and more manufacturing operations have embraced machine vision systems to increase productivity, reliability and cost-effectiveness, including reducing the number of human operators required. Although machine vision technology has long been capable of solving simple problems, it has still not been broadly implemented. The reason is that until now, no machine vision system has been designed to meet the unique demands of complicated pattern recognition. The ZiCAM family was specifically developed to be the first practical hardware to meet these needs. To be able to address non-traditional applications, the machine vision industry must include smart camera technology that meets its users" demands for lower costs, better performance and the ability to address applications of irregular lighting, patterns and color. The next-generation smart cameras will need to evolve as a fundamentally different kind of sensor, with new technology that behaves like a human but performs like a computer. Neural network based systems, coupled with self-taught, n-space, non-linear modeling, promises to be the enabler of the next generation of machine vision equipment. Image processing technology is now available that enables a system to match an operator"s subjectivity. A Zero-Instruction-Set-Computer (ZISC) powered smart camera allows high-speed fuzzy-logic processing, without the need for computer programming. This can address applications of validating highly variable and pseudo-random patterns. A hardware-based implementation of a neural network, Zero-Instruction-Set-Computer, enables a vision system to "think" and "inspect" like a human, with the speed and reliability of a machine.
Beyond Consultation: First Nations and the Governance of Shale Gas in British Columbia
NASA Astrophysics Data System (ADS)
Garvie, Kathryn Henderson
As the province of British Columbia seeks to rapidly develop an extensive natural gas industry, it faces a number of challenges. One of these is that of ensuring that development does not disproportionately impact some of the province's most marginalized communities: the First Nations on whose land extraction will take place. This is particularly crucial given that environmental problems are often caused by unjust and inequitable social conditions that must be rectified before sustainable development can be advanced. This research investigates how the BC Oil and Gas Commission's consultation process addresses, and could be improved to better address Treaty 8 First Nations' concerns regarding shale gas development within their traditional territories. Interviews were conducted with four Treaty 8 First Nations, the Treaty 8 Tribal Association, and provincial government and industry staff. Additionally, participant observation was conducted with the Fort Nelson First Nation Lands and Resources Department. Findings indicate that like many other resource consultation processes in British Columbia, the oil and gas consultation process is unable to meaningfully address First Nations' concerns and values due to fundamental procedural problems, including the permit-by-permit approach and the exclusion of First Nations from the point of decision-making. Considering the government's failure to regulate the shale gas industry in a way that protects ecological, social and cultural resilience, we argue that new governance mechanisms are needed that reallocate authority to First Nations and incorporate proposals for early engagement, long-term planning and cumulative impact assessment and monitoring. Additionally, considering the exceptional power differential between government, industry and First Nations, we argue that challenging industry's social license to operate is an important strategy for First Nations working to gain greater influence over development within their territories, and to ensure a more sustainable shale gas industry.
Dark Energy: A Crisis for Fundamental Physics
Stubbs, Christopher [Harvard University, Cambridge, Massachusetts, USA
2017-12-09
Astrophysical observations provide robust evidence that our current picture of fundamental physics is incomplete. The discovery in 1998 that the expansion of the Universe is accelerating (apparently due to gravitational repulsion between regions of empty space!) presents us with a profound challenge, at the interface between gravity and quantum mechanics. This "Dark Energy" problem is arguably the most pressing open question in modern fundamental physics. The first talk will describe why the Dark Energy problem constitutes a crisis, with wide-reaching ramifications. One consequence is that we should probe our understanding of gravity at all accessible scales, and the second talk will present experiments and observations that are exploring this issue.
The First R: Fundamentals of Initial Reading Instruction. Developments in Classroom Instruction.
ERIC Educational Resources Information Center
Shuman, R. Baird
Addressing subjects ranging from reading readiness to phonics, this book examines several fundamental elements of beginning reading instruction. Divided into 12 chapters, the book begins with a chapter providing a general overview of reading instruction, including the debate between the perception of reading as decoding or comprehension, and other…
ERIC Educational Resources Information Center
Faria, Carlos; Vale, Carolina; Machado, Toni; Erlhagen, Wolfram; Rito, Manuel; Monteiro, Sérgio; Bicho, Estela
2016-01-01
Robotics has been playing an important role in modern surgery, especially in procedures that require extreme precision, such as neurosurgery. This paper addresses the challenge of teaching robotics to undergraduate engineering students, through an experiential learning project of robotics fundamentals based on a case study of robot-assisted…
2016-03-01
CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT...NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER 9...SPONSORING / MONITORING AGENCY NAME( S ) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM( S ) 11. SPONSOR/MONITOR’S REPORT NUMBER( S ) 12
HARNESSING THE CHEMISTRY OF CO{sub 2}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, Janis
2012-11-30
Our research presents several strategies for addressing the challenges of activating CO2. In addition, our cycloaddition chemistry addresses several fundamental issues pertaining to catalysis as it applies to energy conservation. Topics addressed include: DEVELOPMENT OF A CYCLOADDITION CATALYST; INCREASING THE UTILITY OF THE NI CYCLOADDITION CATALYST; UNDERSTANDING THE MECHANISM OF NI-CATALYZED CYCLOADDITION; and METAL-FREE CO{sub 2} ACTIVATION.
Global and Domestic Spheres: Impact on The Traditional Settlement of Penglipuran in Bali
NASA Astrophysics Data System (ADS)
Suartika, G. A. M.
2018-02-01
The issue addressed is how to situate the ‘life-world’ of a traditional Balinese village in the context of global tourism which is now worth USD 7.61 trillion, This problem has been brought sharply into focus, indeed amplified, by the recently adopted policy of local government in Bali to designate specific villages as tourist sites. Such action appears to be taking place in the absence of any prior studies of the social, environmental and other impacts on local communities. The following paper represents an initial attempt to sketch out the breadth of the problem from the global to the local. The paper first proceeds by examining the conceptual framework of globalisation - fundamentally an economic process, - from which global cultural identities merge. It proceeds to give an overview of policy at a global level starting with the idea of designated United Nations World Heritage sites and the inevitable commodification of ‘place.’ Then the issues and problems that flow from these first sections are discussed. Next, the situation in Indonesia as a whole is brought into sharp relief, focussing on policy and legislation that affect the internationally celebrated island of Bali. Finally, a case study of Penglipuran, a traditional Bali Aga village is discussed taking into account the preceding context. In conclusion, suggestions are made as to the significant research and policy issues that need to be addressed concurrently with the development of the settlement. The study documented in this paper was conducted using phenomenological approaches. It was grounded by researcher's experiences, gained from her deep involvement in community life as well as physical and site observation activities during data collection period.
Atmospheric science and power production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Randerson, D.
1984-07-01
This is the third in a series of scientific publications sponsored by the US Atomic Energy Commission and the two later organizations, the US Energy Research and Development Adminstration, and the US Department of Energy. The first book, Meteorology and Atomic Energy, was published in 1955; the second, in 1968. The present volume is designed to update and to expand upon many of the important concepts presented previously. However, the present edition draws heavily on recent contributions made by atmospheric science to the analysis of air quality and on results originating from research conducted and completed in the 1970s. Specialmore » emphasis is placed on how atmospheric science can contribute to solving problems relating to the fate of combustion products released into the atmosphere. The framework of this book is built around the concept of air-quality modeling. Fundamentals are addressed first to equip the reader with basic background information and to focus on available meteorological instrumentation and to emphasize the importance of data management procedures. Atmospheric physics and field experiments are described in detail to provide an overview of atmospheric boundary layer processes, of how air flows around obstacles, and of the mechanism of plume rise. Atmospheric chemistry and removal processes are also detailed to provide fundamental knowledge on how gases and particulate matter can be transformed while in the atmosphere and how they can be removed from the atmosphere. The book closes with a review of how air-quality models are being applied to solve a wide variety of problems. Separate analytics have been prepared for each chapter.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, G
2004-02-05
Synchrotron-based techniques are fundamental to research in ''Molecular Environmental Science'' (MES), an emerging field that involves molecular-level studies of chemical and biological processes affecting the speciation, properties, and behavior of contaminants, pollutants, and nutrients in the ecosphere. These techniques enable the study of aqueous solute complexes, poorly crystalline materials, solid-liquid interfaces, mineral-aqueous solution interactions, microbial biofilm-heavy metal interactions, heavy metal-plant interactions, complex material microstructures, and nanomaterials, all of which are important components or processes in the environment. Basic understanding of environmental materials and processes at the molecular scale is essential for risk assessment and management, and reduction of environmental pollutantsmore » at field, landscape, and global scales. One of the main purposes of this report is to illustrate the role of synchrotron radiation (SR)-based studies in environmental science and related fields and their impact on environmental problems of importance to society. A major driving force for MES research is the need to characterize, treat, and/or dispose of vast quantities of contaminated materials, including groundwater, sediments, and soils, and to process wastes, at an estimated cost exceeding 150 billion dollars through 2070. A major component of this problem derives from high-level nuclear waste. Other significant components come from mining and industrial wastes, atmospheric pollutants derived from fossil fuel consumption, agricultural pesticides and fertilizers, and the pollution problems associated with animal waste run-off, all of which have major impacts on human health and welfare. Addressing these problems requires the development of new characterization and processing technologies--efforts that require information on the chemical speciation of heavy metals, radionuclides, and xenobiotic organic compounds and their reactions with environmental materials. To achieve this goal, both fundamental and targeted studies of complex environmental systems at a molecular level are needed, and examples of both types of studies are presented herein. These examples illustrate the fact that MES SR studies have led to a revolution in our understanding of the fundamental physical and chemical aspects of natural systems. The MES SR user community has continued to experience strong growth at U.S. SR laboratories, with MES researchers comprising up to 15% of the total user base. Further growth and development of the MES community is being hindered by insufficient resources, including support personnel, materials preparation facilities, and available beam time at U.S. SR laboratories. ''EnviroSync'' recommends the following actions, in cooperation with U.S. SR laboratory directors, to meet the MES community's needs.« less
Addressing problems of employee performance.
McConnell, Charles R
2011-01-01
Employee performance problems are essentially of 2 kinds: those that are motivational in origin and those resulting from skill deficiencies. Both kinds of problems are the province of the department manager. Performance problems differ from problems of conduct in that traditional disciplinary processes ordinarily do not apply. Rather, performance problems are addressed through educational and remedial processes. The manager has a basic responsibility in ensuring that everything reasonable is done to help each employee succeed. There are a number of steps the manager can take to address employee performance problems.
Toward information management in corporations (2)
NASA Astrophysics Data System (ADS)
Shibata, Mitsuru
If construction of inhouse information management systems in an advanced information society should be positioned along with the social information management, its base making begins with reviewing current paper filing systems. Since the problems which inhere in inhouse information management systems utilizing OA equipments also inhere in paper filing systems, the first step toward full scale inhouse information management should be to grasp and solve the fundamental problems in current filing systems. This paper describes analysis of fundamental problems in filing systems, making new type of offices and analysis of improvement needs in filing systems, and some points in improving filing systems.
NASA Astrophysics Data System (ADS)
Chen, Wen; Wang, Fajie
Based on the implicit calculus equation modeling approach, this paper proposes a speculative concept of the potential and wave operators on negative dimensionality. Unlike the standard partial differential equation (PDE) modeling, the implicit calculus modeling approach does not require the explicit expression of the PDE governing equation. Instead the fundamental solution of physical problem is used to implicitly define the differential operator and to implement simulation in conjunction with the appropriate boundary conditions. In this study, we conjecture an extension of the fundamental solution of the standard Laplace and Helmholtz equations to negative dimensionality. And then by using the singular boundary method, a recent boundary discretization technique, we investigate the potential and wave problems using the fundamental solution on negative dimensionality. Numerical experiments reveal that the physics behaviors on negative dimensionality may differ on positive dimensionality. This speculative study might open an unexplored territory in research.
State analysis requirements database for engineering complex embedded systems
NASA Technical Reports Server (NTRS)
Bennett, Matthew B.; Rasmussen, Robert D.; Ingham, Michel D.
2004-01-01
It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer's intent, potentially leading to software errors. This problem is addressed by a systems engineering tool called the State Analysis Database, which provides a tool for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using the State Analysis Database.
A Framework for Sharing and Integrating Remote Sensing and GIS Models Based on Web Service
Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin
2014-01-01
Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a “black box” and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users. PMID:24901016
Multiscale methods for computational RNA enzymology
Panteva, Maria T.; Dissanayake, Thakshila; Chen, Haoyuan; Radak, Brian K.; Kuechler, Erich R.; Giambaşu, George M.; Lee, Tai-Sung; York, Darrin M.
2016-01-01
RNA catalysis is of fundamental importance to biology and yet remains ill-understood due to its complex nature. The multi-dimensional “problem space” of RNA catalysis includes both local and global conformational rearrangements, changes in the ion atmosphere around nucleic acids and metal ion binding, dependence on potentially correlated protonation states of key residues and bond breaking/forming in the chemical steps of the reaction. The goal of this article is to summarize and apply multiscale modeling methods in an effort to target the different parts of the RNA catalysis problem space while also addressing the limitations and pitfalls of these methods. Classical molecular dynamics (MD) simulations, reference interaction site model (RISM) calculations, constant pH molecular dynamics (CpHMD) simulations, Hamiltonian replica exchange molecular dynamics (HREMD) and quantum mechanical/molecular mechanical (QM/MM) simulations will be discussed in the context of the study of RNA backbone cleavage transesterification. This reaction is catalyzed by both RNA and protein enzymes, and here we examine the different mechanistic strategies taken by the hepatitis delta virus ribozyme (HDVr) and RNase A. PMID:25726472
NASA Astrophysics Data System (ADS)
He, Xingyu; Tong, Ningning; Hu, Xiaowei
2018-01-01
Compressive sensing has been successfully applied to inverse synthetic aperture radar (ISAR) imaging of moving targets. By exploiting the block sparse structure of the target image, sparse solution for multiple measurement vectors (MMV) can be applied in ISAR imaging and a substantial performance improvement can be achieved. As an effective sparse recovery method, sparse Bayesian learning (SBL) for MMV involves a matrix inverse at each iteration. Its associated computational complexity grows significantly with the problem size. To address this problem, we develop a fast inverse-free (IF) SBL method for MMV. A relaxed evidence lower bound (ELBO), which is computationally more amiable than the traditional ELBO used by SBL, is obtained by invoking fundamental property for smooth functions. A variational expectation-maximization scheme is then employed to maximize the relaxed ELBO, and a computationally efficient IF-MSBL algorithm is proposed. Numerical results based on simulated and real data show that the proposed method can reconstruct row sparse signal accurately and obtain clear superresolution ISAR images. Moreover, the running time and computational complexity are reduced to a great extent compared with traditional SBL methods.
A framework for sharing and integrating remote sensing and GIS models based on Web service.
Chen, Zeqiang; Lin, Hui; Chen, Min; Liu, Deer; Bao, Ying; Ding, Yulin
2014-01-01
Sharing and integrating Remote Sensing (RS) and Geographic Information System/Science (GIS) models are critical for developing practical application systems. Facilitating model sharing and model integration is a problem for model publishers and model users, respectively. To address this problem, a framework based on a Web service for sharing and integrating RS and GIS models is proposed in this paper. The fundamental idea of the framework is to publish heterogeneous RS and GIS models into standard Web services for sharing and interoperation and then to integrate the RS and GIS models using Web services. For the former, a "black box" and a visual method are employed to facilitate the publishing of the models as Web services. For the latter, model integration based on the geospatial workflow and semantic supported marching method is introduced. Under this framework, model sharing and integration is applied for developing the Pearl River Delta water environment monitoring system. The results show that the framework can facilitate model sharing and model integration for model publishers and model users.
Physiology, behavior, and conservation.
Cooke, Steven J; Blumstein, Daniel T; Buchholz, Richard; Caro, Tim; Fernández-Juricic, Esteban; Franklin, Craig E; Metcalfe, Julian; O'Connor, Constance M; St Clair, Colleen Cassady; Sutherland, William J; Wikelski, Martin
2014-01-01
Many animal populations are in decline as a result of human activity. Conservation practitioners are attempting to prevent further declines and loss of biodiversity as well as to facilitate recovery of endangered species, and they often rely on interdisciplinary approaches to generate conservation solutions. Two recent interfaces in conservation science involve animal behavior (i.e., conservation behavior) and physiology (i.e., conservation physiology). To date, these interfaces have been considered separate entities, but from both pragmatic and biological perspectives, there is merit in better integrating behavior and physiology to address applied conservation problems and to inform resource management. Although there are some institutional, conceptual, methodological, and communication-oriented challenges to integrating behavior and physiology to inform conservation actions, most of these barriers can be overcome. Through outlining several successful examples that integrate these disciplines, we conclude that physiology and behavior can together generate meaningful data to support animal conservation and management actions. Tangentially, applied conservation and management problems can, in turn, also help advance and reinvigorate the fundamental disciplines of animal physiology and behavior by providing advanced natural experiments that challenge traditional frameworks.
Genetic testing and private insurance--a case of "selling one's body"?
Hübner, D
2006-01-01
Arguments against the possible use of genetic test results in private health and life insurance predominantly refer to the problem of certain gene carriers failing to obtain affordable insurance cover. However, some moral intuitions speaking against this practice seem to be more fundamental than mere concerns about adverse distributional effects. In their perspective, the central ethical problem is not that some people might fail to get insurance cover because of their 'bad genes', but rather that some people would manage to get insurance cover because of their 'good genes'. This paper tries to highlight the ethical background of these intuitions. Their guiding idea appears to be that, by pointing to his favourable test results, a customer might make an attempt to 'sell his body'. The rationale of this concept is developed and its applicability to the case at issue is critically investigated. The aim is to clarify an essential objection against the use of genetic information in private insurance which has not yet been openly addressed in the academic debate of the topic.
Color appearance for photorealistic image synthesis
NASA Astrophysics Data System (ADS)
Marini, Daniele; Rizzi, Alessandro; Rossi, Maurizio
2000-12-01
Photorealistic Image Synthesis is a relevant research and application field in computer graphics, whose aim is to produce synthetic images that are undistinguishable from real ones. Photorealism is based upon accurate computational models of light material interaction, that allow us to compute the spectral intensity light field of a geometrically described scene. The fundamental methods are ray tracing and radiosity. While radiosity allows us to compute the diffuse component of the emitted and reflected light, applying ray tracing in a two pass solution we can also cope with non diffuse properties of the model surfaces. Both methods can be implemented to generate an accurate photometric distribution of light of the simulated environment. A still open problem is the visualization phase, whose purpose is to display the final result of the simulated mode on a monitor screen or on a printed paper. The tone reproduction problem consists of finding the best solution to compress the extended dynamic range of the computed light field into the limited range of the displayable colors. Recently some scholars have addressed this problem considering the perception stage of image formation, so including a model of the human visual system in the visualization process. In this paper we present a working hypothesis to solve the tone reproduction problem of synthetic image generation, integrating Retinex perception model into the photo realistic image synthesis context.
Fundamental Issues Concerning the Sustainment and Scaling Up of Professional Development Programs
ERIC Educational Resources Information Center
Tirosh, Dina; Tsamir, Pessia; Levenson, Esther
2015-01-01
The issue of sustaining and scaling up professional development for mathematics teachers raises several fundamental issues for researchers. This commentary addresses various definitions for sustainability and scaling up and how these definitions may affect the design of programs as well as the design of research. We consider four of the papers in…
ERIC Educational Resources Information Center
Sherrard, J. H., Ed.
Papers are presented identifying fundamental research needs in water and wastewater treatment by industrial users of technology, industrial users of research, a municipal water department, a consulting engineer, Congress, and the EPA. Areas of research needs addressed include: (1) microbial, viral, and organic contaminants; (2) biological…
Aldous, Leigh; Bendova, Magdalena; Gonzalez-Miquel, Maria; Swadźba-Kwaśny, Małgorzata
2018-05-22
For the third time, a Faraday Discussion addressed ionic liquids. Encompassing the wealth of research in this field, the contributions ranged from fundamental insights to the diverse applications of ionic liquids. Lively discussions initiated in the lecture hall and during poster sessions then seamlessly continued during the social program.
Couples' Reports of Relationship Problems in a Naturalistic Therapy Setting
ERIC Educational Resources Information Center
Boisvert, Marie-Michele; Wright, John; Tremblay, Nadine; McDuff, Pierre
2011-01-01
Understanding couples' relationship problems is fundamental to couple therapy. Although research has documented common relationship problems, no study has used open-ended questions to explore problems in couples seeking therapy in naturalistic settings. The present study used a reliable coding system to explore the relationship problems reported…
An introduction to structural health monitoring.
Farrar, Charles R; Worden, Keith
2007-02-15
The process of implementing a damage identification strategy for aerospace, civil and mechanical engineering infrastructure is referred to as structural health monitoring (SHM). Here, damage is defined as changes to the material and/or geometric properties of these systems, including changes to the boundary conditions and system connectivity, which adversely affect the system's performance. A wide variety of highly effective local non-destructive evaluation tools are available for such monitoring. However, the majority of SHM research conducted over the last 30 years has attempted to identify damage in structures on a more global basis. The past 10 years have seen a rapid increase in the amount of research related to SHM as quantified by the significant escalation in papers published on this subject. The increased interest in SHM and its associated potential for significant life-safety and economic benefits has motivated the need for this theme issue. This introduction begins with a brief history of SHM technology development. Recent research has begun to recognize that the SHM problem is fundamentally one of the statistical pattern recognition (SPR) and a paradigm to address such a problem is described in detail herein as it forms the basis for organization of this theme issue. In the process of providing the historical overview and summarizing the SPR paradigm, the subsequent articles in this theme issue are cited in an effort to show how they fit into this overview of SHM. In conclusion, technical challenges that must be addressed if SHM is to gain wider application are discussed in a general manner.
Dark Energy: A Crisis for Fundamental Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stubbs, Christopher
2010-04-12
Astrophysical observations provide robust evidence that our current picture of fundamental physics is incomplete. The discovery in 1998 that the expansion of the Universe is accelerating (apparently due to gravitational repulsion between regions of empty space!) presents us with a profound challenge, at the interface between gravity and quantum mechanics. This "Dark Energy" problem is arguably the most pressing open question in modern fundamental physics. The first talk will describe why the Dark Energy problem constitutes a crisis, with wide-reaching ramifications. One consequence is that we should probe our understanding of gravity at all accessible scales, and the second talkmore » will present experiments and observations that are exploring this issue.« less
Using Invention to Change How Students Tackle Problems
Smith, Karen M.; van Stolk, Adrian P.; Spiegelman, George B.
2010-01-01
Invention activities challenge students to tackle problems that superficially appear unrelated to the course material but illustrate underlying fundamental concepts that are fundamental to material that will be presented. During our invention activities in a first-year biology class, students were presented with problems that are parallel to those that living cells must solve, in weekly sessions over a 13-wk term. We compared students who participated in the invention activities sessions with students who participated in sessions of structured problem solving and with students who did not participate in either activity. When faced with developing a solution to a challenging and unfamiliar biology problem, invention activity students were much quicker to engage with the problem and routinely provided multiple reasonable hypotheses. In contrast the other students were significantly slower in beginning to work on the problem and routinely produced relatively few ideas. We suggest that the invention activities develop a highly valuable skill that operates at the initial stages of problem solving. PMID:21123697
An Assessment of the State-of-the-Art in Multidisciplinary Aeromechanical Analyses
2008-01-01
monolithic formulations. In summary, for aerospace structures, partitioned formulations provide fundamental advantages over fully coupled ones, in addition...important frequencies of local analysis directly to global analysis using detailed modeling. Performed ju- diciously, based on a fundamental understanding of...in 2000 has com- prehensively described the problem, and reviewed the status of fundamental understanding, experimental data, and analytical
ERIC Educational Resources Information Center
Kotlyarov, I. V.; Kostyukevich, S. V.; Yakovleva, N. I.
2015-01-01
In this article we investigate the problem of the balance of fundamental and applied training in technical colleges through the lens of a historical analysis of the development of the Soviet school of engineering. We demonstrate that the Soviet school of engineering became overreliant on fundamental education due to historical features of its…
ERIC Educational Resources Information Center
Rajagopalan, Kanavillil
2006-01-01
The objective of this response article is to think through some of what I see as the far-reaching implications of a recent paper by Eric Hauser (2005) entitled "Coding 'corrective recasts': the maintenance of meaning and more fundamental problems". Hauser makes a compelling, empirically-backed case for his contention that, contrary to widespread…
Inorganic Photovoltaics Materials and Devices: Past, Present, and Future
NASA Technical Reports Server (NTRS)
Hepp, Aloysius F.; Bailey, Sheila G.; Rafaelle, Ryne P.
2005-01-01
This report describes recent aspects of advanced inorganic materials for photovoltaics or solar cell applications. Specific materials examined will be high-efficiency silicon, gallium arsenide and related materials, and thin-film materials, particularly amorphous silicon and (polycrystalline) copper indium selenide. Some of the advanced concepts discussed include multi-junction III-V (and thin-film) devices, utilization of nanotechnology, specifically quantum dots, low-temperature chemical processing, polymer substrates for lightweight and low-cost solar arrays, concentrator cells, and integrated power devices. While many of these technologies will eventually be used for utility and consumer applications, their genesis can be traced back to challenging problems related to power generation for aerospace and defense. Because this overview of inorganic materials is included in a monogram focused on organic photovoltaics, fundamental issues and metrics common to all solar cell devices (and arrays) will be addressed.
The Peer Social Networks of Young Children with Down Syndrome in Classroom Programmes
Guralnick, Michael J.; Connor, Robert T.; Johnson, L. Clark
2010-01-01
Background The nature and characteristics of the peer social networks of young children with Down syndrome in classroom settings were examined within a developmental framework. Method Comparisons were made with younger typically developing children matched on mental age and typically developing children matched on chronological age. Results Similar patterns were found for all three groups for most peer social network measures. However, group differences were obtained for measures of teacher assistance and peer interactions in unstructured situations. Conclusions Positive patterns appeared to be related to the social orientation of children with Down syndrome and the special efforts of teachers to support children’s peer social networks. Findings also suggested that fundamental peer competence problems for children with Down syndrome remain and may best be addressed within the framework of contemporary models of peer-related social competence. PMID:21765644
LSST Astroinformatics And Astrostatistics: Data-oriented Astronomical Research
NASA Astrophysics Data System (ADS)
Borne, Kirk D.; Stassun, K.; Brunner, R. J.; Djorgovski, S. G.; Graham, M.; Hakkila, J.; Mahabal, A.; Paegert, M.; Pesenson, M.; Ptak, A.; Scargle, J.; Informatics, LSST; Statistics Team
2011-01-01
The LSST Informatics and Statistics Science Collaboration (ISSC) focuses on research and scientific discovery challenges posed by the very large and complex data collection that LSST will generate. Application areas include astroinformatics, machine learning, data mining, astrostatistics, visualization, scientific data semantics, time series analysis, and advanced signal processing. Research problems to be addressed with these methodologies include transient event characterization and classification, rare class discovery, correlation mining, outlier/anomaly/surprise detection, improved estimators (e.g., for photometric redshift or early onset supernova classification), exploration of highly dimensional (multivariate) data catalogs, and more. We present sample science results from these data-oriented approaches to large-data astronomical research. We present results from LSST ISSC team members, including the EB (Eclipsing Binary) Factory, the environmental variations in the fundamental plane of elliptical galaxies, and outlier detection in multivariate catalogs.
Unsupervised learning of natural languages
Solan, Zach; Horn, David; Ruppin, Eytan; Edelman, Shimon
2005-01-01
We address the problem, fundamental to linguistics, bioinformatics, and certain other disciplines, of using corpora of raw symbolic sequential data to infer underlying rules that govern their production. Given a corpus of strings (such as text, transcribed speech, chromosome or protein sequence data, sheet music, etc.), our unsupervised algorithm recursively distills from it hierarchically structured patterns. The adios (automatic distillation of structure) algorithm relies on a statistical method for pattern extraction and on structured generalization, two processes that have been implicated in language acquisition. It has been evaluated on artificial context-free grammars with thousands of rules, on natural languages as diverse as English and Chinese, and on protein data correlating sequence with function. This unsupervised algorithm is capable of learning complex syntax, generating grammatical novel sentences, and proving useful in other fields that call for structure discovery from raw data, such as bioinformatics. PMID:16087885
Unsupervised learning of natural languages.
Solan, Zach; Horn, David; Ruppin, Eytan; Edelman, Shimon
2005-08-16
We address the problem, fundamental to linguistics, bioinformatics, and certain other disciplines, of using corpora of raw symbolic sequential data to infer underlying rules that govern their production. Given a corpus of strings (such as text, transcribed speech, chromosome or protein sequence data, sheet music, etc.), our unsupervised algorithm recursively distills from it hierarchically structured patterns. The adios (automatic distillation of structure) algorithm relies on a statistical method for pattern extraction and on structured generalization, two processes that have been implicated in language acquisition. It has been evaluated on artificial context-free grammars with thousands of rules, on natural languages as diverse as English and Chinese, and on protein data correlating sequence with function. This unsupervised algorithm is capable of learning complex syntax, generating grammatical novel sentences, and proving useful in other fields that call for structure discovery from raw data, such as bioinformatics.
Developments in capture-γ libraries for nonproliferation applications
NASA Astrophysics Data System (ADS)
Hurst, A. M.; Firestone, R. B.; Sleaford, B. W.; Bleuel, D. L.; Basunia, M. S.; Bečvář, F.; Belgya, T.; Bernstein, L. A.; Carroll, J. J.; Detwiler, B.; Escher, J. E.; Genreith, C.; Goldblum, B. L.; Krtička, M.; Lerch, A. G.; Matters, D. A.; McClory, J. W.; McHale, S. R.; Révay, Zs.; Szentmiklosi, L.; Turkoglu, D.; Ureche, A.; Vujic, J.
2017-09-01
The neutron-capture reaction is fundamental for identifying and analyzing the γ-ray spectrum from an unknown assembly because it provides unambiguous information on the neutron-absorbing isotopes. Nondestructive-assay applications may exploit this phenomenon passively, for example, in the presence of spontaneous-fission neutrons, or actively where an external neutron source is used as a probe. There are known gaps in the Evaluated Nuclear Data File libraries corresponding to neutron-capture γ-ray data that otherwise limit transport-modeling applications. In this work, we describe how new thermal neutron-capture data are being used to improve information in the neutron-data libraries for isotopes relevant to nonproliferation applications. We address this problem by providing new experimentally-deduced partial and total neutron-capture reaction cross sections and then evaluate these data by comparison with statistical-model calculations.
Slow dynamics in translation-invariant quantum lattice models
NASA Astrophysics Data System (ADS)
Michailidis, Alexios A.; Žnidarič, Marko; Medvedyeva, Mariya; Abanin, Dmitry A.; Prosen, Tomaž; Papić, Z.
2018-03-01
Many-body quantum systems typically display fast dynamics and ballistic spreading of information. Here we address the open problem of how slow the dynamics can be after a generic breaking of integrability by local interactions. We develop a method based on degenerate perturbation theory that reveals slow dynamical regimes and delocalization processes in general translation invariant models, along with accurate estimates of their delocalization time scales. Our results shed light on the fundamental questions of the robustness of quantum integrable systems and the possibility of many-body localization without disorder. As an example, we construct a large class of one-dimensional lattice models where, despite the absence of asymptotic localization, the transient dynamics is exceptionally slow, i.e., the dynamics is indistinguishable from that of many-body localized systems for the system sizes and time scales accessible in experiments and numerical simulations.
Settling into the midstream? Lessons for governance from the decade of nanotechnology
NASA Astrophysics Data System (ADS)
Bosso, Christopher
2016-06-01
This paper analyzes scholarly papers published from 2003 through 2013 on the general theme of nanotechnology and governance. It considers three general points: (1) the "problem" of nanotechnology; (2) general lessons for governance obtained; and (3) prospects for aligning the US regulatory system to the next generation of complex engineered nano-materials. It argues that engineered nano-materials and products are coming to market within an already mature regulatory framework of decade-old statutes, long-standing bureaucratic rules and routines, narrowly directive judicial decisions, and embedded institutional norms. That extant regulatory regime shapes how policymakers perceive, define, and address the relative benefits and risks of both proximate and yet-to-be idealized nano-materials and applications. The paper concludes that fundamental reforms in the extant regime are unlikely short of a perceived crisis.
Dimension-dependent stimulated radiative interaction of a single electron quantum wavepacket
NASA Astrophysics Data System (ADS)
Gover, Avraham; Pan, Yiming
2018-06-01
In the foundation of quantum mechanics, the spatial dimensions of electron wavepacket are understood only in terms of an expectation value - the probability distribution of the particle location. One can still inquire how the quantum electron wavepacket size affects a physical process. Here we address the fundamental physics problem of particle-wave duality and the measurability of a free electron quantum wavepacket. Our analysis of stimulated radiative interaction of an electron wavepacket, accompanied by numerical computations, reveals two limits. In the quantum regime of long wavepacket size relative to radiation wavelength, one obtains only quantum-recoil multiphoton sidebands in the electron energy spectrum. In the opposite regime, the wavepacket interaction approaches the limit of classical point-particle acceleration. The wavepacket features can be revealed in experiments carried out in the intermediate regime of wavepacket size commensurate with the radiation wavelength.
The coming fiscal crisis: nephrology in the line of fire.
Andersen, Martin J; Friedman, Allon N
2013-07-01
Nephrologists in the United States face a very uncertain economic future. The astronomical federal debt and unfunded liability burden of Medicare combined with the aging population will place unprecedented strain on the health care sector. To address these fundamental problems, it is conceivable that the federal government will ultimately institute rationing and other budget-cutting measures to rein in costs of ESRD care, which is generously funded relative to other chronic illnesses. Therefore, nephrologists should expect implementation of cost-cutting measures, such age-based rationing, mandated delayed dialysis and home therapies, compensated organ donation, and a shift in research priorities from the dialysis to the predialysis patient population. Nephrologists also need to recognize that these changes, which are geared toward the population level, may make it more difficult to advocate effectively for the needs of individual patients.
Spacelab Life Sciences 1 and 2 scientific research objectives
NASA Technical Reports Server (NTRS)
Leach, Carolyn S.; Schneider, Howard J.
1987-01-01
The pressurized Spacelab module was designed and built to allow investigators to conduct research in space in an environment approximating that of a ground-based laboratory. It is configured to allow multiple investigations employing both human and nonhuman subjects. This flexability is exemplified by the SLS-1, SLS-2, and SLS-3 experiment complement. A total of 21 experiments are scheduled for these missions; the areas to be investigated are renal/endocrine function, cardiovascular/cardiopulmonary function, hematology, immunology, metabolic activity of muscle, Ca metabolism, the vestibular system, and general biology. A plan for integration of measurements will allow each investigator to use data from other experiments. The experiments make up a scientifically balanced payload that addresses fundamental biomedical problems associated with space flight and provides the first opportunity to study the acute effects of weightlessness in a comprehensive, interrelated fashion.
Men in Groups: Anthropology and Aggression, 1965-84.
Milam, Erika Lorraine
2015-01-01
By the late 1950s, Harry Frank Guggenheim was concerned with understanding why some charismatic leaders fought for freedom, while others sought power and domination. He believed that best-selling books on ethological approaches to animal and human behavior, especially those by playwright and screenwriter Robert Ardrey, promised a key to this dilemma, and he created a foundation that would fund research addressing problems of violence, aggression, and dominance. Under the directorship of Rutgers University professors Robin Fox and Lionel Tiger, the Harry Frank Guggenheim Foundation fostered scientific investigations into the biological basis of human nature. This essay analyzes their discussions of aggression as fundamental to the behavior of men in groups in order to elucidate the private and professional dimensions of masculine networks of US philanthropic and academic authority in the late 1960s and 1970s.
Abusive behavior is barrier to high-reliability health care systems, culture of patient safety.
Cassirer, C; Anderson, D; Hanson, S; Fraser, H
2000-11-01
Addressing abusive behavior in the medical workplace presents an important opportunity to deliver on the national commitment to improve patient safety. Fundamentally, the issue of patient safety and the issue of abusive behavior in the workplace are both about harm. Undiagnosed and untreated, abusive behavior is a barrier to creating high reliability service delivery systems that ensure patient safety. Health care managers and clinicians need to improve their awareness, knowledge, and understanding of the issue of workplace abuse. The available research suggests there is a high prevalence of workplace abuse in medicine. Both administrators at the blunt end and clinicians at the sharp end should consider learning new approaches to defining and treating the problem of workplace abuse. Eliminating abusive behavior has positive implications for preventing and controlling medical injury and improving organizational performance.
NASA Technical Reports Server (NTRS)
Milman, Mark H.
1987-01-01
The fundamental control synthesis issue of establishing a priori convergence rates of approximation schemes for feedback controllers for a class of distributed parameter systems is addressed within the context of hereditary systems. Specifically, a factorization approach is presented for deriving approximations to the optimal feedback gains for the linear regulator-quadratic cost problem associated with time-varying functional differential equations with control delays. The approach is based on a discretization of the state penalty which leads to a simple structure for the feedback control law. General properties of the Volterra factors of Hilbert-Schmidt operators are then used to obtain convergence results for the controls, trajectories and feedback kernels. Two algorithms are derived from the basic approximation scheme, including a fast algorithm, in the time-invariant case. A numerical example is also considered.
Cardiovascular Bio-Engineering: Current State of the Art.
Simon-Yarza, Teresa; Bataille, Isabelle; Letourneur, Didier
2017-04-01
Despite the introduction of new drugs and innovative devices contributing in the last years to improve patients' quality of life, morbidity and mortality from cardiovascular diseases remain high. There is an urgent need for addressing the underlying problem of the loss of cardiac or vascular tissues and therefore developing new therapies. Autologous vascular transplants are often limited by poor quality of donor sites and heart organ transplantation by donor shortage. Vascular and cardiac tissue engineering, whose aim is to repair or replace cardiovascular tissues by the use of cells, engineering and materials, as well as biochemical and physicochemical factors, appears in this scenario as a promising tool to repair the damaged hearts and vessels. We will present a general overview on the fundamentals in the area of cardiac and vascular tissue engineering as well as on the latest progresses and challenges.
The Use of NMR to Study Transient Carbohydrate-Protein Interactions.
Nieto, Pedro M
2018-01-01
Carbohydrates are biologically ubiquitous and are essential to the existence of all known living organisms. Although they are better known for their role as energy sources (glucose/glycogen or starch) or structural elements (chitin or cellulose), carbohydrates also participate in the recognition events of molecular recognition processes. Such interactions with other biomolecules (nucleic acids, proteins, and lipids) are fundamental to life and disease. This review focuses on the application of NMR methods to understand at the atomic level the mechanisms by which sugar molecules can be recognized by proteins to form complexes, creating new entities with different properties to those of the individual component molecules. These processes have recently gained attention as new techniques have been developed, while at the same time old techniques have been reinvented and adapted to address newer emerging problems.
NASA Technical Reports Server (NTRS)
Milman, Mark H.
1988-01-01
The fundamental control synthesis issue of establishing a priori convergence rates of approximation schemes for feedback controllers for a class of distributed parameter systems is addressed within the context of hereditary schemes. Specifically, a factorization approach is presented for deriving approximations to the optimal feedback gains for the linear regulator-quadratic cost problem associated with time-varying functional differential equations with control delays. The approach is based on a discretization of the state penalty which leads to a simple structure for the feedback control law. General properties of the Volterra factors of Hilbert-Schmidt operators are then used to obtain convergence results for the controls, trajectories and feedback kernels. Two algorithms are derived from the basic approximation scheme, including a fast algorithm, in the time-invariant case. A numerical example is also considered.
Pacific Northwest National Laboratory institutional plan FY 1997--2002
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-10-01
Pacific Northwest National Laboratory`s core mission is to deliver environmental science and technology in the service of the nation and humanity. Through basic research fundamental knowledge is created of natural, engineered, and social systems that is the basis for both effective environmental technology and sound public policy. Legacy environmental problems are solved by delivering technologies that remedy existing environmental hazards, today`s environmental needs are addressed with technologies that prevent pollution and minimize waste, and the technical foundation is being laid for tomorrow`s inherently clean energy and industrial processes. Pacific Northwest National Laboratory also applies its capabilities to meet selected nationalmore » security, energy, and human health needs; strengthen the US economy; and support the education of future scientists and engineers. Brief summaries are given of the various tasks being carried out under these broad categories.« less
Toxicity reduction in industrial effluents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1990-01-01
Wastewater treatment technology is undergoing a profound transformation as a result of the fundamental changes in regulations and permit requirements. Established design procedures and criteria which have served the industry well for decades are no longer useful. Toxicity reduction requirements have forced reconsideration of design standards and caused practicing environmental engineers to seek additional training in the biological sciences. Formal academic programs have not traditionally provided the cross-training between biologists and engineers which is necessary to address these issues. This book describes not only the process of identifying the toxicity problem, but also the treatment technologies which are applicable tomore » reduction or elimination of toxicity. The information provided in this book is a compilation of the experience of ECK-ENFELDER INC. in serving the environmental needs of major industry, and the experience of the individual contributors in research and consultations.« less
Space Based Ornithology: On the Wings of Migration and Biophysics
NASA Technical Reports Server (NTRS)
Smith, James A.
2005-01-01
The study of bird migration on a global scale is one of the compelling and challenging problems of modern biology with major implications for human health and conservation biology. Migration and conservation efforts cross national boundaries and are subject to numerous international agreements and treaties. Space based technology offers new opportunities to shed understanding on the distribution and migration of organisms on the planet and their sensitivity to human disturbances and environmental changes. Migration is an incredibly diverse and complex behavior. A broad outline of space based research must address three fundamental questions: (1) where could birds be, i.e. what is their fundamental niche constrained by their biophysical limits? (2) where do we actually find birds, i.e. what is their realizable niche as modified by local or regional abiotic and biotic factors, and (3) how do they get there (and how do we know?), that is what are their migration patterns and associated mechanisms? Our working hypothesis is that individual organism biophysical models of energy and water balance, driven by satellite measurements of spatio-temporal gradients in climate and habitat, will help us to explain the variability in avian species richness and distribution. Dynamic state variable modeling provides one tool for studying bird migration across multiple scales and can be linked to mechanistic models describing the time and energy budget states of migrating birds. Such models yield an understanding of how a migratory flyway and its component habitats function as a whole and link stop-over ecology with biological conservation and management. Further these models provide an ecological forecasting tool for science and application users to address what are the possible consequences of loss of wetlands, flooding, drought or other natural disasters such as hurricanes on avian biodiversity and bird migration.
Strategies for parenting by mothers and fathers with a mental illness.
van der Ende, P C; van Busschbach, J T; Nicholson, J; Korevaar, E L; van Weeghel, J
2016-03-01
WHAT IS KNOWN ON THE SUBJECT?: The combination of coping with their mental health problems and caring for children makes parents vulnerable. Family-centred practice can help to maintain and strengthen important family relationships, and to identify and enhance the strengths of a parent with a mental illness, all contributing to the recovery of the person with the mental illness. WHAT THIS PAPER ADDS TO THE EXISTING KNOWLEDGE?: Taking the strength and the opportunities formulated by parents themselves as a starting point is fairly new. Parents with severe mental illness find strength for parenting in several ways. They feel responsible, and this helps them to stay alert while parenting, whereas parenthood also offers a basis for social participation through school contacts and the child's friendships. Dedication to the parent role provides a focus; parents develop strengths and skills as they find a balance between attending to their own lives and caring for their children; and parenting prompts them to find adequate sources of social support. In this study these strategies were found to be the fundamentals of recovery related to parenting. WHAT ARE THE IMPLICATIONS FOR PRACTICE?: Nurses can support and coach patients who are identified as parents, and self-chosen parenting related goals are set and addressed. A family-focused approach by nurses can be used to prevent problems for children and their families, identify their strengths as well as vulnerabilities, and address the challenges to build resilience. Understanding of the problems of parents with mental illness is growing. Gaining insight into strategies for parenting, while taking the opportunities formulated by these parents themselves as a starting point is fairly new. What are the strategies of parents with a mental illness to be successful? Experiences of 19 mothers and eight fathers with a mental illness were explored with in-depth interviews. Data were content analysed, using qualitative methods. Next to feelings of inadequacy, interviewees also describe how children enrich and structure their lives and are not only a burden but serve as distraction from problems. Developing activities that interest both child and parent provides avenues for emerging strength. Mental illness constrains fathers, but also gives opportunities to develop a meaningful relation with their children. Strategies like being fully dedicated to the parental role, finding a balance between attention for one's own life and parenting and finding adequate sources of support are found to be fundamental for recovery in the parent role. Implications for practice Peer groups can be of valuable help and mental health workers can support parents to set self-chosen parenting related goals. © 2016 John Wiley & Sons Ltd.
On the numerical modeling of sliding beams: A comparison of different approaches
NASA Astrophysics Data System (ADS)
Steinbrecher, Ivo; Humer, Alexander; Vu-Quoc, Loc
2017-11-01
The transient analysis of sliding beams represents a challenging problem of structural mechanics. Typically, the sliding motion superimposed by large flexible deformation requires numerical methods as, e.g., finite elements, to obtain approximate solutions. By means of the classical sliding spaghetti problem, the present paper provides a guideline to the numerical modeling with conventional finite element codes. For this purpose, two approaches, one using solid elements and one using beam elements, respectively, are employed in the analysis, and the characteristics of each approach are addressed. The contact formulation realizing the interaction of the beam with its support demands particular attention in the context of sliding structures. Additionally, the paper employs the sliding-beam formulation as a third approach, which avoids the numerical difficulties caused by the large sliding motion through a suitable coordinate transformation. The present paper briefly outlines the theoretical fundamentals of the respective approaches for the modeling of sliding structures and gives a detailed comparison by means of the sliding spaghetti serving as a representative example. The specific advantages and limitations of the different approaches with regard to accuracy and computational efficiency are discussed in detail. Through the comparison, the sliding-beam formulation, which proves as an effective approach for the modeling, can be validated for the general problem of a sliding structure subjected to large deformation.
BIOZON: a system for unification, management and analysis of heterogeneous biological data.
Birkland, Aaron; Yona, Golan
2006-02-15
Integration of heterogeneous data types is a challenging problem, especially in biology, where the number of databases and data types increase rapidly. Amongst the problems that one has to face are integrity, consistency, redundancy, connectivity, expressiveness and updatability. Here we present a system (Biozon) that addresses these problems, and offers biologists a new knowledge resource to navigate through and explore. Biozon unifies multiple biological databases consisting of a variety of data types (such as DNA sequences, proteins, interactions and cellular pathways). It is fundamentally different from previous efforts as it uses a single extensive and tightly connected graph schema wrapped with hierarchical ontology of documents and relations. Beyond warehousing existing data, Biozon computes and stores novel derived data, such as similarity relationships and functional predictions. The integration of similarity data allows propagation of knowledge through inference and fuzzy searches. Sophisticated methods of query that span multiple data types were implemented and first-of-a-kind biological ranking systems were explored and integrated. The Biozon system is an extensive knowledge resource of heterogeneous biological data. Currently, it holds more than 100 million biological documents and 6.5 billion relations between them. The database is accessible through an advanced web interface that supports complex queries, "fuzzy" searches, data materialization and more, online at http://biozon.org.
Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng
2015-01-09
The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.
Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng
2015-01-01
The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly. PMID:25572661
NASA Astrophysics Data System (ADS)
Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng
2015-01-01
The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.
ERIC Educational Resources Information Center
Contreras, Jose
2007-01-01
In this article, I model how a problem-posing framework can be used to enhance our abilities to systematically generate mathematical problems by modifying the attributes of a given problem. The problem-posing model calls for the application of the following fundamental mathematical processes: proving, reversing, specializing, generalizing, and…
The twilight of the training analysis system.
Kernberg, Otto F
2014-04-01
This paper briefly reviews challenges to psychoanalysis at this time, including those derived from both external, societal origins and internal psychoanalytic problems. It focuses attention on serious conflicts around psychoanalytic education, and refers to the training analysis system as a central problem determining fundamental constraints on present-day psychoanalytic education. These constraints are examined in some detail, and the general advantages and disadvantages of the training analysis system are outlined. The effects of all these dynamics on the administrative organization of the American Psychoanalytic Association are explored, and a proposal for a fundamental reorganization of our educational system to resolve the correspondent problems is outlined.
Let us keep observing and play in sand boxes (Henry Darcy Medal Lecture)
NASA Astrophysics Data System (ADS)
Illangasekare, T. H.
2012-04-01
Henry Darcy was a civil engineer recognized for a number of technical achievements and scientific discoveries. The sand column experiments for which he is known revealed the linear relationship that exists between fluid motion and driving forces at low velocities. Freeze and Back (1983) stated, ''The experiments carried out by Darcy with the help of his assistant, Ritter, in Dijon, France in 1855 and 1856 represent the beginning of groundwater hydrology as a quantitative science." Because of the prominence given to this experiment, two important facts behind Darcy's contributions to subsurface hydrology have not received much attention. First, Darcy was not only a good engineer, but he was also a highly respected scientist whose knowledge of both the fundamentals of fluid mechanics and the natural world of geology led to better conceptualizing and quantifying of groundwater processes at relevant scales to solve practical problems. The experiments for which he is known may have already been conceived, based on his theoretical understanding, and the results were anticipated (Brown 2002). Second, Darcy, through his contributions with Dupuit, showed that they understood hydrologeology at a regional scale and developed methods for quantification at the scale of geologic stratum (Ritz and Bobek, 2008). The primary thesis of this talk is that scientific contributions such as the one Darcy made require appreciation and a thorough understanding of fundamental theory coupled with observation and recording of phenomena both in nature and in the laboratory. Along with all of the significant theoretical, mathematical modeling, and computational advances we have made in the last several decades, laboratory experiments designed to observe phenomena and processes for better insight, accurate data generation, and hypothesis development are critically important to make scientific and engineering advances to address some of the emerging and societally important problems in hydrology and water resources engineering. Kleinhans et al. (2010) convincingly argued the same point, noting, "Many major issues of hydrology are open to experimental investigation." Current and emerging problems with water supply and their hydrologic implications are associated with sustainability of water as a resource for global food production, clean water for potable use, protection of human health, and impacts and implications of global warming and climate change on water resources. This talk will address the subsurface hydrologic science issues that are central to these problems and the role laboratory experimentation can play in helping to advance the basic knowledge. Improved understanding of fundamental flow, transport, reactive, and biological processes that occur at the pore-scale and their manifestation at different modeling and observational scales will continue to advance the subsurface science. Challenges also come from the need to integrate porous media systems with bio-geochemical and atmospheric systems, requiring observing and quantifying complex phenomena across interfaces (e.g., fluid/fluid in pores to land/atmospheric in the field). This talk will discuss how carefully designed and theory driven experiments at various test scales can play a central role in providing answers to critical scientific questions and how they will help to fill knowledge gaps. It will also be shown that careful observations will lead to the refinement of existing theories or the development of new ones. Focusing on the subsurface, the need to keep observing through controlled laboratory experimentation in various test scales from small cells to large sand boxes will be emphasized. How the insights obtained from such experiments will complement modeling and field investigations are highlighted through examples.
NASA Astrophysics Data System (ADS)
Karasik, Valeriy; Ryzhii, Viktor; Yurchenko, Stanislav
2014-03-01
The 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies' (RJUS TeraTech - 2013) Bauman Moscow State Technical University Moscow, Russia, 3-6 June, 2013 The 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies' (RJUS TeraTech - 2013) was held in Bauman Moscow State Technical University on 3-6 June 2013 and was devoted to modern problems of terahertz optical technologies. RJUS TeraTech 2013 was organized by Bauman Moscow State Technical University in cooperation with Tohoku University (Sendai, Japan) and University of Buffalo (The State University of New York, USA). The Symposium was supported by Bauman Moscow State Technical University (Moscow, Russia) and Russian Foundation for Basic Research (grant number 13-08-06100-g). RJUS TeraTech - 2013 became a foundation for sharing and discussing modern and promising achievements in fundamental and applied problems of terahertz optical technologies, devices based on grapheme and grapheme strictures, condensed matter of different nature. Among participants of RJUS TeraTech - 2013, there were more than 100 researchers and students from different countries. This volume contains proceedings of the 2nd Russia-Japan-USA Symposium 'The Fundamental & Applied Problems of Terahertz Devices & Technologies'. Valeriy Karasik, Viktor Ryzhii and Stanislav Yurchenko Bauman Moscow State Technical University Symposium chair Anatoliy A Aleksandrov, Rector of BMSTU Symposium co-chair Valeriy E Karasik, Head of the Research and Educational Center 'PHOTONICS AND INFRARED TECHNOLOGY' (Russia) Invited Speakers Taiichi Otsuji, Research Institute of Electrical Communication, Tohoku University, Sendai, Japan Akira Satou, Research Institute of Electrical Communication, Tohoku University, Sendai, Japan Michael Shur, Electrical, Computer and System Engineering and Physics, Applied Physics, and Astronomy, Rensselaer Polytechnic Institute, NY, USA Natasha Kirova, University Paris-Sud, France Andrei Sergeev, Department of Electrical Engineering, The University of Buffalo, The State University of New Your, Buffalo, NY, USA Magnus Willander, Linkoping University (LIU), Department of Science and Technology, Linkopings, Sweden Dmitry R Khohlov, Physical Faculty, Lomonosov Moscow State University, Russia Vladimir L Vaks, Institute for Physics of Microstructures of Russian Academy of Sciences, Russia
Toward Paradoxical Inconsistency in Electrostatics of Metallic Conductors
Naturally, when dealing with fundamental problems, the V and V effort should include careful exploration and, if necessary, revision of the fundamentals...Current developments show a clear trend toward more serious efforts in validation and verification (V and V) of physical and engineering models...underlying the physics. With this understanding in mind, we review some fundamentals of the models of crystalline electric conductors and find a
Programs for Fundamentals of Chemistry.
ERIC Educational Resources Information Center
Gallardo, Julio; Delgado, Steven
This document provides computer programs, written in BASIC PLUS, for presenting fundamental or remedial college chemistry students with chemical problems in a computer assisted instructional program. Programs include instructions, a sample run, and 14 separate practice sessions covering: mathematical operations, using decimals, solving…
Open problems in artificial life.
Bedau, M A; McCaskill, J S; Packard, N H; Rasmussen, S; Adami, C; Green, D G; Ikegami, T; Kaneko, K; Ray, T S
2000-01-01
This article lists fourteen open problems in artificial life, each of which is a grand challenge requiring a major advance on a fundamental issue for its solution. Each problem is briefly explained, and, where deemed helpful, some promising paths to its solution are indicated.
NASA Astrophysics Data System (ADS)
Soukiassian, Patrick G.; Ramachandra Rao, M. S.
2010-09-01
'Without carbon, life cannot exist', the saying goes, and not only life. For technological development, carbon was the ultimate material of the 19th century. It allowed the beginnings of the industrial revolution, enabling the rise of the steel and chemical industries, it made the railways run, and it played a major role in the development of naval transportation. Silicon, another very interesting material which makes up a quarter of the earth's crust, became the material of the 20th century in its turn. It gave us the development of high performance electronics and photovoltaics with large fields of applications and played a pivotal role in the evolution of computer technology. The increased device performance of information and data processing systems is changing our lives on a daily basis, producing scientific innovations for a new industrial era. However, success breeds its own problems, and there is ever more data to be handled—which requires a nanoscience approach. This cluster aims to address various aspects, prospects and challenges in this area of great interest for all our futures. Carbon exists in various allotropic forms that are intensively investigated for their unusual and fascinating properties, from both fundamental and applied points of view. Among them, the sp2 (fullerenes, nanotubes and graphene) and sp3 (diamond) bonding configurations are of special interest since they have outstanding and, in some cases, unsurpassed properties compared to other materials. These properties include very high mechanical resistance, very high hardness, high resistance to radiation damage, high thermal conductivity, biocompatibility and superconductivity. Graphene, for example, possesses very uncommon electronic structure and a high carrier mobility, with charge carriers of zero mass moving at constant velocity, just like photons. All these characteristics have put carbon and carbon-related nanomaterials in the spotlight of science and technology research. The main challenges for future understanding include i) material growth, ii) fundamental properties, and iii) developing advanced applications. The reviews in this Cluster Issue of Journal of Physics D: Applied Physics cover carbon nanoparticles and nanotubes, graphene, nano-diamond and films. They address the most current aspects and issues related to their fundamental and outstanding properties, and describe various classes of high-tech applications based on these promising materials. Future prospects, difficulties and challenges are addressed. Important issues include growth, morphology, atomic and electronic structure, transport properties, superconductivity, doping, nanochemistry using hydrogen, chemical and bio-sensors, and bio-imaging, allowing readers to evaluate this very interesting topic and draw perspectives for the future.
Massive Signal Analysis with Hadoop (Invited)
NASA Astrophysics Data System (ADS)
Addair, T.
2013-12-01
The Geophysical Monitoring Program (GMP) at Lawrence Livermore National Laboratory is in the process of transitioning from a primarily human-driven analysis pipeline to a more automated and exploratory system. Waveform correlation represents a significant part of this effort, and the results that come out of this processing could lead to the development of more sophisticated event detection and analysis systems that require less human interaction, and address fundamental shortcomings in existing systems. Furthermore, use of distributed IO systems fundamentally addresses a scalability concern for the GMP as our data holdings continue to grow rapidly. As the data volume increases, it becomes less reasonable to rely upon human analysts to sift through all the information. Not only is more automation essential to keeping up with the ingestion rate, but so too do we require faster and more sophisticated tools for visualizing and interacting with the data. These issues of scalability are not unique to GMP or the seismic domain. All across the lab, and throughout industry, we hear about the promise of 'big data' to address the need of quickly analyzing vast amounts of data in fundamentally new ways. Our waveform correlation system finds and correlates nearby seismic events across the entire Earth. In our original implementation of the system, we processed some 50 TB of data on an in-house traditional HPC cluster (44 cores, 1 filesystem) over the span of 42 days. Having determined the primary bottleneck in the performance to be reading waveforms off a single BlueArc file server, we began investigating distributed IO solutions like Hadoop. As a test case, we took a 1 TB subset of our data and ported it to Livermore Computing's development Hadoop cluster. Through a pilot project sponsored by Livermore Computing (LC), the GMP successfully implemented the waveform correlation system in the Hadoop distributed MapReduce computing framework. Hadoop is an open source implementation of the MapReduce distributed programming framework. We used the Hadoop scripting framework known as Pig for putting together the multi-job MapReduce pipeline used to extract as much parallelism as possible from the algorithms. We also made use the Sqoop data ingestion tool to pull metadata tables from our Oracle database into HDFS (the Hadoop Distributed Filesystem). Running on our in-house HPC cluster, processing this test dataset took 58 hours to complete. In contrast, running our Hadoop implementation on LC's 10 node (160 core) cluster, we were able to cross-correlate the 1 TB of nearby seismic events in just under 3 hours, over a factor of 19 improvement from our existing implementation. This project is one of the first major data mining and analysis tasks performed at the lab or anywhere else correlating the entire Earth's seismicity. Through the success of this project, we believe we've shown that a MapReduce solution can be appropriate for many large-scale Earth science data analysis and exploration problems. Given Hadoop's position as the dominant data analytics solution in industry, we believe Hadoop can be applied to many previously intractable Earth science problems.
The Empowerment of Plasma Modeling by Fundamental Electron Scattering Data
NASA Astrophysics Data System (ADS)
Kushner, Mark J.
2015-09-01
Modeling of low temperature plasmas addresses at least 3 goals - investigation of fundamental processes, analysis and optimization of current technologies, and prediction of performance of as yet unbuilt systems for new applications. The former modeling may be performed on somewhat idealized systems in simple gases, while the latter will likely address geometrically and electromagnetically intricate systems with complex gas mixtures, and now gases in contact with liquids. The variety of fundamental electron and ion scattering data (FSD) required for these activities increases from the former to the latter, while the accuracy required of that data probably decreases. In each case, the fidelity, depth and impact of the modeling depends on the availability of FSD. Modeling is, in fact, empowered by the availability and robustness of FSD. In this talk, examples of the impact of and requirements for FSD in plasma modeling will be discussed from each of these three perspectives using results from multidimensional and global models. The fundamental studies will focus on modeling of inductively coupled plasmas sustained in Ar/Cl2 where the electron scattering from feed gases and their fragments ultimately determine gas temperatures. Examples of the optimization of current technologies will focus on modeling of remote plasma etching of Si and Si3N4 in Ar/NF3/N2/O2 mixtures. Modeling of systems as yet unbuilt will address the interaction of atmospheric pressure plasmas with liquids Work was supported by the US Dept. of Energy (DE-SC0001939), National Science Foundation (CHE-124752), and the Semiconductor Research Corp.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castro, Ricardo
The report describes in details the achievements of the project addressing the performance of nanomaterials in radioactive environments. The project addresses the fundamentals of the role of interface features on the defect dynamics during irradiation and present models to predict behavior based on thermodynamic properties. Papers and products, including formation of students in this strategic area, are presented in details as well.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-28
... Effectiveness of the Second Amendment to the National Market System Plan to Address Extraordinary Market...-631) (Order Approving, on a Pilot Basis, the National Market System Plan To Address Extraordinary... accommodate more fundamental price moves (as opposed to erroneous trades or momentary gaps in liquidity). All...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-28
... Amendment to the National Market System Plan to Address Extraordinary Market Volatility, File No. 4-631... Market System Plan To Address Extraordinary Market Volatility). \\7\\ See Securities Exchange Act Release... requirements of the Plan are coupled with Trading Pauses to accommodate more fundamental price moves (as...
Lessons learned studying design issues for lunar and Mars settlements
NASA Technical Reports Server (NTRS)
Litton, C. E.
1997-01-01
In a study of lunar and Mars settlement concepts, an analysis was made of fundamental design assumptions in five technical areas against a model list of occupational and environmental health concerns. The technical areas included the proposed science projects to be supported, habitat and construction issues, closed ecosystem issues, the "MMM" issues (mining, material processing, and manufacturing), and the human elements of physiology, behavior, and mission approach. Four major lessons were learned. First it is possible to relate public health concerns to complex technological development in a proactive design mode, which has the potential for long-term cost savings. Second, it became very apparent that prior to committing any nation or international group to spending the billions to start and complete a lunar settlement, over the next century, that a significantly different approach must be taken from those previously proposed, to solve the closed ecosystem and "MMM" problems. Third, it also appears that the health concerns and technology issues to be addressed for human exploration into space are fundamentally those to be solved for human habitation of the Earth (as a closed ecosystem) in the 21st century. Finally, it is proposed that ecosystem design modeling must develop new tools, based on probabilistic models as a step up from closed circuit models.
Lessons learned studying design issues for lunar and Mars settlements.
Litton, C E
1997-01-01
In a study of lunar and Mars settlement concepts, an analysis was made of fundamental design assumptions in five technical areas against a model list of occupational and environmental health concerns. The technical areas included the proposed science projects to be supported, habitat and construction issues, closed ecosystem issues, the "MMM" issues (mining, material processing, and manufacturing), and the human elements of physiology, behavior, and mission approach. Four major lessons were learned. First it is possible to relate public health concerns to complex technological development in a proactive design mode, which has the potential for long-term cost savings. Second, it became very apparent that prior to committing any nation or international group to spending the billions to start and complete a lunar settlement, over the next century, that a significantly different approach must be taken from those previously proposed, to solve the closed ecosystem and "MMM" problems. Third, it also appears that the health concerns and technology issues to be addressed for human exploration into space are fundamentally those to be solved for human habitation of the Earth (as a closed ecosystem) in the 21st century. Finally, it is proposed that ecosystem design modeling must develop new tools, based on probabilistic models as a step up from closed circuit models.
Giusti, Chad; Ghrist, Robert; Bassett, Danielle S
2016-08-01
The language of graph theory, or network science, has proven to be an exceptional tool for addressing myriad problems in neuroscience. Yet, the use of networks is predicated on a critical simplifying assumption: that the quintessential unit of interest in a brain is a dyad - two nodes (neurons or brain regions) connected by an edge. While rarely mentioned, this fundamental assumption inherently limits the types of neural structure and function that graphs can be used to model. Here, we describe a generalization of graphs that overcomes these limitations, thereby offering a broad range of new possibilities in terms of modeling and measuring neural phenomena. Specifically, we explore the use of simplicial complexes: a structure developed in the field of mathematics known as algebraic topology, of increasing applicability to real data due to a rapidly growing computational toolset. We review the underlying mathematical formalism as well as the budding literature applying simplicial complexes to neural data, from electrophysiological recordings in animal models to hemodynamic fluctuations in humans. Based on the exceptional flexibility of the tools and recent ground-breaking insights into neural function, we posit that this framework has the potential to eclipse graph theory in unraveling the fundamental mysteries of cognition.
The Marshall Grazing Incidence X-ray Spectrometer (MaGIXS)
NASA Astrophysics Data System (ADS)
Winebarger, A. R.; Savage, S. L.; Kobayashi, K.; Champey, P. R.; McKenzie, D. E.; Golub, L.; Testa, P.; Reeves, K.; Cheimets, P.; Cirtain, J. W.; Walsh, R. W.; Bradshaw, S. J.; Warren, H.; Mason, H. E.; Del Zanna, G.
2017-12-01
For over four decades, X-ray, EUV, and UV spectral observations have been used to measure physical properties of the solar atmosphere. At wavelengths below 10 nm, however, observations of the solar corona with simultaneous spatial and spectral resolution are limited, and not since the late 1970's have spatially resolved solar X-ray spectra been measured. Because the soft X-ray regime is dominated by emission lines formed at high temperatures, X-ray spectroscopic techniques yield insights to fundamental physical processes that are not accessible by any other means. Using a novel implementation of corrective optics, the Marshall Grazing Incidence X-ray Spectrometer (MaGIXS) will measure, for the first time, the solar spectrum from 0.6- 2.4 nm with a 6 arcsec resolution over an 8 arcmin slit. The MaGIXS mission will address on of the fundamental problems of coronal physics: the nature of coronal heating. There are several observables in the MaGIXS wavelength range that will constrain the heating frequency and hence discriminate between competing coronal heating theories. In this presentation, we will present the MaGIXS scientific motivation and provide an update on instrument development. MaGIXS will be launched from White Sands Missile Range in the summer of 2019.
Cervera, Miguel; Tesei, Claudia
2017-01-01
In this paper, an energy-equivalent orthotropic d+/d− damage model for cohesive-frictional materials is formulated. Two essential mechanical features are addressed, the damage-induced anisotropy and the microcrack closure-reopening (MCR) effects, in order to provide an enhancement of the original d+/d− model proposed by Faria et al. 1998, while keeping its high algorithmic efficiency unaltered. First, in order to ensure the symmetry and positive definiteness of the secant operator, the new formulation is developed in an energy-equivalence framework. This proves thermodynamic consistency and allows one to describe a fundamental feature of the orthotropic damage models, i.e., the reduction of the Poisson’s ratio throughout the damage process. Secondly, a “multidirectional” damage procedure is presented to extend the MCR capabilities of the original model. The fundamental aspects of this approach, devised for generic cyclic conditions, lie in maintaining only two scalar damage variables in the constitutive law, while preserving memory of the degradation directionality. The enhanced unilateral capabilities are explored with reference to the problem of a panel subjected to in-plane cyclic shear, with or without vertical pre-compression; depending on the ratio between shear and pre-compression, an absent, a partial or a complete stiffness recovery is simulated with the new multidirectional procedure. PMID:28772793
Davison, Kirsten K; Lawson, Hal A; Coatsworth, J Douglas
2012-07-01
Parents play a fundamental role in shaping children's development, including their dietary and physical activity behaviors. Yet family-centered interventions are rarely used in obesity prevention research. Less than half of childhood obesity prevention programs include parents, and those that do include parents or a family component seldom focus on sustainable change at the level of the family. The general absence of a family-centered approach may be explained by persistent challenges in engaging parents and families and the absence of an intervention framework explicitly designed to foster family-centered programs. The Family-centered Action Model of Intervention Layout and Implementation, or FAMILI, was developed to address these needs. FAMILI draws on theories of family development to frame research and intervention design, uses a mixed-methods approach to conduct ecologically valid research, and positions family members as active participants in the development, implementation, and evaluation of family-centered obesity prevention programs. FAMILI is intended to facilitate the development of culturally responsive and sustainable prevention programs with the potential to improve outcomes. Although childhood obesity was used to illustrate the application of FAMILI, this model can be used to address a range of child health problems.
Focal theoretical problems in modulated and martensitic transformations in alloys and perovskites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krumhansl, J.A.
Fundamental understanding of the microscopic physic of displacive transformations requires insight into the most remarkable and fascinating feature common to so many of the transformations; the formation of local distortive structures, modulations and more general patterns at the mesoscopic scale, far larger than atomic spacings, much smaller than typical specimen size. These have been extensively studied by metallurgists for some time; but also, they are are manifest in ferroelectrics, in such phenomena as the blue phases'' in chloesteric liquid crystals, and in turbulence. This commonality in such a wide range of materials challenges us to achieve a basic understanding ofmore » the physics of why such local, persistent mesostructures appear. In order to address some of the bigger questions -- microscopics of nucleation and growth, mesoscopic and transitional (precursor) structures, and properties of transformed materials -- we began addressing the limitations of traditional methods for describing the thermodynamics and (elastic) distortions of displacive transformations. Conventional phonon descriptions and linear elasticity (and their contribution of the free energy) are obviously limited to very small distortions and are intrinsically incapable of describing the larger, topology changing displacements that are of essence here.« less
Control and accountability in the NHS market: a practical proposition or logical impossibility?
Glynn, J J; Perkins, D
1998-01-01
Before the imposition of the NHS internal market, systems of accountability and control were far from adequate and could be criticized on a number of grounds. The market was offered as a panacea to address these inadequacies. However, in practice there have only been partial improvements which could have been achieved without the imposition of the market. The market also creates new problems and a number of crises and scandals seem to be addressed at the political level by pleas to utilize resources more effectively. These pleas mean that more and more the focus is turning back to central planning in the provision of care and further away from so-called market mechanisms. The NHS "managed" market has been imperfect and will continue to be so. Argues that there is no alternative but to return to the planned provision of health care in order to improve on accountability and control in the NHS. Hopefully the adverse impact of the market on clinicians and others will force a more rational reappraisal of the fundamental raison d'être of the NHS and the need for those involved in the delivery of services, at all levels, to be more openly accountable.
Flight control systems properties and problems, volume 1
NASA Technical Reports Server (NTRS)
Mcruer, D. T.; Johnston, D. E.
1975-01-01
This volume contains a delineation of fundamental and mechanization-specific flight control characteristics and problems gleaned from many sources and spanning a period of over two decades. It is organized to present and discuss first some fundamental, generic problems of closed-loop flight control systems involving numerator characteristics (quadratic dipoles, non-minimum phase roots, and intentionally introduced zeros). Next the principal elements of the largely mechanical primary flight control system are reviewed with particular emphasis on the influence of nonlinearities. The characteristics and problems of augmentation (damping, stability, and feel) system mechanizations are then dealt with. The particular idiosyncracies of automatic control actuation and command augmentation schemes are stressed, because they constitute the major interfaces with the primary flight control system and an often highly variable vehicle response.
Fouling of evaporators in maize processing developing a fundamental understanding
USDA-ARS?s Scientific Manuscript database
Evaporator fouling is a common, chronic problem during maize starch and ethanol production. To compensate for the consequences of fouling, capital costs are increased, operating costs are incurred, productivity is reduced and environmental impact is increased. Despite these issues, fundamental cause...
The Physics Workbook: A Needed Instructional Device.
ERIC Educational Resources Information Center
Brekke, Stewart E.
2003-01-01
Points out the importance of problem solving as a fundamental skill and how students struggle with problem solving in physics courses. Describes a workbook developed as a solution to students' struggles that features simple exercises and advanced problem solving. (Contains 12 references.) (Author/YDS)
Problem Solving in Genetics: Conceptual and Procedural Difficulties
ERIC Educational Resources Information Center
Karagoz, Meryem; Cakir, Mustafa
2011-01-01
The purpose of this study was to explore prospective biology teachers' understandings of fundamental genetics concepts and the association between misconceptions and genetics problem solving abilities. Specifically, the study describes conceptual and procedural difficulties which influence prospective biology teachers' genetics problem solving…
A Course on Surface Phenomena.
ERIC Educational Resources Information Center
Woods, Donald R.
1983-01-01
Describes a graduate or senior elective course combining fundamentals of surface phenomena with practical problem-solving structured around a series of case problems. Discusses topics covered and their development through acquiring new knowledge applied to the case problem, practical calculations of solutions, and applications to additional…
Quantum Mechanics - Fundamentals and Applications to Technology
NASA Astrophysics Data System (ADS)
Singh, Jasprit
1996-10-01
Explore the relationship between quantum mechanics and information-age applications This volume takes an altogether unique approach to quantum mechanics. Providing an in-depth exposition of quantum mechanics fundamentals, it shows how these concepts are applied to most of today's information technologies, whether they are electronic devices or materials. No other text makes this critical, essential leap from theory to real-world applications. The book's lively discussion of the mathematics involved fits right in with contemporary multidisciplinary trends in education: Once the basic formulation has been derived in a given chapter, the connection to important technological problems is summarily described. The many helpful features include * Twenty-eight application-oriented sections that focus on lasers, transistors, magnetic memories, superconductors, nuclear magnetic resonance (NMR), and other important technology-driving materials and devices * One hundred solved examples, with an emphasis on numerical results and the connection between the physics and its applications * End-of-chapter problems that ground the student in both fundamental and applied concepts * Numerous figures and tables to clarify the various topics and provide a global view of the problems under discussion * Over two hundred illustrations to highlight problems and text A book for the information age, Quantum Mechanics: Fundamentals and Applications to Technology promises to become a standard in departments of electrical engineering, applied physics, and materials science, as well as physics. It is an excellent text for senior undergraduate and graduate students, and a helpful reference for practicing scientists, engineers, and chemists in the semiconductor and electronic industries.
Features and Characteristics of Problem Based Learning
ERIC Educational Resources Information Center
Ceker, Eser; Ozdamli, Fezile
2016-01-01
Throughout the years, there appears to be an increase in Problem Based Learning applications in education; and Problem Based Learning related research areas. The main aim of this research is to underline the fundamentals (basic elements) of Problem Based Learning, investigate the dimensions of research approached to PBL oriented areas (with a look…
Acceleration and focusing of plasma flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griswold, Martin Elias
The acceleration of flowing plasmas is a fundamental problem that is useful in a wide variety of technological applications. We consider the problem from the perspective of plasma propulsion. Gridded ion thrusters and Hall thrusters are the most commonly used devices to create flowing plasma for space propulsion, but both suffer from fundamental limitations. Gridded ion sources create good quality beams in terms of energy spread and spatial divergence, but the Child-Langmuir law in the non-neutral acceleration region limits the maximum achievable current density. Hall thrusters avoid this limitation by accelerating ions in quasi-neutral plasma but, as a result, producemore » plumes with high spatial divergence and large energy spread. In addition the more complicated magnetized plasma in the Hall Thruster produces oscillations that can reduce the efficiency of the thruster by increasing electron transport to the anode. We present investigations of three techniques to address the fundamental limitations on the performance of each thruster. First, we propose a method to increase the time-averaged current density (and thus thrust density) produced by a gridded ion source above the Child-Langmuir limit by introducing time-varying boundary conditions. Next, we use an electrostatic plasma lens to focus the Hall thruster plume, and finally we develop a technique to suppress a prominent oscillation that degrades the performance of Hall thrusters. The technique to loosen the constraints on current density from gridded ion thrusters actually applies much more broadly to any space charge limited flow. We investigate the technique with a numerical simulation and by proving a theoretical upper bound. While we ultimately conclude that the approach is not suitable for space propulsion, our results proved useful in another area, providing a benchmark for research into the spontaneously time-dependent current that arises in microdiodes. Next, we experimentally demonstrate a novel approach to reducing plume divergence by using a PL located in the plume of the thruster to focus ions after they were ionized and accelerated. Finally we further improve thruster operation by suppressing a prominent low frequency oscillation in the thruster known as the rotating spoke. The suppression leads to decreased electron transport and more control over the operating conditions in the thruster.« less
ERIC Educational Resources Information Center
Cummings, Lynda; Winston, Michael
1998-01-01
Describes the Solutions model used at Shelley High School in Idaho which gives students the opportunity to gain practical experience while tackling community problems. This approach is built on the three fundamentals of an integrated curriculum, a problem-solving focus, and service-based learning. Sample problems include increasing certain trout…
Bekiroğlu, Sultan; Özdemir, Mehmet; Özyürek, Ercan; Arslan, Avni
2016-10-01
Model forests are nongovernmental organizations at local, regional and international level which are mainly focused on reconciling the conflicts between the stakeholders. This is an innovative approach to organization, which has been receiving more and more attraction from increasing number of countries, which gradually increased the number of model forests for the last 25 years. If these organizations reach desired levels of structure, medium, impacts and assets their contribution in sustainable forest resources management will increase ipso facto. The very first model forest of Turkey was created in Yalova Province in 2010. Yalova Province has certain fundamental problems including but not limited to; population growth and unplanned urbanization caused by industrialization, uncontrolled increase in demand for fire wood and non-wooden products of forestry resources, questionable resource management decisions adopted in the past and low-income levels of the people especially those in the rural areas. The main objective of present case study is to analyze Yalova Model Forest (YMF) so as to determine the possible problems that may occur during implementation of sustainable management for forestry resources through a planning approach with multiple stakeholders. As a result of research, it has been revealed that YMF has certain significant structural, environmental and impact-related problems. In order to ensure continuity of YMF's contribution to sustainable forestry resources management, these problems need to be addressed promptly. Copyright © 2016 Elsevier Ltd. All rights reserved.
Shamsi Meymandi, Manzumeh; Safizadeh, Hossein; Divsalar, Kouros; Rastegariyanzadeh, Ramin; Heravi, Gioia; Mahmoodi, Majid; Kheradmand, Ali
2011-01-01
Background Addiction is one of the complicated problems in Iranian young population. The social and cultural dimensions of this social disease are less considered. So considering socio-cultural and environmental resources, this study investigated the substructures of addiction according to the viewpoints of high-school students of Kerman, Iran in 2007-2008. Methods This qualitative study accomplished in ten high schools through a one-day problem finding workshop and continued until data saturation. The resulted terms and phrases were analyzed by content analysis. To assure about the validity and reliability, the outputs reviewed by workshops participants, and classification and codification of the data were executed separately by two experts. Findings A total of 212 students, 45.3% girls and 54.7% boys, participated in the study. The students introduced the followings as the addiction substantial fundaments: lack of knowledge, positive attitude and interpretation of addiction as a value, family or friends' habit, economy status, psycho-personality problems and availability. Rules infirmity or non-implementation of the current rules enforcement, geographical status and addiction as a conspiracy were also observed in students’ statements. Conclusion The positive attitudes and historical roots of addiction along with the process of changing the values caused the growth of drug addiction in young population which could neutralize the security measures, legislations policy and even the knowledge. Therefore, intensification of personal protective factors and culturalization addressed for improving inner layers of values are recommended. PMID:24494110
Yang, Rui; Wang, Wen-Xu; Lai, Ying-Cheng; Grebogi, Celso
2010-06-01
Evolutionary-game based models of nonhierarchical, cyclically competing populations have become paradigmatic for addressing the fundamental problem of species coexistence in spatially extended ecosystems. We study the role of intraspecific competition in the coexistence and find that the competition can strongly promote the coexistence for high individual mobility in the sense that stable coexistence can arise in parameter regime where extinction would occur without the competition. The critical value of the competition rate beyond which the coexistence is induced is found to be independent of the mobility. We derive a theoretical model based on nonlinear partial differential equations to predict the critical competition rate and the boundaries between the coexistence and extinction regions in a relevant parameter space. We also investigate pattern formation and well-mixed spatiotemporal population dynamics to gain further insights into our findings. (c) 2010 American Institute of Physics.
Hathaway, Cynthia
2012-01-01
Historically, women have been systematically excluded from or underrepresented in human clinical trials of new drugs. Due to fundamental physiological differences between women and men with regard to how drugs work in the human body, testing of drugs in men alone can both deny women the full benefit of a drug and cause them to suffer from increased adverse side effects. Attempts to reform drug development law and agency practices to resolve this problem have met with only partial success. Proposed herein is a patent term extension and for studies in women, modeled upon the pediatric patent term extension, but with several key differences intended to reduce the cost to the public and fund auxiliary programs to address off-patent medicines as well. Such an extension would incentivize this research and provide meaningful guidance to women and their physicians.
Hypothermic temperature effects on organ survival and restoration
Ishikawa, Jun; Oshima, Masamitsu; Iwasaki, Fumitaka; Suzuki, Ryoji; Park, Joonhong; Nakao, Kazuhisa; Matsuzawa-Adachi, Yuki; Mizutsuki, Taro; Kobayashi, Ayaka; Abe, Yuta; Kobayashi, Eiji; Tezuka, Katsunari; Tsuji, Takashi
2015-01-01
A three-dimensional multicellular organism maintains the biological functions of life support by using the blood circulation to transport oxygen and nutrients and to regulate body temperature for intracellular enzymatic reactions. Donor organ transplantation using low-temperature storage is used as the fundamental treatment for dysfunctional organs. However, this approach has a serious problem in that donor organs maintain healthy conditions only during short-term storage. In this study, we developed a novel liver perfusion culture system based on biological metabolism that can maintain physiological functions, including albumin synthesis, bile secretion and urea production. This system also allows for the resurrection of a severely ischaemic liver. This study represents a significant advance for the development of an ex vivo organ perfusion system based on biological metabolism. It can be used not only to address donor organ shortages but also as the basis of future regenerative organ replacement therapy. PMID:25900715
Kim, E.; Safavi-Naini, A.; Hite, D. A.; ...
2017-03-01
The decoherence of trapped-ion quantum bits due to heating of their motional modes is a fundamental science and engineering problem. This heating is attributed to electric-field noise arising from processes on the trap-electrode surfaces. In this work, we address the source of this noise by focusing on the diffusion of carbon-containing adsorbates on the surface of Au(110). We show by detailed scanned probe microscopy and density functional theory how the carbon adatom diffusion on the gold surface changes the energy landscape, and how the adatom dipole moment varies with the diffusive motion. Lastly, a simple model for the diffusion noise,more » which varies quadratically with the variation of the dipole moment, qualitatively reproduces the measured noise spectrum, and the estimate of the noise spectral density is in accord with measured values.« less
Landmark matching based retinal image alignment by enforcing sparsity in correspondence matrix.
Zheng, Yuanjie; Daniel, Ebenezer; Hunter, Allan A; Xiao, Rui; Gao, Jianbin; Li, Hongsheng; Maguire, Maureen G; Brainard, David H; Gee, James C
2014-08-01
Retinal image alignment is fundamental to many applications in diagnosis of eye diseases. In this paper, we address the problem of landmark matching based retinal image alignment. We propose a novel landmark matching formulation by enforcing sparsity in the correspondence matrix and offer its solutions based on linear programming. The proposed formulation not only enables a joint estimation of the landmark correspondences and a predefined transformation model but also combines the benefits of the softassign strategy (Chui and Rangarajan, 2003) and the combinatorial optimization of linear programming. We also introduced a set of reinforced self-similarities descriptors which can better characterize local photometric and geometric properties of the retinal image. Theoretical analysis and experimental results with both fundus color images and angiogram images show the superior performances of our algorithms to several state-of-the-art techniques. Copyright © 2013 Elsevier B.V. All rights reserved.
Flick, Johannes; Ruggenthaler, Michael; Appel, Heiko
2017-01-01
In this work, we provide an overview of how well-established concepts in the fields of quantum chemistry and material sciences have to be adapted when the quantum nature of light becomes important in correlated matter–photon problems. We analyze model systems in optical cavities, where the matter–photon interaction is considered from the weak- to the strong-coupling limit and for individual photon modes as well as for the multimode case. We identify fundamental changes in Born–Oppenheimer surfaces, spectroscopic quantities, conical intersections, and efficiency for quantum control. We conclude by applying our recently developed quantum-electrodynamical density-functional theory to spontaneous emission and show how a straightforward approximation accurately describes the correlated electron–photon dynamics. This work paves the way to describe matter–photon interactions from first principles and addresses the emergence of new states of matter in chemistry and material science. PMID:28275094
Development and experimental characterization of a new non contact sensor for blade tip timing
NASA Astrophysics Data System (ADS)
Brouckaert, Jean-Francois; Marsili, Roberto; Rossi, Gianluca; Tomassini, Roberto
2012-06-01
Performances of blade tip timing measurement systems (BTT), recently used for non contact turbine blade vibration measurements, in terms of uncertainty and resolution are strongly affected by sensor characteristics. The sensors used for BTT generate pulses, to be used also for precise measurements of turbine blades time of arrival. All the literature on this measurement techniques do not address this problem in a clear way, defining the relevant dynamic and static sensor characteristics, fundamental for this application. Till now proximity sensors used are based on optical, capacitive, eddy current and microwave measuring principle. Also pressure sensors has been used. In this paper a new sensing principle is proposed. A proximity sensor based on magnetoresistive sensing element has been assembled end tested. A simple and portable test bench with variable speed, blade tip width, variable clearance was built and used in order to characterize the main sensor performances.
Nuclear Test-Experimental Science: Annual report, fiscal year 1988
DOE Office of Scientific and Technical Information (OSTI.GOV)
Struble, G.L.; Donohue, M.L.; Bucciarelli, G.
1988-01-01
Fiscal year 1988 has been a significant, rewarding, and exciting period for Lawrence Livermore National Laboratory's nuclear testing program. It was significant in that the Laboratory's new director chose to focus strongly on the program's activities and to commit to a revitalized emphasis on testing and the experimental science that underlies it. It was rewarding in that revolutionary new measurement techniques were fielded on recent important and highly complicated underground nuclear tests with truly incredible results. And it was exciting in that the sophisticated and fundamental problems of weapons science that are now being addressed experimentally are yielding new challengesmore » and understanding in ways that stimulate and reward the brightest and best of scientists. During FY88 the program was reorganized to emphasize our commitment to experimental science. The name of the program was changed to reflect this commitment, becoming the Nuclear Test-Experimental Science (NTES) Program.« less
MEMS-Based Communications Systems for Space-Based Applications
NASA Technical Reports Server (NTRS)
DeLosSantos, Hector J.; Brunner, Robert A.; Lam, Juan F.; Hackett, Le Roy H.; Lohr, Ross F., Jr.; Larson, Lawrence E.; Loo, Robert Y.; Matloubian, Mehran; Tangonan, Gregory L.
1995-01-01
As user demand for higher capacity and flexibility in communications satellites increases, new ways to cope with the inherent limitations posed by the prohibitive mass and power consumption, needed to satisfy those requirements, are under investigation. Recent studies suggest that while new satellite architectures are necessary to enable multi-user, multi-data rate, multi-location satellite links, these new architectures will inevitably increase power consumption, and in turn, spacecraft mass, to such an extent that their successful implementation will demand novel lightweight/low power hardware approaches. In this paper, following a brief introduction to the fundamentals of communications satellites, we address the impact of micro-electro-mechanical systems (MEMS) technology, in particular micro-electro-mechanical (MEM) switches to mitigate the above mentioned problems and show that low-loss/wide bandwidth MEM switches will go a long way towards enabling higher capacity and flexibility space-based communications systems.
Seismology and space-based geodesy
NASA Technical Reports Server (NTRS)
Tralli, David M.; Tajima, Fumiko
1993-01-01
The potential of space-based geodetic measurement of crustal deformation in the context of seismology is explored. The achievements of seismological source theory and data analyses, mechanical modeling of fault zone behavior, and advances in space-based geodesy are reviewed, with emphasis on realizable contributions of space-based geodetic measurements specifically to seismology. The fundamental relationships between crustal deformation associated with an earthquake and the geodetically observable data are summarized. The response and spatial and temporal resolution of the geodetic data necessary to understand deformation at various phases of the earthquake cycle is stressed. The use of VLBI, SLR, and GPS measurements for studying global geodynamics properties that can be investigated to some extent with seismic data is discussed. The potential contributions of continuously operating strain monitoring networks and globally distributed geodetic observatories to existing worldwide modern digital seismographic networks are evaluated in reference to mutually addressable problems in seismology, geophysics, and tectonics.
The role of counterfactual theory in causal reasoning.
Maldonado, George
2016-10-01
In this commentary I review the fundamentals of counterfactual theory and its role in causal reasoning in epidemiology. I consider if counterfactual theory dictates that causal questions must be framed in terms of well-defined interventions. I conclude that it does not. I hypothesize that the interventionist approach to causal inference in epidemiology stems from elevating the randomized trial design to the gold standard for thinking about causal inference. I suggest that instead the gold standard we should use for thinking about causal inference in epidemiology is the thought experiment that, for example, compares an actual disease frequency under one exposure level with a counterfactual disease frequency under a different exposure level (as discussed in Greenland and Robins (1986) and Maldonado and Greenland (2002)). I also remind us that no method should be termed "causal" unless it addresses the effect of other biases in addition to the problem of confounding. Copyright © 2016 Elsevier Inc. All rights reserved.
An ethical framework for the management of pain in the emergency department.
Venkat, Arvind; Fromm, Christian; Isaacs, Eric; Ibarra, Jordan
2013-07-01
Pain is a ubiquitous problem, affecting more than 100 million individuals in the United States chronically and many more in the acute setting. Up to three-quarters of patients presenting to the emergency department (ED) report pain as a key component of their reasons for requiring acute care. While pain management is a fundamental component of emergency medicine (EM), there are numerous attitudinal and structural barriers that have been identified to effectively providing pain control in the ED. Coupled with public demands and administrative mandates, concerns surrounding ED pain management have reached a crisis level that should be considered an ethical issue in the profession of EM. In this article, the authors propose an ethical framework based on a combination of virtue, narrative, and relationship theories that can be used to address the clinical dilemmas that arise in managing pain in ED patients. © 2013 by the Society for Academic Emergency Medicine.
Denker, Elsa; Jiang, Di
2012-05-01
Biological tubes are a prevalent structural design across living organisms. They provide essential functions during the development and adult life of an organism. Increasing progress has been made recently in delineating the cellular and molecular mechanisms underlying tubulogenesis. This review aims to introduce ascidian notochord morphogenesis as an interesting model system to study the cell biology of tube formation, to a wider cell and developmental biology community. We present fundamental morphological and cellular events involved in notochord morphogenesis, compare and contrast them with other more established tubulogenesis model systems, and point out some unique features, including bipolarity of the notochord cells, and using cell shape changes and cell rearrangement to connect lumens. We highlight some initial findings in the molecular mechanisms of notochord morphogenesis. Based on these findings, we present intriguing problems and put forth hypotheses that can be addressed in future studies. Copyright © 2012 Elsevier Ltd. All rights reserved.
Commentary: muddy diagnostic waters in the SVP courtroom.
Prentky, Robert A; Coward, Anna I; Gabriel, Adeena M
2008-01-01
In this brief commentary, we address several of the points raised by Drs. First and Halon on the abuses of DSM diagnoses (APA, 2000) in civil commitment hearings of sex offenders. We discuss each of the elements in the three-step process proposed by First and Halon for reforming the diagnosis of paraphilias in SVP proceedings, paying particular attention to the role of volitional impairment. Both in spirit and in substance, we fundamentally agree with First and Halon, concluding that the misuse of science, inclusive of the misuse of the DSM, in the SVP courtroom is a variation of pretextuality. We commend First and Halon for drawing attention to a serious problem, one that undermines the integrity of the legal system in general and the SVP adjudicatory process in particular. We conclude with a warning that without firmer control from the courts, expert opinions will remain opaque and of questionable probative value.
Modeling perceptual grouping and figure-ground segregation by means of active reentrant connections.
Sporns, O; Tononi, G; Edelman, G M
1991-01-01
The segmentation of visual scenes is a fundamental process of early vision, but the underlying neural mechanisms are still largely unknown. Theoretical considerations as well as neurophysiological findings point to the importance in such processes of temporal correlations in neuronal activity. In a previous model, we showed that reentrant signaling among rhythmically active neuronal groups can correlate responses along spatially extended contours. We now have modified and extended this model to address the problems of perceptual grouping and figure-ground segregation in vision. A novel feature is that the efficacy of the connections is allowed to change on a fast time scale. This results in active reentrant connections that amplify the correlations among neuronal groups. The responses of the model are able to link the elements corresponding to a coherent figure and to segregate them from the background or from another figure in a way that is consistent with the so-called Gestalt laws.
Modeling Perceptual Grouping and Figure-Ground Segregation by Means of Active Reentrant Connections
NASA Astrophysics Data System (ADS)
Sporns, Olaf; Tononi, Giulio; Edelman, Gerald M.
1991-01-01
The segmentation of visual scenes is a fundamental process of early vision, but the underlying neural mechanisms are still largely unknown. Theoretical considerations as well as neurophysiological findings point to the importance in such processes of temporal correlations in neuronal activity. In a previous model, we showed that reentrant signaling among rhythmically active neuronal groups can correlate responses along spatially extended contours. We now have modified and extended this model to address the problems of perceptual grouping and figure-ground segregation in vision. A novel feature is that the efficacy of the connections is allowed to change on a fast time scale. This results in active reentrant connections that amplify the correlations among neuronal groups. The responses of the model are able to link the elements corresponding to a coherent figure and to segregate them from the background or from another figure in a way that is consistent with the so-called Gestalt laws.
Foods, obesity, and diabetes-are all calories created equal?
Mozaffarian, Dariush
2017-01-01
Diet has become one of the top risk factors for poor health. The incidence of cardiometabolic disease in the United Sates, in Mexico, and in most countries is driven fundamentally by changes in diet quality. Weight gain has been typically framed as a problem of excess caloric intake, but, as reviewed in this paper, subtle changes in the quality of diet are associated with long-term weight gain. In order to successfully address obesity and diabetes, researchers and policy makers have to better understand how weight gain in the long term is modulated and to change the focus of research and public policy from one based on counting calories to one based on diet quality and its determinants at various levels. © The Author(s) 2016. Published by Oxford University Press on behalf of the International Life Sciences Institute. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
A Study of Gaugeon Formalism for QED in Lorentz Violating Background
NASA Astrophysics Data System (ADS)
Shah, Mushtaq B.; Ganai, Prince A.
2018-02-01
At the energy regimes close to Planck scales, the usual structure of Lorentz symmetry fails to address certain fundamental issues and eventually breaks down, thus paving the way for an alternative road map. It is thus argued that some subgroup of proper Lorentz group could stand consistent and might possibly help us to circumvent this problem. It is this subgroup that goes by the name of Very Special Relativity (VSR). Apart from violating rotational symmetry, VSR is believed to preserve the very tenets of special relativity. The gaugeon formalism due to type-I Yokoyama and type-II Izawa are found to be invariant under BRST symmetry. In this paper, we analyze the scope of this invariance in the scheme of VSR. Furthermore, we will obtain VSR modified Lagrangian density using path integral derivation. We will explore the consistency of VSR with regard to these theories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, E.; Safavi-Naini, A.; Hite, D. A.
The decoherence of trapped-ion quantum bits due to heating of their motional modes is a fundamental science and engineering problem. This heating is attributed to electric-field noise arising from processes on the trap-electrode surfaces. In this work, we address the source of this noise by focusing on the diffusion of carbon-containing adsorbates on the surface of Au(110). We show by detailed scanned probe microscopy and density functional theory how the carbon adatom diffusion on the gold surface changes the energy landscape, and how the adatom dipole moment varies with the diffusive motion. Lastly, a simple model for the diffusion noise,more » which varies quadratically with the variation of the dipole moment, qualitatively reproduces the measured noise spectrum, and the estimate of the noise spectral density is in accord with measured values.« less
Active noise control: A tutorial for HVAC designers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelin, L.J.
1997-08-01
This article will identify the capabilities and limitations of ANC in its application to HVAC noise control. ANC can be used in ducted HVAC systems to cancel ductborne, low-frequency fan noise by injecting sound waves of equal amplitude and opposite phase into an air duct, as close as possible to the source of the unwanted noise. Destructive interference of the fan noise and injected noise results in sound cancellation. The noise problems that it solves are typically described as rumble, roar or throb, all of which are difficult to address using traditional noise control methods. This article will also contrastmore » the use of active against passive noise control techniques. The main differences between the two noise control measures are acoustic performance, energy consumption, and design flexibility. The article will first present the fundamentals and basic physics of ANC. The application to real HVAC systems will follow.« less
Spatial Indexing for Data Searching in Mobile Sensing Environments.
Zhou, Yuchao; De, Suparna; Wang, Wei; Moessner, Klaus; Palaniswami, Marimuthu S
2017-06-18
Data searching and retrieval is one of the fundamental functionalities in many Web of Things applications, which need to collect, process and analyze huge amounts of sensor stream data. The problem in fact has been well studied for data generated by sensors that are installed at fixed locations; however, challenges emerge along with the popularity of opportunistic sensing applications in which mobile sensors keep reporting observation and measurement data at variable intervals and changing geographical locations. To address these challenges, we develop the Geohash-Grid Tree, a spatial indexing technique specially designed for searching data integrated from heterogeneous sources in a mobile sensing environment. Results of the experiments on a real-world dataset collected from the SmartSantander smart city testbed show that the index structure allows efficient search based on spatial distance, range and time windows in a large time series database.
Chen, Xiao-Wei; Chen, Ya-Jun; Wang, Jin-Mei; Guo, Jian; Yin, Shou-Wei; Yang, Xiao-Quan
2017-04-15
A current challenge in the area of food emulsion is the design of microstructure that provides controlled release of volatile compounds during storage and consumption. Here, a new strategy addressed this problem at the fundamental level by describing the design of organogel-based emulsion from the self-assembly of β-sitosterol and γ-oryzanol that are capable of tuning volatile release. The results showed that the release rate (v 0 ), maximum headspace concentrations (C max ) and partition coefficients (k a / e ) above structured emulsions were significantly lower than unstructured emulsions and controlled release doing undergo tunable though the self-assembled interface and core fine microstructure from internal phase under dynamic and static condition. This result provides an understanding of how emulsions can behave as delivery system to better design novel food products with enhanced sensorial and nutritional attributes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Spatial Indexing for Data Searching in Mobile Sensing Environments
Zhou, Yuchao; De, Suparna; Wang, Wei; Moessner, Klaus; Palaniswami, Marimuthu S.
2017-01-01
Data searching and retrieval is one of the fundamental functionalities in many Web of Things applications, which need to collect, process and analyze huge amounts of sensor stream data. The problem in fact has been well studied for data generated by sensors that are installed at fixed locations; however, challenges emerge along with the popularity of opportunistic sensing applications in which mobile sensors keep reporting observation and measurement data at variable intervals and changing geographical locations. To address these challenges, we develop the Geohash-Grid Tree, a spatial indexing technique specially designed for searching data integrated from heterogeneous sources in a mobile sensing environment. Results of the experiments on a real-world dataset collected from the SmartSantander smart city testbed show that the index structure allows efficient search based on spatial distance, range and time windows in a large time series database. PMID:28629156
Metaorganisms as the new frontier
Bosch, Thomas C.G.; McFall-Ngai, Margaret J.
2014-01-01
Summary Because it appears that almost all organisms are part of an interdependent meta-organism, an understanding of the underlying host-microbe species associations, and of its evolution and molecular underpinnings, has become the new frontier in zoology. The availability of novel highthroughput sequencing methods together with the conceptual understanding that advances mostly originate at the intersection of traditional disciplinary boundaries, enable biologists to dissect the mechanisms that control the interdependent associations of species. In this perspective article, we outline some of the issues in interspecies interactions, present two case studies illuminating the necessity of interfacial research when addressing complex and fundamental zoological problems, and show that an interdisciplinary approach that seeks to understand co-evolved multi-species relationships will connect genomes, phenotypes, ecosystems and the evolutionary forces that have shaped them. We hope that this article inspires other collaborations of a similar nature on the diverse landscape commonly referred to as “zoology”. PMID:21737250
Metaorganisms as the new frontier.
Bosch, Thomas C G; McFall-Ngai, Margaret J
2011-09-01
Because it appears that almost all organisms are part of an interdependent metaorganism, an understanding of the underlying host-microbe species associations, and of evolution and molecular underpinnings, has become the new frontier in zoology. The availability of novel high-throughput sequencing methods, together with the conceptual understanding that advances mostly originate at the intersection of traditional disciplinary boundaries, enable biologists to dissect the mechanisms that control the interdependent associations of species. In this review article, we outline some of the issues in inter-species interactions, present two case studies illuminating the necessity of interfacial research when addressing complex and fundamental zoological problems, and show that an interdisciplinary approach that seeks to understand co-evolved multi-species relationships will connect genomes, phenotypes, ecosystems and the evolutionary forces that have shaped them. We hope that this article inspires other collaborations of a similar nature on the diverse landscape commonly referred to as "zoology". Copyright © 2011 Elsevier GmbH. All rights reserved.
A new biology for a new century.
Woese, Carl R
2004-06-01
Biology today is at a crossroads. The molecular paradigm, which so successfully guided the discipline throughout most of the 20th century, is no longer a reliable guide. Its vision of biology now realized, the molecular paradigm has run its course. Biology, therefore, has a choice to make, between the comfortable path of continuing to follow molecular biology's lead or the more invigorating one of seeking a new and inspiring vision of the living world, one that addresses the major problems in biology that 20th century biology, molecular biology, could not handle and, so, avoided. The former course, though highly productive, is certain to turn biology into an engineering discipline. The latter holds the promise of making biology an even more fundamental science, one that, along with physics, probes and defines the nature of reality. This is a choice between a biology that solely does society's bidding and a biology that is society's teacher.
Mechanisms of Evolution in High-Consequence Drug Resistance Plasmids
He, Susu; Chandler, Michael; Varani, Alessandro M.; Hickman, Alison B.; Dekker, John P.
2016-01-01
ABSTRACT The dissemination of resistance among bacteria has been facilitated by the fact that resistance genes are usually located on a diverse and evolving set of transmissible plasmids. However, the mechanisms generating diversity and enabling adaptation within highly successful resistance plasmids have remained obscure, despite their profound clinical significance. To understand these mechanisms, we have performed a detailed analysis of the mobilome (the entire mobile genetic element content) of a set of previously sequenced carbapenemase-producing Enterobacteriaceae (CPE) from the National Institutes of Health Clinical Center. This analysis revealed that plasmid reorganizations occurring in the natural context of colonization of human hosts were overwhelmingly driven by genetic rearrangements carried out by replicative transposons working in concert with the process of homologous recombination. A more complete understanding of the molecular mechanisms and evolutionary forces driving rearrangements in resistance plasmids may lead to fundamentally new strategies to address the problem of antibiotic resistance. PMID:27923922
NASA Astrophysics Data System (ADS)
Buckingham, A. C.; Hawke, R. S.
1982-09-01
Experimental and theoretical research was conducted jointly at the Livermore and Los Alamos National laboratories on dc electromagnetic railgun Lorentz accelerators. Pellets weighing a few grams to tens of grams were launched at velocities up to better than 11 km/s. The research is addressed to attaining repeated launches of samples at hypervelocity in target impact experiments. In these experiments, shock-induced pressure in the tens of megabars range are obtained for high pressure equations of state research. Primary energy sources of the order of several hundred kJ to a MJ and induction currents of the order of 1 or more MA are necessary for these launches. Erosion and deformation of the conductor rails and the accelerated sample material are continuing problems. The beating, stress, and erosion resulting from simultaneous imposition of rail induction current, dense plasma (armature) interaction, current distribution, magnetic field stresses and projectile/rail contact friction are examined.
The Role of Health Education in Addressing Uncertainty about Health and Cell Phone Use--A Commentary
ERIC Educational Resources Information Center
Ratnapradipa, Dhitinut; Dundulis, William P., Jr.; Ritzel, Dale O.; Haseeb, Abdul
2012-01-01
Although the fundamental principles of health education remain unchanged, the practice of health education continues to evolve in response to the rapidly changing lifestyles and technological advances. Emerging health risks are often associated with these lifestyle changes. The purpose of this article is to address the role of health educators…
ERIC Educational Resources Information Center
Dotts, Brian
2013-01-01
This article addresses the unique role performed by social foundations programs in colleges of education and in addressing broader issues facing education today, which fundamentally include the development of interpretive, normative, and critical perspectives in academia. All three perspectives serve to create a scholarly framework within which…
Strategies for Detecting and Correcting Errors in Accounting Problems.
ERIC Educational Resources Information Center
James, Marianne L.
2003-01-01
Reviews common errors in accounting tests that students commit resulting from deficiencies in fundamental prior knowledge, ineffective test taking, and inattention to detail and provides solutions to the problems. (JOW)
The Challenge of Educational Technology in Underdeveloped Countries
ERIC Educational Resources Information Center
Davis, Griffith J.
1970-01-01
A USAID officer outlines the fundamental educational problems faced by the African nations and describes the educational broadcasting program the Congo has developed to deal with some of these problems. (LS)
Symmetries of relativistic world lines
NASA Astrophysics Data System (ADS)
Koch, Benjamin; Muñoz, Enrique; Reyes, Ignacio A.
2017-10-01
Symmetries are essential for a consistent formulation of many quantum systems. In this paper we discuss a fundamental symmetry, which is present for any Lagrangian term that involves x˙2. As a basic model that incorporates the fundamental symmetries of quantum gravity and string theory, we consider the Lagrangian action of the relativistic point particle. A path integral quantization for this seemingly simple system has long presented notorious problems. Here we show that those problems are overcome by taking into account the additional symmetry, leading directly to the exact Klein-Gordon propagator.
The One-on-One Stochastic Duel. Parts I and II
1979-04-15
called the fundamental marksman problem (FM). From this we may solve the duel problem by considering each marksman to be firing independ- ently of the...other. The first to hit his passive target wins. This is entirely equivalent to the fundamental duel, a the model described above in no way links the...fire another round at t it muot fail, in order fo’" the system to be in the state of having just fired th a round which vas the n one, and no hits
Observing copepods through a genomic lens
2011-01-01
Background Copepods outnumber every other multicellular animal group. They are critical components of the world's freshwater and marine ecosystems, sensitive indicators of local and global climate change, key ecosystem service providers, parasites and predators of economically important aquatic animals and potential vectors of waterborne disease. Copepods sustain the world fisheries that nourish and support human populations. Although genomic tools have transformed many areas of biological and biomedical research, their power to elucidate aspects of the biology, behavior and ecology of copepods has only recently begun to be exploited. Discussion The extraordinary biological and ecological diversity of the subclass Copepoda provides both unique advantages for addressing key problems in aquatic systems and formidable challenges for developing a focused genomics strategy. This article provides an overview of genomic studies of copepods and discusses strategies for using genomics tools to address key questions at levels extending from individuals to ecosystems. Genomics can, for instance, help to decipher patterns of genome evolution such as those that occur during transitions from free living to symbiotic and parasitic lifestyles and can assist in the identification of genetic mechanisms and accompanying physiological changes associated with adaptation to new or physiologically challenging environments. The adaptive significance of the diversity in genome size and unique mechanisms of genome reorganization during development could similarly be explored. Genome-wide and EST studies of parasitic copepods of salmon and large EST studies of selected free-living copepods have demonstrated the potential utility of modern genomics approaches for the study of copepods and have generated resources such as EST libraries, shotgun genome sequences, BAC libraries, genome maps and inbred lines that will be invaluable in assisting further efforts to provide genomics tools for copepods. Summary Genomics research on copepods is needed to extend our exploration and characterization of their fundamental biological traits, so that we can better understand how copepods function and interact in diverse environments. Availability of large scale genomics resources will also open doors to a wide range of systems biology type studies that view the organism as the fundamental system in which to address key questions in ecology and evolution. PMID:21933388
Observation of quantum criticality with ultracold atoms in optical lattices
NASA Astrophysics Data System (ADS)
Zhang, Xibo
As biological problems are becoming more complex and data growing at a rate much faster than that of computer hardware, new and faster algorithms are required. This dissertation investigates computational problems arising in two of the fields: comparative genomics and epigenomics, and employs a variety of computational techniques to address the problems. One fundamental question in the studies of chromosome evolution is whether the rearrangement breakpoints are happening at random positions or along certain hotspots. We investigate the breakpoint reuse phenomenon, and show the analyses that support the more recently proposed fragile breakage model as opposed to the conventional random breakage models for chromosome evolution. The identification of syntenic regions between chromosomes forms the basis for studies of genome architectures, comparative genomics, and evolutionary genomics. The previous synteny block reconstruction algorithms could not be scaled to a large number of mammalian genomes being sequenced; neither did they address the issue of generating non-overlapping synteny blocks suitable for analyzing rearrangements and evolutionary history of large-scale duplications prevalent in plant genomes. We present a new unified synteny block generation algorithm based on A-Bruijn graph framework that overcomes these shortcomings. In the epigenome sequencing, a sample may contain a mixture of epigenomes and there is a need to resolve the distinct methylation patterns from the mixture. Many sequencing applications, such as haplotype inference for diploid or polyploid genomes, and metagenomic sequencing, share the similar objective: to infer a set of distinct assemblies from reads that are sequenced from a heterogeneous sample and subsequently aligned to a reference genome. We model the problem from both a combinatorial and a statistical angles. First, we describe a theoretical framework. A linear-time algorithm is then given to resolve a minimum number of assemblies that are consistent with all reads, substantially improving on previous algorithms. An efficient algorithm is also described to determine a set of assemblies that is consistent with a maximum subset of the reads, a previously untreated problem. We then prove that allowing nested reads or permitting mismatches between reads and their assemblies renders these problems NP-hard. Second, we describe a mixture model-based approach, and applied the model for the detection of allele-specific methylations.
Monitoring Distributed Systems: A Relational Approach.
1982-12-01
relationship, and time. The first two have been are modeled directly in the relational model. The third is perhaps the most fundamental , for without the system ...of another, newly created file. The approach adopted here applies to object-based operatin systems , and will support capability addressing at the...in certainties. -- Francis Bacon, in The Advancement of Learning The thesis of this research is that monitoring distributed systems is fundamentally a
DOT National Transportation Integrated Search
2002-10-01
Rutting has long been a problem in hot mix asphalt (HMA) pavement. Through the years, researchers have used different kinds of fundamental and simulative test methods to estimate the rutting performance of HMA. It has been recognized that most fundam...
Environmental Law: Fundamentals for Schools.
ERIC Educational Resources Information Center
Day, David R.
This booklet outlines the environmental problems most likely to arise in schools. An overview provides a fundamental analysis of environmental issues rather than comprehensive analysis and advice. The text examines the concerns that surround superfund cleanups, focusing on the legal framework, and furnishes some practical pointers, such as what to…
Fundamental Problems of Lunar Research, Technical Solutions, and Priority Lunar Regions for Research
NASA Astrophysics Data System (ADS)
Ivanov, M. A.; Basilevsky, A. T.; Bricheva, S. S.; Guseva, E. N.; Demidov, N. E.; Zakharova, M.; Krasil'nikov, S. S.
2017-11-01
In this article, we discuss four fundamental scientific problems of lunar research: (1) lunar chronology, (2) the internal structure of the Moon, (3) the lunar polar regions, and (4) lunar volcanism. After formulating the scientific problems and their components, we proceed to outlining a list of technical solutions and priority lunar regions for research. Solving the listed problems requires investigations on the lunar surface using lunar rovers, which can deliver a set of analytical equipment to places where geological conditions are known from a detailed analysis of orbital information. The most critical research methods, which can answer some of the key questions, are analysis of local geological conditions from panoramic photographs, determination of the chemical, isotopic, and mineral composition of the soil, and deep seismic sounding. A preliminary list is given of lunar regions with high scientific priority.
Metacognition: Student Reflections on Problem Solving
ERIC Educational Resources Information Center
Wismath, Shelly; Orr, Doug; Good, Brandon
2014-01-01
Twenty-first century teaching and learning focus on the fundamental skills of critical thinking and problem solving, creativity and innovation, and collaboration and communication. Metacognition is a crucial aspect of both problem solving and critical thinking, but it is often difficult to get students to engage in authentic metacognitive…
NASA Astrophysics Data System (ADS)
Perconti, Philip; Bedair, Sarah S.; Bajaj, Jagmohan; Schuster, Jonathan; Reed, Meredith
2016-09-01
To increase Soldier readiness and enhance situational understanding in ever-changing and complex environments, there is a need for rapid development and deployment of Army technologies utilizing sensors, photonics, and electronics. Fundamental aspects of these technologies include the research and development of semiconductor materials and devices which are ubiquitous in numerous applications. Since many Army technologies are considered niche, there is a lack of significant industry investment in the fundamental research and understanding of semiconductor technologies relevant to the Army. To address this issue, the US Army Research Laboratory is establishing a Center for Semiconductor Materials and Device Modeling and seeks to leverage expertise and resources across academia, government and industry. Several key research areas—highlighted and addressed in this paper—have been identified by ARL and external partners and will be pursued in a collaborative fashion by this Center. This paper will also address the mechanisms by which the Center is being established and will operate.
Algorithms for Data Sharing, Coordination, and Communication in Dynamic Network Settings
2007-12-03
problems in dynamic networks, focusing on mobile networks with wireless communication. Problems studied include data management, time synchronization ...The discovery of a fundamental limitation in capabilities for time synchronization in large networks. (2) The identification and development of the...Problems studied include data management, time synchronization , communication problems (broadcast, geocast, and point-to-point routing), distributed
Engineering Complex Embedded Systems with State Analysis and the Mission Data System
NASA Technical Reports Server (NTRS)
Ingham, Michel D.; Rasmussen, Robert D.; Bennett, Matthew B.; Moncada, Alex C.
2004-01-01
It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer s intent, potentially leading to software errors. This problem is addressed by a systems engineering methodology called State Analysis, which provides a process for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using State Analysis and how these requirements inform the design of the system software, using representative spacecraft examples.
Constrained State Estimation for Individual Localization in Wireless Body Sensor Networks
Feng, Xiaoxue; Snoussi, Hichem; Liang, Yan; Jiao, Lianmeng
2014-01-01
Wireless body sensor networks based on ultra-wideband radio have recently received much research attention due to its wide applications in health-care, security, sports and entertainment. Accurate localization is a fundamental problem to realize the development of effective location-aware applications above. In this paper the problem of constrained state estimation for individual localization in wireless body sensor networks is addressed. Priori knowledge about geometry among the on-body nodes as additional constraint is incorporated into the traditional filtering system. The analytical expression of state estimation with linear constraint to exploit the additional information is derived. Furthermore, for nonlinear constraint, first-order and second-order linearizations via Taylor series expansion are proposed to transform the nonlinear constraint to the linear case. Examples between the first-order and second-order nonlinear constrained filters based on interacting multiple model extended kalman filter (IMM-EKF) show that the second-order solution for higher order nonlinearity as present in this paper outperforms the first-order solution, and constrained IMM-EKF obtains superior estimation than IMM-EKF without constraint. Another brownian motion individual localization example also illustrates the effectiveness of constrained nonlinear iterative least square (NILS), which gets better filtering performance than NILS without constraint. PMID:25390408
International Symposium on Clusters and Nanostructures (Energy, Environment, and Health)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jena, Puru
The international Symposium on Clusters and Nanostructures was held in Richmond, Virginia during November 7-10, 2011. The symposium focused on the roles clusters and nanostructures play in solving outstanding problems in clean and sustainable energy, environment, and health; three of the most important issues facing science and society. Many of the materials issues in renewable energies, environmental impacts of energy technologies as well as beneficial and toxicity issues of nanoparticles in health are intertwined. Realizing that both fundamental and applied materials issues require a multidisciplinary approach the symposium provided a forum by bringing researchers from physics, chemistry, materials science, andmore » engineering fields to share their ideas and results, identify outstanding problems, and develop new collaborations. Clean and sustainable energy sessions addressed challenges in production, storage, conversion, and efficiency of renewable energies such as solar, wind, bio, thermo-electric, and hydrogen. Environmental issues dealt with air- and water-pollution and conservation, environmental remediation and hydrocarbon processing. Topics in health included therapeutic and diagnostic methods as well as health hazards attributed to nanoparticles. Cross-cutting topics such as reactions, catalysis, electronic, optical, and magnetic properties were also covered.« less
Improving strand pairing prediction through exploring folding cooperativity
Jeong, Jieun; Berman, Piotr; Przytycka, Teresa M.
2008-01-01
The topology of β-sheets is defined by the pattern of hydrogen-bonded strand pairing. Therefore, predicting hydrogen bonded strand partners is a fundamental step towards predicting β-sheet topology. At the same time, finding the correct partners is very difficult due to long range interactions involved in strand pairing. Additionally, patterns of aminoacids observed in β-sheet formations are very general and therefore difficult to use for computational recognition of specific contacts between strands. In this work, we report a new strand pairing algorithm. To address above mentioned difficulties, our algorithm attempts to mimic elements of the folding process. Namely, in addition to ensuring that the predicted hydrogen bonded strand pairs satisfy basic global consistency constraints, it takes into account hypothetical folding pathways. Consistently with this view, introducing hydrogen bonds between a pair of strands changes the probabilities of forming hydrogen bonds between other pairs of strand. We demonstrate that this approach provides an improvement over previously proposed algorithms. We also compare the performance of this method to that of a global optimization algorithm that poses the problem as integer linear programming optimization problem and solves it using ILOG CPLEX™ package. PMID:18989036
Zhang, Jinpeng; Zhang, Lichi; Xiang, Lei; Shao, Yeqin; Wu, Guorong; Zhou, Xiaodong; Shen, Dinggang; Wang, Qian
2017-01-01
It is fundamentally important to fuse the brain atlas from magnetic resonance (MR) images for many imaging-based studies. Most existing works focus on fusing the atlases from high-quality MR images. However, for low-quality diagnostic images (i.e., with high inter-slice thickness), the problem of atlas fusion has not been addressed yet. In this paper, we intend to fuse the brain atlas from the high-thickness diagnostic MR images that are prevalent for clinical routines. The main idea of our works is to extend the conventional groupwise registration by incorporating a novel super-resolution strategy. The contribution of the proposed super-resolution framework is two-fold. First, each high-thickness subject image is reconstructed to be isotropic by the patch-based sparsity learning. Then, the reconstructed isotropic image is enhanced for better quality through the random-forest-based regression model. In this way, the images obtained by the super-resolution strategy can be fused together by applying the groupwise registration method to construct the required atlas. Our experiments have shown that the proposed framework can effectively solve the problem of atlas fusion from the low-quality brain MR images. PMID:29062159
Vygotsky's Crisis: Argument, context, relevance.
Hyman, Ludmila
2012-06-01
Vygotsky's The Historical Significance of the Crisis in Psychology (1926-1927) is an important text in the history and philosophy of psychology that has only become available to scholars in 1982 in Russian, and in 1997 in English. The goal of this paper is to introduce Vygotsky's conception of psychology to a wider audience. I argue that Vygotsky's argument about the "crisis" in psychology and its resolution can be fully understood only in the context of his social and political thinking. Vygotsky shared the enthusiasm, widespread among Russian leftist intelligentsia in the 1920s, that Soviet society had launched an unprecedented social experiment: The socialist revolution opened the way for establishing social conditions that would let the individual flourish. For Vygotsky, this meant that "a new man" of the future would become "the first and only species in biology that would create itself." He envisioned psychology as a science that would serve this humanist teleology. I propose that The Crisis is relevant today insofar as it helps us define a fundamental problem: How can we systematically account for the development of knowledge in psychology? I evaluate how Vygotsky addresses this problem as a historian of the crisis. Copyright © 2011 Elsevier Ltd. All rights reserved.
Using Knowledge Space Theory To Assess Student Understanding of Stoichiometry
NASA Astrophysics Data System (ADS)
Arasasingham, Ramesh D.; Taagepera, Mare; Potter, Frank; Lonjers, Stacy
2004-10-01
Using the concept of stoichiometry we examined the ability of beginning college chemistry students to make connections among the molecular, symbolic, and graphical representations of chemical phenomena, as well as to conceptualize, visualize, and solve numerical problems. Students took a test designed to follow conceptual development; we then analyzed student responses and the connectivities of their responses, or the cognitive organization of the material or thinking patterns, applying knowledge space theory (KST). The results reveal that the students' logical frameworks of conceptual understanding were very weak and lacked an integrated understanding of some of the fundamental aspects of chemical reactivity. Analysis of response states indicates that the overall thinking patterns began with symbolic representations, moved to numerical problem solving, and then lastly to visualization: the acquisition of visualization skills comes later in the knowledge structure. The results strongly suggest the need for teaching approaches that help students integrate their knowledge by emphasizing the relationships between the different representations and presenting them concurrently during instruction. Also, the results indicate that KST is a useful tool for revealing various aspects of students' cognitive structure in chemistry and can be used as an assessment tool or as a pedagogical tool to address a number of student-learning issues.
Aspects of skeletal muscle modelling.
Epstein, Marcelo; Herzog, Walter
2003-09-29
The modelling of skeletal muscle raises a number of philosophical questions, particularly in the realm of the relationship between different possible levels of representation and explanation. After a brief incursion into this area, a list of desiderata is proposed as a guiding principle for the construction of a viable model, including: comprehensiveness, soundness, experimental consistency, predictive ability and refinability. Each of these principles is illustrated by means of simple examples. The presence of internal constraints, such as incompressibility, may lead to counterintuitive results. A one-panel example is exploited to advocate the use of the principle of virtual work as the ideal tool to deal with these situations. The question of stability in the descending limb of the force-length relation is addressed and a purely mechanical analogue is suggested. New experimental results confirm the assumption that fibre stiffness is positive even in the descending limb. The indeterminacy of the force-sharing problem is traditionally resolved by optimizing a, presumably, physically meaningful target function. After presenting some new results in this area, based on a separation theorem, it is suggested that a more fundamental approach to the problem is the abandoning of optimization criteria in favour of an explicit implementation of activation criteria.
Zhang, Jinpeng; Zhang, Lichi; Xiang, Lei; Shao, Yeqin; Wu, Guorong; Zhou, Xiaodong; Shen, Dinggang; Wang, Qian
2017-03-01
It is fundamentally important to fuse the brain atlas from magnetic resonance (MR) images for many imaging-based studies. Most existing works focus on fusing the atlases from high-quality MR images. However, for low-quality diagnostic images (i.e., with high inter-slice thickness), the problem of atlas fusion has not been addressed yet. In this paper, we intend to fuse the brain atlas from the high-thickness diagnostic MR images that are prevalent for clinical routines. The main idea of our works is to extend the conventional groupwise registration by incorporating a novel super-resolution strategy. The contribution of the proposed super-resolution framework is two-fold. First, each high-thickness subject image is reconstructed to be isotropic by the patch-based sparsity learning. Then, the reconstructed isotropic image is enhanced for better quality through the random-forest-based regression model. In this way, the images obtained by the super-resolution strategy can be fused together by applying the groupwise registration method to construct the required atlas. Our experiments have shown that the proposed framework can effectively solve the problem of atlas fusion from the low-quality brain MR images.
Jangland, Eva; Teodorsson, Therese; Molander, Karin; Muntlin Athlin, Åsa
2018-06-01
To explore the delivery of care from the perspective of patients with acute abdominal pain focusing on the contextual factors at system level using the Fundamentals of Care framework. The Fundamentals of Care framework describes several contextual and systemic factors that can impact the delivery of care. To deliver high-quality, person-centred care, it is important to understand how these factors affect patients' experiences and care needs. A focused ethnographic approach. A total of 20 observations were performed on two surgical wards at a Swedish university hospital. Data were collected using participant observation and informal interviews and analysed using deductive content analysis. The findings, presented in four categories, reflect the value patients place on the caring relationship and a friendly atmosphere on the ward. Patients had concerns about the environment, particularly the high-tempo culture on the ward and its impact on their integrity, rest and sleep, access to information and planning, and need for support in addressing their existential thoughts. The observers also noted that missed nursing care had serious consequences for patient safety. Patients with acute abdominal pain were cared for in the high-tempo culture of a surgical ward with limited resources, unclear leadership and challenges to patients' safety. The findings highlight the crucial importance of prioritising and valuing the patients' fundamental care needs for recovery. Nursing leaders and nurses need to take the lead to reconceptualise the value of fundamental care in the acute care setting. To improve clinical practice, the value of fundamentals of care must be addressed regardless of patient's clinical condition. Providing a caring relationship is paramount to ensure a positive impact on patient's well-being and recovery. © 2017 John Wiley & Sons Ltd.
The Basic Principle of Calculus?
ERIC Educational Resources Information Center
Hardy, Michael
2011-01-01
A simple partial version of the Fundamental Theorem of Calculus can be presented on the first day of the first-year calculus course, and then relied upon repeatedly in assigned problems throughout the course. With that experience behind them, students can use the partial version to understand the full-fledged Fundamental Theorem, with further…
A Note for Graphing Calculators in the Fundamental Finance Course
ERIC Educational Resources Information Center
Chen, Jeng-Hong
2011-01-01
The financial calculator is incorporated in finance education. In class, the instructor shows students how to use the financial calculator's function keys to solve time value of money (TVM) related problems efficiently. The fundamental finance course is required for all majors in the business school. Some students, especially…
Education Finance Reform. Voices for Illinois Children Special Report.
ERIC Educational Resources Information Center
Nagle, Ami; Kim, Robert
This special report reviews problems in Illinois' education funding system and discusses potential solutions to these problems. The report notes that the fundamental problem with the current education finance system is an over-reliance on local property taxes. Although property taxes are a relatively stable and lucrative revenue source,…
New Testing Methods to Assess Technical Problem-Solving Ability.
ERIC Educational Resources Information Center
Hambleton, Ronald K.; And Others
Tests to assess problem-solving ability being provided for the Air Force are described, and some details on the development and validation of these computer-administered diagnostic achievement tests are discussed. Three measurement approaches were employed: (1) sequential problem solving; (2) context-free assessment of fundamental skills and…
Reliable Multi Method Assessment of Metacognition Use in Chemistry Problem Solving
ERIC Educational Resources Information Center
Cooper, Melanie M.; Sandi-Urena, Santiago; Stevens, Ron
2008-01-01
Metacognition is fundamental in achieving understanding of chemistry and developing of problem solving skills. This paper describes an across-method-and-time instrument designed to assess the use of metacognition in chemistry problem solving. This multi method instrument combines a self report, namely the Metacognitive Activities Inventory…
Fostering Problem-Solving in a Virtual Environment
ERIC Educational Resources Information Center
Morin, Danielle; Thomas, Jennifer D. E.; Saadé, Raafat George
2015-01-01
This article investigates students' perceptions of the relationship between Problem-Solving and the activities and resources used in a Web-based course on the fundamentals of Information Technology at a university in Montreal, Canada. We assess for the different learning components of the course, the extent of perceived problem-solving skills…
Students' Epistemological Framing in Quantum Mechanics Problem Solving
ERIC Educational Resources Information Center
Modir, Bahar; Thompson, John D.; Sayre, Eleanor C.
2017-01-01
Students' difficulties in quantum mechanics may be the result of unproductive framing and not a fundamental inability to solve the problems or misconceptions about physics content. We observed groups of students solving quantum mechanics problems in an upper-division physics course. Using the lens of epistemological framing, we investigated four…
An Ethnomethodological Perspective on How Middle School Students Addressed a Water Quality Problem
ERIC Educational Resources Information Center
Belland, Brian R.; Gu, Jiangyue; Kim, Nam Ju; Turner, David J.
2016-01-01
Science educators increasingly call for students to address authentic scientific problems in science class. One form of authentic science problem--socioscientific issue--requires that students engage in complex reasoning by considering both scientific and social implications of problems. Computer-based scaffolding can support this process by…
Graph Laplacian Regularization for Image Denoising: Analysis in the Continuous Domain.
Pang, Jiahao; Cheung, Gene
2017-04-01
Inverse imaging problems are inherently underdetermined, and hence, it is important to employ appropriate image priors for regularization. One recent popular prior-the graph Laplacian regularizer-assumes that the target pixel patch is smooth with respect to an appropriately chosen graph. However, the mechanisms and implications of imposing the graph Laplacian regularizer on the original inverse problem are not well understood. To address this problem, in this paper, we interpret neighborhood graphs of pixel patches as discrete counterparts of Riemannian manifolds and perform analysis in the continuous domain, providing insights into several fundamental aspects of graph Laplacian regularization for image denoising. Specifically, we first show the convergence of the graph Laplacian regularizer to a continuous-domain functional, integrating a norm measured in a locally adaptive metric space. Focusing on image denoising, we derive an optimal metric space assuming non-local self-similarity of pixel patches, leading to an optimal graph Laplacian regularizer for denoising in the discrete domain. We then interpret graph Laplacian regularization as an anisotropic diffusion scheme to explain its behavior during iterations, e.g., its tendency to promote piecewise smooth signals under certain settings. To verify our analysis, an iterative image denoising algorithm is developed. Experimental results show that our algorithm performs competitively with state-of-the-art denoising methods, such as BM3D for natural images, and outperforms them significantly for piecewise smooth images.
Establishing a Conceptual Foundation for Addressing Challenges Facing Food-Energy-Water Management
NASA Astrophysics Data System (ADS)
Goldsby, M.; Padowski, J.; Katz, S.; Brady, M.; Hampton, S. E.
2017-12-01
Ensuring the security of food, energy and water in the face of a changing environment is a top societal priority. In order to make sound policy decisions aimed at meeting those needs, policy-makers need decision-relevant information. As such, considerable effort and resources have recently been devoted to investigating the Food-Energy-Water (FEW) Nexus in order to better provide that information. However, despite the increased research activity into FEW systems and FEW problems, little attention has been devoted to the fundamental conceptual issues underlying contemporary FEW systems. Consequently, this inattention has led to conceptual confusion about what is and what is not a FEW system. This project aims to fill that lacuna in order to better facilitate the FEW research agenda. Toward that end, we identify three features that distinguish FEW problems from other resource management problems: (1) the production and management of the resources in each sector of a FEW system is specialized to its own sector; (2) interdependencies exist between sectors such that overproduction in one sector, for example, may have impacts on other sectors; and (3) there are real limits to FEW resource availability as well as limits on the ability to transact across sector boundaries. We contend that once armed with this distinction, one can model the stocks and flows of FEW capital in a conceptually rigorous way that may lead to operational innovations of FEW management.
Semantic Segmentation of Forest Stands of Pure Species as a Global Optimization Problem
NASA Astrophysics Data System (ADS)
Dechesne, C.; Mallet, C.; Le Bris, A.; Gouet-Brunet, V.
2017-05-01
Forest stand delineation is a fundamental task for forest management purposes, that is still mainly manually performed through visual inspection of geospatial (very) high spatial resolution images. Stand detection has been barely addressed in the literature which has mainly focused, in forested environments, on individual tree extraction and tree species classification. From a methodological point of view, stand detection can be considered as a semantic segmentation problem. It offers two advantages. First, one can retrieve the dominant tree species per segment. Secondly, one can benefit from existing low-level tree species label maps from the literature as a basis for high-level object extraction. Thus, the semantic segmentation issue becomes a regularization issue in a weakly structured environment and can be formulated in an energetical framework. This papers aims at investigating which regularization strategies of the literature are the most adapted to delineate and classify forest stands of pure species. Both airborne lidar point clouds and multispectral very high spatial resolution images are integrated for that purpose. The local methods (such as filtering and probabilistic relaxation) are not adapted for such problem since the increase of the classification accuracy is below 5%. The global methods, based on an energy model, tend to be more efficient with an accuracy gain up to 15%. The segmentation results using such models have an accuracy ranging from 96% to 99%.
Why do children and adolescents bully their peers? A critical review of key theoretical frameworks.
Thomas, Hannah J; Connor, Jason P; Scott, James G
2018-05-01
Bullying is a significant public health problem for children and adolescents worldwide. Evidence suggests that both being bullied (bullying victimisation) and bullying others (bullying perpetration) are associated with concurrent and future mental health problems. The onset and course of bullying perpetration are influenced by individual as well as systemic factors. Identifying effective solutions to address bullying requires a fundamental understanding of why it occurs. Drawing from multi-disciplinary domains, this review provides a summary and synthesis of the key theoretical frameworks applied to understanding and intervening on the issue of bullying. A number of explanatory models have been used to elucidate the dynamics of bullying, and broadly these correspond with either system (e.g., social-ecological, family systems, peer-group socialisation) or individual-level (e.g., developmental psychopathology, genetic, resource control, social-cognitive) frameworks. Each theory adds a unique perspective; however, no single framework comprehensively explains why bullying occurs. This review demonstrates that the integration of theoretical perspectives achieves a more nuanced understanding of bullying which is necessary for strengthening evidence-based interventions. Future progress requires researchers to integrate both the systems and individual-level theoretical frameworks to further improve current interventions. More effective intervention across different systems as well as tailoring interventions to the specific needs of the individuals directly involved in bullying will reduce exposure to a key risk factor for mental health problems.
Luo, Yuan; Szolovits, Peter
2016-01-01
In natural language processing, stand-off annotation uses the starting and ending positions of an annotation to anchor it to the text and stores the annotation content separately from the text. We address the fundamental problem of efficiently storing stand-off annotations when applying natural language processing on narrative clinical notes in electronic medical records (EMRs) and efficiently retrieving such annotations that satisfy position constraints. Efficient storage and retrieval of stand-off annotations can facilitate tasks such as mapping unstructured text to electronic medical record ontologies. We first formulate this problem into the interval query problem, for which optimal query/update time is in general logarithm. We next perform a tight time complexity analysis on the basic interval tree query algorithm and show its nonoptimality when being applied to a collection of 13 query types from Allen's interval algebra. We then study two closely related state-of-the-art interval query algorithms, proposed query reformulations, and augmentations to the second algorithm. Our proposed algorithm achieves logarithmic time stabbing-max query time complexity and solves the stabbing-interval query tasks on all of Allen's relations in logarithmic time, attaining the theoretic lower bound. Updating time is kept logarithmic and the space requirement is kept linear at the same time. We also discuss interval management in external memory models and higher dimensions.
Luo, Yuan; Szolovits, Peter
2016-01-01
In natural language processing, stand-off annotation uses the starting and ending positions of an annotation to anchor it to the text and stores the annotation content separately from the text. We address the fundamental problem of efficiently storing stand-off annotations when applying natural language processing on narrative clinical notes in electronic medical records (EMRs) and efficiently retrieving such annotations that satisfy position constraints. Efficient storage and retrieval of stand-off annotations can facilitate tasks such as mapping unstructured text to electronic medical record ontologies. We first formulate this problem into the interval query problem, for which optimal query/update time is in general logarithm. We next perform a tight time complexity analysis on the basic interval tree query algorithm and show its nonoptimality when being applied to a collection of 13 query types from Allen’s interval algebra. We then study two closely related state-of-the-art interval query algorithms, proposed query reformulations, and augmentations to the second algorithm. Our proposed algorithm achieves logarithmic time stabbing-max query time complexity and solves the stabbing-interval query tasks on all of Allen’s relations in logarithmic time, attaining the theoretic lower bound. Updating time is kept logarithmic and the space requirement is kept linear at the same time. We also discuss interval management in external memory models and higher dimensions. PMID:27478379
Inverse kinematic-based robot control
NASA Technical Reports Server (NTRS)
Wolovich, W. A.; Flueckiger, K. F.
1987-01-01
A fundamental problem which must be resolved in virtually all non-trivial robotic operations is the well-known inverse kinematic question. More specifically, most of the tasks which robots are called upon to perform are specified in Cartesian (x,y,z) space, such as simple tracking along one or more straight line paths or following a specified surfacer with compliant force sensors and/or visual feedback. In all cases, control is actually implemented through coordinated motion of the various links which comprise the manipulator; i.e., in link space. As a consequence, the control computer of every sophisticated anthropomorphic robot must contain provisions for solving the inverse kinematic problem which, in the case of simple, non-redundant position control, involves the determination of the first three link angles, theta sub 1, theta sub 2, and theta sub 3, which produce a desired wrist origin position P sub xw, P sub yw, and P sub zw at the end of link 3 relative to some fixed base frame. Researchers outline a new inverse kinematic solution and demonstrate its potential via some recent computer simulations. They also compare it to current inverse kinematic methods and outline some of the remaining problems which will be addressed in order to render it fully operational. Also discussed are a number of practical consequences of this technique beyond its obvious use in solving the inverse kinematic question.
NASA Astrophysics Data System (ADS)
Tuominen, Mark
2013-03-01
Attitude, Skills, Knowledge (ASK) - In this order, these are fundamental characteristics of scientific innovators. Through first-hand practice in using science to unpack and solve complex real-world problems, students can become self-motivated scientific leaders. This presentation describes the pedagogy of a recently developed interdisciplinary undergraduate science education program at the University of Massachusetts Amherst focused on addressing global challenges with scientific solutions. Integrated Concentration in Science (iCons) is an overarching concentration program that supplements the curricula provided within each student's chosen major. iCons is a platform for students to perform student-led research in interdisciplinary collaborative teams. With a schedule of one course per year over four years, the cohort of students move through case studies, analysis of real-world problems, development of potential solutions, integrative communication, laboratory practice, and capstone research projects. In this presentation, a track emphasizing renewable energy science is used to illustrate the iCons pedagogical methods. This includes discussion of a third-year laboratory course in renewable energy that is educationally scaffolded: beginning with a boot camp in laboratory techniques and culminating with student-designed research projects. Among other objectives, this course emphasizes the practice of using reflection and redesign, as a means of generating better solutions and embedding learning for the long term. This work is supported in part by NSF grant DUE-1140805.
SEMINAR PUBLICATION: MANAGING ENVIRONMENTAL PROBLEMS AT INACTIVE AND ABANDONED METALS MINE SITES
Environmental problems associated with abandoned and inactive mines are addressed along with some approaches to resolving those problems, including case studies demonstrating technologies that have worked. New technologies being investigated are addressed also.
Air Pollution Control and Waste Management
This special issue addresses air pollution control and waste management, two environmental problems that are usually considered separately. Indeed, one of the challenges of environmental protection is that problems are addressed in 'media-specific' ways. In reality, these problem...
2008-05-22
of the primary issues that AFRICOM will have to address in North Africa such as governmental structure, finance reform, the disputed Western Sahara...will have to address in North Africa such as governmental structure, finance reform, the disputed Western Sahara region, the availability of equal...30 Finance Reform ......................................................................................................................... 32
ERIC Educational Resources Information Center
Lee, Carol D.
2017-01-01
This chapter addresses how fundamental principles regarding how people learn in the last decade open up possibilities for conceptualizing a broad ecological culturally rooted framework for the design of robust learning environments in a variety of settings, especially schools. These cross-disciplinary principles emerging from across relevant…
Halley, Meghan C; May, Suepattra G; Rendle, Katharine A S; Frosch, Dominick L; Kurian, Allison W
2014-01-01
Sexual health concerns represent one of the most frequently experienced and longest-lasting effects of breast cancer treatment, but research suggests that service providers rarely discuss sexual health with their patients. Existing research examining barriers to addressing patients' sexual health concerns has focused on discrete characteristics of the provider-patient interaction without considering the broader context in which these interactions occur. Drawing on the experiences of 21 breast cancer survivors, this paper explores three ways in which fundamental cultural and structural characteristics of the cancer care system in the USA may prevent breast cancer survivors from addressing their sexual health concerns, including: (1) when patients discussed sexual health with their providers, their providers approached sexuality as primarily physical, while participants experienced complex, multidimensional sexual health concerns; (2) specialisation within cancer care services made it difficult for patients to identify the appropriate provider to address their concerns; and (3) the structure of cancer care literally disconnects patients from the healthcare system at the time when sexual side effects commonly emerged. These data suggest that addressing breast cancer survivors' sexual health concerns requires a multifaceted approach to health systems change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zaera, Francisco; Bardeen, Christopher J.; Yin, Yadong
The overall goal of this project has been to develop new a new and novel class of well-characterized nanostructured Metal@TiO 2 core-shell and yolk-shell photocatalysts to address two fundamental issues presently limiting this field: (1) the fast recombination of electron-hole pairs once generated by light absorption, and (2) the recombination of H 2 and O 2 on the metal surface once produced. These model samples are also used to study the fundamentals of the photocatalytic processes.
... provider needs to confirm the individual has a fundamental understanding of the information exchange, i.e. more ... the patient improve his or her own self management. Full Text Health care systems can also address ...
Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Joseph Daniel; Anderson, Robert Stephen
Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operationmore » can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.« less
Application of the maximal covering location problem to habitat reserve site selection: a review
Stephanie A. Snyder; Robert G. Haight
2016-01-01
The Maximal Covering Location Problem (MCLP) is a classic model from the location science literature which has found wide application. One important application is to a fundamental problem in conservation biology, the Maximum Covering Species Problem (MCSP), which identifies land parcels to protect to maximize the number of species represented in the selected sites. We...
Green's functions for dislocations in bonded strips and related crack problems
NASA Technical Reports Server (NTRS)
Ballarini, R.; Luo, H. A.
1990-01-01
Green's functions are derived for the plane elastostatics problem of a dislocation in a bimaterial strip. Using these fundamental solutions as kernels, various problems involving cracks in a bimaterial strip are analyzed using singular integral equations. For each problem considered, stress intensity factors are calculated for several combinations of the parameters which describe loading, geometry and material mismatch.
The Visual Geophysical Exploration Environment: A Multi-dimensional Scientific Visualization
NASA Astrophysics Data System (ADS)
Pandya, R. E.; Domenico, B.; Murray, D.; Marlino, M. R.
2003-12-01
The Visual Geophysical Exploration Environment (VGEE) is an online learning environment designed to help undergraduate students understand fundamental Earth system science concepts. The guiding principle of the VGEE is the importance of hands-on interaction with scientific visualization and data. The VGEE consists of four elements: 1) an online, inquiry-based curriculum for guiding student exploration; 2) a suite of El Nino-related data sets adapted for student use; 3) a learner-centered interface to a scientific visualization tool; and 4) a set of concept models (interactive tools that help students understand fundamental scientific concepts). There are two key innovations featured in this interactive poster session. One is the integration of concept models and the visualization tool. Concept models are simple, interactive, Java-based illustrations of fundamental physical principles. We developed eight concept models and integrated them into the visualization tool to enable students to probe data. The ability to probe data using a concept model addresses the common problem of transfer: the difficulty students have in applying theoretical knowledge to everyday phenomenon. The other innovation is a visualization environment and data that are discoverable in digital libraries, and installed, configured, and used for investigations over the web. By collaborating with the Integrated Data Viewer developers, we were able to embed a web-launchable visualization tool and access to distributed data sets into the online curricula. The Thematic Real-time Environmental Data Distributed Services (THREDDS) project is working to provide catalogs of datasets that can be used in new VGEE curricula under development. By cataloging this curricula in the Digital Library for Earth System Education (DLESE), learners and educators can discover the data and visualization tool within a framework that guides their use.
Collective trauma in northern Sri Lanka: a qualitative psychosocial-ecological study.
Somasundaram, Daya
2007-10-04
Complex situations that follow war and natural disasters have a psychosocial impact on not only the individual but also on the family, community and society. Just as the mental health effects on the individual psyche can result in non pathological distress as well as a variety of psychiatric disorders; massive and widespread trauma and loss can impact on family and social processes causing changes at the family, community and societal levels. This qualitative, ecological study is a naturalistic, psychosocial ethnography in Northern Sri Lanka, while actively involved in psychosocial and community mental health programmes among the Tamil community. Participatory observation, key informant interviews and focus group discussion with community level relief and rehabilitation workers and government and non-governmental officials were used to gather data. The effects on the community of the chronic, man-made disaster, war, in Northern Sri Lanka were compared with the contexts found before the war and after the tsunami. Fundamental changes in the functioning of the family and the community were observed. While the changes after the tsunami were not so prominent, the chronic war situation caused more fundamental social transformations. At the family level, the dynamics of single parent families, lack of trust among members, and changes in significant relationships, and child rearing practices were seen. Communities tended to be more dependent, passive, silent, without leadership, mistrustful, and suspicious. Additional adverse effects included the breakdown in traditional structures, institutions and familiar ways of life, and deterioration in social norms and ethics. A variety of community level interventions were tried. Exposure to conflict, war and disaster situations impact on fundamental family and community dynamics resulting in changes at a collective level. Relief, rehabilitation and development programmes to be effective will need to address the problem of collective trauma, particularly using integrated multi-level approaches.
Radiation Belt Storm Probes: Resolving Fundamental Physics with Practical Consequences
NASA Technical Reports Server (NTRS)
Ukhorskiy, Aleksandr Y.; Mauk, Barry H.; Fox, Nicola J.; Sibeck, David G.; Grebowsky, Joseph M.
2011-01-01
The fundamental processes that energize, transport, and cause the loss of charged particles operate throughout the universe at locations as diverse as magnetized planets, the solar wind, our Sun, and other stars. The same processes operate within our immediate environment, the Earth's radiation belts. The Radiation Belt Storm Probes (RBSP) mission will provide coordinated two-spacecraft observations to obtain understanding of these fundamental processes controlling the dynamic variability of the near-Earth radiation environment. In this paper we discuss some of the profound mysteries of the radiation belt physics that will be addressed by RBSP and briefly describe the mission and its goals.
The Variety of Fluid Dynamics.
ERIC Educational Resources Information Center
Barnes, Francis; And Others
1980-01-01
Discusses three research topics which are concerned with eminently practical problems and deal at the same time with fundamental fluid dynamical problems. These research topics come from the general areas of chemical and biological engineering, geophysics, and pure mathematics. (HM)
NASA Technical Reports Server (NTRS)
Weaver, David
2008-01-01
Effectively communicate qualitative and quantitative information orally and in writing. Explain the application of fundamental physical principles to various physical phenomena. Apply appropriate problem-solving techniques to practical and meaningful problems using graphical, mathematical, and written modeling tools. Work effectively in collaborative groups.
Development of indirect EFBEM for radiating noise analysis including underwater problems
NASA Astrophysics Data System (ADS)
Kwon, Hyun-Wung; Hong, Suk-Yoon; Song, Jee-Hun
2013-09-01
For the analysis of radiating noise problems in medium-to-high frequency ranges, the Energy Flow Boundary Element Method (EFBEM) was developed. EFBEM is the analysis technique that applies the Boundary Element Method (BEM) to Energy Flow Analysis (EFA). The fundamental solutions representing spherical wave property for radiating noise problems in open field and considering the free surface effect in underwater are developed. Also the directivity factor is developed to express wave's directivity patterns in medium-to-high frequency ranges. Indirect EFBEM by using fundamental solutions and fictitious source was applied to open field and underwater noise problems successfully. Through numerical applications, the acoustic energy density distributions due to vibration of a simple plate model and a sphere model were compared with those of commercial code, and the comparison showed good agreement in the level and pattern of the energy density distributions.
The principle of cooperation and life's origin and evolution
NASA Technical Reports Server (NTRS)
Oro, J.; Armangue, G.; Mar, A.
1986-01-01
In simple terms a living entity is a negentropic system that replicates, mutates and evoluves. A number of suggestions have been made, such as directed panspermia, atmospheric photosynthesis, genetic overtaking from inorganic processes, etc., as alternative models to the accepted Oparin-Haldane-Urey model of the origin of life on Earth. This has probably occurred because in spite of tremendous advances in the prebiotic synthesis of biochemical compounds, the fundamental problem of the appearance of the first life--a primordial replicating cell-ancestral to all other forms of extant life, has remained elusive. This is indeed a reflection on the different fundamental nature of the problem involved. Regardless of which were the fundamental processes which occurred on the primitive Earth, it has to end up with the fundamental characteristics of an ancestral protocell. The problem of the emergence of the first ancestral cell was one of synergistic macromolecular cooperation, as it has been discussed by authors recently (COSPAR XXV Plenary Meeting). An analogous situation must have occurred at the time of the appearance of the first eucaryotic organism. Procaryotic life appeared probably during the first 600 million years of Earth history when the Earth was sufficiently cool and continually bombarded (in the late accretion period) by comets and minor bodies of the solar system, when the sea had not yet acquired its present form.
34 CFR 200.52 - LEA improvement.
Code of Federal Regulations, 2011 CFR
2011-07-01
... through 200.20; (v) Address— (A) The fundamental teaching and learning needs in the schools of the LEA... effective methods and instructional strategies grounded in scientifically based research; and (ii) Address...
34 CFR 200.52 - LEA improvement.
Code of Federal Regulations, 2013 CFR
2013-07-01
... through 200.20; (v) Address— (A) The fundamental teaching and learning needs in the schools of the LEA... effective methods and instructional strategies grounded in scientifically based research; and (ii) Address...
34 CFR 200.52 - LEA improvement.
Code of Federal Regulations, 2014 CFR
2014-07-01
... through 200.20; (v) Address— (A) The fundamental teaching and learning needs in the schools of the LEA... effective methods and instructional strategies grounded in scientifically based research; and (ii) Address...
34 CFR 200.52 - LEA improvement.
Code of Federal Regulations, 2012 CFR
2012-07-01
... through 200.20; (v) Address— (A) The fundamental teaching and learning needs in the schools of the LEA... effective methods and instructional strategies grounded in scientifically based research; and (ii) Address...
A review of promoting access to medicines in China - problems and recommendations.
Sun, Jing; Hu, Cecile Jia; Stuntz, Mark; Hogerzeil, Hans; Liu, Yuanli
2018-02-20
Despite recent reforms, distorting funding mechanisms and over-prescribing still maintain severe financial barriers to medicines access in China. Complicated and interrelated problems in the pharmaceutical sector require a common framework to be resolved as fragmented solutions do not work. We present a preliminary assessment of the impact of the national healthcare reforms on access to medicines, and propose policy recommendations for promoting universal access to medicines in China. Drawing on multiple sources of information, including a review of published literatures and official national data, field investigations in six provinces and interviews with key opinion leaders, this paper presents a preliminary assessment of the impact of the national healthcare reforms on access to medicines, and proposes policy recommendations for promoting universal access to medicines in China. Public expenditure on medicines has been strictly controlled since the national healthcare reforms of 2009. Yet total pharmaceutical expenditure (TPE) and total health expenditure growth rates continuously outpaced the growth of gross domestic product (GDP). With 2.4% of GDP, TPE now exceeds that of most high income countries. The distorted provider and consumer incentives in the Chinese health system have not fundamentally changed. Price-setting and reimbursement mechanisms do not promote cost-effective use of medicines. Inappropriate price controls and perverse financial incentives are the un-resolved root causes of preference of originator brands for some major diseases and shortages of low-cost and low-consumption medicines. In addition, access to expensive life-saving medicines is yet systematically addressed. The complicated and interdependent problems interact in a way that leads to significant system problems in China, which create dual challenges that both the developing country and the developed countries are facing. To further promote access to medicines, China should speed up the re-assessment of the quality and efficacy of domestically produced generic medicines; coordinate various reforms of price determination, insurance payments, and procurement policies; address medicine shortages through comprehensive policies and legislation; establish specific mechanisms to achieve sustainable equitable access to expensive essential medicines with health technology assessment as a tool to ensure that policy and priority setting are created in a coherent and evidence-based way.
The exact fundamental solution for the Benes tracking problem
NASA Astrophysics Data System (ADS)
Balaji, Bhashyam
2009-05-01
The universal continuous-discrete tracking problem requires the solution of a Fokker-Planck-Kolmogorov forward equation (FPKfe) for an arbitrary initial condition. Using results from quantum mechanics, the exact fundamental solution for the FPKfe is derived for the state model of arbitrary dimension with Benes drift that requires only the computation of elementary transcendental functions and standard linear algebra techniques- no ordinary or partial differential equations need to be solved. The measurement process may be an arbitrary, discrete-time nonlinear stochastic process, and the time step size can be arbitrary. Numerical examples are included, demonstrating its utility in practical implementation.
On the need for a theory of wildland fire spread
Mark A. Finney; Jack D. Cohen; Sara S. McAllister; W. Matt Jolly
2012-01-01
We explore the basis of understanding wildland fire behaviour with the intention of stimulating curiosity and promoting fundamental investigations of fire spread problems that persist even in the presence of tremendous modelling advances. Internationally, many fire models have been developed based on a variety of assumptions and expressions for the fundamental heat...
The Chinese Idea of Universities and the Beida Reform
ERIC Educational Resources Information Center
Yang, Gan
2004-01-01
"As far as all the universities in today?s Chinese societies are concerned, the fundamental problem of universities operated by Chinese is that basically there can be no mention of cultural self-confidence or cultural consciousness; or in other words, they have far from established a Chinese idea of university." The fundamental mission…
Challenge Based Innovation: Translating Fundamental Research into Societal Applications
ERIC Educational Resources Information Center
Kurikka, Joona; Utriainen, Tuuli; Repokari, Lauri
2016-01-01
This paper is based on work done at IdeaSquare, a new innovation experiment at CERN, the European Organization for Nuclear Research. The paper explores the translation of fundamental research into societal applications with the help of multidisciplinary student teams, project- and problem-based learning and design thinking methods. The theme is…
ERIC Educational Resources Information Center
Fang, Ning
2012-01-01
A concept pair is a pair of concepts that are fundamentally different but closely related. To develop a solid conceptual understanding in dynamics (a foundational engineering science course) and physics, students must understand the fundamental difference and relationship between two concepts that are included in each concept pair. However, all…
ERIC Educational Resources Information Center
Pardue, Harry L.; Woo, Jannie
1984-01-01
Proposes an approach to teaching analytical chemistry and chemical analysis in which a problem to be resolved is the focus of a course. Indicates that this problem-oriented approach is intended to complement detailed discussions of fundamental and applied aspects of chemical determinations and not replace such discussions. (JN)
The representation of multiplication and division facts in memory.
De Brauwer, Jolien; Fias, Wim
2011-01-01
Recently, using a training paradigm, Campbell and Agnew (2009) observed cross-operation response time savings with nonidentical elements (e.g., practice 3 + 2, test 5 - 2) for addition and subtraction, showing that a single memory representation underlies addition and subtraction performance. Evidence for cross-operation savings between multiplication and division have been described frequently (e.g., Campbell, Fuchs-Lacelle, & Phenix, 2006) but they have always been attributed to a mediation strategy (reformulating a division problem as a multiplication problem, e.g., Campbell et al., 2006). Campbell and Agnew (2009) therefore concluded that there exists a fundamental difference between addition and subtraction on the one hand and multiplication and division on the other hand. However, our results suggest that retrieval savings between inverse multiplication and division problems can be observed. Even for small problems (solved by direct retrieval) practicing a division problem facilitated the corresponding multiplication problem and vice versa. These findings indicate that shared memory representations underlie multiplication and division retrieval. Hence, memory and learning processes do not seem to differ fundamentally between addition-subtraction and multiplication-division.
A Neural Dynamic Model Generates Descriptions of Object-Oriented Actions.
Richter, Mathis; Lins, Jonas; Schöner, Gregor
2017-01-01
Describing actions entails that relations between objects are discovered. A pervasively neural account of this process requires that fundamental problems are solved: the neural pointer problem, the binding problem, and the problem of generating discrete processing steps from time-continuous neural processes. We present a prototypical solution to these problems in a neural dynamic model that comprises dynamic neural fields holding representations close to sensorimotor surfaces as well as dynamic neural nodes holding discrete, language-like representations. Making the connection between these two types of representations enables the model to describe actions as well as to perceptually ground movement phrases-all based on real visual input. We demonstrate how the dynamic neural processes autonomously generate the processing steps required to describe or ground object-oriented actions. By solving the fundamental problems of neural pointing, binding, and emergent discrete processing, the model may be a first but critical step toward a systematic neural processing account of higher cognition. Copyright © 2017 The Authors. Topics in Cognitive Science published by Wiley Periodicals, Inc. on behalf of Cognitive Science Society.
Mean intensity of the fundamental Bessel-Gaussian beam in turbulent atmosphere
NASA Astrophysics Data System (ADS)
Lukin, Igor P.
2017-11-01
In the given article mean intensity of a fundamental Bessel-Gaussian optical beam in turbulent atmosphere is studied. The problem analysis is based on the solution of the equation for the transverse second-order mutual coherence function of a fundamental Bessel-Gaussian beam of optical radiation. Distributions of mean intensity of a fundamental Bessel- Gaussian beam optical beam in longitudinal and transverse to a direction of propagation of optical radiation are investigated in detail. Influence of atmospheric turbulence on change of radius of the central part of a Bessel optical beam is estimated. Values of parameters at which it is possible to generate in turbulent atmosphere a nondiffracting pseudo-Bessel optical beam by means of a fundamental Bessel-Gaussian optical beam are established.
Krska, Shane W; DiRocco, Daniel A; Dreher, Spencer D; Shevlin, Michael
2017-12-19
The structural complexity of pharmaceuticals presents a significant challenge to modern catalysis. Many published methods that work well on simple substrates often fail when attempts are made to apply them to complex drug intermediates. The use of high-throughput experimentation (HTE) techniques offers a means to overcome this fundamental challenge by facilitating the rational exploration of large arrays of catalysts and reaction conditions in a time- and material-efficient manner. Initial forays into the use of HTE in our laboratories for solving chemistry problems centered around screening of chiral precious-metal catalysts for homogeneous asymmetric hydrogenation. The success of these early efforts in developing efficient catalytic steps for late-stage development programs motivated the desire to increase the scope of this approach to encompass other high-value catalytic chemistries. Doing so, however, required significant advances in reactor and workflow design and automation to enable the effective assembly and agitation of arrays of heterogeneous reaction mixtures and retention of volatile solvents under a wide range of temperatures. Associated innovations in high-throughput analytical chemistry techniques greatly increased the efficiency and reliability of these methods. These evolved HTE techniques have been utilized extensively to develop highly innovative catalysis solutions to the most challenging problems in large-scale pharmaceutical synthesis. Starting with Pd- and Cu-catalyzed cross-coupling chemistry, subsequent efforts expanded to other valuable modern synthetic transformations such as chiral phase-transfer catalysis, photoredox catalysis, and C-H functionalization. As our experience and confidence in HTE techniques matured, we envisioned their application beyond problems in process chemistry to address the needs of medicinal chemists. Here the problem of reaction generality is felt most acutely, and HTE approaches should prove broadly enabling. However, the quantities of both time and starting materials available for chemistry troubleshooting in this space generally are severely limited. Adapting to these needs led us to invest in smaller predefined arrays of transformation-specific screening "kits" and push the boundaries of miniaturization in chemistry screening, culminating in the development of "nanoscale" reaction screening carried out in 1536-well plates. Grappling with the problem of generality also inspired the exploration of cheminformatics-driven HTE approaches such as the Chemistry Informer Libraries. These next-generation HTE methods promise to empower chemists to run orders of magnitude more experiments and enable "big data" informatics approaches to reaction design and troubleshooting. With these advances, HTE is poised to revolutionize how chemists across both industry and academia discover new synthetic methods, develop them into tools of broad utility, and apply them to problems of practical significance.
What works and what doesn't work well in the US healthcare system.
Luft, Harold S
2006-12-01
Most observers agree that the US healthcare system is expensive, provides variable quality and leaves many without coverage. The policy challenge is that there is little consensus on how to approach reform. Many proposals assume that systems appearing to work in one nation can be transferred in toto to another or, alternatively, that only minor tweaking of an existing system is possible. The former approach ignores fundamental social, political and legal realities, and the latter ignores the potential for increased benefits. Additionally, many proposals are ideologically driven, focusing on how to finance expanded coverage. Broadening the discussion to examine other components of the system that do not work well may identify sufficient benefits for various stakeholders to engage them in finding more comprehensive solutions that address a range of problems. This paper examines areas in which the US healthcare system performs worse than one would like and areas in which it appears to work well. In the first category is the high proportion of people without coverage, the inefficient and inequitable incentives for the purchase and provision of insurance, the problems in deciding what should be covered, the ineffective payment incentives, administrative costs and complexities, the variable quality and lack of responsiveness to patient preferences, the less than optimal safety, under-valued primary care, provider de-professionalisation, and the costs that appear to be on auto-pilot. In the second category is the rapid and wide-reaching technological innovation, the ready access to care for the insured, and clinical and patient autonomy. Among the things taken as given is our constitutional (rather than parliamentary) political system and underlying public values about the roles of individuals and government. Current players will be active in any debate about reform, so their interests must be addressed. Likewise, certain underlying economic and social drivers of behaviour will continue and should be considered in any reform proposal. Potentially changeable, however, are the roles and functions that the current players may take on in a new system. Likewise, health system-specific legislation should be as malleable as are financing approaches. A more expansive view of the health system's problems makes potential solutions more complex. By addressing problems faced by people currently with coverage and by providers and other stakeholders within the system, however, the benefits may be sufficiently widespread to create the political consensus that has so far eluded reformers in the US.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Becker, Kurt H.; McCurdy, C. William; Orlando, Thomas M.
2000-09-01
This report is based largely on presentations and discussions at two workshops and contributions from workshop participants. The workshop on Fundamental Challenges in Electron-Driven Chemistry was held in Berkeley, October 9-10, 1998, and addressed questions regarding theory, computation, and simulation. The workshop on Electron-Driven Processes: Scientific Challenges and Technological Opportunities was held at Stevens Institute of Technology, March 16-17, 2000, and focused largely on experiments. Electron-molecule and electron-atom collisions initiate and drive almost all the relevant chemical processes associated with radiation chemistry, environmental chemistry, stability of waste repositories, plasma-enhanced chemical vapor deposition, plasma processing of materials for microelectronic devices andmore » other applications, and novel light sources for research purposes (e.g. excimer lamps in the extreme ultraviolet) and in everyday lighting applications. The life sciences are a rapidly advancing field where the important role of electron-driven processes is only now beginning to be recognized. Many of the applications of electron-initiated chemical processes require results in the near term. A large-scale, multidisciplinary and collaborative effort should be mounted to solve these problems in a timely way so that their solution will have the needed impact on the urgent questions of understanding the physico-chemical processes initiated and driven by electron interactions.« less
Electrochemical metallization memories--fundamentals, applications, prospects.
Valov, Ilia; Waser, Rainer; Jameson, John R; Kozicki, Michael N
2011-06-24
This review focuses on electrochemical metallization memory cells (ECM), highlighting their advantages as the next generation memories. In a brief introduction, the basic switching mechanism of ECM cells is described and the historical development is sketched. In a second part, the full spectra of materials and material combinations used for memory device prototypes and for dedicated studies are presented. In a third part, the specific thermodynamics and kinetics of nanosized electrochemical cells are described. The overlapping of the space charge layers is found to be most relevant for the cell properties at rest. The major factors determining the functionality of the ECM cells are the electrode reaction and the transport kinetics. Depending on electrode and/or electrolyte material electron transfer, electro-crystallization or slow diffusion under strong electric fields can be rate determining. In the fourth part, the major device characteristics of ECM cells are explained. Emphasis is placed on switching speed, forming and SET/RESET voltage, R(ON) to R(OFF) ratio, endurance and retention, and scaling potentials. In the last part, circuit design aspects of ECM arrays are discussed, including the pros and cons of active and passive arrays. In the case of passive arrays, the fundamental sneak path problem is described and as well as a possible solution by two anti-serial (complementary) interconnected resistive switches per cell. Furthermore, the prospects of ECM with regard to further scalability and the ability for multi-bit data storage are addressed.
Electrochemical metallization memories—fundamentals, applications, prospects
NASA Astrophysics Data System (ADS)
Valov, Ilia; Waser, Rainer; Jameson, John R.; Kozicki, Michael N.
2011-06-01
This review focuses on electrochemical metallization memory cells (ECM), highlighting their advantages as the next generation memories. In a brief introduction, the basic switching mechanism of ECM cells is described and the historical development is sketched. In a second part, the full spectra of materials and material combinations used for memory device prototypes and for dedicated studies are presented. In a third part, the specific thermodynamics and kinetics of nanosized electrochemical cells are described. The overlapping of the space charge layers is found to be most relevant for the cell properties at rest. The major factors determining the functionality of the ECM cells are the electrode reaction and the transport kinetics. Depending on electrode and/or electrolyte material electron transfer, electro-crystallization or slow diffusion under strong electric fields can be rate determining. In the fourth part, the major device characteristics of ECM cells are explained. Emphasis is placed on switching speed, forming and SET/RESET voltage, RON to ROFF ratio, endurance and retention, and scaling potentials. In the last part, circuit design aspects of ECM arrays are discussed, including the pros and cons of active and passive arrays. In the case of passive arrays, the fundamental sneak path problem is described and as well as a possible solution by two anti-serial (complementary) interconnected resistive switches per cell. Furthermore, the prospects of ECM with regard to further scalability and the ability for multi-bit data storage are addressed.
Federated data storage and management infrastructure
NASA Astrophysics Data System (ADS)
Zarochentsev, A.; Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Hristov, P.
2016-10-01
The Large Hadron Collider (LHC)’ operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. Computing models for the High Luminosity LHC era anticipate a growth of storage needs of at least orders of magnitude; it will require new approaches in data storage organization and data handling. In our project we address the fundamental problem of designing of architecture to integrate a distributed heterogeneous disk resources for LHC experiments and other data- intensive science applications and to provide access to data from heterogeneous computing facilities. We have prototyped a federated storage for Russian T1 and T2 centers located in Moscow, St.-Petersburg and Gatchina, as well as Russian / CERN federation. We have conducted extensive tests of underlying network infrastructure and storage endpoints with synthetic performance measurement tools as well as with HENP-specific workloads, including the ones running on supercomputing platform, cloud computing and Grid for ALICE and ATLAS experiments. We will present our current accomplishments with running LHC data analysis remotely and locally to demonstrate our ability to efficiently use federated data storage experiment wide within National Academic facilities for High Energy and Nuclear Physics as well as for other data-intensive science applications, such as bio-informatics.
Requirements for a next generation global flood inundation models
NASA Astrophysics Data System (ADS)
Bates, P. D.; Neal, J. C.; Smith, A.; Sampson, C. C.
2016-12-01
In this paper we review the current status of global hydrodynamic models for flood inundation prediction and highlight recent successes and current limitations. Building on this analysis we then go on to consider what is required to develop the next generation of such schemes and show that to achieve this a number of fundamental science problems will need to be overcome. New data sets and new types of analysis will be required, and we show that these will only partially be met by currently planned satellite missions and data collection initiatives. A particular example is the quality of available global Digital Elevation data. The current best data set for flood modelling, SRTM, is only available at a relatively modest 30m resolution, contains pixel-to-pixel noise of 6m and is corrupted by surface artefacts. Creative processing techniques have sought to address these issues with some success, but fundamentally the quality of the available global terrain data limits flood modelling and needs to be overcome. Similar arguments can be made for many other elements of global hydrodynamic models including their bathymetry data, boundary conditions, flood defence information and model validation data. We therefore systematically review each component of global flood models and document whether planned new technology will solve current limitations and, if not, what exactly will be required to do so.
NASA Astrophysics Data System (ADS)
Mathai, Pramod P.
This thesis focuses on applying and augmenting 'Reduced Order Modeling' (ROM) techniques to large scale problems. ROM refers to the set of mathematical techniques that are used to reduce the computational expense of conventional modeling techniques, like finite element and finite difference methods, while minimizing the loss of accuracy that typically accompanies such a reduction. The first problem that we address pertains to the prediction of the level of heat dissipation in electronic and MEMS devices. With the ever decreasing feature sizes in electronic devices, and the accompanied rise in Joule heating, the electronics industry has, since the 1990s, identified a clear need for computationally cheap heat transfer modeling techniques that can be incorporated along with the electronic design process. We demonstrate how one can create reduced order models for simulating heat conduction in individual components that constitute an idealized electronic device. The reduced order models are created using Krylov Subspace Techniques (KST). We introduce a novel 'plug and play' approach, based on the small gain theorem in control theory, to interconnect these component reduced order models (according to the device architecture) to reliably and cheaply replicate whole device behavior. The final aim is to have this technique available commercially as a computationally cheap and reliable option that enables a designer to optimize for heat dissipation among competing VLSI architectures. Another place where model reduction is crucial to better design is Isoelectric Focusing (IEF) - the second problem in this thesis - which is a popular technique that is used to separate minute amounts of proteins from the other constituents that are present in a typical biological tissue sample. Fundamental questions about how to design IEF experiments still remain because of the high dimensional and highly nonlinear nature of the differential equations that describe the IEF process as well as the uncertainty in the parameters of the differential equations. There is a clear need to design better experiments for IEF without the current overhead of expensive chemicals and labor. We show how with a simpler modeling of the underlying chemistry, we can still achieve the accuracy that has been achieved in existing literature for modeling small ranges of pH (hydrogen ion concentration) in IEF, but with far less computational time. We investigate a further reduction of time by modeling the IEF problem using the Proper Orthogonal Decomposition (POD) technique and show why POD may not be sufficient due to the underlying constraints. The final problem that we address in this thesis addresses a certain class of dynamics with high stiffness - in particular, differential algebraic equations. With the help of simple examples, we show how the traditional POD procedure will fail to model certain high stiffness problems due to a particular behavior of the vector field which we will denote as twist. We further show how a novel augmentation to the traditional POD algorithm can model-reduce problems with twist in a computationally cheap manner without any additional data requirements.
Munga, Michael A; Mwangu, Mughwira A
2013-04-01
Although the Human Resources for Health (HRH) crisis is apparently not new in the public health agenda of many countries, not many low and middle income countries are using Primary Health Care (PHC) as a tool for planning and addressing the crisis in a comprehensive manner. The aim of this paper is to appraise the inadequacies of the existing planning approaches in addressing the growing HRH crisis in resource limited settings. A descriptive literature review of selected case studies in middle and low income countries reinforced with the evidence from Tanzania was used. Consultations with experts in the field were also made. In this review, we propose a conceptual framework that describes planning may only be effective if it is structured to embrace the fundamental principles of PHC. We place the core principles of PHC at the centre of HRH planning as we acknowledge its major perspective that the effectiveness of any public health policy depends on the degree to which it envisages to address public health problems multi-dimensionally and comprehensively. The proponents of PHC approach in planning have identified inter-sectoral action and collaboration and comprehensive approach as the two basic principles that policies and plans should accentuate in order to make them effective in realizing their pre-determined goals. Two conclusions are made: Firstly, comprehensive health workforce planning is not widely known and thus not frequently used in HRH planning or analysis of health workforce issues; Secondly, comprehensiveness in HRH planning is important but not sufficient in ensuring that all the ingredients of HRH crisis are eliminated. In order to be effective and sustainable, the approach need to evoke three basic values namely effectiveness, efficiency and equity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierce, Eric M.; Freshley, Mark D.; Hubbard, Susan S.
In this report, we start by examining previous efforts at linking science and DOE EM research with cleanup activities. Many of these efforts were initiated by creating science and technology roadmaps. A recurring feature of successfully implementing these roadmaps into EM applied research efforts and successful cleanup is the focus on integration. Such integration takes many forms, ranging from combining information generated by various scientific disciplines, to providing technical expertise to facilitate successful application of novel technology, to bringing the resources and creativity of many to address the common goal of moving EM cleanup forward. Successful projects identify and focusmore » research efforts on addressing the problems and challenges that are causing “failure” in actual cleanup activities. In this way, basic and applied science resources are used strategically to address the particular unknowns that are barriers to cleanup. The brief descriptions of the Office of Science basic (Environmental Remediation Science Program [ERSP]) and EM’s applied (Groundwater and Soil Remediation Program) research programs in subsurface science provide context to the five “crosscutting” themes that have been developed in this strategic planning effort. To address these challenges and opportunities, a tiered systematic approach is proposed that leverages basic science investments with new applied research investments from the DOE Office of Engineering and Technology within the framework of the identified basic science and applied research crosscutting themes. These themes are evident in the initial portfolio of initiatives in the EM groundwater and soil cleanup multi-year program plan. As stated in a companion document for tank waste processing (Bredt et al. 2008), in addition to achieving its mission, DOE EM is experiencing a fundamental shift in philosophy from driving to closure to enabling the long-term needs of DOE and the nation.« less
Fundamental Investigations of the Tribological Properties of Biological Interfaces
2007-11-28
D Spencer 5e. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) ETH Zurich Wolfgang - Pauli -Strasse 10 Zürich CH-8093...Chiara Perrino, Seunghwan Lee and Nicholas D. Spencer Laboratory for Surface Science and Technology, Department of Materials, ETH Zurich, Wolfgang ... Pauli -Strasse 10, CH-8093, Switzerland Abstract: Comb-like graft copolymers with carbohydrate side chains have been developed as aqueous
2007-05-01
business processes and services. 4. Security operations management addresses the day-to-day activities that the organization performs to protect the...Management TM – Technology Management Security Operations Management SOM – Security Operations Management 5.7.2 Important Operations Competency...deals with the provision of access rights to informa- tion and technical assets SOM – Security Operations Management , which addresses the fundamental
ERIC Educational Resources Information Center
Lee, Liangshiu
2010-01-01
The basis sets for symmetry operations of d[superscript 1] to d[superscript 9] complexes in an octahedral field and the resulting terms are derived for the ground states and spin-allowed excited states. The basis sets are of fundamental importance in group theory. This work addresses such a fundamental issue, and the results are pedagogically…
REIMR: A Process for Utilizing Propulsion-Oriented 'Lessons-Learned' to Mitigate Development Risk
NASA Technical Reports Server (NTRS)
Ballard, Richard O.; Brown, Kendall K.
2005-01-01
This paper is a summary overview of a study conducted a t the NASA Marshall Space Flight Center (MSFC) during the initial phases of the Space Launch Initiative (SLI) program to evaluate a large number of technical problems associated with the design, development, test, evaluation and operation of several major liquid propellant rocket engine systems (i.e., SSME, Fastrac, J-2, F-1). The results of this study was the identification of the "Fundamental Root Causes" that enabled the technical problems to manifest, and practices that can be implemented to prevent them from recurring in future engine development efforts. This paper will discus the Fundamental Root Causes, cite some examples of how the technical problems arose from them, and provide a discussion of how they can be mitigated or avoided.
Garland, Eric L
2016-06-01
Though valuation processes are fundamental to survival of the human species, hedonic dysregulation is at the root of an array of maladies, including addiction, stress, and chronic pain, as evidenced by the allostatic shift in the relative salience of natural reward to drug reward observed among persons with severe substance use disorders. To address this crucial problem, novel interventions are needed to restore hedonic regulatory processes gone awry in persons exhibiting addictive behaviors. This article describes a theoretical rationale and empirical evidence for the effects of one such new intervention, Mindfulness-Oriented Recovery Enhancement (MORE), on top-down and bottom-up mechanisms implicated in cognitive control and hedonic regulation. MORE is innovative and distinct from extant mindfulness-based interventions in that it unites traditional mindfulness meditation with reappraisal and savoring strategies designed to reverse the downward shift in salience of natural reward relative to drug reward, representing a crucial tipping point to disrupt the progression of addiction-a mechanistic target that no other behavioral intervention has been designed to address. Though additional studies are needed, clinical and biobehavioral data from several completed and ongoing trials suggest that MORE may exert salutary effects on addictive behaviors and the neurobiological processes that underpin them. © 2016 New York Academy of Sciences.
Disturbed soil characterization workshop: post-meeting summary
NASA Astrophysics Data System (ADS)
Cathcart, J. Michael
2010-04-01
Disturbance of ground surfaces can arise from a variety of processes, both manmade and natural. Burying landmines, vehicle movement, and walking are representative examples of processes that disturb ground surfaces. The nature of the specific disturbance process can lead to the observables that can aid the detection and identification of that process. While much research has been conducted in this area, fundamental questions related to the remote detection and characterization of disturbed soil surfaces remain unanswered. Under the sponsorship of the Army Research Office (ARO), the Night Vision and Electronic Sensors Directorate (NVESD), and the U.S. Army Corps of Engineers (USACE) Engineering Research and Development Center (ERDC), Georgia Tech hosted a workshop to address Remote Sensing Methods for Disturbed Soil Characterization. The workshop was held January 15-17, 2008 in Atlanta. The primary objective of this workshop was to take a new look at the disturbed soil problem in general as well as its relation to buried explosive detection and other manmade disturbances. In particular, the participants sought to outline the basic science and technology questions that need to be addressed across the full spectrum of military applications to fully exploit this phenomenon. This presentation will outline the approach taken during the workshop and provide a summary of the conclusions.
ERIC Educational Resources Information Center
Matthews, Paul G.; Atkinson, Richard C.
This paper reports an experiment designed to test theoretical relations among fast problem solving, more complex and slower problem solving, and research concerning fundamental memory processes. Using a cathode ray tube, subjects were presented with propositions of the form "Y is in list X" which they memorized. In later testing they were asked to…
The Kadison–Singer Problem in mathematics and engineering
Casazza, Peter G.; Tremain, Janet Crandell
2006-01-01
We will see that the famous intractible 1959 Kadison–Singer Problem in C*-algebras is equivalent to fundamental open problems in a dozen different areas of research in mathematics and engineering. This work gives all these areas common ground on which to interact as well as explaining why each area has volumes of literature on their respective problems without a satisfactory resolution. PMID:16461465
NASA Astrophysics Data System (ADS)
Brebbia, C. A.; Futagami, T.; Tanaka, M.
The boundary-element method (BEM) in computational fluid and solid mechanics is examined in reviews and reports of theoretical studies and practical applications. Topics presented include the fundamental mathematical principles of BEMs, potential problems, EM-field problems, heat transfer, potential-wave problems, fluid flow, elasticity problems, fracture mechanics, plates and shells, inelastic problems, geomechanics, dynamics, industrial applications of BEMs, optimization methods based on the BEM, numerical techniques, and coupling.
Health Effects of Climate Change
... in their leisure time. Deeply embedded in this fundamental relationship between climate and human life are the ... and emergency services to address disaster planning and management. Research to understand the benefits of alternative fuels, ...
The ROE is divided into 5 themes: Air, Water, Land, Human Exposure and Health and Ecological Condition. From these themes, the report indicators address fundamental questions that the ROE attempts to answer. For Land there are 5 questions.
Medicare Financing of Graduate Medical Education
Rich, Eugene C; Liebow, Mark; Srinivasan, Malathi; Parish, David; Wolliscroft, James O; Fein, Oliver; Blaser, Robert
2002-01-01
The past decade has seen ongoing debate regarding federal support of graduate medical education, with numerous proposals for reform. Several critical problems with the current mechanism are evident on reviewing graduate medical education (GME) funding issues from the perspectives of key stakeholders. These problems include the following: substantial interinstitutional and interspecialty variations in per-resident payment amounts; teaching costs that have not been recalibrated since 1983; no consistent control by physician educators over direct medical education (DME) funds; and institutional DME payments unrelated to actual expenditures for resident education or to program outcomes. None of the current GME reform proposals adequately address all of these issues. Accordingly, we recommend several fundamental changes in Medicare GME support. We propose a re-analysis of the true direct costs of resident training (with appropriate adjustment for local market factors) to rectify the myriad problems with per-resident payments. We propose that Medicare DME funds go to the physician organization providing resident instruction, keeping DME payments separate from the operating revenues of teaching hospitals. To ensure financial accountability, we propose that institutions must maintain budgets and report expenditures for each GME program. To establish educational accountability, Residency Review Committees should establish objective, annually measurable standards for GME program performance; programs that consistently fail to meet these minimum standards should lose discretion over GME funds. These reforms will solve several long-standing, vexing problems in Medicare GME funding, but will also uncover the extent of undersupport of GME by most other health care payers. Ultimately, successful reform of GME financing will require “all-payer” support. PMID:11972725
Editorial: 2nd Special Issue on behavior change, health, and health disparities.
Higgins, Stephen T
2015-11-01
This Special Issue of Preventive Medicine (PM) is the 2nd that we have organized on behavior change, health, and health disparities. This is a topic of fundamental importance to improving population health in the U.S. and other industrialized countries that are trying to more effectively manage chronic health conditions. There is broad scientific consensus that personal behavior patterns such as cigarette smoking, other substance abuse, and physical inactivity/obesity are among the most important modifiable causes of chronic disease and its adverse impacts on population health. As such behavior change needs to be a key component of improving population health. There is also broad agreement that while these problems extend across socioeconomic strata, they are overrepresented among more economically disadvantaged populations and contribute directly to the growing problem of health disparities. Hence, behavior change represents an essential step in curtailing that unsettling problem as well. In this 2nd Special Issue, we devote considerable space to the current U.S. prescription opioid addiction epidemic, a crisis that was not addressed in the prior Special Issue. We also continue to devote attention to the two largest contributors to preventable disease and premature death, cigarette smoking and physical inactivity/obesity as well as risks of co-occurrence of these unhealthy behavior patterns. Across each of these topics we included contributions from highly accomplished policy makers and scientists to acquaint readers with recent accomplishments as well as remaining knowledge gaps and challenges to effectively managing these important chronic health problems. Copyright © 2015 Elsevier Inc. All rights reserved.
The great opportunity: Evolutionary applications to medicine and public health.
Nesse, Randolph M; Stearns, Stephen C
2008-02-01
Evolutionary biology is an essential basic science for medicine, but few doctors and medical researchers are familiar with its most relevant principles. Most medical schools have geneticists who understand evolution, but few have even one evolutionary biologist to suggest other possible applications. The canyon between evolutionary biology and medicine is wide. The question is whether they offer each other enough to make bridge building worthwhile. What benefits could be expected if evolution were brought fully to bear on the problems of medicine? How would studying medical problems advance evolutionary research? Do doctors need to learn evolution, or is it valuable mainly for researchers? What practical steps will promote the application of evolutionary biology in the areas of medicine where it offers the most? To address these questions, we review current and potential applications of evolutionary biology to medicine and public health. Some evolutionary technologies, such as population genetics, serial transfer production of live vaccines, and phylogenetic analysis, have been widely applied. Other areas, such as infectious disease and aging research, illustrate the dramatic recent progress made possible by evolutionary insights. In still other areas, such as epidemiology, psychiatry, and understanding the regulation of bodily defenses, applying evolutionary principles remains an open opportunity. In addition to the utility of specific applications, an evolutionary perspective fundamentally challenges the prevalent but fundamentally incorrect metaphor of the body as a machine designed by an engineer. Bodies are vulnerable to disease - and remarkably resilient - precisely because they are not machines built from a plan. They are, instead, bundles of compromises shaped by natural selection in small increments to maximize reproduction, not health. Understanding the body as a product of natural selection, not design, offers new research questions and a framework for making medical education more coherent. We conclude with recommendations for actions that would better connect evolutionary biology and medicine in ways that will benefit public health. It is our hope that faculty and students will send this article to their undergraduate and medical school Deans, and that this will initiate discussions about the gap, the great opportunity, and action plans to bring the full power of evolutionary biology to bear on human health problems.
The Unrecognized Crisis: Library Reference Service at the Crossroads.
ERIC Educational Resources Information Center
Hernon, Peter
1986-01-01
Briefly describes fundamental problems with library reference service based on the findings of 20 separate studies using unobtrusive tests. An emphasis on quality of service, realistic goals, and effective managerial strategies are identified as possible resolutions of these problems. (CLB)
The ROE is divided into 5 themes: Air, Water, Land, Human Exposure and Health and Ecological Condition. From these themes, the report indicators address fundamental questions that the ROE attempts to answer. For ecological condition there are 5 questions.
Wang, Luda; Boutilier, Michael S H; Kidambi, Piran R; Jang, Doojoon; Hadjiconstantinou, Nicolas G; Karnik, Rohit
2017-06-06
Graphene and other two-dimensional materials offer a new approach to controlling mass transport at the nanoscale. These materials can sustain nanoscale pores in their rigid lattices and due to their minimum possible material thickness, high mechanical strength and chemical robustness, they could be used to address persistent challenges in membrane separations. Here we discuss theoretical and experimental developments in the emerging field of nanoporous atomically thin membranes, focusing on the fundamental mechanisms of gas- and liquid-phase transport, membrane fabrication techniques and advances towards practical application. We highlight potential functional characteristics of the membranes and discuss applications where they are expected to offer advantages. Finally, we outline the major scientific questions and technological challenges that need to be addressed to bridge the gap from theoretical simulations and proof-of-concept experiments to real-world applications.
The Right Network for the Right Problem
ERIC Educational Resources Information Center
Gomez, Louis M.; Russell, Jennifer L.; Bryk, Anthony S.; LeMahieu, Paul G.; Mejia, Eva M.
2016-01-01
Educators are realizing that individuals working in isolation can't adequately address the teaching and learning problems that face us today. Collective action networks are needed. Sharing networks use collective energy to support individual action and agency, whereas execution networks typically address complex problems that require sustained…
Collaborative research in medical education: a discussion of theory and practice.
O'Sullivan, Patricia S; Stoddard, Hugh A; Kalishman, Summers
2010-12-01
Medical education researchers are inherently collaborators. This paper presents a discussion of theoretical frameworks, issues and challenges around collaborative research to prepare medical education researchers to enter into successful collaborations. It gives emphasis to the conceptual issues associated with collaborative research and applies these to medical education research. Although not a systematic literature review, the paper provides a rich discussion of issues which medical education researchers might consider when undertaking collaborative studies. Building on the work of others, we have classified collaborative research in three dimensions according to: the number of administrative units represented; the number of academic fields present, and the manner in which knowledge is created. Although some literature on collaboration focuses on the more traditional positivist perspective and emphasises outcomes, other literature comes from the constructivist framework, in which research is not driven by hypotheses and the approaches emphasised, but by the interaction between investigator and subject. Collaborations are more effective when participants overtly clarify their motivations, values, definitions of appropriate data and accepted methodologies. These should be agreed upon prior to commencing a study. The way we currently educate researchers should be restructured if we want them to be able to undertake interdisciplinary research. Despite calls for researchers to be educated differently, most training programmes for developing researchers have demonstrated a limited, if not contrary, response to these calls. Collaborative research in medical education should be driven by the problem being investigated, by the new knowledge gained and by the interpersonal interactions that may be achieved. Success rests on recognising that many of the research problems we, as medical educators, address are fundamentally interdisciplinary in nature. This represents a transition to bridge the dichotomy often presented in medical education between theory building and addressing practical needs. © Blackwell Publishing Ltd 2010.
Jeffrey J. Brooks; Alexander N. Bujak; Joseph G. Champ; Daniel R. Williams
2006-01-01
We reviewed, annotated, and organized recent social science research and developed a framework for addressing the wildland fire social problem. We annotated articles related to three topic areas or factors, which are critical for understanding collective action, particularly in the wildland-urban interface. These factors are collaborative capacity, problem framing, and...
ERIC Educational Resources Information Center
Canadian Association of University Teachers, 2017
2017-01-01
Canadian Association of University Teachers (CAUT) welcomes the report of the Advisory Panel on Federal Support for Fundamental Science "the Panel". It is a thoughtful and comprehensive study that correctly diagnoses problems that have plagued basic science for over a decade. The Panel's recommendations, if implemented, will chart a…
ERIC Educational Resources Information Center
Kim, Chung-Il; Lee, Kang-Yi
2016-01-01
Early childhood obesity is a serious worldwide problem, and fundamental movement skills (FMS) are very important factors in human movement. Thus, several advanced studies have examined the associations between FMS and body mass index (BMI). The purpose of this study was to investigate BMI and FMS (locomotion and object control skills) in Korean…
Modelling of RR Lyrae instability strips
NASA Astrophysics Data System (ADS)
Szabo, Robert; Csubry, Zoltan
2001-02-01
Recent studies indicates that the slope of the empirical blue edge of the RR Lyrae fundamental mode instability strip is irreconcilable with the theoretical blue edges. Nonlinear hydrodynamical pulsational code involving turbulent convection was used to follow fundamental/first overtone mode selection mechanism. This method combined with the results of horizontal branch evolutionary computations was applied to rethink the problem.
ERIC Educational Resources Information Center
Epstein, Baila
2016-01-01
Background: Clinical problem-solving is fundamental to the role of the speech-language pathologist in both the diagnostic and treatment processes. The problem-solving often involves collaboration with clients and their families, supervisors, and other professionals. Considering the importance of cooperative problem-solving in the profession,…
Is Student Knowledge of Anatomy Affected by a Problem-Based Learning Approach? A Review
ERIC Educational Resources Information Center
Williams, Jonathan M.
2014-01-01
A fundamental understanding of anatomy is critical for students on many health science courses. It has been suggested that a problem-based approach to learning anatomy may result in deficits in foundation knowledge. The aim of this review is to compare traditional didactic methods with problem-based learning methods for obtaining anatomy…
A Dynamic Process Model for Optimizing the Hospital Environment Cash-Flow
NASA Astrophysics Data System (ADS)
Pater, Flavius; Rosu, Serban
2011-09-01
In this article is presented a new approach to some fundamental techniques of solving dynamic programming problems with the use of functional equations. We will analyze the problem of minimizing the cost of treatment in a hospital environment. Mathematical modeling of this process leads to an optimal control problem with a finite horizon.
Learning Analysis of K-12 Students' Online Problem Solving: A Three-Stage Assessment Approach
ERIC Educational Resources Information Center
Hu, Yiling; Wu, Bian; Gu, Xiaoqing
2017-01-01
Problem solving is considered a fundamental human skill. However, large-scale assessment of problem solving in K-12 education remains a challenging task. Researchers have argued for the development of an enhanced assessment approach through joint effort from multiple disciplines. In this study, a three-stage approach based on an evidence-centered…
Rezk, Amgad R; Ramesan, Shwathy; Yeo, Leslie Y
2018-01-30
The microarray titre plate remains a fundamental workhorse in genomic, proteomic and cellomic analyses that underpin the drug discovery process. Nevertheless, liquid handling technologies for sample dispensing, processing and transfer have not progressed significantly beyond conventional robotic micropipetting techniques, which are not only at their fundamental sample size limit, but are also prone to mechanical failure and contamination. This is because alternative technologies to date suffer from a number of constraints, mainly their limitation to carry out only a single liquid operation such as dispensing or mixing at a given time, and their inability to address individual wells, particularly at high throughput. Here, we demonstrate the possibility for true sequential or simultaneous single- and multi-well addressability in a 96-well plate using a reconfigurable modular platform from which MHz-order hybrid surface and bulk acoustic waves can be coupled to drive a variety of microfluidic modes including mixing, sample preconcentration and droplet jetting/ejection in individual or multiple wells on demand, thus constituting a highly versatile yet simple setup capable of improving the functionality of existing laboratory protocols and processes.
False discovery rates in spectral identification.
Jeong, Kyowon; Kim, Sangtae; Bandeira, Nuno
2012-01-01
Automated database search engines are one of the fundamental engines of high-throughput proteomics enabling daily identifications of hundreds of thousands of peptides and proteins from tandem mass (MS/MS) spectrometry data. Nevertheless, this automation also makes it humanly impossible to manually validate the vast lists of resulting identifications from such high-throughput searches. This challenge is usually addressed by using a Target-Decoy Approach (TDA) to impose an empirical False Discovery Rate (FDR) at a pre-determined threshold x% with the expectation that at most x% of the returned identifications would be false positives. But despite the fundamental importance of FDR estimates in ensuring the utility of large lists of identifications, there is surprisingly little consensus on exactly how TDA should be applied to minimize the chances of biased FDR estimates. In fact, since less rigorous TDA/FDR estimates tend to result in more identifications (at higher 'true' FDR), there is often little incentive to enforce strict TDA/FDR procedures in studies where the major metric of success is the size of the list of identifications and there are no follow up studies imposing hard cost constraints on the number of reported false positives. Here we address the problem of the accuracy of TDA estimates of empirical FDR. Using MS/MS spectra from samples where we were able to define a factual FDR estimator of 'true' FDR we evaluate several popular variants of the TDA procedure in a variety of database search contexts. We show that the fraction of false identifications can sometimes be over 10× higher than reported and may be unavoidably high for certain types of searches. In addition, we further report that the two-pass search strategy seems the most promising database search strategy. While unavoidably constrained by the particulars of any specific evaluation dataset, our observations support a series of recommendations towards maximizing the number of resulting identifications while controlling database searches with robust and reproducible TDA estimation of empirical FDR.
Coherence degree of the fundamental Bessel-Gaussian beam in turbulent atmosphere
NASA Astrophysics Data System (ADS)
Lukin, Igor P.
2017-11-01
In this article the coherence of a fundamental Bessel-Gaussian optical beam in turbulent atmosphere is analyzed. The problem analysis is based on the solution of the equation for the transverse second-order mutual coherence function of a fundamental Bessel-Gaussian optical beam of optical radiation. The behavior of a coherence degree of a fundamental Bessel-Gaussian optical beam depending on parameters of an optical beam and characteristics of turbulent atmosphere is examined. It was revealed that at low levels of fluctuations in turbulent atmosphere the coherence degree of a fundamental Bessel-Gaussian optical beam has the characteristic oscillating appearance. At high levels of fluctuations in turbulent atmosphere the coherence degree of a fundamental Bessel-Gaussian optical beam is described by an one-scale decreasing curve which in process of increase of level of fluctuations on a line of formation of a laser beam becomes closer to the same characteristic of a spherical optical wave.
NASA Astrophysics Data System (ADS)
Imai, Emiko; Katagiri, Yoshitada; Seki, Keiko; Kawamata, Toshio
2011-06-01
We present a neural model of the production of modulated speech streams in the brain, referred to as prosody, which indicates the limbic structure essential for producing prosody both linguistically and emotionally. This model suggests that activating the fundamental brain including monoamine neurons at the basal ganglia will potentially contribute to helping patients with prosodic disorders coming from functional defects of the fundamental brain to overcome their speech problem. To establish effective clinical treatment for such prosodic disorders, we examine how sounds affect the fundamental activity by using electroencephalographic measurements. Throughout examinations with various melodious sounds, we found that some melodies with lilting rhythms successfully give rise to the fast alpha rhythms at the electroencephalogram which reflect the fundamental brain activity without any negative feelings.
ERIC Educational Resources Information Center
Nelson, Tenneisha; Squires, Vicki
2017-01-01
Organizations are faced with solving increasingly complex problems. Addressing these issues requires effective leadership that can facilitate a collaborative problem solving approach where multiple perspectives are leveraged. In this conceptual paper, we critique the effectiveness of earlier leadership models in tackling complex organizational…
State of the art and future needs in S.I. engine combustion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maly, R.R.
1994-12-31
The paper reviews, in short, the state-of-the-art in SI engine combustion by addressing its main features: mixture formation, ignition, homogeneous combustion, pollutant formation, knock, and engine modeling. Necessary links between fundamental and practical work are clarified and discussed along with advanced diagnostics and simulation tools. The needs for further work are identified, the most important one being integration of all fundamental and practical resources to meet R and D requirements for future engines.
Zhdanko, I M; Pisarev, A A; Vorona, A A; Lapa, V V; Khomenko, M N
2015-01-01
The article discloses postulates of theoretical concepts that make the methodological basis for addressing the real-world aviation medicine challenges of humanizing aviator's environment, labor content and means, and health and performance maintenance. Under consideration are focal fundamental and practical issues arising with the technological progress in aviation and dealt with at the AF CRI Research Test Center of Aerospace Medicine and Military Ergonomics.
Visualizing the Fundamental Physics of Rapid Earth Penetration Using Transparent Soils
2015-03-01
L R E P O R T DTRA-TR-14-80 Visualizing the Fundamental Physics of Rapid Earth Penetration Using Transparent Soils Approved for public... ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS...dose absorbed) roentgen shake slug torr (mm Hg, 0 C) *The bacquerel (Bq) is the SI unit of radioactivity ; 1 Bq = 1 event/s. **The Gray (GY) is