NASA Astrophysics Data System (ADS)
Konopleva, Nelly
2017-03-01
Fundamental physical theory axiomatics is closely connected with methods of experimental measurements. The difference between the theories using global and local symmetries is explained. It is shown that symmetry group localization leads not only to the change of the relativity principle, but to the fundamental modification of experimental programs testing physical theory predictions. It is noticed that any fundamental physical theory must be consistent with the measurement procedures employed for its testing. These ideas are illustrated by events of my biography connected with Yang-Mills theory transformation from an ordinary phenomenological model to a fundamental physical theory based on local symmetry principles like the Einsteinian General Relativity. Baldin position in this situation is demonstrated.
The Role of Fisher Information Theory in the Development of Fundamental Laws in Physical Chemistry
ERIC Educational Resources Information Center
Honig, J. M.
2009-01-01
The unifying principle that involves rendering the Fisher information measure an extremum is reviewed. It is shown that with this principle, in conjunction with appropriate constraints, a large number of fundamental laws can be derived from a common source in a unified manner. The resulting economy of thought pertaining to fundamental principles…
Quantum correlations are tightly bound by the exclusivity principle.
Yan, Bin
2013-06-28
It is a fundamental problem in physics of what principle limits the correlations as predicted by our current description of nature, based on quantum mechanics. One possible explanation is the "global exclusivity" principle recently discussed in Phys. Rev. Lett. 110, 060402 (2013). In this work we show that this principle actually has a much stronger restriction on the probability distribution. We provide a tight constraint inequality imposed by this principle and prove that this principle singles out quantum correlations in scenarios represented by any graph. Our result implies that the exclusivity principle might be one of the fundamental principles of nature.
Detection principle of gravitational wave detectors
NASA Astrophysics Data System (ADS)
Congedo, Giuseppe
With the first two detections in late 2015, astrophysics has officially entered into the new era of gravitational wave (GW) observations. Since then, much has been going on in the field with a lot of work focusing on the observations and implications for astrophysics and tests of general relativity in the strong regime. However, much less is understood about how gravitational detectors really work at their fundamental level. For decades, the response to incoming signals has been customarily calculated using the very same physical principle, which has proved so successful in the first detections. In this paper, we review the physical principle that is behind such a detection at the very fundamental level, and we try to highlight the peculiar subtleties that make it so hard in practice. We will then mention how detectors are built starting from this fundamental measurement element.
ERIC Educational Resources Information Center
Wiener, Gerfried J.; Schmeling, Sascha M.; Hopf, Martin
2015-01-01
This study introduces a teaching concept based on the Standard Model of particle physics. It comprises two consecutive chapters--elementary particles and fundamental interactions. The rationale of this concept is that the fundamental principles of particle physics can run as the golden thread through the whole physics curriculum. The design…
USDA-ARS?s Scientific Manuscript database
In this chapter, definitions of dielectric properties, or permittivity, of materials and a brief discussion of the fundamental principles governing their behavior with respect to influencing factors are presented. The basic physics of the influence of frequency of the electric fields and temperatur...
Nomura, Yasunori; Salzetta, Nico
2016-08-04
The firewall paradox for black holes is often viewed as indicating a conflict between unitarity and the equivalence principle. We elucidate how the paradox manifests as a limitation of semiclassical theory, rather than presents a conflict between fundamental principles. Two principal features of the fundamental and semiclassical theories address two versions of the paradox: the entanglement and typicality arguments. First, the physical Hilbert space describing excitations on a fixed black hole background in the semiclassical theory is exponentially smaller than the number of physical states in the fundamental theory of quantum gravity. Second, in addition to the Hilbert space formore » physical excitations, the semiclassical theory possesses an unphysically large Fock space built by creation and annihilation operators on the fixed black hole background. Understanding these features not only eliminates the necessity of firewalls but also leads to a new picture of Hawking emission contrasting pair creation at the horizon.« less
Mueller, Michael J; Maluf, Katrina S
2002-04-01
The purpose of this perspective is to present a general theory--the Physical Stress Theory (PST). The basic premise of the PST is that changes in the relative level of physical stress cause a predictable adaptive response in all biological tissue. Specific thresholds define the upper and lower stress levels for each characteristic tissue response. Qualitatively, the 5 tissue responses to physical stress are decreased stress tolerance (eg, atrophy), maintenance, increased stress tolerance (eg, hypertrophy), injury, and death. Fundamental principles of tissue adaptation to physical stress are described that, in the authors' opinion, can be used to help guide physical therapy practice, education, and research. The description of fundamental principles is followed by a review of selected literature describing adaptation to physical stress for each of the 4 main organ systems described in the Guide to Physical Therapist Practice (ie, cardiovascular/pulmonary, integumentary, musculoskeletal, neuromuscular). Limitations and implications of the PST for practice, research, and education are presented.
NASA Technical Reports Server (NTRS)
Weaver, David
2008-01-01
Effectively communicate qualitative and quantitative information orally and in writing. Explain the application of fundamental physical principles to various physical phenomena. Apply appropriate problem-solving techniques to practical and meaningful problems using graphical, mathematical, and written modeling tools. Work effectively in collaborative groups.
Microscopic Description of Le Chatelier's Principle
ERIC Educational Resources Information Center
Novak, Igor
2005-01-01
A simple approach that "demystifies" Le Chatelier's principle (LCP) and simulates students to think about fundamental physical background behind the well-known principles is presented. The approach uses microscopic descriptors of matter like energy levels and populations and does not require any assumption about the fixed amount of substance being…
Water Balance Covers For Waste Containment: Principles and Practice
Water Balance Covers for Waste Containment: Principles and Practices introduces water balance covers and compares them with conventional approaches to waste containment. The authors provided detailed analysis of the fundamentals of soil physics and design issues, introduce appl...
Looking forward, not back: Supporting structuralism in the present.
McKenzie, Kerry
2016-10-01
The view that the fundamental kind properties are intrinsic properties enjoys reflexive endorsement by most metaphysicians of science. But ontic structural realists deny that there are any fundamental intrinsic properties at all. Given that structuralists distrust intuition as a guide to truth, and given that we currently lack a fundamental physical theory that we could consult instead to order settle the issue, it might seem as if there is simply nowhere for this debate to go at present. However, I will argue that there exists an as-yet untapped resource for arguing for ontic structuralism - namely, the way that fundamentality is conceptualized in our most fundamental physical frameworks. By arguing that physical objects must be subject to the 'Goldilock's principle' if they are to count as fundamental at all, I argue that we can no longer view the majority of properties defining them as intrinsic. As such, ontic structural realism can be regarded as the most promising metaphysics for fundamental physics, and that this is so even though we do not yet claim to know precisely what that fundamental physics is. Copyright © 2016 Elsevier Ltd. All rights reserved.
Many-Worlds Interpretation of Quantum Theory and Mesoscopic Anthropic Principle
NASA Astrophysics Data System (ADS)
Kamenshchik, A. Yu.; Teryaev, O. V.
2008-10-01
We suggest to combine the Anthropic Principle with Many-Worlds Interpretation of Quantum Theory. Realizing the multiplicity of worlds it provides an opportunity of explanation of some important events which are assumed to be extremely improbable. The Mesoscopic Anthropic Principle suggested here is aimed to explain appearance of such events which are necessary for emergence of Life and Mind. It is complementary to Cosmological Anthropic Principle explaining the fine tuning of fundamental constants. We briefly discuss various possible applications of Mesoscopic Anthropic Principle including the Solar Eclipses and assembling of complex molecules. Besides, we address the problem of Time's Arrow in the framework of Many-World Interpretation. We suggest the recipe for disentangling of quantities defined by fundamental physical laws and by an anthropic selection.
Lessons from the GP-B Experience for Future Fundamental Physics Missions in Space
NASA Technical Reports Server (NTRS)
Kolodziejczak, Jeffery
2006-01-01
Gravity Probe B launched in April 2004 and completed its science data collection in September 2005, with the objective of sub-milliarcsec measurement of two General Relativistic effects on the spin axis orientation of orbiting gyroscopes. Much of the technology required by GP-B has potential application in future missions intended to make precision measurements. The philosophical approach and experiment design principles developed for GP-B are equally adaptable to these mission concepts. This talk will discuss GP-B's experimental approach and the technological and philosophical lessons learned that apply to future experiments in fundamental physics. Measurement of fundamental constants to high precision, probes of short-range forces, searches for equivalence principle violations, and detection of gravitational waves are examples of concepts and missions that will benefit kern GP-B's experience.
NASA Astrophysics Data System (ADS)
Overduin, James; Everitt, Francis; Worden, Paul; Mester, John
2012-09-01
The Satellite Test of the Equivalence Principle (STEP) will advance experimental limits on violations of Einstein's equivalence principle from their present sensitivity of two parts in 1013 to one part in 1018 through multiple comparison of the motions of four pairs of test masses of different compositions in a drag-free earth-orbiting satellite. We describe the experiment, its current status and its potential implications for fundamental physics. Equivalence is at the heart of general relativity, our governing theory of gravity and violations are expected in most attempts to unify this theory with the other fundamental interactions of physics, as well as in many theoretical explanations for the phenomenon of dark energy in cosmology. Detection of such a violation would be equivalent to the discovery of a new force of nature. A null result would be almost as profound, pushing upper limits on any coupling between standard-model fields and the new light degrees of freedom generically predicted by these theories down to unnaturally small levels.
``From Fundamental Motives to Rational Expectation Equilibrium[REE, henceworth] of Indeterminacy''
NASA Astrophysics Data System (ADS)
Maksoed, Ssi, Wh-
For ``Principle of Indeterminacy''from Heisenberg states: ``one of the fundamental cornerstone of quantum mechanics is the Heisenberg uncertainty principle''.whereas canonically conjugate quantities can be determined simultaneously only with a characteristic indeterminacy[M. Arevalo Aguilar, et.al]. Accompanying Alfred North Whitehead conclusion in ``The Aims of Education''that mathematical symbols are artificial before new meanings given, two kinds of fundamental motives: (i) expectation-expectation, (ii) expectation-certainty inherently occurs with determinacy properties of rational expectation equilibrium(REE, henceworth)- Guido Ascari & Tizano Ropele:''Trend inflation, Taylor principle & Indeterminacy'', Kiel Institute, June 2007. Furthers, relative price expression can be compare of their α and (1 - α) configurations in the expression of possible activity. Acknowledgment to Prof[asc]. Dr. Bobby Eka Gunara for ``made a rank through physics'' denotes...
ERIC Educational Resources Information Center
Concannon, Tom
2008-01-01
Doing physics "magic shows" for the general public or for local area schools is usually an integral part of any physics department's outreach program. These demonstration shows should not only teach fundamental physics principles with "standard" demonstrations (like the rocket cart) but should also include the "wow!" types of demonstrations for…
Productive Nanosystems: The Physics of Molecular Fabrication
ERIC Educational Resources Information Center
Drexler, K. Eric
2005-01-01
Fabrication techniques are the foundation of physical technology, and are thus of fundamental interest. Physical principles indicate that nanoscale systems will be able to fabricate a wide range of structures, operating with high productivity and precise molecular control. Advanced systems of this kind will require intermediate generations of…
NASA Astrophysics Data System (ADS)
Crouch, Catherine H.; Heller, Kenneth
2014-05-01
We describe restructuring the introductory physics for life science students (IPLS) course to better support these students in using physics to understand their chosen fields. Our courses teach physics using biologically rich contexts. Specifically, we use examples in which fundamental physics contributes significantly to understanding a biological system to make explicit the value of physics to the life sciences. This requires selecting the course content to reflect the topics most relevant to biology while maintaining the fundamental disciplinary structure of physics. In addition to stressing the importance of the fundamental principles of physics, an important goal is developing students' quantitative and problem solving skills. Our guiding pedagogical framework is the cognitive apprenticeship model, in which learning occurs most effectively when students can articulate why what they are learning matters to them. In this article, we describe our courses, summarize initial assessment data, and identify needs for future research.
[The anthropic principle in biology and radiobiology].
Akif'ev, A P; Degtiarev, S V
1999-01-01
In accordance with the anthropic principle of the Universe the physical constants of fundamental particles of matter and the laws of their counteraction are those that an appearance of man and mind becomes possible and necessary. It is suggested to add some biological constants to the set of fundamental constants. With reparation of DNA as an example it was shown how a cell ran some parameters of Watson-Crick double helix. It was pointed that the concept of the anthropic principle of the Universe in its full body including biological constants is a key to developing of a unified theory of evolution of the Universe within the limits of scientific creationism.
The 4th Thermodynamic Principle?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montero Garcia, Jose de la Luz; Novoa Blanco, Jesus Francisco
2007-04-28
It should be emphasized that the 4th Principle above formulated is a thermodynamic principle and, at the same time, is mechanical-quantum and relativist, as it should inevitably be and its absence has been one of main the theoretical limitations of the physical theory until today.We show that the theoretical discovery of Dimensional Primitive Octet of Matter, the 4th Thermodynamic Principle, the Quantum Hexet of Matter, the Global Hexagonal Subsystem of Fundamental Constants of Energy and the Measurement or Connected Global Scale or Universal Existential Interval of the Matter is that it is possible to be arrived at a global formulationmore » of the four 'forces' or fundamental interactions of nature. The Einstein's golden dream is possible.« less
How Physics is Used in Video Games
ERIC Educational Resources Information Center
Bourg, David M.
2004-01-01
Modern video games use physics to achieve realistic behaviour and special effects. Everything from billiard balls, to flying debris, to tactical fighter jets is simulated in games using fundamental principles of dynamics. This article explores several examples of how physics is used in games. Further, this article describes some of the more…
NASA Astrophysics Data System (ADS)
Le Bellac, Michel
2006-03-01
Quantum physics allows us to understand the nature of the physical phenomena which govern the behavior of solids, semi-conductors, lasers, atoms, nuclei, subnuclear particles and light. In Quantum Physics, Le Bellac provides a thoroughly modern approach to this fundamental theory. Throughout the book, Le Bellac teaches the fundamentals of quantum physics using an original approach which relies primarily on an algebraic treatment and on the systematic use of symmetry principles. In addition to the standard topics such as one-dimensional potentials, angular momentum and scattering theory, the reader is introduced to more recent developments at an early stage. These include a detailed account of entangled states and their applications, the optical Bloch equations, the theory of laser cooling and of magneto-optical traps, vacuum Rabi oscillations, and an introduction to open quantum systems. This is a textbook for a modern course on quantum physics, written for advanced undergraduate and graduate students. Completely original and contemporary approach, using algebra and symmetry principles Introduces recent developments at an early stage, including many topics that cannot be found in standard textbooks. Contains 130 physically relevant exercises
An Absolute Phase Space for the Physicality of Matter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valentine, John S.
2010-12-22
We define an abstract and absolute phase space (''APS'') for sub-quantum intrinsic wave states, in three axes, each mapping directly to a duality having fundamental ontological basis. Many aspects of quantum physics emerge from the interaction algebra and a model deduced from principles of 'unique solvability' and 'identifiable entity', and we reconstruct previously abstract fundamental principles and phenomena from these new foundations. The physical model defines bosons as virtual continuous waves pairs in the APS, and fermions as real self-quantizing snapshots of those waves when simple conditions are met. The abstraction and physical model define a template for the constitutionmore » of all fermions, a template for all the standard fundamental bosons and their local interactions, in a common framework and compactified phase space for all forms of real matter and virtual vacuum energy, and a distinct algebra for observables and unobservables. To illustrate our scheme's potential, we provide examples of slit experiment variations (where the model finds theoretical basis for interference only occurring between two final sources), QCD (where we may model most attributes known to QCD, and a new view on entanglement), and we suggest approaches for other varied applications. We believe this is a viable candidate for further exploration as a foundational proposition for physics.« less
Principles of Guided Missiles and Nuclear Weapons.
ERIC Educational Resources Information Center
Naval Personnel Program Support Activity, Washington, DC.
Fundamentals of missile and nuclear weapons systems are presented in this book which is primarily prepared as the second text of a three-volume series for students of the Navy Reserve Officers' Training Corps and the Officer Candidate School. Following an introduction to guided missiles and nuclear physics, basic principles and theories are…
Photoelectroconversion by Semiconductors: A Physical Chemistry Experiment.
ERIC Educational Resources Information Center
Fan, Qinbai; And Others
1995-01-01
Presents an experiment designed to give students some experience with photochemistry, electrochemistry, and basic theories about semiconductors. Uses a liquid-junction solar cell and illustrates some fundamental physical and chemical principles related to light and electricity interconversion as well as the properties of semiconductors. (JRH)
Learning Physics in a Water Park
ERIC Educational Resources Information Center
Cabeza, Cecilia; Rubido, Nicolás; Martí, Arturo C.
2014-01-01
Entertaining and educational experiments that can be conducted in a water park, illustrating physics concepts, principles and fundamental laws, are described. These experiments are suitable for students ranging from senior secondary school to junior university level. Newton's laws of motion, Bernoulli's equation, based on the conservation of…
Dynamic principle for ensemble control tools.
Samoletov, A; Vasiev, B
2017-11-28
Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.
NASA Astrophysics Data System (ADS)
Griffiths, Robert B.
2001-11-01
Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics
Sensors, Volume 1, Fundamentals and General Aspects
NASA Astrophysics Data System (ADS)
Grandke, Thomas; Ko, Wen H.
1996-12-01
'Sensors' is the first self-contained series to deal with the whole area of sensors. It describes general aspects, technical and physical fundamentals, construction, function, applications and developments of the various types of sensors. This volume deals with the fundamentals and common principles of sensors and covers the wide areas of principles, technologies, signal processing, and applications. Contents include: Sensor Fundamentals, e.g. Sensor Parameters, Modeling, Design and Packaging; Basic Sensor Technologies, e.g. Thin and Thick Films, Integrated Magnetic Sensors, Optical Fibres and Intergrated Optics, Ceramics and Oxides; Sensor Interfaces, e.g. Signal Processing, Multisensor Signal Processing, Smart Sensors, Interface Systems; Sensor Applications, e.g. Automotive: On-board Sensors, Traffic Surveillance and Control, Home Appliances, Environmental Monitoring, etc. This volume is an indispensable reference work and text book for both specialits and newcomers, researchers and developers.
ERIC Educational Resources Information Center
Lin, Shih-Yin; Singh, Chandralekha
2011-01-01
Learning physics requires understanding the applicability of fundamental principles in a variety of contexts that share deep features. One way to help students learn physics is via analogical reasoning. Students can be taught to make an analogy between situations that are more familiar or easier to understand and another situation where the same…
Solar-System Bodies as Teaching Tools in Fundamental Physics
NASA Astrophysics Data System (ADS)
Genus, Amelia; Overduin, James
2018-01-01
We show how asteroids can be used as teaching tools in fundamental physics. Current gravitational theory assumes that all bodies fall with the same acceleration in the same gravitational field. But this assumption, known as the Equivalence Principle, is violated to some degree in nearly all theories that attempt to unify gravitation with the other fundamental forces of nature. In such theories, bodies with different compositions can fall at different rates, producing small non-Keplerian distortions in their orbits. We focus on the unique all-metal asteroid 16 Psyche as a test case. Using Kepler’s laws of planetary motion together with recent observational data on the orbital motions of Psyche and its neighbors, students are able to derive new constraints on current theories in fundamental physics. These constraints take on particular interest since NASA has just announced plans to visit Psyche in 2026.
Other ways of measuring `Big G'
NASA Astrophysics Data System (ADS)
Rothleitner, Christian
2016-03-01
In 1798, the British scientist Henry Cavendish performed the first laboratory experiment to determine the gravitational force between two massive bodies. From his result, Newton's gravitational constant, G, was calculated. Cavendish's measurement principle was the torsion balance invented by John Michell some 15 years before. During the following two centuries, more than 300 new measurements followed. Although technology - and physics - developed rapidly during this time, surprisingly, most experiments were still based on the same principle. In fact, the most accurate determination of G to date is a measurement based on the torsion balance principle. Despite the fact that G was one of the first fundamental physical constants ever measured, and despite the huge number of experiments performed on it to this day, its CODATA recommended value still has the highest standard measurement uncertainty when compared to other fundamental physical constants. Even more serious is the fact that even measurements based on the same principle often do not overlap within their attributed standard uncertainties. It must be assumed that various experiments are subject to one or more unknown biases. In this talk I will present some alternative experimental setups to the torsion balance which have been performed or proposed to measure G. Although their estimated uncertainties are often higher than most torsion balance experiments, revisiting such ideas is worthwhile. Advances in technology could offer solutions to problems which were previously insurmountable, these solutions could result in lower measurement uncertainties. New measurement principles could also help to uncover hidden systematic effects.
Teaching symmetry in the introductory physics curriculum
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, C. T.; Lederman, L. M.
Modern physics is largely defined by fundamental symmetry principles and Noether's Theorem. Yet these are not taught, or rarely mentioned, to beginning students, thus missing an opportunity to reveal that the subject of physics is as lively and contemporary as molecular biology, and as beautiful as the arts. We prescribe a symmetry module to insert into the curriculum, of a week's length.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeng, Li; Jacobsen, Stein B., E-mail: astrozeng@gmail.com, E-mail: jacobsen@neodymium.harvard.edu
In the past few years, the number of confirmed planets has grown above 2000. It is clear that they represent a diversity of structures not seen in our own solar system. In addition to very detailed interior modeling, it is valuable to have a simple analytical framework for describing planetary structures. The variational principle is a fundamental principle in physics, entailing that a physical system follows the trajectory, which minimizes its action. It is alternative to the differential equation formulation of a physical system. Applying the variational principle to the planetary interior can beautifully summarize the set of differential equationsmore » into one, which provides us some insight into the problem. From this principle, a universal mass–radius relation, an estimate of the error propagation from the equation of state to the mass–radius relation, and a form of the virial theorem applicable to planetary interiors are derived.« less
Computer-Based Tools for Inquiry in Undergraduate Classrooms: Results from the VGEE
NASA Astrophysics Data System (ADS)
Pandya, R. E.; Bramer, D. J.; Elliott, D.; Hay, K. E.; Mallaiahgari, L.; Marlino, M. R.; Middleton, D.; Ramamurhty, M. K.; Scheitlin, T.; Weingroff, M.; Wilhelmson, R.; Yoder, J.
2002-05-01
The Visual Geophysical Exploration Environment (VGEE) is a suite of computer-based tools designed to help learners connect observable, large-scale geophysical phenomena to underlying physical principles. Technologically, this connection is mediated by java-based interactive tools: a multi-dimensional visualization environment, authentic scientific data-sets, concept models that illustrate fundamental physical principles, and an interactive web-based work management system for archiving and evaluating learners' progress. Our preliminary investigations showed, however, that the tools alone are not sufficient to empower undergraduate learners; learners have trouble in organizing inquiry and using the visualization tools effectively. To address these issues, the VGEE includes an inquiry strategy and scaffolding activities that are similar to strategies used successfully in K-12 classrooms. The strategy is organized around the steps: identify, relate, explain, and integrate. In the first step, students construct visualizations from data to try to identify salient features of a particular phenomenon. They compare their previous conceptions of a phenomenon to the data examine their current knowledge and motivate investigation. Next, students use the multivariable functionality of the visualization environment to relate the different features they identified. Explain moves the learner temporarily outside the visualization to the concept models, where they explore fundamental physical principles. Finally, in integrate, learners use these fundamental principles within the visualization environment by literally placing the concept model within the visualization environment as a probe and watching it respond to larger-scale patterns. This capability, unique to the VGEE, addresses the disconnect that novice learners often experience between fundamental physics and observable phenomena. It also allows learners the opportunity to reflect on and refine their knowledge as well as anchor it within a context for long-term retention. We are implementing the VGEE in one of two otherwise identical entry-level atmospheric courses. In addition to comparing student learning and attitudes in the two courses, we are analyzing student participation with the VGEE to evaluate the effectiveness and usability of the VGEE. In particular, we seek to identify the scaffolding students need to construct physically meaningful multi-dimensional visualizations, and evaluate the effectiveness of the visualization-embedded concept-models in addressing inert knowledge. We will also examine the utility of the inquiry strategy in developing content knowledge, process-of-science knowledge, and discipline-specific investigatory skills. Our presentation will include video examples of student use to illustrate our findings.
39 Questionable Assumptions in Modern Physics
NASA Astrophysics Data System (ADS)
Volk, Greg
2009-03-01
The growing body of anomalies in new energy, low energy nuclear reactions, astrophysics, atomic physics, and entanglement, combined with the failure of the Standard Model and string theory to predict many of the most basic fundamental phenomena, all point to a need for major new paradigms. Not Band-Aids, but revolutionary new ways of conceptualizing physics, in the spirit of Thomas Kuhn's The Structure of Scientific Revolutions. This paper identifies a number of long-held, but unproven assumptions currently being challenged by an increasing number of alternative scientists. Two common themes, both with venerable histories, keep recurring in the many alternative theories being proposed: (1) Mach's Principle, and (2) toroidal, vortex particles. Matter-based Mach's Principle differs from both space-based universal frames and observer-based Einsteinian relativity. Toroidal particles, in addition to explaining electron spin and the fundamental constants, satisfy the basic requirement of Gauss's misunderstood B Law, that motion itself circulates. Though a comprehensive theory is beyond the scope of this paper, it will suggest alternatives to the long list of assumptions in context.
Evolution in students' understanding of thermal physics with increasing complexity
NASA Astrophysics Data System (ADS)
Langbeheim, Elon; Safran, Samuel A.; Livne, Shelly; Yerushalmi, Edit
2013-12-01
We analyze the development in students’ understanding of fundamental principles in the context of learning a current interdisciplinary research topic—soft matter—that was adapted to the level of high school students. The topic was introduced in a program for interested 11th grade high school students majoring in chemistry and/or physics, in an off-school setting. Soft matter was presented in a gradual increase in the degree of complexity of the phenomena as well as in the level of the quantitative analysis. We describe the evolution in students’ use of fundamental thermodynamics principles to reason about phase separation—a phenomenon that is ubiquitous in soft matter. In particular, we examine the impact of the use of free energy analysis, a common approach in soft matter, on the understanding of the fundamental principles of thermodynamics. The study used diagnostic questions and classroom observations to gauge the student’s learning. In order to gain insight on the aspects that shape the understanding of the basic principles, we focus on the responses and explanations of two case-study students who represent two trends of evolution in conceptual understanding in the group. We analyze changes in the two case studies’ management of conceptual resources used in their analysis of phase separation, and suggest how their prior knowledge and epistemological framing (a combination of their personal tendencies and their prior exposure to different learning styles) affect their conceptual evolution. Finally, we propose strategies to improve the instruction of these concepts.
Quantum Mechanics predicts evolutionary biology.
Torday, J S
2018-07-01
Nowhere are the shortcomings of conventional descriptive biology more evident than in the literature on Quantum Biology. In the on-going effort to apply Quantum Mechanics to evolutionary biology, merging Quantum Mechanics with the fundamentals of evolution as the First Principles of Physiology-namely negentropy, chemiosmosis and homeostasis-offers an authentic opportunity to understand how and why physics constitutes the basic principles of biology. Negentropy and chemiosmosis confer determinism on the unicell, whereas homeostasis constitutes Free Will because it offers a probabilistic range of physiologic set points. Similarly, on this basis several principles of Quantum Mechanics also apply directly to biology. The Pauli Exclusion Principle is both deterministic and probabilistic, whereas non-localization and the Heisenberg Uncertainty Principle are both probabilistic, providing the long-sought after ontologic and causal continuum from physics to biology and evolution as the holistic integration recognized as consciousness for the first time. Copyright © 2018 Elsevier Ltd. All rights reserved.
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
Algorithms and Array Design Criteria for Robust Imaging in Interferometry
2016-04-01
Chapter 1 Fundamentals of Optical Interferometry 1.1 Chapter Overview In this chapter, we introduce the physics -based principles of optical...particular physical structure (i.e. the existence of a certain type of loop in the interferometric graph), and provide a simple algorithm for... physical condition on aperture placement is more intuitive when considering the raw phase measurements as opposed to their closures. For this reason
Introduction to the Theory of Atmospheric Radiative Transfer
NASA Technical Reports Server (NTRS)
Buglia, J. J.
1986-01-01
The fundamental physical and mathematical principles governing the transmission of radiation through the atmosphere are presented, with emphasis on the scattering of visible and near-IR radiation. The classical two-stream, thin-atmosphere, and Eddington approximations, along with some of their offspring, are developed in detail, along with the discrete ordinates method of Chandrasekhar. The adding and doubling methods are discussed from basic principles, and references for further reading are suggested.
Equivalence of the Kelvin-Planck statement of the second law and the principle of entropy increase
NASA Astrophysics Data System (ADS)
Sarasua, L. G.; Abal, G.
2016-09-01
We present a demonstration of the equivalence between the Kelvin-Planck statement of the second law and the principle of entropy increase. Despite the fundamental importance of these two statements, a rigorous treatment to establish their equivalence is missing in standard physics textbooks. The argument is valid under very general conditions, but is simple and suited to an undergraduate course.
Radiological Protection in Medicine; OCHRONA RADIOLOGICZNA W MEDYCYNIE
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1961-01-01
A handbook is presented for the application of nuclear phenomena and techniques to medical diagnostics and treatment. A large portion is devoted to fundamental nuclear chemistry and physics, paying special attention to tracer techniques. In addition to principles of dosimetry the government regulations applicable to medical exposures are described together with a survey of the fleld of health physics. (TTT)
Nondestructive Electromagnetic Characterization of Uniaxial Materials
2014-09-18
architecture and the chemical composition.” It is also well-understood that metamaterials are man-made materials which possess physical characteristics...Negative-refraction metamaterials: fundamental principles and applications. Wiley-IEEE Press, 2005. [38] Engen , Glenn F and Cletus A Hoer. “Thru-reflect
Soft matter food physics--the physics of food and cooking.
Vilgis, Thomas A
2015-12-01
This review discusses the (soft matter) physics of food. Although food is generally not considered as a typical model system for fundamental (soft matter) physics, a number of basic principles can be found in the interplay between the basic components of foods, water, oil/fat, proteins and carbohydrates. The review starts with the introduction and behavior of food-relevant molecules and discusses food-relevant properties and applications from their fundamental (multiscale) behavior. Typical food aspects from 'hard matter systems', such as chocolates or crystalline fats, to 'soft matter' in emulsions, dough, pasta and meat are covered and can be explained on a molecular basis. An important conclusion is the point that the macroscopic properties and the perception are defined by the molecular interplay on all length and time scales.
NASA Astrophysics Data System (ADS)
Connolly, Joseph W.
The bicycle is a common, yet unique mechanical contraption in our world. In spite of this, the bike's physical and mechanical principles are understood by a select few. You do not have to be a genius to join this small group of people who understand the physics of cycling. This is your guide to fundamental principles (such as Newton's laws) and the book provides intuitive, basic explanations for the bicycle's behaviour. Each concept is introduced and illustrated with simple, everyday examples. Although cycling is viewed by most as a fun activity, and almost everyone acquires the basic skills at a young age, few understand the laws of nature that give magic to the ride. This is a closer look at some of these fun, exhilarating, and magical aspects of cycling. In the reading, you will also understand other physical principles such as motion, force, energy, power, heat, and temperature.
High sensitivity test of the Pauli Exclusion Principle for electrons with X-ray spectroscopy (VIP2)
NASA Astrophysics Data System (ADS)
Marton, Johann; VIP2 Collaboration
2015-10-01
The Pauli Exclusion Principle (PEP) is one of the most fundamental rules in physics and it has various important consequences ranging from atomic and subatomic systems to the stability of matter and stellar objects like neutron stars. Due to many observations This rule must be valid to an extremely high degree and consequently no violations were found so far. On the other hand a simple explanation of PEP is still missing. Many experimental investigations based on different assumptions were performed to search for a tiny PEP violation in various systems. The experiment VIP2 at the Gran Sasso underground laboratory (LNGS of INFN) is designed to test the PEP for electrons with high sensitivity by searching for forbidden X-ray transitions in copper atoms. This experiment aims to improve the PEP violation limit obtained with our preceding experiment VIP by orders of magnitude. The experimental method, comparison of the VIP result with different PEP searches and the present status of the VIP2 experiment will be presented. We acknowledge the support from the: HadronPhysics FP6 (506078), HadronPhysics2 FP7 (227431), HadronPhysics3 (283286) projects, EU COST Action 1006 (Fundamental Problems in Quantum Physics) and the Austrian Science Fund (FWF).
Relevance of physics to the pharmacy major.
McCall, Richard P
2007-08-15
To offer a physics course that is relevant to pharmacy students, yet still contains many of the fundamental principles of physics. The course was modified over a period of several years to include activities and examples that were related to other courses in the curriculum. Course evaluations were given to assess student attitudes about the importance of physics in the pharmacy curriculum. Students' attitudes have changed over time to appreciate the role that physics plays in their studies. Students gained confidence in their ability to learn in other courses.
Exact symmetries in the velocity fluctuations of a hot Brownian swimmer
NASA Astrophysics Data System (ADS)
Falasco, Gianmaria; Pfaller, Richard; Bregulla, Andreas P.; Cichos, Frank; Kroy, Klaus
2016-09-01
Symmetries constrain dynamics. We test this fundamental physical principle, experimentally and by molecular dynamics simulations, for a hot Janus swimmer operating far from thermal equilibrium. Our results establish scalar and vectorial steady-state fluctuation theorems and a thermodynamic uncertainty relation that link the fluctuating particle current to its entropy production at an effective temperature. A Markovian minimal model elucidates the underlying nonequilibrium physics.
Microgravity: A Teacher's Guide with Activities in Science, Mathematics, and Technology
NASA Technical Reports Server (NTRS)
Rogers, Melissa J.B.; Vogt, Gregory L.; Wargo, Michael J.
1997-01-01
Microgravity is the subject of this teacher's guide. This publication identifies the underlying mathematics, physics, and technology principles that apply to microgravity. The topics included in this publication are: 1) Microgravity Science Primer; 2) The Microgravity Environment of Orbiting Spacecraft; 3) Biotechnology; 4) Combustion Science; 5) Fluid Physics; 6) Fundamental Physics; and 7) Materials Science; 8) Microgravity Research and Exploration; and 9) Microgravity Science Space Flights. This publication also contains a glossary of selected terms.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-22
... Part IV The President Executive Order 13559--Fundamental Principles and Policymaking Criteria for... Fundamental Principles and Policymaking Criteria for Partnerships With Faith-Based and Other Neighborhood... the following: ``Sec. 2. Fundamental Principles. In formulating and implementing policies that have...
A Curriculum Guide for Electricity/Electronics.
ERIC Educational Resources Information Center
Rouse, Bill, Comp.
This curriculum guide is designed to upgrade the secondary electrical trades program in Mississippi by broadening its scope to incorporate basic electronic principles. Covered in the individual chapters of the guide are the following courses: basic electricity (occupational information, basic physics, circuit fundamentals, resistance and Ohm's…
ERIC Educational Resources Information Center
Trauth-Nare, Amy; Pavilonis, Amy; Paganucci, Julia; Ciabattoni, Gemma; Buckley, Jenni
2016-01-01
"Mechanics" is a branch of engineering and physics that deals with forces and motion, and its fundamental principles apply to all objects, whether a bouncing ball, flowing stream, bicycle, or the human body. The field of "biomechanics" applies mechanics concepts specifically to the bodies of humans (and other animals).…
Statistical Physics Approaches to Microbial Ecology
NASA Astrophysics Data System (ADS)
Mehta, Pankaj
The unprecedented ability to quantitatively measure and probe complex microbial communities has renewed interest in identifying the fundamental ecological principles governing community ecology in microbial ecosystems. Here, we present work from our group and others showing how ideas from statistical physics can help us uncover these ecological principles. Two major lessons emerge from this work. First, large, ecosystems with many species often display new, emergent ecological behaviors that are absent in small ecosystems with just a few species. To paraphrase Nobel laureate Phil Anderson, ''More is Different'', especially in community ecology. Second, the lack of trophic layer separation in microbial ecology fundamentally distinguishes microbial ecology from classical paradigms of community ecology and leads to qualitative different rules for community assembly in microbes. I illustrate these ideas using both theoretical modeling and novel new experiments on large microbial ecosystems performed by our collaborators (Joshua Goldford and Alvaro Sanchez). Work supported by Simons Investigator in MMLS and NIH R35 R35 GM119461.
Perlovsky, Leonid I
2016-01-01
Is it possible to turn psychology into "hard science"? Physics of the mind follows the fundamental methodology of physics in all areas where physics have been developed. What is common among Newtonian mechanics, statistical physics, quantum physics, thermodynamics, theory of relativity, astrophysics… and a theory of superstrings? The common among all areas of physics is a methodology of physics discussed in the first few lines of the paper. Is physics of the mind possible? Is it possible to describe the mind based on the few first principles as physics does? The mind with its variabilities and uncertainties, the mind from perception and elementary cognition to emotions and abstract ideas, to high cognition. Is it possible to turn psychology and neuroscience into "hard" sciences? The paper discusses established first principles of the mind, their mathematical formulations, and a mathematical model of the mind derived from these first principles, mechanisms of concepts, emotions, instincts, behavior, language, cognition, intuitions, conscious and unconscious, abilities for symbols, functions of the beautiful and musical emotions in cognition and evolution. Some of the theoretical predictions have been experimentally confirmed. This research won national and international awards. In addition to summarizing existing results the paper describes new development theoretical and experimental. The paper discusses unsolved theoretical problems as well as experimental challenges for future research.
Perlovsky, Leonid I.
2016-01-01
Is it possible to turn psychology into “hard science”? Physics of the mind follows the fundamental methodology of physics in all areas where physics have been developed. What is common among Newtonian mechanics, statistical physics, quantum physics, thermodynamics, theory of relativity, astrophysics… and a theory of superstrings? The common among all areas of physics is a methodology of physics discussed in the first few lines of the paper. Is physics of the mind possible? Is it possible to describe the mind based on the few first principles as physics does? The mind with its variabilities and uncertainties, the mind from perception and elementary cognition to emotions and abstract ideas, to high cognition. Is it possible to turn psychology and neuroscience into “hard” sciences? The paper discusses established first principles of the mind, their mathematical formulations, and a mathematical model of the mind derived from these first principles, mechanisms of concepts, emotions, instincts, behavior, language, cognition, intuitions, conscious and unconscious, abilities for symbols, functions of the beautiful and musical emotions in cognition and evolution. Some of the theoretical predictions have been experimentally confirmed. This research won national and international awards. In addition to summarizing existing results the paper describes new development theoretical and experimental. The paper discusses unsolved theoretical problems as well as experimental challenges for future research. PMID:27895558
NASA Astrophysics Data System (ADS)
Lowrie, William
1997-10-01
This unique textbook presents a comprehensive overview of the fundamental principles of geophysics. Unlike most geophysics textbooks, it combines both the applied and theoretical aspects to the subject. The author explains complex geophysical concepts using abundant diagrams, a simplified mathematical treatment, and easy-to-follow equations. After placing the Earth in the context of the solar system, he describes each major branch of geophysics: gravitation, seismology, dating, thermal and electrical properties, geomagnetism, paleomagnetism and geodynamics. Each chapter begins with a summary of the basic physical principles, and a brief account of each topic's historical evolution. The book will satisfy the needs of intermediate-level earth science students from a variety of backgrounds, while at the same time preparing geophysics majors for continued study at a higher level.
NASA Astrophysics Data System (ADS)
Fracassini, Massimo; Pasinetti Fracassini, Laura E.; Pasinetti, Antonio L.
1988-07-01
The Anthropic Principle, a new trend of modern cosmology, claims that the origin of life and the development of intelligent beings on the Earth is the result of highly selective biological processes, strictly tuned in the fundamental physical characteristics of the Universe. This principle could account for the failure of some programs of search for extraterrestrial intelligences (SETI) and suggests the search for strict solar analogs as a primary target for SETI strategies. In this connection, the authors have selected 22 solar analogs and discussed their choice.
Exercise Physiology. Basic Stuff Series I. I.
ERIC Educational Resources Information Center
Svoboda, Milan; And Others
The fundamentals of exercise physiology (the study of the physiological effects of bodily exertion) form the basis for this booklet designed for teachers of physical education. The scientific principles underlying the building of muscular strength and flexibility are described and illustrated. Topics covered include: (1) muscular strength,…
NASA Technical Reports Server (NTRS)
Nilsson, Per-Olof (Editor); Nordgren, Joseph (Editor)
1987-01-01
The interactions of VUV radiation with solids are explored in reviews and reports of recent theoretical and experimental investigations from the fields of atomic and molecular physics, solid-state physics, and VUV instrumentation. Topics examined include photoabsorption and photoionization, multiphoton processes, plasma physics, VUV lasers, time-resolved spectroscopy, synchrotron radiation centers, solid-state spectroscopy, and dynamical processes involving localized levels. Consideration is given to the fundamental principles of photoemission, spin-polarized photoemission, inverse photoemission, semiconductors, organic materials, and adsorbates.
Quantum enhanced feedback cooling of a mechanical oscillator using nonclassical light.
Schäfermeier, Clemens; Kerdoncuff, Hugo; Hoff, Ulrich B; Fu, Hao; Huck, Alexander; Bilek, Jan; Harris, Glen I; Bowen, Warwick P; Gehring, Tobias; Andersen, Ulrik L
2016-11-29
Laser cooling is a fundamental technique used in primary atomic frequency standards, quantum computers, quantum condensed matter physics and tests of fundamental physics, among other areas. It has been known since the early 1990s that laser cooling can, in principle, be improved by using squeezed light as an electromagnetic reservoir; while quantum feedback control using a squeezed light probe is also predicted to allow improved cooling. Here we show the implementation of quantum feedback control of a micro-mechanical oscillator using squeezed probe light. This allows quantum-enhanced feedback cooling with a measurement rate greater than it is possible with classical light, and a consequent reduction in the final oscillator temperature. Our results have significance for future applications in areas ranging from quantum information networks, to quantum-enhanced force and displacement measurements and fundamental tests of macroscopic quantum mechanics.
Soft matter food physics—the physics of food and cooking
NASA Astrophysics Data System (ADS)
Vilgis, Thomas A.
2015-12-01
This review discusses the (soft matter) physics of food. Although food is generally not considered as a typical model system for fundamental (soft matter) physics, a number of basic principles can be found in the interplay between the basic components of foods, water, oil/fat, proteins and carbohydrates. The review starts with the introduction and behavior of food-relevant molecules and discusses food-relevant properties and applications from their fundamental (multiscale) behavior. Typical food aspects from ‘hard matter systems’, such as chocolates or crystalline fats, to ‘soft matter’ in emulsions, dough, pasta and meat are covered and can be explained on a molecular basis. An important conclusion is the point that the macroscopic properties and the perception are defined by the molecular interplay on all length and time scales.
Atom Interferometry with Ultracold Quantum Gases in a Microgravity Environment
NASA Astrophysics Data System (ADS)
Williams, Jason; D'Incao, Jose; Chiow, Sheng-Wey; Yu, Nan
2015-05-01
Precision atom interferometers (AI) in space promise exciting technical capabilities for fundamental physics research, with proposals including unprecedented tests of the weak equivalence principle, precision measurements of the fine structure and gravitational constants, and detection of gravity waves and dark energy. Consequently, multiple AI-based missions have been proposed to NASA, including a dual-atomic-species interferometer that is to be integrated into the Cold Atom Laboratory (CAL) onboard the International Space Station. In this talk, I will discuss our plans and preparation at JPL for the proposed flight experiments to use the CAL facility to study the leading-order systematics expected to corrupt future high-precision measurements of fundamental physics with AIs in microgravity. The project centers on the physics of pairwise interactions and molecular dynamics in these quantum systems as a means to overcome uncontrolled shifts associated with the gravity gradient and few-particle collisions. We will further utilize the CAL AI for proof-of-principle tests of systematic mitigation and phase-readout techniques for use in the next-generation of precision metrology experiments based on AIs in microgravity. This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.
Fast Pitch Softball, Physical Education: 5551.10.
ERIC Educational Resources Information Center
Wilkie, Betty
This course outline is a guide for teaching the principles and fundamentals of softball in grades 7-12. The course format includes lectures, skills practice, films, game situations, class tournaments, and tests that focus on mastery of skills, understanding rules, development of techniques using team strategy and tactics, and class competition.…
Fluid Power, Rate Training Manual.
ERIC Educational Resources Information Center
Bureau of Naval Personnel, Washington, DC.
Fundamentals of hydraulics and pneumatics are presented in this manual, prepared for regular navy and naval reserve personnel who are seeking advancement to Petty Officer Third Class. The history of applications of compressed fluids is described in connection with physical principles. Selection of types of liquids and gases is discussed with a…
Thermodynamical Arguments against Evolution
ERIC Educational Resources Information Center
Rosenhouse, Jason
2017-01-01
The argument that the second law of thermodynamics contradicts the theory of evolution has recently been revived by anti-evolutionists. In its basic form, the argument asserts that whereas evolution implies that there has been an increase in biological complexity over time, the second law, a fundamental principle of physics, shows this to be…
Pandey, Shilpa; Hakky, Michael; Kwak, Ellie; Jara, Hernan; Geyer, Carl A; Erbay, Sami H
2013-05-01
Neurovascular imaging studies are routinely used for the assessment of headaches and changes in mental status, stroke workup, and evaluation of the arteriovenous structures of the head and neck. These imaging studies are being performed with greater frequency as the aging population continues to increase. Magnetic resonance (MR) angiographic imaging techniques are helpful in this setting. However, mastering these techniques requires an in-depth understanding of the basic principles of physics, complex flow patterns, and the correlation of MR angiographic findings with conventional MR imaging findings. More than one imaging technique may be used to solve difficult cases, with each technique contributing unique information. Unfortunately, incorporating findings obtained with multiple imaging modalities may add to the diagnostic challenge. To ensure diagnostic accuracy, it is essential that the radiologist carefully evaluate the details provided by these modalities in light of basic physics principles, the fundamentals of various imaging techniques, and common neurovascular imaging pitfalls. ©RSNA, 2013.
The Role of Geophysics/Geology in the Environmental Discourse
NASA Astrophysics Data System (ADS)
Pfannkuch, H. O.
2013-12-01
Environmental problems are created by interaction between the Anthroposphere and Geosphere. Principles and laws governing behavior and interaction between them have to be fully understood to properly address environmental problems. A particular problem arises by inadequate communication between practitioners and/or decision makers in each sphere. A perfect analysis or solution in the Geosphere based solely on geophysical, geochemical principles will go nowhere if institutional, socio economic principles are ignored, or vice versa: no matter how well socio-economic relations are used in the Anthroposphere if they violate basic laws of physics . Two conceptual representations of the environment system are: Nöosphere with three domains: Physical, Institutional, Symbolic and their interactions. It is where environmental problems arise, decisions are made and implemented. The Physical Domain comprises physical, chemical, biological, geopsphere realities. Problems are treated by the scientific method. The Institutional Domain with economy, sociology, administration and political institutions, solves by negotiation (vote, ballot). The elements of the Symbolic Domain. spiritual, moral, religious, esthetic principles are revealed. All are intimately connected and interdependent. Activity in one affects the state of the others. A particularly strong and effective interactive relation exists between the Physical and the Institutional domains with regards to environmental problem definition, analysis and resolution. Hierarchic structure of interaction pyramid. Geosphere, Biosphere and Anthroposphere are open systems and are organized in successive levels forming a pyramid shape or aspect. The Geosphere forms the bottom level, the Anthroposphere the top. One fundamental attribute is that level (n) is limited by the restrictions obtaining in level (n-1), and conversely, level (n) represents the restrictions and limitations for level (n+1). In the environmental discourse this structural aspect is often overlooked which leads to two major sets of fallacies: 1. Discourse takes place across hierarchic levels with the unstated assumption that from the viewpoint of level (n) the same conditions, rules, equations and models hold in level (n-1) as on level (n) and are similarly compatible and follow the same rules. This leads to misunderstanding or misrepresentation of what analysis, modeling and solution methods at this level would be appropriate. 2. The fact that the bottom Geosphere level is the base level onto which all other levels are stacked including the topmost Anthroposphere. Each of the successive layers is restricted by the limitations of the Geosphere layer. Institutional and physical scientific realities both have to realize that solutions or redress are not solely within their domain. No matter what the economic or socio-political preferences might be, they cannot be implemented by violating fundamental physical, geological geo-ecological principles, nor can the physical world ignore currently acceptable principles of the institutional and symbolic realities. The role of Geophysics/Geology in the environmental discourse is to clearly state and apply the physical and thermodynamic principles to the Geosphere and Noösphere.
47 CFR 36.2 - Fundamental principles underlying procedures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 2 2010-10-01 2010-10-01 false Fundamental principles underlying procedures... Fundamental principles underlying procedures. (a) The following general principles underlie the procedures... operating forces on a unit basis (e.g., conversation-minute-kilometers per message, weighted standard work...
Scaling laws for ignition at the National Ignition Facility from first principles.
Cheng, Baolian; Kwan, Thomas J T; Wang, Yi-Ming; Batha, Steven H
2013-10-01
We have developed an analytical physics model from fundamental physics principles and used the reduced one-dimensional model to derive a thermonuclear ignition criterion and implosion energy scaling laws applicable to inertial confinement fusion capsules. The scaling laws relate the fuel pressure and the minimum implosion energy required for ignition to the peak implosion velocity and the equation of state of the pusher and the hot fuel. When a specific low-entropy adiabat path is used for the cold fuel, our scaling laws recover the ignition threshold factor dependence on the implosion velocity, but when a high-entropy adiabat path is chosen, the model agrees with recent measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freitag, Mark A.
2001-12-31
The major title of this dissertation, 'From first principles,' is a phase often heard in the study of thermodynamics and quantum mechanics. These words embody a powerful idea in the physical sciences; namely, that it is possible to distill the complexities of nature into a set of simple, well defined mathematical laws from which specific relations can then be derived . In thermodynamics, these fundamental laws are immediately familiar to the physical scientist by their numerical order: the First, Second and Third Laws. However, the subject of the present volume is quantum mechanics-specifically, non-relativistic quantum mechanics, which is appropriate formore » most systems of chemical interest.« less
Hulteen, Ryan M; Morgan, Philip J; Barnett, Lisa M; Stodden, David F; Lubans, David R
2018-03-09
Evidence supports a positive association between competence in fundamental movement skills (e.g., kicking, jumping) and physical activity in young people. Whilst important, fundamental movement skills do not reflect the broad diversity of skills utilized in physical activity pursuits across the lifespan. Debate surrounds the question of what are the most salient skills to be learned which facilitate physical activity participation across the lifespan. In this paper, it is proposed that the term 'fundamental movement skills' be replaced with 'foundational movement skills'. The term 'foundational movement skills' better reflects the broad range of movement forms that increase in complexity and specificity and can be applied in a variety of settings. Thus, 'foundational movement skills' includes both traditionally conceptualized 'fundamental' movement skills and other skills (e.g., bodyweight squat, cycling, swimming strokes) that support physical activity engagement across the lifespan. A proposed conceptual model outlines how foundational movement skill competency can provide a direct or indirect pathway, via specialized movement skills, to a lifetime of physical activity. Foundational movement skill development is hypothesized to vary according to culture and/or geographical location. Further, skill development may be hindered or enhanced by physical (i.e., fitness, weight status) and psychological (i.e., perceived competence, self-efficacy) attributes. This conceptual model may advance the application of motor development principles within the public health domain. Additionally, it promotes the continued development of human movement in the context of how it leads to skillful performance and how movement skill development supports and maintains a lifetime of physical activity engagement.
Adaptation of Acoustic Model Experiments of STM via Smartphones and Tablets
ERIC Educational Resources Information Center
Thees, Michael; Hochberg, Katrin; Kuhn, Jochen; Aeschlimann, Martin
2017-01-01
The importance of Scanning Tunneling Microscopy (STM) in today's research and industry leads to the question of how to include such a key technology in physics education. Manfred Euler has developed an acoustic model experiment to illustrate the fundamental measuring principles based on an analogy between quantum mechanics and acoustics. Based on…
Quantum information aspects of noncommutative quantum mechanics
NASA Astrophysics Data System (ADS)
Bertolami, Orfeu; Bernardini, Alex E.; Leal, Pedro
2018-01-01
Some fundamental aspects related with the construction of Robertson-Schrödinger-like uncertainty-principle inequalities are reported in order to provide an overall description of quantumness, separability and nonlocality of quantum systems in the noncommutative phase-space. Some consequences of the deformed noncommutative algebra are also considered in physical systems of interest.
We Better Not Vote on It: Public Hostility toward Freedom of Expression.
ERIC Educational Resources Information Center
Kane, Peter E.
Articles of The Bill of Rights, although comprising the fundamental principles of American society, are often opposed by many people on varying grounds. For example, many people support physical abuses by law enforcement officials, even though they might violate constitutional rights. The First Amendment, simple in original wording, has resulted…
Modeling the Compact Disc Read System in Lab
ERIC Educational Resources Information Center
Hinaus, Brad; Veum, Mick
2009-01-01
One of the great, engaging aspects of physics is its application to everyday technology. The compact disc player is an example of one such technology that applies fundamental principles from optics in order to efficiently store and quickly retrieve information. We have created a lab in which students use simple optical components to assemble a…
Incoherent scatter radar observations of the ionosphere
NASA Technical Reports Server (NTRS)
Hagfors, Tor
1989-01-01
Incoherent scatter radar (ISR) has become the most powerful means of studying the ionosphere from the ground. Many of the ideas and methods underlying the troposphere and stratosphere (ST) radars have been taken over from ISR. Whereas the theory of refractive index fluctuations in the lower atmosphere, depending as it does on turbulence, is poorly understood, the theory of the refractivity fluctuations in the ionosphere, which depend on thermal fluctuations, is known in great detail. The underlying theory is one of the most successful theories in plasma physics, and allows for many detailed investigations of a number of parameters such as electron density, electron temperature, ion temperature, electron mean velocity, and ion mean velocity as well as parameters pertaining to composition, neutral density and others. Here, the author reviews the fundamental processes involved in the scattering from a plasma undergoing thermal or near thermal fluctuations in density. The fundamental scattering properties of the plasma to the physical parameters characterizing them from first principles. He does not discuss the observation process itself, as the observational principles are quite similar whether they are applied to a neutral gas or a fluctuating plasma.
Levitated Optomechanics for Fundamental Physics
NASA Astrophysics Data System (ADS)
Rashid, Muddassar; Bateman, James; Vovrosh, Jamie; Hempston, David; Ulbricht, Hendrik
2015-05-01
Optomechanics with levitated nano- and microparticles is believed to form a platform for testing fundamental principles of quantum physics, as well as find applications in sensing. We will report on a new scheme to trap nanoparticles, which is based on a parabolic mirror with a numerical aperture of 1. Combined with achromatic focussing, the setup is a cheap and readily straightforward solution to trapping nanoparticles for further study. Here, we report on the latest progress made in experimentation with levitated nanoparticles; these include the trapping of 100 nm nanodiamonds (with NV-centres) down to 1 mbar as well as the trapping of 50 nm Silica spheres down to 10?4 mbar without any form of feedback cooling. We will also report on the progress to implement feedback stabilisation of the centre of mass motion of the trapped particle using digital electronics. Finally, we argue that such a stabilised particle trap can be the particle source for a nanoparticle matterwave interferometer. We will present our Talbot interferometer scheme, which holds promise to test the quantum superposition principle in the new mass range of 106 amu. EPSRC, John Templeton Foundation.
Ancient Cosmology, superfine structure of the Universe and Anthropological Principle
NASA Astrophysics Data System (ADS)
Arakelyan, Hrant; Vardanyan, Susan
2015-07-01
The modern cosmology by its spirit, conception of the Big Bang is closer to the ancient cosmology, than to the cosmological paradigm of the XIX century. Repeating the speculations of the ancients, but using at the same time subtle mathematical methods and relying on the steadily accumulating empirical material, the modern theory tends to a quantitative description of nature, in which increasing role are playing the numerical ratios between the physical constants. The detailed analysis of the influence of the numerical values -- of physical quantities on the physical state of the universe revealed amazing relations called fine and hyperfine tuning. In order to explain, why the observable universe comes to be a certain set of interrelated fundamental parameters, in fact a speculative anthropic principle was proposed, which focuses on the fact of the existence of sentient beings.
OBPR Product Lines, Human Research Initiative, and Physics Roadmap for Exploration
NASA Technical Reports Server (NTRS)
Israelsson, Ulf
2004-01-01
The pace of change has increased at NASA. OBPR s focus is now on the Human interface as it relates to the new Exploration vision. The fundamental physics community must demonstrate how we can contribute. Many opportunities exist for physicists to participate in addressing NASA's cross-disciplinary exploration challenges: a) Physicists can contribute to elucidating basic operating principles for complex biological systems; b) Physics technologies can contribute to developing miniature sensors and systems required for manned missions to Mars. NASA Codes other than OBPR may be viable sources of funding for physics research.
Exploring the Invisible Universe: From Black Holes to Superstrings
NASA Astrophysics Data System (ADS)
Baaquie, Belal E.; Willeboordse, Frederick H.
2015-03-01
The book is written for a broad scientific audience with an interest in the leading theories about the Universe. The focus is on the physical Universe, and the laws of Physics are taken to be the guiding light in all our analysis. Starting from first principles and using self-evident reasoning, all the fundamental ideas that are employed in exploring the hidden and invisible realms of the Universe are shown to arise quite naturally, once one adopts the outlook that has come to light with the advances in Physics...
NASA Astrophysics Data System (ADS)
Faber, T. E.
1995-08-01
This textbook provides an accessible and comprehensive account of fluid dynamics that emphasizes fundamental physical principles and stresses connections with other branches of physics. Beginning with a basic introduction, the book goes on to cover many topics not typically treated in texts, such as compressible flow and shock waves, sound attenuation and bulk viscosity, solitary waves and ship waves, thermal convection, instabilities, turbulence, and the behavior of anisotropic, non-Newtonian and quantum fluids. Undergraduate or graduate students in physics or engineering who are taking courses in fluid dynamics will find this book invaluable.
Statistical Analysis of Physiological Signals
NASA Astrophysics Data System (ADS)
Ruiz, María G.; Pérez, Leticia
2003-07-01
In spite of two hundred years of clinical practice, Homeopathy still lacks of scientific basis. Its fundamental laws, similia principle and the activity of the denominated ultra-high dilutions are controversial issues that do not fit into the mainstream medicine or current physical-chemistry field as well. Aside its clinical efficacy, the identification of physical - chemistry parameters, as markers of the homeopathic effect, would allow to construct mathematic models [1], which in turn, could provide clues regarding the involved mechanism.
A New Type of Atom Interferometry for Testing Fundamental Physics
NASA Astrophysics Data System (ADS)
Lorek, Dennis; Lämmerzahl, Claus; Wicht, Andreas
We present a new type of atom interferometer (AI) that provides a tool for ultra-high precision tests of fundamental physics. As an example we present how an AI based on highly charged hydrogen-like atoms is affected by gravitational waves (GW). A qualitative description of the quantum interferometric measurement principle is given, the modifications in the atomic Hamiltonian caused by the GW are presented, and the size of the resulting frequency shifts in hydrogen-like atoms is estimated. For a GW amplitude of h = 10-23 the frequency shift is of the order of 110μHz for an AI based on a 91-fold charged uranium ion. A frequency difference of this size can be resolved by current AIs in 1s.
Nearfield acoustic holography. I - Theory of generalized holography and the development of NAH
NASA Technical Reports Server (NTRS)
Maynard, J. D.; Williams, E. G.; Lee, Y.
1985-01-01
Because its underlying principles are so fundamental, holography has been studied and applied in many areas of science. Recently, a technique has been developed which takes the maximum advantage of the fundamental principles and extracts much more information from a hologram than is customarily associated with such a measurement. In this paper the fundamental principles of holography are reviewed, and a sound radiation measurement system, called nearfield acoustic holography (NAH), which fully exploits the fundamental principles, is described.
The potential of legislation on organ donation to increase the supply of donor organs.
Coppen, Remco; Friele, Roland D; van der Zee, Jouke; Gevers, Sjef K
2010-12-01
The aim of this paper is to assess the possibilities to adapt the 1998 Dutch Organ Donation Act, taking account of fundamental principles such as the right to physical integrity, equitable access to and equal availability of care, and the non-commerciality principle, with a view to increasing the organ supply. In 2008 the Dutch Taskforce on Organ Donation presented several proposals to amend the Act and to increase the supply of organs. This paper describes the proposals to amend the Act and evaluates them by assessing their intrinsic adherence to basic principles and the available evidence that these proposals will indeed increase the organ supply. Several proposals could constitute an infringement of fundamental principles of the Act. Moreover, evidence for their impact on the organ supply is lacking. Changing the consent system is possible, as this would not incur legal objections. There are diverging views regarding the impact of consent systems on the organ supply. The scope for changing the Act and its impact on organ procurement is at best limited. Relying on legislation alone will possibly not bring much relief, whereas additional policy measures may be more successful. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bartels, A.; Bartel, T.; Canadija, M.; Mosler, J.
2015-09-01
This paper deals with the thermomechanical coupling in dissipative materials. The focus lies on finite strain plasticity theory and the temperature increase resulting from plastic deformation. For this type of problem, two fundamentally different modeling approaches can be found in the literature: (a) models based on thermodynamical considerations and (b) models based on the so-called Taylor-Quinney factor. While a naive straightforward implementation of thermodynamically consistent approaches usually leads to an over-prediction of the temperature increase due to plastic deformation, models relying on the Taylor-Quinney factor often violate fundamental physical principles such as the first and the second law of thermodynamics. In this paper, a thermodynamically consistent framework is elaborated which indeed allows the realistic prediction of the temperature evolution. In contrast to previously proposed frameworks, it is based on a fully three-dimensional, finite strain setting and it naturally covers coupled isotropic and kinematic hardening - also based on non-associative evolution equations. Considering a variationally consistent description based on incremental energy minimization, it is shown that the aforementioned problem (thermodynamical consistency and a realistic temperature prediction) is essentially equivalent to correctly defining the decomposition of the total energy into stored and dissipative parts. Interestingly, this decomposition shows strong analogies to the Taylor-Quinney factor. In this respect, the Taylor-Quinney factor can be well motivated from a physical point of view. Furthermore, certain intervals for this factor can be derived in order to guarantee that fundamental physically principles are fulfilled a priori. Representative examples demonstrate the predictive capabilities of the final constitutive modeling framework.
Trends in Practical Work in German Science Education
ERIC Educational Resources Information Center
di Fuccia, David; Witteck, Torsten; Markic, Silvija; Eilks, Ingo
2012-01-01
By the 1970s a fundamental shift had taken place in German science education. This was a shift away from the learning of more-or-less isolated facts and facets in Biology, Chemistry, and Physics towards a restructuring of science teaching along the general principles of the respective science domains. The changes included also the addition of…
Solar variability, weather, and climate
NASA Technical Reports Server (NTRS)
1982-01-01
Advances in the understanding of possible effects of solar variations on weather and climate are most likely to emerge by addressing the subject in terms of fundamental physical principles of atmospheric sciences and solar-terrestrial physis. The limits of variability of solar inputs to the atmosphere and the depth in the atmosphere to which these variations have significant effects are determined.
Introducing Filters and Amplifiers Using a Two-Channel Light Organ
ERIC Educational Resources Information Center
Zavrel, Erik; Sharpsteen, Eric
2015-01-01
In an era when many students carry iPods, iPhones, and iPads, physics teachers are realizing that in order to continue to inspire and convey the amazing things made possible by a few fundamental principles, they must expand laboratory coverage of electricity and circuits beyond the conventional staples of constructing series and parallel…
ERIC Educational Resources Information Center
Rowland, D. R.
2007-01-01
The physical analysis of a uniformly accelerating point charge provides a rich problem to explore in advanced courses in electrodynamics and relativity since it brings together fundamental concepts in relation to electromagnetic radiation, Einstein's equivalence principle and the inertial mass of field energy in ways that reveal subtleties in each…
INSTRUCTIONAL TELEVISION FOR THE LOWER PRIMARY. A TEACHER GUIDE, SEMESTER II.
ERIC Educational Resources Information Center
PELIKAN, ALFRED; AND OTHERS
PROGRAMS FOR THE LOWER PRIMARY GROUP WERE IN ART, MUSIC AND PHYSICAL EDUCATION. A PREVIEW OF THE CONTENT OF EACH TELECAST WAS GIVEN WITH DETAILED INFORMATION FOR FOLLOWUP ACTIVITIES. THE STRUCTURE OF THE ART PROGRAM INCLUDED THE FUNDAMENTAL PRINCIPLES APPLICABLE TO SUCH BASIC AREAS AS LINE DRAWING, PICTURE MAKING, DESIGN AND CONSTRUCTION WITH THE…
Beginning Skin and Scuba Diving, Physical Education: 5551.69.
ERIC Educational Resources Information Center
Roberts, Millie
This course outline is a guide for teaching the principles and basic fundamentals of beginning skin and scuba diving in grades 7-12. The course format includes lectures, skills practice, films, and tests that focus on mastery of skills and understanding correct usage of skin and scuba equipment. Course content includes the following: (a) history,…
Wang, Yimeng; Bargh, John A
2016-01-01
Consistent with neural reuse theory, empirical tests of the related "scaffolding" principle of abstract concept development show that higher-level concepts "reuse" and are built upon fundamental motives such as survival, safety, and consumption. This produces mutual influence between the two levels, with far-ranging impacts from consumer behavior to political attitudes.
NASA Astrophysics Data System (ADS)
Brutsaert, Wilfried
2005-08-01
Water in its different forms has always been a source of wonder, curiosity and practical concern for humans everywhere. Hydrology - An Introduction presents a coherent introduction to the fundamental principles of hydrology, based on the course that Wilfried Brutsaert has taught at Cornell University for the last thirty years. Hydrologic phenomena are dealt with at spatial and temporal scales at which they occur in nature. The physics and mathematics necessary to describe these phenomena are introduced and developed, and readers will require a working knowledge of calculus and basic fluid mechanics. The book will be invaluable as a textbook for entry-level courses in hydrology directed at advanced seniors and graduate students in physical science and engineering. In addition, the book will be more broadly of interest to professional scientists and engineers in hydrology, environmental science, meteorology, agronomy, geology, climatology, oceanology, glaciology and other earth sciences. Emphasis on fundamentals Clarification of the underlying physical processes Applications of fluid mechanics in the natural environment
Synthetic Biology: Engineering Living Systems from Biophysical Principles.
Bartley, Bryan A; Kim, Kyung; Medley, J Kyle; Sauro, Herbert M
2017-03-28
Synthetic biology was founded as a biophysical discipline that sought explanations for the origins of life from chemical and physical first principles. Modern synthetic biology has been reinvented as an engineering discipline to design new organisms as well as to better understand fundamental biological mechanisms. However, success is still largely limited to the laboratory and transformative applications of synthetic biology are still in their infancy. Here, we review six principles of living systems and how they compare and contrast with engineered systems. We cite specific examples from the synthetic biology literature that illustrate these principles and speculate on their implications for further study. To fully realize the promise of synthetic biology, we must be aware of life's unique properties. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Principle of Spacetime and Black Hole Equivalence
NASA Astrophysics Data System (ADS)
Zhang, Tianxi
2016-06-01
Modelling the universe without relying on a set of hypothetical entities (HEs) to explain observations and overcome problems and difficulties is essential to developing a physical cosmology. The well-known big bang cosmology, widely accepted as the standard model, stands on two fundamentals, which are Einstein’s general relativity (GR) that describes the effect of matter on spacetime and the cosmological principle (CP) of spacetime isotropy and homogeneity. The field equation of GR along with the Friedmann-Lemaitre-Robertson-Walker (FLRW) metric of spacetime derived from CP generates the Friedmann equation (FE) that governs the development and dynamics of the universe. The big bang theory has made impressive successes in explaining the universe, but still has problems and solutions of them rely on an increasing number of HEs such as inflation, dark matter, dark energy, and so on. Recently, the author has developed a new cosmological model called black hole universe, which, instead of making many those hypotheses, only includes a new single postulate (or a new principle) to the cosmology - Principle of Spacetime and Black Hole Equivalence (SBHEP) - to explain all the existing observations of the universe and overcome all the existing problems in conventional cosmologies. This study thoroughly demonstrates how this newly developed black hole universe model, which therefore stands on the three fundamentals (GR, CP, and SBHEP), can fully explain the universe as well as easily conquer the difficulties according to the well-developed physics, thus, neither needing any other hypotheses nor existing any unsolved difficulties. This work was supported by NSF/REU (Grant #: PHY-1263253) at Alabama A & M University.
Educational activities with a tandem accelerator
NASA Astrophysics Data System (ADS)
Casolaro, P.; Campajola, L.; Balzano, E.; D'Ambrosio, E.; Figari, R.; Vardaci, E.; La Rana, G.
2018-05-01
Selected experiments in fundamental physics have been proposed for many years at the Tandem Accelerator of the University of Napoli ‘Federico II’s Department of Physics as a part of a one-semester laboratory course for graduate students. The aim of this paper is to highlight the educational value of the experimental realization of the nuclear reaction 19F(p,α)16O. With the purpose of verifying the mass-energy equivalence principle, different aspects of both classical and modern physics can be investigated, e.g. conservation laws, atomic models, nuclear physics applications to compositional analysis, nuclear cross-section, Q-value and nuclear spectroscopic analysis.
Principle of Maximum Fisher Information from Hardy’s Axioms Applied to Statistical Systems
Frieden, B. Roy; Gatenby, Robert A.
2014-01-01
Consider a finite-sized, multidimensional system in a parameter state a. The system is in either a state of equilibrium or general non-equilibrium, and may obey either classical or quantum physics. L. Hardy’s mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N = max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N = max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I = Imax. This is important because many physical laws have been derived, assuming as a working hypothesis that I = Imax. These derivations include uses of the principle of Extreme physical information (EPI). Examples of such derivations were of the De Broglie wave hypothesis, quantum wave equations, Maxwell’s equations, new laws of biology (e.g. of Coulomb force-directed cell development, and of in situ cancer growth), and new laws of economic fluctuation and investment. That the principle I = Imax itself derives, from suitably extended Hardy axioms, thereby eliminates its need to be assumed in these derivations. Thus, uses of I = Imax and EPI express physics at its most fundamental level – its axiomatic basis in math. PMID:24229152
Principle of maximum Fisher information from Hardy's axioms applied to statistical systems.
Frieden, B Roy; Gatenby, Robert A
2013-10-01
Consider a finite-sized, multidimensional system in parameter state a. The system is either at statistical equilibrium or general nonequilibrium, and may obey either classical or quantum physics. L. Hardy's mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N=max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N=max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I=I(max). This is important because many physical laws have been derived, assuming as a working hypothesis that I=I(max). These derivations include uses of the principle of extreme physical information (EPI). Examples of such derivations were of the De Broglie wave hypothesis, quantum wave equations, Maxwell's equations, new laws of biology (e.g., of Coulomb force-directed cell development and of in situ cancer growth), and new laws of economic fluctuation and investment. That the principle I=I(max) itself derives from suitably extended Hardy axioms thereby eliminates its need to be assumed in these derivations. Thus, uses of I=I(max) and EPI express physics at its most fundamental level, its axiomatic basis in math.
An Overview of the State of the Art in Atomistic and Multiscale Simulation of Fracture
NASA Technical Reports Server (NTRS)
Saether, Erik; Yamakov, Vesselin; Phillips, Dawn R.; Glaessgen, Edward H.
2009-01-01
The emerging field of nanomechanics is providing a new focus in the study of the mechanics of materials, particularly in simulating fundamental atomic mechanisms involved in the initiation and evolution of damage. Simulating fundamental material processes using first principles in physics strongly motivates the formulation of computational multiscale methods to link macroscopic failure to the underlying atomic processes from which all material behavior originates. This report gives an overview of the state of the art in applying concurrent and sequential multiscale methods to analyze damage and failure mechanisms across length scales.
The Foundations of Einstein's Theory of Gravitation
NASA Astrophysics Data System (ADS)
Freundlich, Erwin; Brose, Translated by Henry L.; Einstein, Preface by Albert; Turner, Introduction by H. H.
2011-06-01
Introduction; 1. The special theory of relativity as a stepping-stone to the general theory of relativity; 2. Two fundamental postulates in the mathematical formulation of physical laws; 3. Concerning the fulfilment of the two postulates; 4. The difficulties in the principles of classical mechanics; 5. Einstein's theory of gravitation; 6. The verification of the new theory by actual experience; Appendix; Index.
Why threefold-replication of families?
NASA Astrophysics Data System (ADS)
Fitzpatrick, Gerald L.
1998-04-01
In spite of the many successes of the standard model of particle physics, the observed proliferation of matter-fields, in the form of ``replicated'' generations or families, is a major unsolved problem. In this paper, I explore some of the algebraic, geometric and physical consequences of a new organizing principle for fundamental fermions (quarks and leptons)(Gerald L. Fitzpatrick, phThe Family Problem--New Internal Algebraic and Geometric Regularities), Nova Scientific Press, Issaquah, Washington, 1997. Read more about this book (ISBN 0--9655695--0--0) and its subject matter at: http://www.tp.umu.se/TIPTOP A> and/or http://www.amazon.com.. The essence of the new organizing principle is the idea that the standard-model concept of scalar fermion numbers f can be generalized. In particular, a ``generalized fermion number,'' which consists of a 2× 2 matrix F that ``acts'' on an internal 2-space, instead of spacetime, is taken to describe certain internal properties of fundamental fermions. This generalization automatically introduces internal degrees of freedom that ``explain,'' among other things, family replication and the number (three) of families observed in nature.
Bi-centenary of successes of Fourier theorem: its power and limitations in optical system designs
NASA Astrophysics Data System (ADS)
Roychoudhuri, Chandrasekhar
2007-09-01
We celebrate the two hundred years of successful use of the Fourier theorem in optics. However, there is a great enigma associated with the Fourier transform integral. It is one of the most pervasively productive and useful tool of physics and optics because its foundation is based on the superposition of harmonic functions and yet we have never declared it as a principle of physics for valid reasons. And, yet there are a good number of situations where we pretend it to be equivalent to the superposition principle of physics, creating epistemological problems of enormous magnitude. The purpose of the paper is to elucidate the problems while underscoring the successes and the elegance of the Fourier theorem, which are not explicitly discussed in the literature. We will make our point by taking six major engineering fields of optics and show in each case why it works and under what restricted conditions by bringing in the relevant physics principles. The fields are (i) optical signal processing, (ii) Fourier transform spectrometry, (iii) classical spectrometry of pulsed light, (iv) coherence theory, (v) laser mode locking and (vi) pulse broadening. We underscore that mathematical Fourier frequencies, not being physical frequencies, cannot generate real physical effects on our detectors. Appreciation of this fundamental issue will open up ways to be innovative in many new optical instrument designs. We underscore the importance of always validating our design platforms based on valid physics principles (actual processes undergoing in nature) captured by an appropriate hypothesis based on diverse observations. This paper is a comprehensive view of the power and limitations of Fourier Transform by summarizing a series of SPIE conference papers presented during 2003-2007.
Understanding the Physical Optics Phenomena by Using a Digital Application for Light Propagation
NASA Astrophysics Data System (ADS)
Sierra-Sosa, Daniel-Esteban; Ángel-Toro, Luciano
2011-01-01
Understanding the light propagation on the basis of the Huygens-Fresnel principle stands for a fundamental factor for deeper comprehension of different physical optics related phenomena like diffraction, self-imaging, image formation, Fourier analysis and spatial filtering. This constitutes the physical approach of the Fourier optics whose principles and applications have been developed since the 1950's. Both for analytical and digital applications purposes, light propagation can be formulated in terms of the Fresnel Integral Transform. In this work, a digital optics application based on the implementation of the Discrete Fresnel Transform (DFT), and addressed to serve as a tool for applications in didactics of optics is presented. This tool allows, at a basic and intermediate learning level, exercising with the identification of basic phenomena, and observing changes associated with modifications of physical parameters. This is achieved by using a friendly graphic user interface (GUI). It also assists the user in the development of his capacity for abstracting and predicting the characteristics of more complicated phenomena. At an upper level of learning, the application could be used to favor a deeper comprehension of involved physics and models, and experimenting with new models and configurations. To achieve this, two characteristics of the didactic tool were taken into account when designing it. First, all physical operations, ranging from simple diffraction experiments to digital holography and interferometry, were developed on the basis of the more fundamental concept of light propagation. Second, the algorithm was conceived to be easily upgradable due its modular architecture based in MATLAB® software environment. Typical results are presented and briefly discussed in connection with didactics of optics.
First-principles definition and measurement of planetary electromagnetic-energy budget.
Mishchenko, Michael I; Lock, James A; Lacis, Andrew A; Travis, Larry D; Cairns, Brian
2016-06-01
The imperative to quantify the Earth's electromagnetic-energy budget with an extremely high accuracy has been widely recognized but has never been formulated in the framework of fundamental physics. In this paper we give a first-principles definition of the planetary electromagnetic-energy budget using the Poynting-vector formalism and discuss how it can, in principle, be measured. Our derivation is based on an absolute minimum of theoretical assumptions, is free of outdated notions of phenomenological radiometry, and naturally leads to the conceptual formulation of an instrument called the double hemispherical cavity radiometer (DHCR). The practical measurement of the planetary energy budget would require flying a constellation of several dozen planet-orbiting satellites hosting identical well-calibrated DHCRs.
First-principles definition and measurement of planetary electromagnetic-energy budget
NASA Astrophysics Data System (ADS)
Mishchenko, M. I.; James, L.; Lacis, A. A.; Travis, L. D.; Cairns, B.
2016-12-01
The imperative to quantify the Earth's electromagnetic-energy budget with an extremely high accuracy has been widely recognized but has never been formulated in the framework of fundamental physics. In this talk we give a first-principles definition of the planetary electromagnetic-energy budget using the Poynting-vector formalism and discuss how it can, in principle, be measured. Our derivation is based on an absolute minimum of theoretical assumptions, is free of outdated concepts of phenomenological radiometry, and naturally leads to the conceptual formulation of an instrument called the double hemispherical cavity radiometer (DHCR). The practical measurement of the planetary energy budget would require flying a constellation of several dozen planet-orbiting satellites hosting identical well-calibrated DHCRs.
First-Principles Definition and Measurement of Planetary Electromagnetic-Energy Budget
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Lock, James A.; Lacis, Andrew A.; Travis, Larry D.; Cairns, Brian
2016-01-01
The imperative to quantify the Earths electromagnetic-energy budget with an extremely high accuracy has been widely recognized but has never been formulated in the framework of fundamental physics. In this paper we give a first-principles definition of the planetary electromagnetic-energy budget using the Poynting- vector formalism and discuss how it can, in principle, be measured. Our derivation is based on an absolute minimum of theoretical assumptions, is free of outdated notions of phenomenological radiometry, and naturally leads to the conceptual formulation of an instrument called the double hemispherical cavity radiometer (DHCR). The practical measurement of the planetary energy budget would require flying a constellation of several dozen planet-orbiting satellites hosting identical well-calibrated DHCRs.
Theoretical principles for biology: Variation.
Montévil, Maël; Mossio, Matteo; Pocheville, Arnaud; Longo, Giuseppe
2016-10-01
Darwin introduced the concept that random variation generates new living forms. In this paper, we elaborate on Darwin's notion of random variation to propose that biological variation should be given the status of a fundamental theoretical principle in biology. We state that biological objects such as organisms are specific objects. Specific objects are special in that they are qualitatively different from each other. They can undergo unpredictable qualitative changes, some of which are not defined before they happen. We express the principle of variation in terms of symmetry changes, where symmetries underlie the theoretical determination of the object. We contrast the biological situation with the physical situation, where objects are generic (that is, different objects can be assumed to be identical) and evolve in well-defined state spaces. We derive several implications of the principle of variation, in particular, biological objects show randomness, historicity and contextuality. We elaborate on the articulation between this principle and the two other principles proposed in this special issue: the principle of default state and the principle of organization. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Bogdanov, Alexander; Khramushin, Vasily
2016-02-01
The architecture of a digital computing system determines the technical foundation of a unified mathematical language for exact arithmetic-logical description of phenomena and laws of continuum mechanics for applications in fluid mechanics and theoretical physics. The deep parallelization of the computing processes results in functional programming at a new technological level, providing traceability of the computing processes with automatic application of multiscale hybrid circuits and adaptive mathematical models for the true reproduction of the fundamental laws of physics and continuum mechanics.
NASA Astrophysics Data System (ADS)
Plotnitsky, Arkady
2017-06-01
The history of mathematical modeling outside physics has been dominated by the use of classical mathematical models, C-models, primarily those of a probabilistic or statistical nature. More recently, however, quantum mathematical models, Q-models, based in the mathematical formalism of quantum theory have become more prominent in psychology, economics, and decision science. The use of Q-models in these fields remains controversial, in part because it is not entirely clear whether Q-models are necessary for dealing with the phenomena in question or whether C-models would still suffice. My aim, however, is not to assess the necessity of Q-models in these fields, but instead to reflect on what the possible applicability of Q-models may tell us about the corresponding phenomena there, vis-à-vis quantum phenomena in physics. In order to do so, I shall first discuss the key reasons for the use of Q-models in physics. In particular, I shall examine the fundamental principles that led to the development of quantum mechanics. Then I shall consider a possible role of similar principles in using Q-models outside physics. Psychology, economics, and decision science borrow already available Q-models from quantum theory, rather than derive them from their own internal principles, while quantum mechanics was derived from such principles, because there was no readily available mathematical model to handle quantum phenomena, although the mathematics ultimately used in quantum did in fact exist then. I shall argue, however, that the principle perspective on mathematical modeling outside physics might help us to understand better the role of Q-models in these fields and possibly to envision new models, conceptually analogous to but mathematically different from those of quantum theory, helpful or even necessary there or in physics itself. I shall suggest one possible type of such models, singularized probabilistic, SP, models, some of which are time-dependent, TDSP-models. The necessity of using such models may change the nature of mathematical modeling in science and, thus, the nature of science, as it happened in the case of Q-models, which not only led to a revolutionary transformation of physics but also opened new possibilities for scientific thinking and mathematical modeling beyond physics.
Underground atom gradiometer array for mass distribution monitoring and advanced geodesy
NASA Astrophysics Data System (ADS)
Canuel, B.
2015-12-01
After more than 20 years of fundamental research, atom interferometers have reached sensitivity and accuracy levels competing with or beating inertial sensors based on different technologies. Atom interferometers offer interesting applications in geophysics (gravimetry, gradiometry, Earth rotation rate measurements), inertial sensing (submarine or aircraft autonomous positioning), metrology (new definition of the kilogram) and fundamental physics (tests of the standard model, tests of general relativity). Atom interferometers already contributed significantly to fundamental physics by, for example, providing stringent constraints on quantum-electrodynamics through measurements of the hyperfine structure constant, testing the Equivalence Principle with cold atoms, or providing new measurements for the Newtonian gravitational constant. Cold atom sensors have moreover been established as key instruments in metrology for the new definition of the kilogram or through international comparisons of gravimeters. The field of atom interferometry (AI) is now entering a new phase where very high sensitivity levels must be demonstrated, in order to enlarge the potential applications outside atomic physics laboratories. These applications range from gravitational wave (GW) detection in the [0.1-10 Hz] frequency band to next generation ground and space-based Earth gravity field studies to precision gyroscopes and accelerometers. The Matter-wave laser Interferometric Gravitation Antenna (MIGA) presented here is a large-scale matter-wave sensor which will open new applications in geoscience and fundamental physics. The MIGA consortium gathers 18 expert French laboratories and companies in atomic physics, metrology, optics, geosciences and gravitational physics, with the aim to build a large-scale underground atom-interferometer instrument by 2018 and operate it till at least 2023. In this paper, we present the main objectives of the project, the status of the construction of the instrument and the motivation for the applications of MIGA in geosciences
The "Fundamental Pedogagical Principle" in Second Language Teaching.
ERIC Educational Resources Information Center
Krashen, Stephen D.
1981-01-01
A fundamental principle of second language acquisition is stated and applied to language teaching. The principle states that learners acquire a second language when they receive comprehensible input in situations where their affective filters are sufficiently low. The theoretical background of this principle consists of five hypotheses: the…
Unsteady transonic flows - Introduction, current trends, applications
NASA Technical Reports Server (NTRS)
Yates, E. C., Jr.
1985-01-01
The computational treatment of unsteady transonic flows is discussed, reviewing the historical development and current techniques. The fundamental physical principles are outlined; the governing equations are introduced; three-dimensional linearized and two-dimensional linear-perturbation theories in frequency domain are described in detail; and consideration is given to frequency-domain FEMs and time-domain finite-difference and integral-equation methods. Extensive graphs and diagrams are included.
Generalized uncertainty principle and quantum gravity phenomenology
NASA Astrophysics Data System (ADS)
Bosso, Pasquale
The fundamental physical description of Nature is based on two mutually incompatible theories: Quantum Mechanics and General Relativity. Their unification in a theory of Quantum Gravity (QG) remains one of the main challenges of theoretical physics. Quantum Gravity Phenomenology (QGP) studies QG effects in low-energy systems. The basis of one such phenomenological model is the Generalized Uncertainty Principle (GUP), which is a modified Heisenberg uncertainty relation and predicts a deformed canonical commutator. In this thesis, we compute Planck-scale corrections to angular momentum eigenvalues, the hydrogen atom spectrum, the Stern-Gerlach experiment, and the Clebsch-Gordan coefficients. We then rigorously analyze the GUP-perturbed harmonic oscillator and study new coherent and squeezed states. Furthermore, we introduce a scheme for increasing the sensitivity of optomechanical experiments for testing QG effects. Finally, we suggest future projects that may potentially test QG effects in the laboratory.
Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2005-01-01
A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.
How can laboratory plasma experiments contribute to space and &astrophysics?
NASA Astrophysics Data System (ADS)
Yamada, M.
Plasma physics plays key role in a wide range of phenomena in the universe, from laboratory plasmas to the magnetosphere, the solar corona, and to the tenuous interstellar and intergalactic gas. Despite the huge difference in physical scales, there are striking similarities in plasma behavior of laboratory and space plasmas. Similar plasma physics problems have been investigated independently by both laboratory plasma physicists and astrophysicists. Since 1991, cross fertilization has been increased among laboratory plasma physicists and space physicists through meeting such as IPELS [Interrelationship between Plasma Experiments in the Laboratory and Space] meeting. The advances in laboratory plasma physics, along with the recent surge of astronomical data from satellites, make this moment ripe for research collaboration to further advance plasma physics and to obtain new understanding of key space and astrophysical phenomena. The recent NRC review of astronomy and astrophysics notes the benefit that can accrue from stronger connection to plasma physics. The present talk discusses how laboratory plasma studies can contribute to the fundamental understandings of the space and astrophysical phenomena by covering common key physics topics such as magnetic reconnection, dynamos, angular momentum transport, ion heating, and magnetic self-organization. In particular, it has recently been recognized that "physics -issue- dedicated" laboratory experiments can contribute significantly to the understanding of the fundamental physics for space-astrophysical phenomena since they can create fundamental physics processes in controlled manner and provide well-correlated plasma parameters at multiple plasma locations simultaneously. Such dedicated experiments not only can bring about better understanding of the fundamental physics processes but also can lead to findings of new physics principles as well as new ideas for fusion plasma confinement. Several dedicated experiments have provided the fundamental physics data for magnetic reconnection [1]. Linear plasma devices have been utilized to investigate Whistler waves and Alfven wave phenomena [2,3]. A rotating gallium disk experiment has been initiated to study magneto-rotational instability [4]. This talk also presents the most recent progress of these dedicated laboratory plasma research. 1. M. Yamada et al., Phys. Plasmas 4, 1936, (1997) 2. R. Stenzel, Phys. Rev. Lett. 65, 3001 (1991) 3. W. Gekelman et al, Plasma Phys. Contr. Fusion, v42, B15-B26, Suppl.12B (2000) 4. H. Ji, J. Goodman, A. Kageyama Mon. Not. R. Astron. Soc. 325, L1- (2001)
Quantifying the Effect of Soil Water Repellency on Infiltration Parameters Using a Dry Sand
NASA Astrophysics Data System (ADS)
Shillito, R.; Berli, M.; Ghezzehei, T. A.; Kaminski, E.
2017-12-01
Water infiltration into less than perfectly wettable soils has usually been considered an exceptional case—in fact, it may be the rule. Infiltration into soils exhibiting some degree of water repellency has important implications in agricultural irrigation, post-fire runoff, golf course and landscape management, and spill and contaminant mitigation. Beginning from fundamental principles, we developed a physically-based model to quantify the effect of water repellency on infiltration parameters. Experimentally, we used a dry silica sand and treated it to achieve various known degrees of water repellency. The model was verified using data gathered from multiple upward infiltration (wicking) experiments using the treated sand. The model also allowed us to explore the effect of initial soil moisture conditions on infiltration into water-repellent soils, and the physical interpretation of the simple water drop penetration time test. These results provide a fundamental step in the physically-based understanding of how water infiltrates into a less than perfectly wettable porous media.
The metaphysics of quantum mechanics: Modal interpretations
NASA Astrophysics Data System (ADS)
Gluck, Stuart Murray
2004-11-01
This dissertation begins with the argument that a preferred way of doing metaphysics is through philosophy of physics. An understanding of quantum physics is vital to answering questions such as: What counts as an individual object in physical ontology? Is the universe fundamentally indeterministic? Are indiscernibles identical? This study explores how the various modal interpretations of quantum mechanics answer these sorts of questions; modal accounts are one of the two classes of interpretations along with so-called collapse accounts. This study suggests a new alternative within the class of modal views that yields a more plausible ontology, one in which the Principle of the Identity of Indisceribles is necessarily true. Next, it shows that modal interpretations can consistently deny that the universe must be fundamentally indeterministic so long as they accept certain other metaphysical commitments: either a perfect initial distribution of states in the universe or some form of primitive dispositional properties. Finally, the study sketches out a future research project for modal interpretations based on developing quantified quantum logic.
Energy geotechnics: Advances in subsurface energy recovery, storage, exchange, and waste management
McCartney, John S.; Sanchez, Marcelo; Tomac, Ingrid
2016-02-17
Energy geotechnics involves the use of geotechnical principles to understand and engineer the coupled thermo-hydro-chemo-mechanical processes encountered in collecting, exchanging, storing, and protecting energy resources in the subsurface. In addition to research on these fundamental coupled processes and characterization of relevant material properties, applied research is being performed to develop analytical tools for the design and analysis of different geo-energy applications. In conclusion, the aims of this paper are to discuss the fundamental physics and constitutive models that are common to these different applications, and to summarize recent advances in the development of relevant analytical tools.
Energy geotechnics: Advances in subsurface energy recovery, storage, exchange, and waste management
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCartney, John S.; Sanchez, Marcelo; Tomac, Ingrid
Energy geotechnics involves the use of geotechnical principles to understand and engineer the coupled thermo-hydro-chemo-mechanical processes encountered in collecting, exchanging, storing, and protecting energy resources in the subsurface. In addition to research on these fundamental coupled processes and characterization of relevant material properties, applied research is being performed to develop analytical tools for the design and analysis of different geo-energy applications. In conclusion, the aims of this paper are to discuss the fundamental physics and constitutive models that are common to these different applications, and to summarize recent advances in the development of relevant analytical tools.
NASA Astrophysics Data System (ADS)
Allain, Rhett
2016-05-01
We currently live in a world filled with videos. There are videos on YouTube, feature movies and even videos recorded with our own cameras and smartphones. These videos present an excellent opportunity to not only explore physical concepts, but also inspire others to investigate physics ideas. With video analysis, we can explore the fantasy world in science-fiction films. We can also look at online videos to determine if they are genuine or fake. Video analysis can be used in the introductory physics lab and it can even be used to explore the make-believe physics embedded in video games. This book covers the basic ideas behind video analysis along with the fundamental physics principles used in video analysis. The book also includes several examples of the unique situations in which video analysis can be used.
Theoretical aspects of the equivalence principle
NASA Astrophysics Data System (ADS)
Damour, Thibault
2012-09-01
We review several theoretical aspects of the equivalence principle (EP). We emphasize the unsatisfactory fact that the EP maintains the absolute character of the coupling constants of physics, while general relativity and its generalizations (Kaluza-Klein, …, string theory) suggest that all absolute structures should be replaced by dynamical entities. We discuss the EP-violation phenomenology of dilaton-like models, which is likely to be dominated by the linear superposition of two effects: a signal proportional to the nuclear Coulomb energy, related to the variation of the fine-structure constant, and a signal proportional to the surface nuclear binding energy, related to the variation of the light quark masses. We recall various theoretical arguments (including a recently proposed anthropic argument) suggesting that the EP be violated at a small, but not unmeasurably small level. This motivates the need for improved tests of the EP. These tests are probing new territories in physics that are related to deep, and mysterious, issues in fundamental physics.
Tuning topological phases in the XMnSb2 system via chemical substitution from first principles
NASA Astrophysics Data System (ADS)
Griffin, Sinead M.; Neaton, Jeffrey B.
New Dirac materials are sought for their interesting fundamental physics and for their potential technological applications. Protected symmetries offer a route to potential zero mass Dirac and Weyl fermions, and can lead unique transport properties and spectroscopic signatures. In this work, we use first-principles calculations to study the XMnSb2 family of materials and show how varying X changes the nature of bulk protected topological features in their electronic structure. We further discuss new design rules for predicting new topological materials suggested by our calculations. SG is supported by the Early Postdoc Mobility Fellowship of the SNF.
Lorenz, Gödel and Penrose: new perspectives on determinism and causality in fundamental physics
NASA Astrophysics Data System (ADS)
Palmer, T. N.
2014-07-01
Despite being known for his pioneering work on chaotic unpredictability, the key discovery at the core of meteorologist Ed Lorenz's work is the link between space-time calculus and state-space fractal geometry. Indeed, properties of Lorenz's fractal invariant set relate space-time calculus to deep areas of mathematics such as Gödel's Incompleteness Theorem. Could such properties also provide new perspectives on deep unsolved issues in fundamental physics? Recent developments in cosmology motivate what is referred to as the 'cosmological invariant set postulate': that the universe ? can be considered a deterministic dynamical system evolving on a causal measure-zero fractal invariant set ? in its state space. Symbolic representations of ? are constructed explicitly based on permutation representations of quaternions. The resulting 'invariant set theory' provides some new perspectives on determinism and causality in fundamental physics. For example, while the cosmological invariant set appears to have a rich enough structure to allow a description of (quantum) probability, its measure-zero character ensures it is sparse enough to prevent invariant set theory being constrained by the Bell inequality (consistent with a partial violation of the so-called measurement independence postulate). The primacy of geometry as embodied in the proposed theory extends the principles underpinning general relativity. As a result, the physical basis for contemporary programmes which apply standard field quantisation to some putative gravitational lagrangian is questioned. Consistent with Penrose's suggestion of a deterministic but non-computable theory of fundamental physics, an alternative 'gravitational theory of the quantum' is proposed based on the geometry of ?, with new perspectives on the problem of black-hole information loss and potential observational consequences for the dark universe.
NASA Astrophysics Data System (ADS)
Carlowicz, Michael
If you have a computer and a grasp of algebra, you can learn physics. That is one of the messages behind the release of Physics—The Root Science, a new full-text version of a physics textbook available at no cost on the World Wide Web.The interactive textbook is the work of the International Institute of Theoretical and Applied Physics (IITAP) at Iowa State University, which was established in 1993 as a partnership with the United Nations Education, Scientific, and Cultural Organization (UNESCO). With subject matter equivalent to that of a 400-page volume, the text is designed to be completed in one school year. The textbook also will eventually include video clips of experiments and interactive learning modules, as well as links to appropriate cross-references about fundamental principles of physics.
Development of Thermodynamic Conceptual Evaluation
NASA Astrophysics Data System (ADS)
Talaeb, P.; Wattanakasiwich, P.
2010-07-01
This research aims to develop a test for assessing student understanding of fundamental principles in thermodynamics. Misconceptions found from previous physics education research were used to develop the test. Its topics include heat and temperature, the zeroth and the first law of thermodynamics, and the thermodynamics processes. The content validity was analyzed by three physics experts. Then the test was administered to freshmen, sophomores and juniors majored in physics in order to determine item difficulties and item discrimination of the test. A few items were eliminated from the test. Finally, the test will be administered to students taking Physics I course in order to evaluate the effectiveness of Interactive Lecture Demonstrations that will be used for the first time at Chiang Mai University.
Learning physics in a water park
NASA Astrophysics Data System (ADS)
Cabeza, Cecilia; Rubido, Nicolás; Martí, Arturo C.
2014-03-01
Entertaining and educational experiments that can be conducted in a water park, illustrating physics concepts, principles and fundamental laws, are described. These experiments are suitable for students ranging from senior secondary school to junior university level. Newton’s laws of motion, Bernoulli’s equation, based on the conservation of energy, buoyancy, linear and non-linear wave propagation, turbulence, thermodynamics, optics and cosmology are among the topics that can be discussed. Commonly available devices like smartphones, digital cameras, laptop computers and tablets, can be used conveniently to enable accurate calculation and a greater degree of engagement on the part of students.
Transcranial magnetic stimulation: physics, electrophysiology, and applications.
Fatemi-Ardekani, Ali
2008-01-01
Transcranial magnetic stimulation (TMS) is a noninvasive technique used to stimulate the brain. This review will examine the fundamental principles of physics upon which magnetic stimulation is based, the design considerations of the TMS device, and hypotheses about its electrophysiological effects resulting in neuromodulation. TMS is valuable in neurophysiology research and has significant therapeutic potential in clinical neurology and psychiatry. While TMS can modify neuronal currents in the brain, its underlying mechanism remains unknown. Salient applications are included and some suggestions are outlined for future development of magnetic stimulators that could lead to more effective neuronal stimulation and therefore better therapeutic and diagnostic applications.
Observability, Visualizability and the Question of Metaphysical Neutrality
NASA Astrophysics Data System (ADS)
Wolff, Johanna
2015-09-01
Theories in fundamental physics are unlikely to be ontologically neutral, yet they may nonetheless fail to offer decisive empirical support for or against particular metaphysical positions. I illustrate this point by close examination of a particular objection raised by Wolfgang Pauli against Hermann Weyl. The exchange reveals that both parties to the dispute appeal to broader epistemological principles to defend their preferred metaphysical starting points. I suggest that this should make us hesitant to assume that in deriving metaphysical conclusions from physical theories we place our metaphysical theories on a purely empirical foundation. The metaphysics within a particular physical theory may well be the result of a priori assumptions in the background, not particular empirical findings.
NASA Astrophysics Data System (ADS)
Ogoh, Kazutoshi
"Basic Natural Science" for freshmen at Miyazaki Prefectural Nursing University has a component including physics. Here students learn three principles of thermal transfer; conduction, radiation, and convection through a series of experiments. The purpose of these experiments is to understand the structure of a method for the caring of breathing and temperature of patients as written in "Ventilation and Warming", the first chapter of F. Nightingale's Notes on Nursing. Students can then apply this structure to retain fresh air in today's hospital rooms, and can then appreciate studying real physics incorporated into fundamental knowledge for nursing practice.
NASA Astrophysics Data System (ADS)
Ercan, İlke; Suyabatmaz, Enes
2018-06-01
The saturation in the efficiency and performance scaling of conventional electronic technologies brings about the development of novel computational paradigms. Brownian circuits are among the promising alternatives that can exploit fluctuations to increase the efficiency of information processing in nanocomputing. A Brownian cellular automaton, where signals propagate randomly and are driven by local transition rules, can be made computationally universal by embedding arbitrary asynchronous circuits on it. One of the potential realizations of such circuits is via single electron tunneling (SET) devices since SET technology enable simulation of noise and fluctuations in a fashion similar to Brownian search. In this paper, we perform a physical-information-theoretic analysis on the efficiency limitations in a Brownian NAND and half-adder circuits implemented using SET technology. The method we employed here establishes a solid ground that enables studying computational and physical features of this emerging technology on an equal footing, and yield fundamental lower bounds that provide valuable insights into how far its efficiency can be improved in principle. In order to provide a basis for comparison, we also analyze a NAND gate and half-adder circuit implemented in complementary metal oxide semiconductor technology to show how the fundamental bound of the Brownian circuit compares against a conventional paradigm.
Quantum physics in neuroscience and psychology: A neurophysicalmodel of the mind/brain interaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Jeffrey M.; Stapp, Henry P.; Beauregard, Mario
Neuropsychological research on the neural basis of behavior generally posits that brain mechanisms will ultimately suffice to explain all psychologically described phenomena. This assumption stems from the idea that the brain is made up entirely of material particles and fields, and that all causal mechanisms relevant to neuroscience can therefore be formulated solely in terms of properties of these elements. Thus terms having intrinsic mentalistic and/or experiential content (e.g., ''feeling,'' ''knowing,'' and ''effort'') are not included as primary causal factors. This theoretical restriction is motivated primarily by ideas about the natural world that have been known to be fundamentally incorrectmore » for more than three quarters of a century. Contemporary basic physical theory differs profoundly from classical physics on the important matter of how the consciousness of human agents enters into the structure of empirical phenomena. The new principles contradict the older idea that local mechanical processes alone can account for the structure of all observed empirical data. Contemporary physical theory brings directly and irreducibly into the overall causal structure certain psychologically described choices made by human agents about how they will act. This key development in basic physical theory is applicable to neuroscience, and it provides neuroscientists and psychologists with an alternative conceptual framework for describing neural processes. Indeed, due to certain structural features of ion channels critical to synaptic function, contemporary physical theory must in principle be used when analyzing human brain dynamics. The new framework, unlike its classical-physics-based predecessor is erected directly upon, and is compatible with, the prevailing principles of physics, and is able to represent more adequately than classical concepts the neuroplastic mechanisms relevant to the growing number of empirical studies of the capacity of directed attention and mental effort to systematically alter brain function.« less
Quantum physics in neuroscience and psychology: a neurophysical model of mind–brain interaction
Schwartz, Jeffrey M; Stapp, Henry P; Beauregard, Mario
2005-01-01
Neuropsychological research on the neural basis of behaviour generally posits that brain mechanisms will ultimately suffice to explain all psychologically described phenomena. This assumption stems from the idea that the brain is made up entirely of material particles and fields, and that all causal mechanisms relevant to neuroscience can therefore be formulated solely in terms of properties of these elements. Thus, terms having intrinsic mentalistic and/or experiential content (e.g. ‘feeling’, ‘knowing’ and ‘effort’) are not included as primary causal factors. This theoretical restriction is motivated primarily by ideas about the natural world that have been known to be fundamentally incorrect for more than three-quarters of a century. Contemporary basic physical theory differs profoundly from classic physics on the important matter of how the consciousness of human agents enters into the structure of empirical phenomena. The new principles contradict the older idea that local mechanical processes alone can account for the structure of all observed empirical data. Contemporary physical theory brings directly and irreducibly into the overall causal structure certain psychologically described choices made by human agents about how they will act. This key development in basic physical theory is applicable to neuroscience, and it provides neuroscientists and psychologists with an alternative conceptual framework for describing neural processes. Indeed, owing to certain structural features of ion channels critical to synaptic function, contemporary physical theory must in principle be used when analysing human brain dynamics. The new framework, unlike its classic-physics-based predecessor, is erected directly upon, and is compatible with, the prevailing principles of physics. It is able to represent more adequately than classic concepts the neuroplastic mechanisms relevant to the growing number of empirical studies of the capacity of directed attention and mental effort to systematically alter brain function. PMID:16147524
Fundamentals of Structural Geology
NASA Astrophysics Data System (ADS)
Pollard, David D.; Fletcher, Raymond C.
2005-09-01
Fundamentals of Structural Geology provides a new framework for the investigation of geological structures by integrating field mapping and mechanical analysis. Assuming a basic knowledge of physical geology, introductory calculus and physics, it emphasizes the observational data, modern mapping technology, principles of continuum mechanics, and the mathematical and computational skills, necessary to quantitatively map, describe, model, and explain deformation in Earth's lithosphere. By starting from the fundamental conservation laws of mass and momentum, the constitutive laws of material behavior, and the kinematic relationships for strain and rate of deformation, the authors demonstrate the relevance of solid and fluid mechanics to structural geology. This book offers a modern quantitative approach to structural geology for advanced students and researchers in structural geology and tectonics. It is supported by a website hosting images from the book, additional colour images, student exercises and MATLAB scripts. Solutions to the exercises are available to instructors. The book integrates field mapping using modern technology with the analysis of structures based on a complete mechanics MATLAB is used to visualize physical fields and analytical results and MATLAB scripts can be downloaded from the website to recreate textbook graphics and enable students to explore their choice of parameters and boundary conditions The supplementary website hosts color images of outcrop photographs used in the text, supplementary color images, and images of textbook figures for classroom presentations The textbook website also includes student exercises designed to instill the fundamental relationships, and to encourage the visualization of the evolution of geological structures; solutions are available to instructors
Risk-based containment and air monitoring criteria for work with dispersible radioactive materials.
Veluri, Venkateswara Rao; Justus, Alan L
2013-04-01
This paper presents readily understood, technically defensible, risk-based containment and air monitoring criteria, which are developed from fundamental physical principles. The key for the development of each criterion was the use of a calculational de minimis level, in this case chosen to be 100 mrem (or 40 DAC-h). Examples are provided that demonstrate the effective use of each criterion. Comparison to other often used criteria is provided.
... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology 2018-2019 Basic and ... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology Print 2018-2019 Basic ...
Energy Literacy : Essential Principles and Fundamental Concepts for Energy Education
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Energy Literacy: Essential Principles and Fundamental Concepts for Energy Education presents energy concepts that, if understood and applied, will help individuals and communities make informed energy decisions.
Career opportunities in clinical engineering.
Morse, W A
1992-01-01
The varied career opportunities open to clinical engineers are described in this paper. Many of these opportunities are within the medical device industry in research, development, manufacturing design, regulatory activities, production, operations, sales, marketing, service, and management. Additional opportunities are available in hospitals, with the Veterans Administration, or working as an entrepreneur or a consultant. Each of these careers requires specific training and skills, and they all require a fundamental scientific knowledge of physical principles and mathematics. Research and management, however, require different educational preparation. The research emphasis should be on theoretical principles and creativity; the management emphasis should be on financial and labor problems. In all clinical engineering careers, the individual is a problem solver.
Nanodopant-Induced Band Modulation in AgPbmSbTe2+m-Type Nanocomposites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yi; Ke, Xuezhi; Chen, Changfeng
2011-01-01
We elucidate the fundamental physics of nanoscale dopants in narrow band-gap thermoelectric nanocomposites XPbmYTe2+m (X=Ag,Na; Y=Sb,Bi) using first-principles calculations. Our re- sults unveil distinct band-structure modulations, most notably a sizable band-gap widening driven by nanodopant-induced lattice strain and a band split-off at the conduction band minimum caused by the spin-orbit interaction of the dopant Sb or Bi atoms. Boltzmann transport calculations demon- strate that these band modulations have significant but competing effects on high-temperature elec- tron transport behavior. These results offer insights for understanding recent experimental findings and suggest principles for optimizing thermoelectric properties of narrow band-gap semiconductors.
Do the Modified Uncertainty Principle and Polymer Quantization predict same physics?
NASA Astrophysics Data System (ADS)
Majumder, Barun; Sen, Sourav
2012-10-01
In this Letter we study the effects of the Modified Uncertainty Principle as proposed in Ali et al. (2009) [5] in simple quantum mechanical systems and study its thermodynamic properties. We have assumed that the quantum particles follow Maxwell-Boltzmann statistics with no spin. We compare our results with the results found in the GUP and polymer quantum mechanical frameworks. Interestingly we find that the corrected thermodynamic entities are exactly the same compared to the polymer results but the length scale considered has a theoretically different origin. Hence we express the need of further study for an investigation whether these two approaches are conceptually connected in the fundamental level.
Weak Galilean invariance as a selection principle for coarse-grained diffusive models.
Cairoli, Andrea; Klages, Rainer; Baule, Adrian
2018-05-29
How does the mathematical description of a system change in different reference frames? Galilei first addressed this fundamental question by formulating the famous principle of Galilean invariance. It prescribes that the equations of motion of closed systems remain the same in different inertial frames related by Galilean transformations, thus imposing strong constraints on the dynamical rules. However, real world systems are often described by coarse-grained models integrating complex internal and external interactions indistinguishably as friction and stochastic forces. Since Galilean invariance is then violated, there is seemingly no alternative principle to assess a priori the physical consistency of a given stochastic model in different inertial frames. Here, starting from the Kac-Zwanzig Hamiltonian model generating Brownian motion, we show how Galilean invariance is broken during the coarse-graining procedure when deriving stochastic equations. Our analysis leads to a set of rules characterizing systems in different inertial frames that have to be satisfied by general stochastic models, which we call "weak Galilean invariance." Several well-known stochastic processes are invariant in these terms, except the continuous-time random walk for which we derive the correct invariant description. Our results are particularly relevant for the modeling of biological systems, as they provide a theoretical principle to select physically consistent stochastic models before a validation against experimental data.
NASA Astrophysics Data System (ADS)
Lanzalaco, Felix; Pissanetzky, Sergio
2013-12-01
A recent theory of physical information based on the fundamental principles of causality and thermodynamics has proposed that a large number of observable life and intelligence signals can be described in terms of the Causal Mathematical Logic (CML), which is proposed to encode the natural principles of intelligence across any physical domain and substrate. We attempt to expound the current definition of CML, the "Action functional" as a theory in terms of its ability to possess a superior explanatory power for the current neuroscientific data we use to measure the mammalian brains "intelligence" processes at its most general biophysical level. Brain simulation projects define their success partly in terms of the emergence of "non-explicitly programmed" complex biophysical signals such as self-oscillation and spreading cortical waves. Here we propose to extend the causal theory to predict and guide the understanding of these more complex emergent "intelligence Signals". To achieve this we review whether causal logic is consistent with, can explain and predict the function of complete perceptual processes associated with intelligence. Primarily those are defined as the range of Event Related Potentials (ERP) which include their primary subcomponents; Event Related Desynchronization (ERD) and Event Related Synchronization (ERS). This approach is aiming for a universal and predictive logic for neurosimulation and AGi. The result of this investigation has produced a general "Information Engine" model from translation of the ERD and ERS. The CML algorithm run in terms of action cost predicts ERP signal contents and is consistent with the fundamental laws of thermodynamics. A working substrate independent natural information logic would be a major asset. An information theory consistent with fundamental physics can be an AGi. It can also operate within genetic information space and provides a roadmap to understand the live biophysical operation of the phenotype
Transmission of chirality through space and across length scales
NASA Astrophysics Data System (ADS)
Morrow, Sarah M.; Bissette, Andrew J.; Fletcher, Stephen P.
2017-05-01
Chirality is a fundamental property and vital to chemistry, biology, physics and materials science. The ability to use asymmetry to operate molecular-level machines or macroscopically functional devices, or to give novel properties to materials, may address key challenges at the heart of the physical sciences. However, how chirality at one length scale can be translated to asymmetry at a different scale is largely not well understood. In this Review, we discuss systems where chiral information is translated across length scales and through space. A variety of synthetic systems involve the transmission of chiral information between the molecular-, meso- and macroscales. We show how fundamental stereochemical principles may be used to design and understand nanoscale chiral phenomena and highlight important recent advances relevant to nanotechnology. The survey reveals that while the study of stereochemistry on the nanoscale is a rich and dynamic area, our understanding of how to control and harness it and dial-up specific properties is still in its infancy. The long-term goal of controlling nanoscale chirality promises to be an exciting journey, revealing insight into biological mechanisms and providing new technologies based on dynamic physical properties.
End-Directedness and Context in Nonliving Dissipative Systems
NASA Astrophysics Data System (ADS)
Dixon, James A.; Kay, Bruce A.; Davis, Tehran J.; Kondepudi, Dilip
Biological organisms are distinguished from non-living systems, in part, by their ability to choose and strive towards particular ends. This end-directed behavior is seen across all five biological kingdoms, from single-celled organisms to the most advanced primates. The ubiquitous nature of end-directedness, across such a wide variety of biological entities, suggests that a deeper principle may be at work. We propose that end-directedness, rather than being a special ability of living systems, is actually a fundamental property of a larger class of physical systems, called dissipative structures, which are formed and maintained by the flow of energy and matter. Our work shows that dissipative structures "behave so as to persist", seeking states that increase their rate of entropy production, and thus facilitate their own persistence. In addition, we suggest that biological entities create their exquisite sensitivity to context by interweaving this fundamental end-directedness with the contextual and physical constraints of their environments. The result is a repertoire of complex behavior. We provide an example of such complex behavior emerging from contextual and physical constraints coupled with end-directedness.
Plasma inverse transition acceleration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Ming
It can be proved fundamentally from the reciprocity theorem with which the electromagnetism is endowed that corresponding to each spontaneous process of radiation by a charged particle there is an inverse process which defines a unique acceleration mechanism, from Cherenkov radiation to inverse Cherenkov acceleration (ICA) [1], from Smith-Purcell radiation to inverse Smith-Purcell acceleration (ISPA) [2], and from undulator radiation to inverse undulator acceleration (IUA) [3]. There is no exception. Yet, for nearly 30 years after each of the aforementioned inverse processes has been clarified for laser acceleration, inverse transition acceleration (ITA), despite speculation [4], has remained the least understood,more » and above all, no practical implementation of ITA has been found, until now. Unlike all its counterparts in which phase synchronism is established one way or the other such that a particle can continuously gain energy from an acceleration wave, the ITA to be discussed here, termed plasma inverse transition acceleration (PITA), operates under fundamentally different principle. As a result, the discovery of PITA has been delayed for decades, waiting for a conceptual breakthrough in accelerator physics: the principle of alternating gradient acceleration [5, 6, 7, 8, 9, 10]. In fact, PITA was invented [7, 8] as one of several realizations of the new principle.« less
Biological life-support systems for Mars mission.
Gitelson, J I
1992-01-01
Mars mission like the Lunar base is the first venture to maintain human life beyond earth biosphere. So far, all manned space missions including the longest ones used stocked reserves and can not be considered egress from biosphere. Conventional path proposed by technology for Martian mission LSS is to use physical-chemical approaches proved by the experience of astronautics. But the problem of man living beyond the limits of the earth biosphere can be fundamentally solved by making a closed ecosystem for him. The choice optimum for a Mars mission LSS can be substantiated by comparing the merits and demerits of physical-chemical and biological principles without ruling out possible compromise between them. The work gives comparative analysis of ecological and physical-chemical principles for LSS. Taking into consideration universal significance of ecological problems with artificial LSS as a particular case of their solution, complexity and high cost of large-scale experiments with manned LSS, it would be expedient for these works to have the status of an International Program open to be joined. A program of making artificial biospheres based on preceding experience and analysis of current situation is proposed.
Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control
NASA Astrophysics Data System (ADS)
Deffner, Sebastian; Campbell, Steve
2017-11-01
One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam-Tamm and the Margolus-Levitin bounds on the quantum speed limit, and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach, where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader.
Introducing Filters and Amplifiers Using a Two-Channel Light Organ
NASA Astrophysics Data System (ADS)
Zavrel, Erik; Sharpsteen, Eric
2015-11-01
In an era when many students carry iPods, iPhones, and iPads, physics teachers are realizing that in order to continue to inspire and convey the amazing things made possible by a few fundamental principles, they must expand laboratory coverage of electricity and circuits beyond the conventional staples of constructing series and parallel arrangements of light bulbs and confirming Kirchhoff's laws. Indeed, physics teachers are already incorporating smartphones into their laboratory activities in an effort to convey concepts in a more contemporary and relatable manner. As part of Cornell's Learning Initiative in Medicine and Bioengineering (CLIMB), we set out to design and implement an engaging curriculum to introduce high school physics students to filters and amplifiers.
Effective Lagrangian in de Sitter spacetime
NASA Astrophysics Data System (ADS)
Kitamoto, Hiroyuki; Kitazawa, Yoshihisa
2017-01-01
Scale invariant fluctuations of metric are a universal feature of quantum gravity in de Sitter spacetime. We construct an effective Lagrangian which summarizes their implications on local physics by integrating superhorizon metric fluctuations. It shows infrared quantum effects are local and render fundamental couplings time dependent. We impose Lorenz invariance on the effective Lagrangian as it is required by the principle of general covariance. We show that such a requirement leads to unique physical predictions by fixing the quantization ambiguities. We explain how the gauge parameter dependence of observables is canceled. In particular the relative evolution speed of the couplings are shown to be gauge invariant.
Nuclear Physics Made Very, Very Easy
NASA Technical Reports Server (NTRS)
Hanlen, D. F.; Morse, W. J.
1968-01-01
The fundamental approach to nuclear physics was prepared to introduce basic reactor principles to various groups of non-nuclear technical personnel associated with NERVA Test Operations. NERVA Test Operations functions as the field test group for the Nuclear Rocket Engine Program. Nuclear Engine for Rocket Vehicle Application (NERVA) program is the combined efforts of Aerojet-General Corporation as prime contractor, and Westinghouse Astronuclear Laboratory as the major subcontractor, for the assembly and testing of nuclear rocket engines. Development of the NERVA Program is under the direction of the Space Nuclear Propulsion Office, a joint agency of the U.S. Atomic Energy Commission and the National Aeronautics and Space Administration.
NASA Astrophysics Data System (ADS)
Decremps, F.; Belliard, L.; Couzinet, B.; Vincent, S.; Munsch, P.; Le Marchand, G.; Perrin, B.
2009-07-01
Recent improvements to measure ultrasonic sound velocities of liquids under extreme conditions are described. Principle and feasibility of picosecond acoustics in liquids embedded in a diamond anvils cell are given. To illustrate the capability of these advances in the sound velocity measurement technique, original high pressure and high temperature results on the sound velocity of liquid mercury up to 5 GPa and 575 K are given. This high pressure technique will certainly be useful in several fundamental and applied problems in physics and many other fields such as geophysics, nonlinear acoustics, underwater sound, petrology or physical acoustics.
Student Use of Physics to Make Sense of Incomplete but Functional VPython Programs in a Lab Setting
NASA Astrophysics Data System (ADS)
Weatherford, Shawn A.
2011-12-01
Computational activities in Matter & Interactions, an introductory calculus-based physics course, have the instructional goal of providing students with the experience of applying the same set of a small number of fundamental principles to model a wide range of physical systems. However there are significant instructional challenges for students to build computer programs under limited time constraints, especially for students who are unfamiliar with programming languages and concepts. Prior attempts at designing effective computational activities were successful at having students ultimately build working VPython programs under the tutelage of experienced teaching assistants in a studio lab setting. A pilot study revealed that students who completed these computational activities had significant difficultly repeating the exact same tasks and further, had difficulty predicting the animation that would be produced by the example program after interpreting the program code. This study explores the interpretation and prediction tasks as part of an instructional sequence where students are asked to read and comprehend a functional, but incomplete program. Rather than asking students to begin their computational tasks with modifying program code, we explicitly ask students to interpret an existing program that is missing key lines of code. The missing lines of code correspond to the algebraic form of fundamental physics principles or the calculation of forces which would exist between analogous physical objects in the natural world. Students are then asked to draw a prediction of what they would see in the simulation produced by the VPython program and ultimately run the program to evaluate the students' prediction. This study specifically looks at how the participants use physics while interpreting the program code and creating a whiteboard prediction. This study also examines how students evaluate their understanding of the program and modification goals at the beginning of the modification task. While working in groups over the course of a semester, study participants were recorded while they completed three activities using these incomplete programs. Analysis of the video data showed that study participants had little difficulty interpreting physics quantities, generating a prediction, or determining how to modify the incomplete program. Participants did not base their prediction solely from the information from the incomplete program. When participants tried to predict the motion of the objects in the simulation, many turned to their knowledge of how the system would evolve if it represented an analogous real-world physical system. For example, participants attributed the real-world behavior of springs to helix objects even though the program did not include calculations for the spring to exert a force when stretched. Participants rarely interpreted lines of code in the computational loop during the first computational activity, but this changed during latter computational activities with most participants using their physics knowledge to interpret the computational loop. Computational activities in the Matter & Interactions curriculum were revised in light of these findings to include an instructional sequence of tasks to build a comprehension of the example program. The modified activities also ask students to create an additional whiteboard prediction for the time-evolution of the real-world phenomena which the example program will eventually model. This thesis shows how comprehension tasks identified by Palinscar and Brown (1984) as effective in improving reading comprehension are also effective in helping students apply their physics knowledge to interpret a computer program which attempts to model a real-world phenomena and identify errors in their understanding of the use, or omission, of fundamental physics principles in a computational model.
Do Racial and Gender Disparities Exist in Newer Glaucoma Treatments?
... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology 2018-2019 Basic and ... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology Print 2018-2019 Basic ...
Recent Advances and Future Prospects in Fundamental Symmetries
NASA Astrophysics Data System (ADS)
Plaster, Brad
2017-09-01
A broad program of initiatives in fundamental symmetries seeks answers to several of the most pressing open questions in nuclear physics, ranging from the scale of the neutrino mass, to the particle-antiparticle nature of the neutrino, to the origin of the matter-antimatter asymmetry, to the limits of Standard Model interactions. Although the experimental program is quite broad, with efforts ranging from precision measurements of neutrino properties; to searches for electric dipole moments; to precision measurements of magnetic dipole moments; and to precision measurements of couplings, particle properties, and decays; all of these seemingly disparate initiatives are unified by several common threads. These include the use and exploitation of symmetry principles, novel cross-disciplinary experimental work at the forefront of the precision frontier, and the need for accompanying breakthroughs in development of the theory necessary for an interpretation of the anticipated results from these experiments. This talk will highlight recent accomplishments and advances in fundamental symmetries and point to the extraordinary level of ongoing activity aimed at realizing the development and interpretation of next-generation experiments. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics, under Award Number DE-SC-0014622.
Leibniz on the metaphysical foundation of physics
NASA Astrophysics Data System (ADS)
Temple, Daniel R.
This thesis examines how and why Leibniz felt that physics must be grounded in metaphysics. I argue that one of the strongest motivation Leibniz had for attempting to ground physics in metaphysics was his concern over the problem of induction. Even in his early writings, Leibniz was well aware of the problem of induction and how this problem threatened the very possibility of physics. Both his early and later theories of truth are geared towards solving this deep problem in the philosophy of science. In his early theory of truth, all truths are ultimately grounded in (but not necessarily reducible to) an identity. Hence, all truths are ultimately based in logic. Consequently, the problem of induction is seemingly solved since everything that happens, happens with the force of logical necessity. Unfortunately, this theory is incompatible with Leibniz's theory of possible worlds and hence, jeopardizes the liberty of God. In Leibniz's later theory of truth, Leibniz tries to overcome this weakness by acknowledging truths that are grounded in the free but moral necessity of God's actions. Since God's benevolence is responsible for the actualization of this world, then this world must possess rational laws. Furthermore, since God's rationality ensures that everything obeys the principle of sufficient reason, then we can use this principle to determine the fundamental laws of the universe. Leibniz himself attempts to derive these laws using this principle. Kant attempted to continue this work of securing the possibility of science, and the problems he encountered helped to shape his critical philosophy. So I conclude by a comparative analysis of Leibniz and Kant on the foundations of physics.
Quasiparticle band structures and interface physics of SnS and GeS
NASA Astrophysics Data System (ADS)
Malone, Brad; Kaxiras, Efthimios
2013-03-01
Orthorhombic SnS and GeS are layered materials made of earth-abundant elements which have the potential to play a useful role in the massive scale up of renewable power necessary by 2050 to avoid unmanageable levels of climate change. We report on first principles calculations of the quasiparticle spectra of these two materials, predicting the type and magnitude of the fundamental band gap, a quantity which shows a strong degree of scatter in the experimental literature. Additionally, in order to evaluate the possible role of GeS as an electron-blocking layer in a SnS-based photovoltaic device, we investigate the band offsets of the interfaces between these materials along the three principle crystallographic directions. We find that while the valence-band offsets are similar along the three principle directions, the conduction-band offsets display a substantial amount of anisotropy.
Ethical principles in health research and review process.
Tangwa, Godfrey B
2009-11-01
In this paper I want to reflect on the fundamental ethical principles and their application in different particular contexts, especially in health research and the ethics review process. Four fundamental ethical principles have been identified and widely discussed in bioethical literature. These principles namely are: autonomy or respect for others, beneficence, non-maleficence and justice. These principles have cross-cultural validity, relevance and applicability. Every real-life situation and every concrete particular case in which ethical decision-making is called-for is unique and different from all others; but the same fundamental ethical principles are relevant and used in addressing all such cases and situations. Very often ethical problems will present themselves in the form of dilemmas and it is then necessary to use the same fundamental principles to analyze the situations, to argue persuasively and cogently with competence for the best options or choices in such situations. The issues I will be dealing with in this paper are necessarily more abstract and theoretical, but we will be discussing them from a very practical viewpoint and impulse, with a view to application in concrete real-life situations. The paper ends with some sample practical examples of cases that the reader can use to test his/her grasp of the principles, how to apply them, how to balance them in differing situations and contexts and how to adjudicate between them when they seem to be in conflict.
Fischer, Andreas
2016-11-01
Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.
Experimentally probing topological order and its breakdown through modular matrices
NASA Astrophysics Data System (ADS)
Luo, Zhihuang; Li, Jun; Li, Zhaokai; Hung, Ling-Yan; Wan, Yidun; Peng, Xinhua; Du, Jiangfeng
2018-02-01
The modern concept of phases of matter has undergone tremendous developments since the first observation of topologically ordered states in fractional quantum Hall systems in the 1980s. In this paper, we explore the following question: in principle, how much detail of the physics of topological orders can be observed using state of the art technologies? We find that using surprisingly little data, namely the toric code Hamiltonian in the presence of generic disorders and detuning from its exactly solvable point, the modular matrices--characterizing anyonic statistics that are some of the most fundamental fingerprints of topological orders--can be reconstructed with very good accuracy solely by experimental means. This is an experimental realization of these fundamental signatures of a topological order, a test of their robustness against perturbations, and a proof of principle--that current technologies have attained the precision to identify phases of matter and, as such, probe an extended region of phase space around the soluble point before its breakdown. Given the special role of anyonic statistics in quantum computation, our work promises myriad applications both in probing and realistically harnessing these exotic phases of matter.
Some Fundamental Molecular Mechanisms of Contractility in Fibrous Macromolecules
Mandelkern, L.
1967-01-01
The fundamental molecular mechanisms of contractility and tension development in fibrous macromolecules are developed from the point of view of the principles of polymer physical chemistry. The problem is treated in a general manner to encompass the behavior of all macromolecular systems irrespective of their detailed chemical structure and particular function, if any. Primary attention is given to the contractile process which accompanies the crystal-liquid transition in axially oriented macromolecular systems. The theoretical nature of the process is discussed, and many experimental examples are given from the literature which demonstrate the expected behavior. Experimental attention is focused on the contraction of fibrous proteins, and the same underlying molecular mechanism is shown to be operative for a variety of different systems. PMID:6050598
An experimental approach to the fundamental principles of hemodynamics.
Pontiga, Francisco; Gaytán, Susana P
2005-09-01
An experimental model has been developed to give students hands-on experience with the fundamental laws of hemodynamics. The proposed experimental setup is of simple construction but permits the precise measurements of physical variables involved in the experience. The model consists in a series of experiments where different basic phenomena are quantitatively investigated, such as the pressure drop in a long straight vessel and in an obstructed vessel, the transition from laminar to turbulent flow, the association of vessels in vascular networks, or the generation of a critical stenosis. Through these experiments, students acquire a direct appreciation of the importance of the parameters involved in the relationship between pressure and flow rate, thus facilitating the comprehension of more complex problems in hemodynamics.
Limits on fundamental limits to computation.
Markov, Igor L
2014-08-14
An indispensable part of our personal and working lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the past fifty years. Such Moore scaling now requires ever-increasing efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and increase our understanding of integrated-circuit scaling, here I review fundamental limits to computation in the areas of manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, I recapitulate how some limits were circumvented, and compare loose and tight limits. Engineering difficulties encountered by emerging technologies may indicate yet unknown limits.
Applications of the principle of maximum entropy: from physics to ecology.
Banavar, Jayanth R; Maritan, Amos; Volkov, Igor
2010-02-17
There are numerous situations in physics and other disciplines which can be described at different levels of detail in terms of probability distributions. Such descriptions arise either intrinsically as in quantum mechanics, or because of the vast amount of details necessary for a complete description as, for example, in Brownian motion and in many-body systems. We show that an application of the principle of maximum entropy for estimating the underlying probability distribution can depend on the variables used for describing the system. The choice of characterization of the system carries with it implicit assumptions about fundamental attributes such as whether the system is classical or quantum mechanical or equivalently whether the individuals are distinguishable or indistinguishable. We show that the correct procedure entails the maximization of the relative entropy subject to known constraints and, additionally, requires knowledge of the behavior of the system in the absence of these constraints. We present an application of the principle of maximum entropy to understanding species diversity in ecology and introduce a new statistical ensemble corresponding to the distribution of a variable population of individuals into a set of species not defined a priori.
The highly intelligent virtual agents for modeling financial markets
NASA Astrophysics Data System (ADS)
Yang, G.; Chen, Y.; Huang, J. P.
2016-02-01
Researchers have borrowed many theories from statistical physics, like ensemble, Ising model, etc., to study complex adaptive systems through agent-based modeling. However, one fundamental difference between entities (such as spins) in physics and micro-units in complex adaptive systems is that the latter are usually with high intelligence, such as investors in financial markets. Although highly intelligent virtual agents are essential for agent-based modeling to play a full role in the study of complex adaptive systems, how to create such agents is still an open question. Hence, we propose three principles for designing high artificial intelligence in financial markets and then build a specific class of agents called iAgents based on these three principles. Finally, we evaluate the intelligence of iAgents through virtual index trading in two different stock markets. For comparison, we also include three other types of agents in this contest, namely, random traders, agents from the wealth game (modified on the famous minority game), and agents from an upgraded wealth game. As a result, iAgents perform the best, which gives a well support for the three principles. This work offers a general framework for the further development of agent-based modeling for various kinds of complex adaptive systems.
Consequences of Irreversibility in Fundamental Models of Transcription
NASA Astrophysics Data System (ADS)
Sevier, Stuart; Levine, Herbert
2015-03-01
The ability to watch biochemical events play out at the single-molecule level has led to the discovery that transcription occurs in a noisy, ``bursty'' manner. Recently, as the single-molecule lens is placed over a larger number of organisms and genes, relationships between mean expression and noise beyond the ``bursty'' paradigm have emerged. Through a master-equation formulation of transcription we have found that many powerful physical principles relating to irreversibility seem to play a central role in the newly uncovered trends. Specifically, the relationships between mean expression and noise appears to be a direct consequence of network currents. We discuss how emphasizing the underlying principles in the models can explain recent experimental data and lead to a generalized view of transcription.
Chemical evolution and the origin of life
NASA Technical Reports Server (NTRS)
Oro, J.
1983-01-01
A review is presented of recent advances made in the understanding of the formation of carbon compounds in the universe and the occurrence of processes of chemical evolution. Topics discussed include the principle of evolutionary continuity, evolution as a fundamental principle of the physical universe, the nuclear synthesis of biogenic elements, organic cosmochemistry and interstellar molecules, the solar nebula and the solar system in chemical evolution, the giant planets and Titan in chemical evolution, and comets and their interaction with the earth. Also examined are carbonaceous chondrites, environment of the primitive earth, energy sources available on the primitive earth, the synthesis of biochemical monomers and oligomers, the abiotic transcription of nucleotides, unified prebiotic and enzymatic mechanisms, phospholipids and membranes, and protobiological evolution.
NASA Astrophysics Data System (ADS)
Stapp, Henry P.
2011-11-01
The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determined by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.
Testing Our Fundamental Assumptions
NASA Astrophysics Data System (ADS)
Kohler, Susanna
2016-06-01
Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these fundamental assumptions.A recent focus set in the Astrophysical Journal Letters, titled Focus on Exploring Fundamental Physics with Extragalactic Transients, consists of multiple published studies doing just that.Testing General RelativitySeveral of the articles focus on the 4th point above. By assuming that the delay in photon arrival times is only due to the gravitational potential of the Milky Way, these studies set constraints on the deviation of our galaxys gravitational potential from what GR would predict. The study by He Gao et al. uses the different photon arrival times from gamma-ray bursts to set constraints at eVGeV energies, and the study by Jun-Jie Wei et al. complements this by setting constraints at keV-TeV energies using photons from high-energy blazar emission.Photons or neutrinos from different extragalactic transients each set different upper limits on delta gamma, the post-Newtonian parameter, vs. particle energy or frequency. This is a test of Einsteins equivalence principle: if the principle is correct, delta gamma would be exactly zero, meaning that photons of different energies move at the same velocity through a vacuum. [Tingay Kaplan 2016]S.J. Tingay D.L. Kaplan make the case that measuring the time delay of photons from fast radio bursts (FRBs; transient radio pulses that last only a few milliseconds) will provide even tighter constraints if we are able to accurately determine distances to these FRBs.And Adi Musser argues that the large-scale structure of the universe plays an even greater role than the Milky Way gravitational potential, allowing for even stricter testing of Einsteins equivalence principle.The ever-narrower constraints from these studies all support GR as a correct set of rules through which to interpret our universe.Other Tests of Fundamental PhysicsIn addition to the above tests, Xue-Feng Wu et al. show that FRBs can be used to provide severe constraints on the rest mass of the photon, and S. Croft et al. even touches on what we might learn from transients using multi-messenger astrophysics (astrophysics involving observations of particles besides photons, such as neutrinos or gravitational waves).In general, extragalactic transients provide a rich prospect for better understanding the laws that govern the universe. Check out the entire focus set below to learn more about the tests of fundamental physics that can be done with observations of extragalactic transients!CitationFocus Set: Focus on Exploring Fundamental Physics With Extragalactic TransientsHe Gao et al. 2015 ApJ 810 121. doi:10.1088/0004-637X/810/2/121Jun-Jie Wei et al. 2016 ApJ 818 L2. doi:10.3847/2041-8205/818/1/L2S. Croft et al. 2016 ApJ 820 L24. doi:10.3847/2041-8205/820/2/L24S. J. Tingay and D. L. Kaplan 2016 ApJ 820 L31. doi:10.3847/2041-8205/820/2/L31Adi Nusser 2016 ApJ 821 L2. doi:10.3847/2041-8205/821/1/L2Xue-Feng Wu et al. 2016 ApJ 822 L15. doi:10.3847/2041-8205/822/1/L15
NASA Technical Reports Server (NTRS)
Zuk, J.
1976-01-01
The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function, these are presented and discussed. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin film seals, which enable leakage calculations to be made, are also presented. Concept of flow functions, application of Reynolds lubrication equation, and nonlubrication equation flow, friction and wear; and seal lubrication regimes are explained.
NASA Astrophysics Data System (ADS)
Asano, Masanari; Basieva, Irina; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro
2015-10-01
We discuss foundational issues of quantum information biology (QIB)—one of the most successful applications of the quantum formalism outside of physics. QIB provides a multi-scale model of information processing in bio-systems: from proteins and cells to cognitive and social systems. This theory has to be sharply distinguished from "traditional quantum biophysics". The latter is about quantum bio-physical processes, e.g., in cells or brains. QIB models the dynamics of information states of bio-systems. We argue that the information interpretation of quantum mechanics (its various forms were elaborated by Zeilinger and Brukner, Fuchs and Mermin, and D' Ariano) is the most natural interpretation of QIB. Biologically QIB is based on two principles: (a) adaptivity; (b) openness (bio-systems are fundamentally open). These principles are mathematically represented in the framework of a novel formalism— quantum adaptive dynamics which, in particular, contains the standard theory of open quantum systems.
Optimized Materials From First Principles Simulations: Are We There Yet?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galli, G; Gygi, F
2005-07-26
In the past thirty years, the use of scientific computing has become pervasive in all disciplines: collection and interpretation of most experimental data is carried out using computers, and physical models in computable form, with various degrees of complexity and sophistication, are utilized in all fields of science. However, full prediction of physical and chemical phenomena based on the basic laws of Nature, using computer simulations, is a revolution still in the making, and it involves some formidable theoretical and computational challenges. We illustrate the progress and successes obtained in recent years in predicting fundamental properties of materials in condensedmore » phases and at the nanoscale, using ab-initio, quantum simulations. We also discuss open issues related to the validation of the approximate, first principles theories used in large scale simulations, and the resulting complex interplay between computation and experiment. Finally, we describe some applications, with focus on nanostructures and liquids, both at ambient and under extreme conditions.« less
Cardiac imaging: does radiation matter?
Einstein, Andrew J.; Knuuti, Juhani
2012-01-01
The use of ionizing radiation in cardiovascular imaging has generated considerable discussion. Radiation should not be considered in isolation, but rather in the context of a careful examination of the benefits, risks, and costs of cardiovascular imaging. Such consideration requires an understanding of some fundamental aspects of the biology, physics, epidemiology, and terminology germane to radiation, as well as principles of radiological protection. This paper offers a concise, contemporary perspective on these areas by addressing pertinent questions relating to radiation and its application to cardiac imaging. PMID:21828062
NASA Technical Reports Server (NTRS)
Reese, T. G.; Baracat, W. A.; Butner, C. L.
1986-01-01
The handbook provides a list and description of ongoing tether programs. This includes the joint U.S.-Italy demonstration project, and individual U.S. and Italian studies and demonstration programs. An overview of the current activity level and areas of emphasis in this emerging field is provided. The fundamental physical principles behind the proposed tether applications are addressed. Four basic concepts of gravity gradient, rotation, momentum exchange, and electrodynamics are discussed. Information extracted from literature, which supplements and enhances the tether applications is also presented. A bibliography is appended.
Principles of signal conditioning.
Finkel, A; Bookman, R
2001-05-01
It is rare for biological, physiological, chemical, electrical, or physical signals to be measured in the appropriate format for recording and interpretation. Usually, a signal must be conditioned to optimize it for both of these functions. This overview describes the fundamentals of signal filtering, how to prepare signals for A/D conversion, signal averaging to increase the signal-to-noise ratio, line frequency pickup (hum), peak-to-peak and rms noise measurements, blanking, audio monitoring, testing of electrodes and the common-mode rejection ratio.
Regularized Reconstruction of Dynamic Contrast-Enhanced MR Images for Evaluation of Breast Lesions
2010-09-01
resonance imaging . We focus specifically on dynamic contrast-enhanced (DCE) imaging of breast cancer patients. The fundamental challenge in dynamic MRI is...Venkatesan, Magnetic resonance imaging : Physical principles and sequence design, Wiley, New York, 1999. 14 [7] P. S. Tofts and A. G. Kermode, “Measurement...10, no. 3, pp. 223–32, Sept. 1999. [12] D. C. Noll, D. G. Nishimura, and A. Macovski, “Homodyne detection in magnetic resonance imaging ,” IEEE Trans
10 Tips to Reduce Your Chance of Losing Vision from the Most Common Cause of Blindness
... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology 2018-2019 Basic and ... 2019 Basic and Clinical Science Course, Section 02: Fundamentals and Principles of Ophthalmology Print 2018-2019 Basic ...
Novel approaches to the study of particle dark matter in astrophysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Argüelles, C. R., E-mail: carlos.arguelles@icranet.org; Ruffini, R., E-mail: ruffini@icra.it; Rueda, J. A., E-mail: jorge.rueda@icra.it
A deep understanding of the role of the dark matter in the different astrophysical scenarios of the local Universe such as galaxies, represent a crucial step to describe in a more consistent way the role of dark matter in cosmology. This kind of studies requires the interconnection between particle physics within and beyond the Standard Model, and fundamental physics such as thermodynamics and statistics, within a fully relativistic treatment of Gravity. After giving a comprehensive summary of the different types of dark matter and their role in astrophysics, we discuss the recent efforts in describing the distribution of dark mattermore » in the center and halo of galaxies from first principles such as gravitational interactions, quantum statistics and particle physics; and its implications with the observations.« less
Hall thruster with grooved walls
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li Hong; Ning Zhongxi; Yu Daren
2013-02-28
Axial-oriented and azimuthal-distributed grooves are formed on channel walls of a Hall thruster after the engine undergoes a long-term operation. Existing studies have demonstrated the relation between the grooves and the near-wall physics, such as sheath and electron near-wall transport. The idea to optimize the thruster performance with such grooves was also proposed. Therefore, this paper is devoted to explore the effects of wall grooves on the discharge characteristics of a Hall thruster. With experimental measurements, the variations on electron conductivity, ionization distribution, and integrated performance are obtained. The involved physical mechanisms are then analyzed and discussed. The findings helpmore » to not only better understand the working principle of Hall thruster discharge but also establish a physical fundamental for the subsequent optimization with artificial grooves.« less
Fundamental Tactical Principles of Soccer: A Comparison of Different Age Groups
Guilherme, José; Rechenchosky, Leandro; da Costa, Luciane Cristina Arantes; Rinadi, Wilson
2017-01-01
Abstract The fundamental tactical principles of the game of soccer represent a set of action rules that guide behaviours related to the management of game space. The aim of this study was to compare the performance of fundamental offensive and defensive tactical principles among youth soccer players from 12 to 17 years old. The sample consisted of 3689 tactical actions performed by 48 soccer players in three age categories: under 13 (U-13), under 15 (U-15), and under 17 (U-17). Tactical performance was measured using the System of Tactical Assessment in Soccer (FUT-SAT). The Kruskal Wallis, Mann-Whitney U, Friedman, Wilcoxon, and Cohen’s Kappa tests were used in the study analysis. The results showed that the principles of “offensive coverage” (p = 0.01) and “concentration” (p = 0.04) were performed more frequently by the U-17 players than the U-13 players. The tactical principles “width and length” (p < 0.05) and “defensive unit” (p < 0.05) were executed more frequently by younger soccer players. It can be concluded that the frequency with which fundamental tactical principles are performed varies between the gaming categories, which implies that there is valuation of defensive security and a progressive increase in “offensive coverage” caused by increased confidence and security in offensive actions. PMID:28828091
Fundamental Tactical Principles of Soccer: A Comparison of Different Age Groups.
Borges, Paulo Henrique; Guilherme, José; Rechenchosky, Leandro; da Costa, Luciane Cristina Arantes; Rinadi, Wilson
2017-09-01
The fundamental tactical principles of the game of soccer represent a set of action rules that guide behaviours related to the management of game space. The aim of this study was to compare the performance of fundamental offensive and defensive tactical principles among youth soccer players from 12 to 17 years old. The sample consisted of 3689 tactical actions performed by 48 soccer players in three age categories: under 13 (U-13), under 15 (U-15), and under 17 (U-17). Tactical performance was measured using the System of Tactical Assessment in Soccer (FUT-SAT). The Kruskal Wallis, Mann-Whitney U, Friedman, Wilcoxon, and Cohen's Kappa tests were used in the study analysis. The results showed that the principles of "offensive coverage" (p = 0.01) and "concentration" (p = 0.04) were performed more frequently by the U-17 players than the U-13 players. The tactical principles "width and length" (p < 0.05) and "defensive unit" (p < 0.05) were executed more frequently by younger soccer players. It can be concluded that the frequency with which fundamental tactical principles are performed varies between the gaming categories, which implies that there is valuation of defensive security and a progressive increase in "offensive coverage" caused by increased confidence and security in offensive actions.
Physics Matters: An Introduction to Conceptual Physics
NASA Astrophysics Data System (ADS)
Trefil, James; Hazen, Robert M.
2003-12-01
From amusement park rides to critical environmental issues such as energy generation-physics affects almost every aspect of our world. In PHYSICS MATTERS, James Trefil and Robert Hazen examine the fundamental physics principles at work behind the many practical applications that fuel our society and individual lives. Their goal is to promote a deeper understanding of how the great ideas of physics connect to form a much larger understanding of the universe in which we live. Highlights Helps readers build a general knowledge of key ideas in physics and their connection to technology and other areas of science. Promotes an appreciation of what science is, how scientific knowledge is developed, and how it differs from other intellectual activities. Examines modern technologies, including GPS, the Internet, and information technologies, as well as medical technologies, such as MRI, PET scans, CAT scans, and radioisotope tracers. Explores key issues facing the world today, such as global warning, nuclear waste, and government funding for research.
Physics Matters: An Introduction to Conceptual Physics, Activity Book
NASA Astrophysics Data System (ADS)
Trefil, James; Hazen, Robert M.
2004-02-01
From amusement park rides to critical environmental issues such as energy generation-physics affects almost every aspect of our world. In PHYSICS MATTERS, James Trefil and Robert Hazen examine the fundamental physics principles at work behind the many practical applications that fuel our society and individual lives. Their goal is to promote a deeper understanding of how the great ideas of physics connect to form a much larger understanding of the universe in which we live. Highlights Helps readers build a general knowledge of key ideas in physics and their connection to technology and other areas of science. Promotes an appreciation of what science is, how scientific knowledge is developed, and how it differs from other intellectual activities. Examines modern technologies, including GPS, the Internet, and information technologies, as well as medical technologies, such as MRI, PET scans, CAT scans, and radioisotope tracers. Explores key issues facing the world today, such as global warning, nuclear waste, and government funding for research.
Fundamental Physics with Antihydrogen
NASA Astrophysics Data System (ADS)
Hangst, J. S.
Antihydrogen—the antimatter equivalent of the hydrogen atom—is of fundamental interest as a test bed for universal symmetries—such as CPT and the Weak Equivalence Principle for gravitation. Invariance under CPT requires that hydrogen and antihydrogen have the same spectrum. Antimatter is of course intriguing because of the observed baryon asymmetry in the universe—currently unexplained by the Standard Model. At the CERN Antiproton Decelerator (AD) [
Tomographic phase microscopy: principles and applications in bioimaging [Invited
Jin, Di; Zhou, Renjie; Yaqoob, Zahid; So, Peter T. C.
2017-01-01
Tomographic phase microscopy (TPM) is an emerging optical microscopic technique for bioimaging. TPM uses digital holographic measurements of complex scattered fields to reconstruct three-dimensional refractive index (RI) maps of cells with diffraction-limited resolution by solving inverse scattering problems. In this paper, we review the developments of TPM from the fundamental physics to its applications in bioimaging. We first provide a comprehensive description of the tomographic reconstruction physical models used in TPM. The RI map reconstruction algorithms and various regularization methods are discussed. Selected TPM applications for cellular imaging, particularly in hematology, are reviewed. Finally, we examine the limitations of current TPM systems, propose future solutions, and envision promising directions in biomedical research. PMID:29386746
The Ampere and Electrical Standards
Elmquist, Randolph E.; Cage, Marvin E.; Tang, Yi-hua; Jeffery, Anne-Marie; Kinard, Joseph R.; Dziuba, Ronald F.; Oldham, Nile M.; Williams, Edwin R.
2001-01-01
This paper describes some of the major contributions to metrology and physics made by the NIST Electricity Division, which has existed since 1901. It was one of the six original divisions of the National Bureau of Standards. The Electricity Division provides dc and low-frequency calibrations for industrial, scientific, and research organizations, and conducts research on topics related to electrical metrology and fundamental constants. The early work of the Electricity Division staff included the development of precision standards, such as Rosa and Thomas standard resistors and the ac-dc thermal converter. Research contributions helped define the early international system of measurement units and bring about the transition to absolute units based on fundamental principles and physical and dimensional measurements. NIST research has helped to develop and refine electrical standards using the quantum Hall effect and the Josephson effect, which are both based on quantum physics. Four projects covering a number of voltage and impedance measurements are described in detail. Several other areas of current research at NIST are described, including the use of the Internet for international compatibility in metrology, determination of the fine-structure and Planck constants, and construction of the electronic kilogram. PMID:27500018
Tests of the Weak Equivalence Principal Below Fifty Microns
NASA Astrophysics Data System (ADS)
Leopardi, Holly; Hoyle, C. D.; Smith, Dave; Cardenas, Crystal; Harter, Andrew Conrad
2014-03-01
Due to the incompatibility of the Standard Model and General Relativity, tests of gravity remain at the forefront of experimental physics research. The Weak Equivalence Principle (WEP), which states that in a uniform gravitational field all objects fall with the same acceleration regardless of composition, total mass, or structure, is fundamentally the result of the equality of inertial mass and gravitational mass. The WEP has been effectively studied since the time of Galileo, and is a central feature of General Relativity; its violation at any length scale would bring into question fundamental aspects of the current model of gravitational physics. A variety of scenarios predict possible mechanisms that could result in a violation of the WEP. The Humboldt State University Gravitational Physics Laboratory is using a torsion pendulum with equal masses of different materials (a ``composition dipole'' configuration) to determine whether the WEP holds below the 50-micron distance scale. The experiment will measure the twist of a torsion pendulum as an attractor mass is oscillated nearby in a parallel-plate configuration, providing a time varying torque on the pendulum. The size and distance dependence of the torque variation will provide means to determine deviations from accepted models of gravity on untested distance scales. P.I.
Fundamentals of Pharmacogenetics in Personalized, Precision Medicine.
Valdes, Roland; Yin, DeLu Tyler
2016-09-01
This article introduces fundamental principles of pharmacogenetics as applied to personalized and precision medicine. Pharmacogenetics establishes relationships between pharmacology and genetics by connecting phenotypes and genotypes in predicting the response of therapeutics in individual patients. We describe differences between precision and personalized medicine and relate principles of pharmacokinetics and pharmacodynamics to applications in laboratory medicine. We also review basic principles of pharmacogenetics, including its evolution, how it enables the practice of personalized therapeutics, and the role of the clinical laboratory. These fundamentals are a segue for understanding specific clinical applications of pharmacogenetics described in subsequent articles in this issue. Copyright © 2016 Elsevier Inc. All rights reserved.
Yegorysheva, I V
2013-01-01
The article considers the participation of medical community in formation of fundamental principles of unique system of public health--the Zemstvo medicine. This occurrence found its reflexion in activities of medical scientific societies and congresses, periodic medical mass media.
Designing quantum information processing via structural physical approximation.
Bae, Joonwoo
2017-10-01
In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.
Designing quantum information processing via structural physical approximation
NASA Astrophysics Data System (ADS)
Bae, Joonwoo
2017-10-01
In quantum information processing it may be possible to have efficient computation and secure communication beyond the limitations of classical systems. In a fundamental point of view, however, evolution of quantum systems by the laws of quantum mechanics is more restrictive than classical systems, identified to a specific form of dynamics, that is, unitary transformations and, consequently, positive and completely positive maps to subsystems. This also characterizes classes of disallowed transformations on quantum systems, among which positive but not completely maps are of particular interest as they characterize entangled states, a general resource in quantum information processing. Structural physical approximation offers a systematic way of approximating those non-physical maps, positive but not completely positive maps, with quantum channels. Since it has been proposed as a method of detecting entangled states, it has stimulated fundamental problems on classifications of positive maps and the structure of Hermitian operators and quantum states, as well as on quantum measurement such as quantum design in quantum information theory. It has developed efficient and feasible methods of directly detecting entangled states in practice, for which proof-of-principle experimental demonstrations have also been performed with photonic qubit states. Here, we present a comprehensive review on quantum information processing with structural physical approximations and the related progress. The review mainly focuses on properties of structural physical approximations and their applications toward practical information applications.
How History Helped Einstein in Special Relativity
NASA Astrophysics Data System (ADS)
Martinez, Alberto
2013-04-01
I will discuss how the German intellectual movement known as ``critical history'' motivated several physicists in the late 1900s to radically analyze the fundamental principles of mechanics, leading eventually to Einstein's special theory of relativity. Eugen Karl Dühring, Johann Bernhard Stallo, Ludwig Lange, and Ernst Mach wrote critical histories of mechanics, some of which emphasized notions of relativity and observation, in opposition to old metaphysical concepts that seemed to infect the foundations of physics. This strand of critical history included the ``genetic method'' of analyzing how concepts develop over time, in our minds, by way of ordinary experiences, which by 1904 was young Albert Einstein's favorite approach for examining fundamental notions. Thus I will discuss how history contributed in Einstein's path to relativity, as well as comment more generally on Einstein's views on history.
Soto, Ana M; Longo, Giuseppe; Montévil, Maël; Sonnenschein, Carlos
2016-10-01
The principle of inertia is central to the modern scientific revolution. By postulating this principle Galileo at once identified a pertinent physical observable (momentum) and a conservation law (momentum conservation). He then could scientifically analyze what modifies inertial movement: gravitation and friction. Inertia, the default state in mechanics, represented a major theoretical commitment: there is no need to explain uniform rectilinear motion, rather, there is a need to explain departures from it. By analogy, we propose a biological default state of proliferation with variation and motility. From this theoretical commitment, what requires explanation is proliferative quiescence, lack of variation, lack of movement. That proliferation is the default state is axiomatic for biologists studying unicellular organisms. Moreover, it is implied in Darwin's "descent with modification". Although a "default state" is a theoretical construct and a limit case that does not need to be instantiated, conditions that closely resemble unrestrained cell proliferation are readily obtained experimentally. We will illustrate theoretical and experimental consequences of applying and of ignoring this principle. Copyright © 2016 Elsevier Ltd. All rights reserved.
SOTO, ANA M.; LONGO, GIUSEPPE; Montévil, Maël; SONNENSCHEIN, CARLOS
2017-01-01
The principle of inertia is central to the modern scientific revolution. By postulating this principle Galileo at once identified a pertinent physical observable (momentum) and a conservation law (momentum conservation). He then could scientifically analyze what modifies inertial movement: gravitation and friction. Inertia, the default state in mechanics, represented a major theoretical commitment: there is no need to explain uniform rectilinear motion, rather, there is a need to explain departures from it. By analogy, we propose a biological default state of proliferation with variation and motility. From this theoretical commitment, what requires explanation is proliferative quiescence, lack of variation, lack of movement. That proliferation is the default state is axiomatic for biologists studying unicellular organisms. Moreover, it is implied in Darwin’s “descent with modification”. Although a “default state” is a theoretical construct and a limit case that does not need to be instantiated, conditions that closely resemble unrestrained cell proliferation are readily obtained experimentally. We will illustrate theoretical and experimental consequences of applying and of ignoring this principle. PMID:27381480
The Microscope Mission and Pre-Flight Performance Verification
NASA Astrophysics Data System (ADS)
Hudson, D.; Touboul, P.; Rodrigues, M.
2006-04-01
Recent developments in fundamental physics have renewed interest in disproving the equivalence principle. The MICROSCOPE mission will be the first test to capitalize on the advantages of space to achieve an accuracy of 10-15, more than two orders of magnitude better than current ground based results. It is a joint CNES, ONERA, and Observatoire de la Côte d'Azur mission in the CNES Myriade microsatellite program. The principle of the test is to place two masses of different material on precisely the same orbit and measure any difference in the forces required to maintain the common orbit. The test is performed by a differential electrostatic accelerometer containing two concentric cylindrical test masses. This paper will present both an overview of the mission, and a description of the accelerometer development and performance verification.
Principles of thermoacoustic energy harvesting
NASA Astrophysics Data System (ADS)
Avent, A. W.; Bowen, C. R.
2015-11-01
Thermoacoustics exploit a temperature gradient to produce powerful acoustic pressure waves. The technology has a key role to play in energy harvesting systems. A time-line in the development of thermoacoustics is presented from its earliest recorded example in glass blowing through to the development of the Sondhauss and Rijke tubes to Stirling engines and pulse-tube cryo-cooling. The review sets the current literature in context, identifies key publications and promising areas of research. The fundamental principles of thermoacoustic phenomena are explained; design challenges and factors influencing efficiency are explored. Thermoacoustic processes involve complex multi-physical coupling and transient, highly non-linear relationships which are computationally expensive to model; appropriate numerical modelling techniques and options for analyses are presented. Potential methods of harvesting the energy in the acoustic waves are also examined.
Fundamentals of Diesel Engines.
ERIC Educational Resources Information Center
Marine Corps Inst., Washington, DC.
This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the fundamentals of diesel engine mechanics. Addressed in the three individual units of the course are the following topics: basic principles of diesel mechanics; principles, mechanics, and…
The Social Side of School: Why Teachers Need Social Psychology
ERIC Educational Resources Information Center
Gehlbach, Hunter
2010-01-01
Teaching and learning are fundamentally social enterprises. In attempting to understand, explain, and predict social behavior, social psychologists have amassed scores of empirically grounded, fundamental principles. Yet, many such principles have yet to be applied to classrooms despite the social nature of these settings. This article illustrates…
The Elements and Principles of Design: A Baseline Study
ERIC Educational Resources Information Center
Adams, Erin
2013-01-01
Critical to the discipline, both professionally and academically, are the fundamentals of interior design. These fundamentals include the elements and principles of interior design: the commonly accepted tools and vocabulary used to create and communicate successful interior environments. Research indicates a lack of consistency in both the…
Code of Federal Regulations, 2011 CFR
2011-01-01
...; (vi) the Secretary of Housing and Urban Development; (vii) the Secretary of Education; (viii) the... social service programs or that support (including through prime awards or sub-awards) social service... following fundamental principles: (a) Federal financial assistance for social service programs should be...
NASA Astrophysics Data System (ADS)
Knipp, D. J.
2013-12-01
An undergraduate course in solar and geospace (helio) physics should link fundamental principles from introductory physics and astronomy courses to concepts that appear unique, or are uniquely named in the heliophysics course. This paper discusses short topics and activities that can be addressed in an approximately 15-min class segment, that introduce students to aspects of solar, solar wind, and geospace storms that are a step beyond, or a special application of, an introductory physics concept. Some of these activities could be assigned as pre- or post- class activities as well. Many of the actives are aligned with images or diagrams in textbook, "Understanding Space Weather and the Physics Behind It," but could be easily adapted to other texts. We also address activities that link to information from space weather forecasting and/or modeling websites.
The unification of physics: the quest for a theory of everything.
Paulson, Steve; Gleiser, Marcelo; Freese, Katherine; Tegmark, Max
2015-12-01
The holy grail of physics has been to merge each of its fundamental branches into a unified "theory of everything" that would explain the functioning and existence of the universe. The last step toward this goal is to reconcile general relativity with the principles of quantum mechanics, a quest that has thus far eluded physicists. Will physics ever be able to develop an all-encompassing theory, or should we simply acknowledge that science will always have inherent limitations as to what can be known? Should new theories be validated solely on the basis of calculations that can never be empirically tested? Can we ever truly grasp the implications of modern physics when the basic laws of nature do not always operate according to our standard paradigms? These and other questions are discussed in this paper. © 2015 New York Academy of Sciences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stapp, Henry P.
2011-05-10
The principle of sufficient reason asserts that anything that happens does so for a reason: no definite state of affairs can come into being unless there is a sufficient reason why that particular thing should happen. This principle is usually attributed to Leibniz, although the first recorded Western philosopher to use it was Anaximander of Miletus. The demand that nature be rational, in the sense that it be compatible with the principle of sufficient reason, conflicts with a basic feature of contemporary orthodox physical theory, namely the notion that nature's response to the probing action of an observer is determinedmore » by pure chance, and hence on the basis of absolutely no reason at all. This appeal to pure chance can be deemed to have no rational fundamental place in reason-based Western science. It is argued here, on the basis of the other basic principles of quantum physics, that in a world that conforms to the principle of sufficient reason, the usual quantum statistical rules will naturally emerge at the pragmatic level, in cases where the reason behind nature's choice of response is unknown, but that the usual statistics can become biased in an empirically manifest way when the reason for the choice is empirically identifiable. It is shown here that if the statistical laws of quantum mechanics were to be biased in this way then the basically forward-in-time unfolding of empirical reality described by orthodox quantum mechanics would generate the appearances of backward-time-effects of the kind that have been reported in the scientific literature.« less
Quantum Interactive Dualism: An Alternative to Materialism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stapp, Henry P
2005-06-01
Materialism rest implicitly upon the general conception of nature promoted by Galileo and Newton during the seventeenth century. It features the causal closure of the physical: The course of physically described events for all time is fixed by laws that refer exclusively to the physically describeable features of nature, and initial conditions on these feature. No reference to subjective thoughts or feeling of human beings enter. That simple conception of nature was found during the first quarter of the twentieth century to be apparently incompatible with the empirical facts. The founders of quantum theory created a new fundamental physical theory,more » quantum theory, which introduced crucially into the causal structure certain conscious choices made by human agents about how they will act. These conscious human choices are ''free'' in the sense that they are not fixed by the known laws. But they can influence the course of physically described events. Thus the principle of the causal closure of the physical fails. Applications in psycho-neuro-dynamics are described.« less
Fundamentals of microfluidic cell culture in controlled microenvironments†
Young, Edmond W. K.; Beebe, David J.
2010-01-01
Microfluidics has the potential to revolutionize the way we approach cell biology research. The dimensions of microfluidic channels are well suited to the physical scale of biological cells, and the many advantages of microfluidics make it an attractive platform for new techniques in biology. One of the key benefits of microfluidics for basic biology is the ability to control parameters of the cell microenvironment at relevant length and time scales. Considerable progress has been made in the design and use of novel microfluidic devices for culturing cells and for subsequent treatment and analysis. With the recent pace of scientific discovery, it is becoming increasingly important to evaluate existing tools and techniques, and to synthesize fundamental concepts that would further improve the efficiency of biological research at the microscale. This tutorial review integrates fundamental principles from cell biology and local microenvironments with cell culture techniques and concepts in microfluidics. Culturing cells in microscale environments requires knowledge of multiple disciplines including physics, biochemistry, and engineering. We discuss basic concepts related to the physical and biochemical microenvironments of the cell, physicochemical properties of that microenvironment, cell culture techniques, and practical knowledge of microfluidic device design and operation. We also discuss the most recent advances in microfluidic cell culture and their implications on the future of the field. The goal is to guide new and interested researchers to the important areas and challenges facing the scientific community as we strive toward full integration of microfluidics with biology. PMID:20179823
Physics of mind: Experimental confirmations of theoretical predictions.
Schoeller, Félix; Perlovsky, Leonid; Arseniev, Dmitry
2018-02-02
What is common among Newtonian mechanics, statistical physics, thermodynamics, quantum physics, the theory of relativity, astrophysics and the theory of superstrings? All these areas of physics have in common a methodology, which is discussed in the first few lines of the review. Is a physics of the mind possible? Is it possible to describe how a mind adapts in real time to changes in the physical world through a theory based on a few basic laws? From perception and elementary cognition to emotions and abstract ideas allowing high-level cognition and executive functioning, at nearly all levels of study, the mind shows variability and uncertainties. Is it possible to turn psychology and neuroscience into so-called "hard" sciences? This review discusses several established first principles for the description of mind and their mathematical formulations. A mathematical model of mind is derived from these principles. This model includes mechanisms of instincts, emotions, behavior, cognition, concepts, language, intuitions, and imagination. We clarify fundamental notions such as the opposition between the conscious and the unconscious, the knowledge instinct and aesthetic emotions, as well as humans' universal abilities for symbols and meaning. In particular, the review discusses in length evolutionary and cognitive functions of aesthetic emotions and musical emotions. Several theoretical predictions are derived from the model, some of which have been experimentally confirmed. These empirical results are summarized and we introduce new theoretical developments. Several unsolved theoretical problems are proposed, as well as new experimental challenges for future research. Copyright © 2017. Published by Elsevier B.V.
Several foundational and information theoretic implications of Bell’s theorem
NASA Astrophysics Data System (ADS)
Kar, Guruprasad; Banik, Manik
2016-08-01
In 1935, Albert Einstein and two colleagues, Boris Podolsky and Nathan Rosen (EPR) developed a thought experiment to demonstrate what they felt was a lack of completeness in quantum mechanics (QM). EPR also postulated the existence of more fundamental theory where physical reality of any system would be completely described by the variables/states of that fundamental theory. This variable is commonly called hidden variable and the theory is called hidden variable theory (HVT). In 1964, John Bell proposed an empirically verifiable criterion to test for the existence of these HVTs. He derived an inequality, which must be satisfied by any theory that fulfill the conditions of locality and reality. He also showed that QM, as it violates this inequality, is incompatible with any local-realistic theory. Later it has been shown that Bell’s inequality (BI) can be derived from different set of assumptions and it also find applications in useful information theoretic protocols. In this review, we will discuss various foundational as well as information theoretic implications of BI. We will also discuss about some restricted nonlocal feature of quantum nonlocality and elaborate the role of Uncertainty principle and Complementarity principle in explaining this feature.
Inclusive Education: An EFA Strategy For All Children 31195
ERIC Educational Resources Information Center
Peters, Susan J.
2004-01-01
The fundamental principle of Education for All (EFA) is that all children should have the opportunity to learn. The fundamental principle of Inclusive Education (IE) is that all children should have the opportunity to learn?together. Diversity is a characteristic that all children and youth have in common?both within each individual child and…
NASA Astrophysics Data System (ADS)
Surendralal, Sudarsan; Todorova, Mira; Finnis, Michael W.; Neugebauer, Jörg
2018-06-01
Combining concepts of semiconductor physics and corrosion science, we develop a novel approach that allows us to perform ab initio calculations under controlled potentiostat conditions for electrochemical systems. The proposed approach can be straightforwardly applied in standard density functional theory codes. To demonstrate the performance and the opportunities opened by this approach, we study the chemical reactions that take place during initial corrosion at the water-Mg interface under anodic polarization. Based on this insight, we derive an atomistic model that explains the origin of the anodic hydrogen evolution.
NASA Technical Reports Server (NTRS)
Fu, L. S. W.
1982-01-01
Developments in fracture mechanics and elastic wave theory enhance the understanding of many physical phenomena in a mathematical context. Available literature in the material, and fracture characterization by NDT, and the related mathematical methods in mechanics that provide fundamental underlying principles for its interpretation and evaluation are reviewed. Information on the energy release mechanism of defects and the interaction of microstructures within the material is basic in the formulation of the mechanics problems that supply guidance for nondestructive evaluation (NDE).
Mass Defect from Nuclear Physics to Mass Spectral Analysis.
Pourshahian, Soheil
2017-09-01
Mass defect is associated with the binding energy of the nucleus. It is a fundamental property of the nucleus and the principle behind nuclear energy. Mass defect has also entered into the mass spectrometry terminology with the availability of high resolution mass spectrometry and has found application in mass spectral analysis. In this application, isobaric masses are differentiated and identified by their mass defect. What is the relationship between nuclear mass defect and mass defect used in mass spectral analysis, and are they the same? Graphical Abstract ᅟ.
Entropy inequality and hydrodynamic limits for the Boltzmann equation.
Saint-Raymond, Laure
2013-12-28
Boltzmann brought a fundamental contribution to the understanding of the notion of entropy, by giving a microscopic formulation of the second principle of thermodynamics. His ingenious idea, motivated by the works of his contemporaries on the atomic nature of matter, consists of describing gases as huge systems of identical and indistinguishable elementary particles. The state of a gas can therefore be described in a statistical way. The evolution, which introduces couplings, loses part of the information, which is expressed by the decay of the so-called mathematical entropy (the opposite of physical entropy!).
Jacobi's Principle and Hertz' Definition of Time
NASA Astrophysics Data System (ADS)
Treder, Hans-J.; Bleyer, Ulrich; Liebscher, Dierck-E.
This article should remind the interest which D.D.Ivanenko always had in the fundamental questions of Mach's ideas for founding the physics of inertia. Even today, we have no generally accepted idea yet how to quantify the general demand for a theory, in which the existence and not only the amount of inertia of a body is determined by the configuration of the surrounding universe. The actual discussion centers around the problem of introducing time in theoretical constructions without time, and this paper shall be a contribution to this dicussion…
The ultra high resolution XUV spectroheliograph: An attached payload for the Space Station Freedom
NASA Technical Reports Server (NTRS)
Walker, Arthur B. C., Jr.; Hoover, Richard B.; Barbee, Troy W., Jr.; Tandberg-Hanssen, Einar; Timothy, J. Gethyn; Lindblom, Joakim F.
1990-01-01
The principle goal of the ultra high resolution XUV spectroheliograph (UHRXS) is to improve the ability to identify and understand the fundamental physical processes that shape the structure and dynamics of the solar chromosphere and corona. The ability of the UHRXS imaging telescope and spectrographs to resolve fine scale structures over a broad wavelength (and hence temperature) range is critical to this mission. The scientific objectives and instrumental capabilities of the UHRXS investigation are reviewed before proceeding to a discussion of the expected performance of the UHRXS observatory.
It's what you do! Reflections on the VERB campaign.
Wong, Faye L; Greenwell, Michael; Gates, Suzanne; Berkowitz, Judy M
2008-06-01
This article shares the first-hand experiences of the CDC's VERB team in planning, executing, and evaluating a campaign that used social marketing principles, which involved paid media advertising, promotions, and national and community partnerships to increase physical activity among children aged 9-13 years (tweens). VERB staff gained valuable experience in applying commercial marketing techniques to a public health issue. This article describes how marketing, partnership, and evaluation activities were implemented to reach a tween audience. In doing so, fundamental differences in marketing between public health and the private sector were revealed.
Secure Multiparty Quantum Computation for Summation and Multiplication.
Shi, Run-hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2016-01-21
As a fundamental primitive, Secure Multiparty Summation and Multiplication can be used to build complex secure protocols for other multiparty computations, specially, numerical computations. However, there is still lack of systematical and efficient quantum methods to compute Secure Multiparty Summation and Multiplication. In this paper, we present a novel and efficient quantum approach to securely compute the summation and multiplication of multiparty private inputs, respectively. Compared to classical solutions, our proposed approach can ensure the unconditional security and the perfect privacy protection based on the physical principle of quantum mechanics.
Secure Multiparty Quantum Computation for Summation and Multiplication
Shi, Run-hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun
2016-01-01
As a fundamental primitive, Secure Multiparty Summation and Multiplication can be used to build complex secure protocols for other multiparty computations, specially, numerical computations. However, there is still lack of systematical and efficient quantum methods to compute Secure Multiparty Summation and Multiplication. In this paper, we present a novel and efficient quantum approach to securely compute the summation and multiplication of multiparty private inputs, respectively. Compared to classical solutions, our proposed approach can ensure the unconditional security and the perfect privacy protection based on the physical principle of quantum mechanics. PMID:26792197
The ambiguity of simplicity in quantum and classical simulation
NASA Astrophysics Data System (ADS)
Aghamohammadi, Cina; Mahoney, John R.; Crutchfield, James P.
2017-04-01
A system's perceived simplicity depends on whether it is represented classically or quantally. This is not so surprising, as classical and quantum physics are descriptive frameworks built on different assumptions that capture, emphasize, and express different properties and mechanisms. What is surprising is that, as we demonstrate, simplicity is ambiguous: the relative simplicity between two systems can change sign when moving between classical and quantum descriptions. Here, we associate simplicity with small model-memory. We see that the notions of absolute physical simplicity at best form a partial, not a total, order. This suggests that appeals to principles of physical simplicity, via Ockham's Razor or to the ;elegance; of competing theories, may be fundamentally subjective. Recent rapid progress in quantum computation and quantum simulation suggest that the ambiguity of simplicity will strongly impact statistical inference and, in particular, model selection.
Conceptual Change from the Framework Theory Side of the Fence
NASA Astrophysics Data System (ADS)
Vosniadou, Stella; Skopeliti, Irini
2014-07-01
We describe the main principles of the framework theory approach to conceptual change and briefly report on the results of a text comprehension study that investigated some of the hypotheses that derive from it. We claim that children construct a naive physics which is based on observation in the context of lay culture and which forms a relatively coherent conceptual system—i.e., a framework theory—that can be used as a basis for explanation and prediction of everyday phenomena. Learning science requires fundamental ontological, epistemological, and representational changes in naive physics. These conceptual changes take a long time to be achieved, giving rise to fragmentation and synthetic conceptions. We also argue that both fragmentation and synthetic conceptions can be explained to result from learners' attempts assimilate scientific information into their existing but incompatible naive physics.
Teaching ``The Physics of Energy'' at MIT
NASA Astrophysics Data System (ADS)
Jaffe, Robert
2009-05-01
New physics courses on energy are popping up at colleges and universities across the country. Many require little or no previous physics background, aiming to introduce a broad audience to this complex and critical problem, often augmenting the scientific message with economic and policy discussions. Others are advanced courses, focussing on highly specialized subjects like solar voltaics, nuclear physics, or thermal fluids, for example. About two years ago Washington Taylor and I undertook to develop a course on the ``Physics of Energy'' open to all MIT students who had taken MIT's common core of university level calculus, physics, and chemistry. By avoiding higher level prerequisites, we aimed to attract and make the subject relevant to students in the life sciences, economics, etc. --- as well as physical scientists and engineers --- who want to approach energy issues in a sophisticated and analytical fashion, exploiting their background in calculus, mechanics, and E & M, but without having to take advanced courses in thermodynamics, quantum mechanics, or nuclear physics beforehand. Our object was to interweave teaching the fundamental physics principles at the foundations of energy science with the applications of those principles to energy systems. We envisioned a course that would present the basics of statistical, quantum, and fluid mechanics at a fairly sophisticated level and apply those concepts to the study of energy sources, conversion, transport, losses, storage, conservation, and end use. In the end we developed almost all of the material for the course from scratch. The course debuted this past fall. I will describe what we learned and what general lessons our experience might have for others who contemplate teaching energy physics broadly to a technically sophisticated audience.
Recent advances in non-LTE stellar atmosphere models
NASA Astrophysics Data System (ADS)
Sander, Andreas A. C.
2017-11-01
In the last decades, stellar atmosphere models have become a key tool in understanding massive stars. Applied for spectroscopic analysis, these models provide quantitative information on stellar wind properties as well as fundamental stellar parameters. The intricate non-LTE conditions in stellar winds dictate the development of adequate sophisticated model atmosphere codes. The increase in both, the computational power and our understanding of physical processes in stellar atmospheres, led to an increasing complexity in the models. As a result, codes emerged that can tackle a wide range of stellar and wind parameters. After a brief address of the fundamentals of stellar atmosphere modeling, the current stage of clumped and line-blanketed model atmospheres will be discussed. Finally, the path for the next generation of stellar atmosphere models will be outlined. Apart from discussing multi-dimensional approaches, I will emphasize on the coupling of hydrodynamics with a sophisticated treatment of the radiative transfer. This next generation of models will be able to predict wind parameters from first principles, which could open new doors for our understanding of the various facets of massive star physics, evolution, and death.
Cutting Off the Head of the Snake: Applying and Assessing Leadership Attack in Military Conflict
2013-04-26
war that will help 13 17 Baron Antoine Henri de Jomini makes this argument when he describes what he calls the Fundamental Principle of War: “On the... of Napoleonic Warfare, Clausewitz had but a vague understanding of it. Nevertheless, because of Napoleon’s offensive principle , he foisted on to... of the Enemy is the Aim,” Clausewitz opens the chapter by weaving the idea of Schwerpunkt into his two fundamental principles of war: The first is
Constructing the principles: Method and metaphysics in the progress of theoretical physics
NASA Astrophysics Data System (ADS)
Glass, Lawrence C.
This thesis presents a new framework for the philosophy of physics focused on methodological differences found in the practice of modern theoretical physics. The starting point for this investigation is the longstanding debate over scientific realism. Some philosophers have argued that it is the aim of science to produce an accurate description of the world including explanations for observable phenomena. These scientific realists hold that our best confirmed theories are approximately true and that the entities they propose actually populate the world, whether or not they have been observed. Others have argued that science achieves only frameworks for the prediction and manipulation of observable phenomena. These anti-realists argue that truth is a misleading concept when applied to empirical knowledge. Instead, focus should be on the empirical adequacy of scientific theories. This thesis argues that the fundamental distinction at issue, a division between true scientific theories and ones which are empirically adequate, is best explored in terms of methodological differences. In analogy with the realism debate, there are at least two methodological strategies. Rather than focusing on scientific theories as wholes, this thesis takes as units of analysis physical principles which are systematic empirical generalizations. The first possible strategy, the conservative, takes the assumption that the empirical adequacy of a theory in one domain serves as good evidence for such adequacy in other domains. This then motivates the application of the principle to new domains. The second strategy, the innovative, assumes that empirical adequacy in one domain does not justify the expectation of adequacy in other domains. New principles are offered as explanations in the new domain. The final part of the thesis is the application of this framework to two examples. On the first, Lorentz's use of the aether is reconstructed in terms of the conservative strategy with respect to the principles of Galilean relativity. A comparison between the conservative strategy as an application of the conservative strategy and TeVeS as one of the innovative constitutes the second example.
How low can you go? Physical production mechanism of elephant infrasonic vocalizations.
Herbst, Christian T; Stoeger, Angela S; Frey, Roland; Lohscheller, Jörg; Titze, Ingo R; Gumpenberger, Michaela; Fitch, W Tecumseh
2012-08-03
Elephants can communicate using sounds below the range of human hearing ("infrasounds" below 20 hertz). It is commonly speculated that these vocalizations are produced in the larynx, either by neurally controlled muscle twitching (as in cat purring) or by flow-induced self-sustained vibrations of the vocal folds (as in human speech and song). We used direct high-speed video observations of an excised elephant larynx to demonstrate flow-induced self-sustained vocal fold vibration in the absence of any neural signals, thus excluding the need for any "purring" mechanism. The observed physical principles of voice production apply to a wide variety of mammals, extending across a remarkably large range of fundamental frequencies and body sizes, spanning more than five orders of magnitude.
Sensors, Volume 4, Thermal Sensors
NASA Astrophysics Data System (ADS)
Scholz, Jorg; Ricolfi, Teresio
1996-12-01
'Sensors' is the first self-contained series to deal with the whole area of sensors. It describes general aspects, technical and physical fundamentals, construction, function, applications and developments of the various types of sensors. This volume describes the construction and applicational aspects of thermal sensors while presenting a rigorous treatment of the underlying physical principles. It provides a unique overview of the various categories of sensors as well as of specific groups, e.g. temperature sensors (resistance thermometers, thermocouples, and radiation thermometers), noise and acoustic thermometers, heat-flow and mass-flow sensors. Specific facettes of applications are presented by specialists from different fields including process control, automotive technology and cryogenics. This volume is an indispensable reference work and text book for both specialists and newcomers, researchers and developers.
Speed of transverse waves in a string revisited
NASA Astrophysics Data System (ADS)
Rizcallah, Joseph A.
2017-11-01
In many introductory-level physics textbooks, the derivation of the formula for the speed of transverse waves in a string is either omitted altogether or presented under physically overly idealized assumptions about the shape of the considered wave pulse and the related velocity and acceleration distributions. In this paper, we derive the named formula by applying Newton’s second law or the work-energy theorem to a finite element of the string, making no assumptions about the shape of the wave. We argue that the suggested method can help the student gain a deeper insight into the nature of waves and the related process of energy transport, as well as provide a new experience with the fundamental principles of mechanics as applied to extended and deformable bodies.
WE-G-12A-01: High Intensity Focused Ultrasound Surgery and Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farahani, K; O'Neill, B
More and more emphasis is being made on alternatives to invasive surgery and the use of ionizing radiation to treat various diseases including cancer. Novel screening, diagnosis, treatment and monitoring of response to treatment are also hot areas of research and new clinical technologies. Ultrasound(US) has gained traction in all of the aforementioned areas of focus. Especially with recent advances in the use of ultrasound to noninvasively treat various diseases/organ systems. This session will focus on covering MR-guided focused ultrasound and the state of the art clinical applications, and the second speaker will survey the more cutting edge technologies e.g.more » Focused Ultrasound (FUS) mediated drug delivery, principles of cavitation and US guided FUS. Learning Objectives: Fundamental physics and physical limitations of US interaction with tissue and nanoparticles The alteration of tissue transport using focused ultrasound US control of nanoparticle drug carriers for targeted release The basic principles of MRI-guided focused ultrasound (MRgFUS) surgery and therapy the current state of the art clinical applications of MRgFUS requirements for quality assurance and treatment planning.« less
NASA Astrophysics Data System (ADS)
Martin, B. R.; Shaw, G.
1998-01-01
Particle Physics, Second Edition is a concise and lucid account of the fundamental constituents of matter. The standard model of particle physics is developed carefully and systematically, without heavy mathematical formalism, to make this stimulating subject accessible to undergraduate students. Throughout, the emphasis is on the interpretation of experimental data in terms of the basic properties of quarks and leptons, and extensive use is made of symmetry principles and Feynman diagrams, which are introduced early in the book. The Second Edition brings the book fully up to date, including the discovery of the top quark and the search for the Higgs boson. A final short chapter is devoted to the continuing search for new physics beyond the standard model. Particle Physics, Second Edition features: * A carefully structured and written text to help students understand this exciting and demanding subject. * Many worked examples and problems to aid student learning. Hints for solving the problems are given in an Appendix. * Optional "starred" sections and appendices, containing more specialised and advanced material for the more ambitious reader.
Understanding Molecular Conduction: Old Wine in a New Bottle?
NASA Astrophysics Data System (ADS)
Ghosh, Avik
2007-03-01
Molecules provide an opportunity to test our understanding of fundamental non-equilibrium transport processes, as well as explore new device possibilities. We have developed a unified approach to nanoscale conduction, coupling bandstructure and electrostatics of the channel and contacts with a quantum kinetic theory of current flow. This allows us to describe molecular conduction at various levels of detail, -- from quantum corrected compact models, to semi-empirical models for quick physical insights, and `first-principles' calculations of current-voltage (I-V) characteristics with no adjustable parameters. Using this suite of tools, we can quantitatively explain various experimental I-Vs, including complex reconstructed silicon substrates. We find that conduction in most molecules is contact dominated, and limited by fundamental electrostatic and thermodynamic restrictions quite analogous to those faced by the silicon industry, barring a few interesting exceptions. The distinction between molecular and silicon electronics must therefore be probed at a more fundamental level. Ultra-short molecules are unique in that they possess large Coulomb energies as well as anomalous vibronic couplings with current flow -- in other words, strong non-equilibrium electron-electron and electron-phonon correlations. These effects yield prominent experimental signatures, but require a completely different modeling approach -- in fact, popular approaches to include correlation typically do not work for non-equilibrium. Molecules exhibit rich physics, including the ability to function both as weakly interacting current conduits (quantum wires) as well as strongly correlated charge storage centers (quantum dots). Theoretical treatment of the intermediate coupling regime is particularly challenging, with a large `fine structure constant' for transport that negates orthodox theories of Coulomb Blockade and phonon-assisted tunneling. It is in this regime that the scientific and technological merits of molecular conductors may need to be explored. For instance, the tunable quantum coupling of current flow in silicon transistors with engineered molecular scatterers could lead to devices that operate on completely novel principles.
Fundamental Principles of Coherent-Feedback Quantum Control
2014-12-08
in metrology (acceleration sensing, vibrometry, gravity wave detection) and in quantum information processing (continuous-variables quantum ...AFRL-OSR-VA-TR-2015-0009 FUNDAMENTAL PRINCIPLES OF COHERENT-FEEDBACK QUANTUM CONTROL Hideo Mabuchi LELAND STANFORD JUNIOR UNIV CA Final Report 12/08...foundations and potential applications of coherent-feedback quantum control. We have focused on potential applications in quantum -enhanced metrology and
Elementary Concepts and Fundamental Laws of the Theory of Heat
NASA Astrophysics Data System (ADS)
de Oliveira, Mário J.
2018-06-01
The elementary concepts and fundamental laws concerning the science of heat are examined from the point of view of its development with special attention to its theoretical structure. The development is divided into four periods, each one characterized by the concept that was attributed to heat. The transition from one to the next period was marked by the emergence of new concepts and new laws, and by singular events. We point out that thermodynamics, as it emerged, is founded on the elementary concepts of temperature and adiabatic wall, and on the fundamental laws: Mayer-Joule principle, or law of conservation of energy; Carnot principle, which leads to the definition of entropy; and the Clausius principle, or law of increase in entropy.
Elementary Concepts and Fundamental Laws of the Theory of Heat
NASA Astrophysics Data System (ADS)
de Oliveira, Mário J.
2018-03-01
The elementary concepts and fundamental laws concerning the science of heat are examined from the point of view of its development with special attention to its theoretical structure. The development is divided into four periods, each one characterized by the concept that was attributed to heat. The transition from one to the next period was marked by the emergence of new concepts and new laws, and by singular events. We point out that thermodynamics, as it emerged, is founded on the elementary concepts of temperature and adiabatic wall, and on the fundamental laws: Mayer-Joule principle, or law of conservation of energy; Carnot principle, which leads to the definition of entropy; and the Clausius principle, or law of increase in entropy.
The Peculiar Status of the Second Law of Thermodynamics and the Quest for its Violation
NASA Astrophysics Data System (ADS)
D'Abramo, Germano
2012-11-01
Even though the second law of thermodynamics holds the supreme position among the laws of nature, as stated by many distinguished scientists, notably Eddington and Einstein, its position appears to be also quite peculiar. Given the atomic nature of matter, whose behaviour is well described by statistical physics, the second law could not hold unconditionally, but only statistically. It is not an absolute law. As a result of this, in the present paper we try to argue that we have not yet any truly cogent argument (known fundamental physical laws) to exclude its possible macroscopic violation. Even Landauer's information-theoretic principle seems to fall short of the initial expectations of being the fundamental `physical' reason of all Maxwell's demons failure. Here we propose a modified Szilard engine which operates without any steps in the process resembling the creation or destruction of information. We argue that the information-based exorcisms must be wrong, or at the very least superfluous, and that the real physical reason why such engines cannot work lies in the ubiquity of thermal fluctuations (and friction). We see in the above peculiar features the main motivation and rationale for pursuing exploratory research to challenge the second law, which is still ongoing and probably richer than ever. A quite thorough (and critical) description of some of these challenges is also given.
The peculiar status of the second law of thermodynamics and the quest for its violation
NASA Astrophysics Data System (ADS)
D'Abramo, Germano
2012-11-01
Even though the second law of thermodynamics holds the supreme position among the laws of nature, as stated by many distinguished scientists, notably Eddington and Einstein, its position appears to be also quite peculiar. Given the atomic nature of matter, whose behavior is well described by statistical physics, the second law could not hold unconditionally, but only statistically. It is not an absolute law. As a result of this, in the present paper we try to argue that we have not yet any truly cogent argument (known fundamental physical laws) to exclude its possible macroscopic violation. Even Landauer's information-theoretic principle seems to fall short of the initial expectations of being the fundamental 'physical' reason of all Maxwell's demons failure. Here we propose a modified Szilard engine which operates without any steps in the process resembling the creation or destruction of information. We argue that the information-based exorcisms must be wrong, or at the very least superfluous, and that the real physical reason why such engines cannot work lies in the ubiquity of thermal fluctuations (and friction). We see in the above peculiar features the main motivation and rationale for pursuing exploratory research to challenge the second law, which is still ongoing and probably richer than ever. A quite thorough (and critical) description of some of these challenges is also given.
TOPICAL REVIEW: First principles studies of multiferroic materials
NASA Astrophysics Data System (ADS)
Picozzi, Silvia; Ederer, Claude
2009-07-01
Multiferroics, materials where spontaneous long-range magnetic and dipolar orders coexist, represent an attractive class of compounds, which combine rich and fascinating fundamental physics with a technologically appealing potential for applications in the general area of spintronics. Ab initio calculations have significantly contributed to recent progress in this area, by elucidating different mechanisms for multiferroicity and providing essential information on various compounds where these effects are manifestly at play. In particular, here we present examples of density-functional theory investigations for two main classes of materials: (a) multiferroics where ferroelectricity is driven by hybridization or purely structural effects, with BiFeO3 as the prototype material, and (b) multiferroics where ferroelectricity is driven by correlation effects and is strongly linked to electronic degrees of freedom such as spin-, charge-, or orbital-ordering, with rare-earth manganites as prototypes. As for the first class of multiferroics, first principles calculations are shown to provide an accurate qualitative and quantitative description of the physics in BiFeO3, ranging from the prediction of large ferroelectric polarization and weak ferromagnetism, over the effect of epitaxial strain, to the identification of possible scenarios for coupling between ferroelectric and magnetic order. For the second class of multiferroics, ab initio calculations have shown that, in those cases where spin-ordering breaks inversion symmetry (e.g. in antiferromagnetic E-type HoMnO3), the magnetically induced ferroelectric polarization can be as large as a few µC cm-2. The examples presented point the way to several possible avenues for future research: on the technological side, first principles simulations can contribute to a rational materials design, aimed at identifying spintronic materials that exhibit ferromagnetism and ferroelectricity at or above room temperature. On the fundamental side, ab initio approaches can be used to explore new mechanisms for ferroelectricity by exploiting electronic correlations that are at play in transition metal oxides, and by suggesting ways to maximize the strength of these effects as well as the corresponding ordering temperatures.
NASA Astrophysics Data System (ADS)
Ben Jacob, Eshel; Shapira, Yoash; Tauber, Alfred I.
2006-01-01
We reexamine Schrödinger's reflections on the fundamental requirements for life in view of new observations about bacterial self-organization and the emerging understanding of gene-network regulation mechanisms and dynamics. Focusing on the energy, matter and thermodynamic imbalances provided by the environment, Schrödinger proposed his consumption of negative entropy requirement for life. We take the criteria further and propose that, besides “negative entropy”, organisms extract latent information embedded in the complexity of their environment. By latent information we refer to the non-arbitrary spatio-temporal patterns of regularities and variations that characterize the environmental dynamics. Hence it can be used to generate an internal condensed description (model or usable information) of the environment which guides the organisms functioning. Accordingly, we propose that Schrödinger's criterion of “consumption of negative entropy” is not sufficient and “consumption of latent information” is an additional fundamental requirement of Life. In other words, all organisms, including bacteria, the most primitive (fundamental) ones, must be able to sense the environment and perform internal information processing for thriving on latent information embedded in the complexity of their environment. We then propose that by acting together, bacteria can perform this most elementary cognitive function more efficiently as can be illustrated by their cooperative behavior (colonial or inter-cellular self-organization). As a member of a complex superorganism-the colony-each unit (bacteria) must possess the ability to sense and communicate with the other units comprising the collective and perform its task within a distribution of tasks. Bacterial communication thus entails collective sensing and cooperativity. The fundamental (primitive) elements of cognition in such systems include interpretation of (chemical) messages, distinction between internal and external information, and some self vs., non-self distinction (peers and cheaters). We outline how intra-cellular self-organization together with genome plasticity and membrane dynamics might, in principle, provide the intra-cellular mechanisms needed for these fundamental cognitive functions. In regard to intra-cellular processes, Schrödinger postulated that new physics is needed to explain the convertion of the genetically stored information into a functioning cell. At present, his ontogenetic dilemma is generally perceived to be solved and is attributed to a lack of knowledge when it was proposed. So it is widely accepted that there is no need for some unknown laws of physics to explain cellular ontogenetic development. We take a different view and in Schrödinger's foot steps suggest that yet unknown physics principles of self-organization in open systems are missing for understanding how to assemble the cell's component into an information-based functioning “machine”.
NASA Astrophysics Data System (ADS)
Boriev, I. A.
2018-03-01
Astronomical data indicate a presence of dark matter (DM) in the space, what is necessary for explanation of observed dynamics of the galaxies within Newtonian mechanics. DM, at its very low density (∼10-26kg/m3), constitutes main part of the matter in the Universe, 10 times the mass of all visible cosmic bodies. No doubt, namely properties of DM, which fills space, must determine its physical properties and fundamental physical laws. Taking into account observed properties of cosmic microwave background radiation (CMBR), whose energy is ∼90% of all cosmic radiation, and understanding that this radiation is produced by DM motion, conservation laws of classical physics and principles of quantum mechanics receive their materialistic substantiation. Thus, CMBR high homogeneity and isotropy (∼10-4), and hence the same properties of DM (and space) justify momentum and angular momentum conservation laws, respectively, according to E. Noether's theorems. CMBR has black body spectrum at ∼2.7K with maximum wavelength ∼1.9·10-3m, what allows calculate the value of mechanical action produced by DM thermal motion (∼7·10-34 J·s). This value corresponds well to the Planck’s constant, which is the mechanical action too, what gives materialistic basis for all principles of quantum mechanics. Obtained results directly confirm the reality of DM existence, and show that CMBR is an observed display of DM thermal motion. Understanding that namely from DM occur known creation of electron-positron pairs as contrarily rotating material vortexes (according to their spins) let substantiate positron nature of ball lightning what first explains all its observed specific properties.
[Basic principles and results of brachytherapy in gynecological oncology].
Kanaev, S V; Turkevich, V G; Baranov, S B; Savel'eva, V V
2014-01-01
The fundamental basics of contact radiation therapy (brachytherapy) for gynecological cancer are presented. During brachytherapy the principles of conformal radiotherapy should be implemented, the aim of which is to sum the maximum possible dose of radiation to the tumor and decrease the dose load in adjacent organs and tissues, which allows reducing the frequency of radiation damage at treatment of primary tumors. It is really feasible only on modern technological level, thanks to precision topometry preparation, optimal computer dosimetrical and radiobiological planning of each session and radiotherapy in general. Successful local and long-term results of the contact radiation therapy for cancer of cervix and endometrium are due to optimal anatomical and topometrical ratio of the tumor localization, radioactive sources, and also physical and radiobiological laws of distribution and effects of ionizing radiation, the dose load accounting rules.
Colloquium: Modeling the dynamics of multicellular systems: Application to tissue engineering
NASA Astrophysics Data System (ADS)
Kosztin, Ioan; Vunjak-Novakovic, Gordana; Forgacs, Gabor
2012-10-01
Tissue engineering is a rapidly evolving discipline that aims at building functional tissues to improve or replace damaged ones. To be successful in such an endeavor, ideally, the engineering of tissues should be based on the principles of developmental biology. Recent progress in developmental biology suggests that the formation of tissues from the composing cells is often guided by physical laws. Here a comprehensive computational-theoretical formalism is presented that is based on experimental input and incorporates biomechanical principles of developmental biology. The formalism is described and it is shown that it correctly reproduces and predicts the quantitative characteristics of the fundamental early developmental process of tissue fusion. Based on this finding, the formalism is then used toward the optimization of the fabrication of tubular multicellular constructs, such as a vascular graft, by bioprinting, a novel tissue engineering technology.
Fundamental Principles of Classical Mechanics: a Geometrical Perspectives
NASA Astrophysics Data System (ADS)
Lam, Kai S.
2014-07-01
Classical mechanics is the quantitative study of the laws of motion for oscopic physical systems with mass. The fundamental laws of this subject, known as Newton's Laws of Motion, are expressed in terms of second-order differential equations governing the time evolution of vectors in a so-called configuration space of a system (see Chapter 12). In an elementary setting, these are usually vectors in 3-dimensional Euclidean space, such as position vectors of point particles; but typically they can be vectors in higher dimensional and more abstract spaces. A general knowledge of the mathematical properties of vectors, not only in their most intuitive incarnations as directed arrows in physical space but as elements of abstract linear vector spaces, and those of linear operators (transformations) on vector spaces as well, is then indispensable in laying the groundwork for both the physical and the more advanced mathematical - more precisely topological and geometrical - concepts that will prove to be vital in our subject. In this beginning chapter we will review these properties, and introduce the all-important related notions of dual spaces and tensor products of vector spaces. The notational convention for vectorial and tensorial indices used for the rest of this book (except when otherwise specified) will also be established...
NASA Astrophysics Data System (ADS)
2017-02-01
The main goal of the conference is to contribute to new knowledge in surface, interface, ultra-thin films and very-thin films science of inorganic and organic materials by the most rapid interactive manner - by direct communication among scientists of corresponding research fields. The list of topics indicates that conference interests cover the development of basic theoretical physical and chemical principles and performance of surfaces-, thin films-, and interface-related procedures, and corresponding experimental research on atomic scale. Topical results are applied at development of new inventive industrial equipments needed for investigation of electrical, optical, and structural properties, and other parameters of atomic-size research objects. The conference range spreads, from physical point of view, from fundamental research done on sub-atomic and quantum level to production of devices built on new physical principles. The conference topics include also presentation of principally new devices in following fields: solar cells, liquid crystal displays, high-temperature superconductivity, and sensors. During the event, special attention will be given to evaluation of scientific and technical quality of works prepared by PhD students, to deep ecological meaning of solar cell energy production, and to exhibitions of companies.
Fundamentals of fluid lubrication
NASA Technical Reports Server (NTRS)
Hamrock, Bernard J.
1991-01-01
The aim is to coordinate the topics of design, engineering dynamics, and fluid dynamics in order to aid researchers in the area of fluid film lubrication. The lubrication principles that are covered can serve as a basis for the engineering design of machine elements. The fundamentals of fluid film lubrication are presented clearly so that students that use the book will have confidence in their ability to apply these principles to a wide range of lubrication situations. Some guidance on applying these fundamentals to the solution of engineering problems is also provided.
ERIC Educational Resources Information Center
Silver, Mark S.
2017-01-01
During the current period of rapid technological change, business students need to emerge from their introductory course in Information Systems (IS) with a set of fundamental principles to help them "think about Information Technology (IT)" in future courses and the workplace. Given the digital revolution, they also need to appreciate…
The problem of the Grand Unification Theory
NASA Astrophysics Data System (ADS)
Treder, H.-J.
The evolution and fundamental questions of physical theories unifying the gravitational, electromagnetic, and quantum-mechanical interactions are explored, taking Pauli's aphorism as a motto: 'Let no man join what God has cast asunder.' The contributions of Faraday and Riemann, Lorentz, Einstein, and others are discussed, and the criterion of Pauli is applied to Grand Unification Theories (GUT) in general and to those seeking to link gravitation and electromagnetism in particular. Formal mathematical symmetry principles must be shown to have real physical relevance by predicting measurable phenomena not explainable without a GUT; these phenomena must be macroscopic because gravitational effects are to weak to be measured on the microscopic level. It is shown that empirical and theoretical studies of 'gravomagnetism', 'gravoelectricity', or possible links between gravoelectrity and the cosmic baryon assymmetry eventually lead back to basic questions which appear philosophical or purely mathematical but actually challenge physics to seek verifiable answers.
Automated sampling assessment for molecular simulations using the effective sample size
Zhang, Xin; Bhatt, Divesh; Zuckerman, Daniel M.
2010-01-01
To quantify the progress in the development of algorithms and forcefields used in molecular simulations, a general method for the assessment of the sampling quality is needed. Statistical mechanics principles suggest the populations of physical states characterize equilibrium sampling in a fundamental way. We therefore develop an approach for analyzing the variances in state populations, which quantifies the degree of sampling in terms of the effective sample size (ESS). The ESS estimates the number of statistically independent configurations contained in a simulated ensemble. The method is applicable to both traditional dynamics simulations as well as more modern (e.g., multi–canonical) approaches. Our procedure is tested in a variety of systems from toy models to atomistic protein simulations. We also introduce a simple automated procedure to obtain approximate physical states from dynamic trajectories: this allows sample–size estimation in systems for which physical states are not known in advance. PMID:21221418
Physical key-protected one-time pad
Horstmeyer, Roarke; Judkewitz, Benjamin; Vellekoop, Ivo M.; Assawaworrarit, Sid; Yang, Changhuei
2013-01-01
We describe an encrypted communication principle that forms a secure link between two parties without electronically saving either of their keys. Instead, random cryptographic bits are kept safe within the unique mesoscopic randomness of two volumetric scattering materials. We demonstrate how a shared set of patterned optical probes can generate 10 gigabits of statistically verified randomness between a pair of unique 2 mm3 scattering objects. This shared randomness is used to facilitate information-theoretically secure communication following a modified one-time pad protocol. Benefits of volumetric physical storage over electronic memory include the inability to probe, duplicate or selectively reset any bits without fundamentally altering the entire key space. Our ability to securely couple the randomness contained within two unique physical objects can extend to strengthen hardware required by a variety of cryptographic protocols, which is currently a critically weak link in the security pipeline of our increasingly mobile communication culture. PMID:24345925
Composition in the Quantum World
NASA Astrophysics Data System (ADS)
Hall, Edward Jonathan
This thesis presents a problem for the foundations of quantum mechanics. It arises from the way that theory describes the composition of larger systems in terms of smaller ones, and renders untenable a wide range of interpretations of quantum mechanics. That quantum mechanics is difficult to interpret is old news, given the well-known Measurement Problem. But the problem I raise is quite different, and in important respects more fundamental. In brief: The physical world exhibits mereological structure: physical objects have parts, which in turn have parts, and so on. A natural way to try to represent this structure is by means of a particle theory, according to which the physical world consists entirely enduring physical objects which themselves have no proper parts, but aggregates of which are, or compose, all physical objects. Elementary, non-relativistic quantum mechanics can be cast in this mold--at least, according to the usual expositions of that theory. But herein lies the problem: the standard attempt to give a systematic particle interpretation to elementary quantum mechanics results in nonsense, thanks to the well-established principle of Permutation Invariance, which constrains the quantum -mechanical description of systems containing identical particles. Specifically, it follows from the most minimal principles of a particle interpretation (much weaker than those needed to generate the Measurement Problem), together with Permutation Invariance, that systems identical in composition must have the same physical state. In other words, systems which merely have the same numbers of the same types of particles are therefore, at all times, perfect physical duplicates. This conclusion is absurd: e.g., it is quite plausible that some of those particles which compose my body make up a system identical in composition to some pepperoni pizza. Yet no part of me is a qualitative physical duplicate of any pepperoni pizza. Perhaps "you are what you eat" --but not in this sense! In what follows I develop the principles needed to explore this problem, contrast it with the Measurement Problem, and consider, finally, how it should influence our judgments of the relative merits of the many extant interpretations of quantum mechanics.
NASA Astrophysics Data System (ADS)
Guzzardi, Luca
2014-06-01
This paper discusses Ernst Mach's interpretation of the principle of energy conservation (EC) in the context of the development of energy concepts and ideas about causality in nineteenth-century physics and theory of science. In doing this, it focuses on the close relationship between causality, energy conservation and space in Mach's antireductionist view of science. Mach expounds his thesis about EC in his first historical-epistemological essay, Die Geschichte und die Wurzel des Satzes von der Erhaltung der Arbeit (1872): far from being a new principle, it is used from the early beginnings of mechanics independently from other principles; in fact, EC is a pre-mechanical principle which is generally applied in investigating nature: it is, indeed, nothing but a form of the principle of causality. The paper focuses on the scientific-historical premises and philosophical underpinnings of Mach's thesis, beginning with the classic debate on the validity and limits of the notion of cause by Hume, Kant, and Helmholtz. Such reference also implies a discussion of the relationship between causality on the one hand and space and time on the other. This connection plays a major role for Mach, and in the final paragraphs its importance is argued in order to understand his antireductionist perspective, i.e. the rejection of any attempt to give an ultimate explanation of the world via reduction of nature to one fundamental set of phenomena.
Hadronic and nuclear interactions in QCD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Despite the evidence that QCD - or something close to it - gives a correct description of the structure of hadrons and their interactions, it seems paradoxical that the theory has thus far had very little impact in nuclear physics. One reason for this is that the application of QCD to distances larger than 1 fm involves coherent, non-perturbative dynamics which is beyond present calculational techniques. For example, in QCD the nuclear force can evidently be ascribed to quark interchange and gluon exchange processes. These, however, are as complicated to analyze from a fundamental point of view as is themore » analogous covalent bond in molecular physics. Since a detailed description of quark-quark interactions and the structure of hadronic wavefunctions is not yet well-understood in QCD, it is evident that a quantitative first-principle description of the nuclear force will require a great deal of theoretical effort. Another reason for the limited impact of QCD in nuclear physics has been the conventional assumption that nuclear interactions can for the most part be analyzed in terms of an effective meson-nucleon field theory or potential model in isolation from the details of short distance quark and gluon structure of hadrons. These lectures, argue that this view is untenable: in fact, there is no correspondence principle which yields traditional nuclear physics as a rigorous large-distance or non-relativistic limit of QCD dynamics. On the other hand, the distinctions between standard nuclear physics dynamics and QCD at nuclear dimensions are extremely interesting and illuminating for both particle and nuclear physics.« less
ERIC Educational Resources Information Center
Johnstone, D. Bruce
As background to the National Dialogue on Student Financial Aid, this essay discusses the fundamental assumptions and aims that underlie the principles and policies of federal financial aid to students. These eight assumptions and aims are explored: (1) higher education is the province of states, and not of the federal government; (2) the costs of…
A quantum annealing architecture with all-to-all connectivity from local interactions.
Lechner, Wolfgang; Hauke, Philipp; Zoller, Peter
2015-10-01
Quantum annealers are physical devices that aim at solving NP-complete optimization problems by exploiting quantum mechanics. The basic principle of quantum annealing is to encode the optimization problem in Ising interactions between quantum bits (qubits). A fundamental challenge in building a fully programmable quantum annealer is the competing requirements of full controllable all-to-all connectivity and the quasi-locality of the interactions between physical qubits. We present a scalable architecture with full connectivity, which can be implemented with local interactions only. The input of the optimization problem is encoded in local fields acting on an extended set of physical qubits. The output is-in the spirit of topological quantum memories-redundantly encoded in the physical qubits, resulting in an intrinsic fault tolerance. Our model can be understood as a lattice gauge theory, where long-range interactions are mediated by gauge constraints. The architecture can be realized on various platforms with local controllability, including superconducting qubits, NV-centers, quantum dots, and atomic systems.
A quantum annealing architecture with all-to-all connectivity from local interactions
Lechner, Wolfgang; Hauke, Philipp; Zoller, Peter
2015-01-01
Quantum annealers are physical devices that aim at solving NP-complete optimization problems by exploiting quantum mechanics. The basic principle of quantum annealing is to encode the optimization problem in Ising interactions between quantum bits (qubits). A fundamental challenge in building a fully programmable quantum annealer is the competing requirements of full controllable all-to-all connectivity and the quasi-locality of the interactions between physical qubits. We present a scalable architecture with full connectivity, which can be implemented with local interactions only. The input of the optimization problem is encoded in local fields acting on an extended set of physical qubits. The output is—in the spirit of topological quantum memories—redundantly encoded in the physical qubits, resulting in an intrinsic fault tolerance. Our model can be understood as a lattice gauge theory, where long-range interactions are mediated by gauge constraints. The architecture can be realized on various platforms with local controllability, including superconducting qubits, NV-centers, quantum dots, and atomic systems. PMID:26601316
Fundamentals of satellite navigation
NASA Astrophysics Data System (ADS)
Stiller, A. H.
The basic operating principles and capabilities of conventional and satellite-based navigation systems for air, sea, and land vehicles are reviewed and illustrated with diagrams. Consideration is given to autonomous onboard systems; systems based on visible or radio beacons; the Transit, Cicada, Navstar-GPS, and Glonass satellite systems; the physical laws and parameters of satellite motion; the definition of time in satellite systems; and the content of the demodulated GPS data signal. The GPS and Glonass data format frames are presented graphically, and tables listing the GPS and Glonass satellites, their technical characteristics, and the (past or scheduled) launch dates are provided.
Cellular automaton supercomputing
NASA Technical Reports Server (NTRS)
Wolfram, Stephen
1987-01-01
Many of the models now used in science and engineering are over a century old. And most of them can be implemented on modern digital computers only with considerable difficulty. Some new basic models are discussed which are much more directly suitable for digital computer simulation. The fundamental principle is that the models considered herein are as suitable as possible for implementation on digital computers. It is then a matter of scientific analysis to determine whether such models can reproduce the behavior seen in physical and other systems. Such analysis was carried out in several cases, and the results are very encouraging.
Adaptation of acoustic model experiments of STM via smartphones and tablets
NASA Astrophysics Data System (ADS)
Thees, Michael; Hochberg, Katrin; Kuhn, Jochen; Aeschlimann, Martin
2017-10-01
The importance of Scanning Tunneling Microscopy (STM) in today's research and industry leads to the question of how to include such a key technology in physics education. Manfred Euler has developed an acoustic model experiment to illustrate the fundamental measuring principles based on an analogy between quantum mechanics and acoustics. Based on earlier work we applied mobile devices such as smartphones and tablets instead of using a computer to record and display the experimental data and thus converted Euler's experimental setup into a low-cost experiment that is easy to build and handle by students themselves.
On Frequency Combs in Monolithic Resonators
NASA Astrophysics Data System (ADS)
Savchenkov, A. A.; Matsko, A. B.; Maleki, L.
2016-06-01
Optical frequency combs have become indispensable in astronomical measurements, biological fingerprinting, optical metrology, and radio frequency photonic signal generation. Recently demonstrated microring resonator-based Kerr frequency combs point the way towards chip scale optical frequency comb generator retaining major properties of the lab scale devices. This technique is promising for integrated miniature radiofrequency and microwave sources, atomic clocks, optical references and femtosecond pulse generators. Here we present Kerr frequency comb development in a historical perspective emphasizing its similarities and differences with other physical phenomena. We elucidate fundamental principles and describe practical implementations of Kerr comb oscillators, highlighting associated solved and unsolved problems.
Scale relativity: from quantum mechanics to chaotic dynamics.
NASA Astrophysics Data System (ADS)
Nottale, L.
Scale relativity is a new approach to the problem of the origin of fundamental scales and of scaling laws in physics, which consists in generalizing Einstein's principle of relativity to the case of scale transformations of resolutions. We recall here how it leads one to the concept of fractal space-time, and to introduce a new complex time derivative operator which allows to recover the Schrödinger equation, then to generalize it. In high energy quantum physics, it leads to the introduction of a Lorentzian renormalization group, in which the Planck length is reinterpreted as a lowest, unpassable scale, invariant under dilatations. These methods are successively applied to two problems: in quantum mechanics, that of the mass spectrum of elementary particles; in chaotic dynamics, that of the distribution of planets in the Solar System.
NASA Astrophysics Data System (ADS)
Lachieze-Rey, Marc
This book delivers a quantitative account of the science of cosmology, designed for a non-specialist audience. The basic principles are outlined using simple maths and physics, while still providing rigorous models of the Universe. It offers an ideal introduction to the key ideas in cosmology, without going into technical details. The approach used is based on the fundamental ideas of general relativity such as the spacetime interval, comoving coordinates, and spacetime curvature. It provides an up-to-date and thoughtful discussion of the big bang, and the crucial questions of structure and galaxy formation. Questions of method and philosophical approaches in cosmology are also briefly discussed. Advanced undergraduates in either physics or mathematics would benefit greatly from use either as a course text or as a supplementary guide to cosmology courses.
NASA Astrophysics Data System (ADS)
Chang, Li-Na; Luo, Shun-Long; Sun, Yuan
2017-11-01
The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182
Minimal Length Scale Scenarios for Quantum Gravity.
Hossenfelder, Sabine
2013-01-01
We review the question of whether the fundamental laws of nature limit our ability to probe arbitrarily short distances. First, we examine what insights can be gained from thought experiments for probes of shortest distances, and summarize what can be learned from different approaches to a theory of quantum gravity. Then we discuss some models that have been developed to implement a minimal length scale in quantum mechanics and quantum field theory. These models have entered the literature as the generalized uncertainty principle or the modified dispersion relation, and have allowed the study of the effects of a minimal length scale in quantum mechanics, quantum electrodynamics, thermodynamics, black-hole physics and cosmology. Finally, we touch upon the question of ways to circumvent the manifestation of a minimal length scale in short-distance physics.
Quantum Consciousness - The Road to Reality
NASA Astrophysics Data System (ADS)
Goradia, Shantilal
Per Einstein's theory mass tells space how to curve and space tells mass how to move. How do they tell\\x9D? The question boils down to information created by quantum particles blinking ON and OFF analogous to 'Ying and Yang' or some more complex ways that may include dark matter. Consciousness, dark matter, quantum physics, uncertainty principle, constants of nature like strong coupling, fine structure constant, cosmological constant introduced by Einstein, information, gravitation etc. are fundamentally consequences of that ONE TOE. Vedic philosophers, who impressed Schrodinger so much, called it ATMA split in the categories of AnuAtma (particle soul), JivAtma (life soul) and ParamAtma (Omnipresent soul) which we relate to quantum physics, biology and cosmology. There is no separate TOE (Theory of Everything) for any one thing.
A Variational Method in Out-of-Equilibrium Physical Systems
Pinheiro, Mario J.
2013-01-01
We propose a new variational principle for out-of-equilibrium dynamic systems that are fundamentally based on the method of Lagrange multipliers applied to the total entropy of an ensemble of particles. However, we use the fundamental equation of thermodynamics on differential forms, considering U and S as 0-forms. We obtain a set of two first order differential equations that reveal the same formal symplectic structure shared by classical mechanics, fluid mechanics and thermodynamics. From this approach, a topological torsion current emerges of the form , where Aj and ωk denote the components of the vector potential (gravitational and/or electromagnetic) and where ω denotes the angular velocity of the accelerated frame. We derive a special form of the Umov-Poynting theorem for rotating gravito-electromagnetic systems. The variational method is then applied to clarify the working mechanism of particular devices. PMID:24316718
Analytical exploration of the thermodynamic potentials by using symbolic computation software
NASA Astrophysics Data System (ADS)
Hantsaridou, Anastasia P.; Polatoglou, Hariton M.
2005-09-01
Thermodynamics is a very general theory, based on fundamental symmetries. It generalizes classical mechanics and incorporates theoretical concepts such as field and field equations. Although all these ingredients are of the highest importance for a scientist, they are not given the attention they perhaps deserve in most undergraduate courses. Nowadays, powerful computers in conjunction with equally powerful software can ease the exploration of the crucial ideas of thermodynamics. The purpose of the present work is to show how the utilization of symbolic computation software can lead to a complementary understanding of thermodynamics. The method was applied to first and second year physics students in the Aristotle University of Thessaloniki (Greece) during the 2002-2003 academic year. The results indicate that symbolic computation software is appropriate not only for enhancing the teaching of the fundamental principles in thermodynamics and their applications, but also for increasing students' motivation for learning.
Advances in quantifying air-sea gas exchange and environmental forcing.
Wanninkhof, Rik; Asher, William E; Ho, David T; Sweeney, Colm; McGillis, Wade R
2009-01-01
The past decade has seen a substantial amount of research on air-sea gas exchange and its environmental controls. These studies have significantly advanced the understanding of processes that control gas transfer, led to higher quality field measurements, and improved estimates of the flux of climate-relevant gases between the ocean and atmosphere. This review discusses the fundamental principles of air-sea gas transfer and recent developments in gas transfer theory, parameterizations, and measurement techniques in the context of the exchange of carbon dioxide. However, much of this discussion is applicable to any sparingly soluble, non-reactive gas. We show how the use of global variables of environmental forcing that have recently become available and gas exchange relationships that incorporate the main forcing factors will lead to improved estimates of global and regional air-sea gas fluxes based on better fundamental physical, chemical, and biological foundations.
Complementary Huygens Principle for Geometrical and Nongeometrical Optics
ERIC Educational Resources Information Center
Luis, Alfredo
2007-01-01
We develop a fundamental principle depicting the generalized ray formulation of optics provided by the Wigner function. This principle is formally identical to the Huygens-Fresnel principle but in terms of opposite concepts, rays instead of waves, and incoherent superpositions instead of coherent ones. This ray picture naturally includes…
Limits to magnetic resonance microscopy
NASA Astrophysics Data System (ADS)
Glover, Paul; Mansfield, Peter, Sir
2002-10-01
The last quarter of the twentieth century saw the development of magnetic resonance imaging (MRI) grow from a laboratory demonstration to a multi-billion dollar worldwide industry. There is a clinical body scanner in almost every hospital of the developed nations. The field of magnetic resonance microscopy (MRM), after mostly being abandoned by researchers in the first decade of MRI, has become an established branch of the science. This paper reviews the development of MRM over the last decade with an emphasis on the current state of the art. The fundamental principles of imaging and signal detection are examined to determine the physical principles which limit the available resolution. The limits are discussed with reference to liquid, solid and gas phase microscopy. In each area, the novel approaches employed by researchers to push back the limits of resolution are discussed. Although the limits to resolution are well known, the developments and applications of MRM have not reached their limit.
Nurzaman, Surya G.
2016-01-01
Sensor morphology, the morphology of a sensing mechanism which plays a role of shaping the desired response from physical stimuli from surroundings to generate signals usable as sensory information, is one of the key common aspects of sensing processes. This paper presents a structured review of researches on bioinspired sensor morphology implemented in robotic systems, and discusses the fundamental design principles. Based on literature review, we propose two key arguments: first, owing to its synthetic nature, biologically inspired robotics approach is a unique and powerful methodology to understand the role of sensor morphology and how it can evolve and adapt to its task and environment. Second, a consideration of an integrative view of perception by looking into multidisciplinary and overarching mechanisms of sensor morphology adaptation across biology and engineering enables us to extract relevant design principles that are important to extend our understanding of the unfinished concepts in sensing and perception. PMID:27499843
Nanoelectronics: Opportunities for future space applications
NASA Technical Reports Server (NTRS)
Frazier, Gary
1995-01-01
Further improvements in the performance of integrated electronics will eventually halt due to practical fundamental limits on our ability to downsize transistors and interconnect wiring. Avoiding these limits requires a revolutionary approach to switching device technology and computing architecture. Nanoelectronics, the technology of exploiting physics on the nanometer scale for computation and communication, attempts to avoid conventional limits by developing new approaches to switching, circuitry, and system integration. This presentation overviews the basic principles that operate on the nanometer scale that can be assembled into practical devices and circuits. Quantum resonant tunneling (RT) is used as the center-piece of the overview since RT devices already operate at high temperature (120 degrees C) and can be scaled, in principle, to a few nanometers in semiconductors. Near- and long-term applications of GaAs and silicon quantum devices are suggested for signal and information processing, memory, optoelectronics, and radio frequency (RF) communication.
NASA Astrophysics Data System (ADS)
Wentzcovitch, R. M.; Da Silveira, P. R.; Wu, Z.; Yu, Y.
2013-12-01
Today first principles calculations in mineral physics play a fundamental role in understanding of the Earth. They complement experiments by expanding the pressure and temperature range for which properties can be obtained and provide access to atomic scale phenomena. Since the wealth of predictive first principles results can hardly be communicated in printed form, we have developed online applications where published results can be reproduced/verified online and extensive unpublished results can be generated in customized form. So far these applications have included thermodynamics properties of end-member phases and thermal elastic properties of end-member phases and few solid solutions. Extension of this software infrastructure to include other properties is in principle straightforward. This contribution will review the nature of results that can be generated (methods, thermodynamics domain, list of minerals, properties, etc) and nature of the software infrastructure. These applications are part of a more extensive cyber-infrastructure operating in the XSEDE - the VLab Science Gateway [1]. [1] https://www.xsede.org/web/guest/gateways-listing Research supported by NSF grants ATM-0428744 and EAR-1047629.
Tightening the entropic uncertainty bound in the presence of quantum memory
NASA Astrophysics Data System (ADS)
Adabi, F.; Salimi, S.; Haseli, S.
2016-06-01
The uncertainty principle is a fundamental principle in quantum physics. It implies that the measurement outcomes of two incompatible observables cannot be predicted simultaneously. In quantum information theory, this principle can be expressed in terms of entropic measures. M. Berta et al. [Nat. Phys. 6, 659 (2010), 10.1038/nphys1734] have indicated that uncertainty bound can be altered by considering a particle as a quantum memory correlating with the primary particle. In this article, we obtain a lower bound for entropic uncertainty in the presence of a quantum memory by adding an additional term depending on the Holevo quantity and mutual information. We conclude that our lower bound will be tightened with respect to that of Berta et al. when the accessible information about measurements outcomes is less than the mutual information about the joint state. Some examples have been investigated for which our lower bound is tighter than Berta et al.'s lower bound. Using our lower bound, a lower bound for the entanglement of formation of bipartite quantum states has been obtained, as well as an upper bound for the regularized distillable common randomness.
NASA Astrophysics Data System (ADS)
Xu, M.; Yang, J. Y.; Liu, L. H.
2016-07-01
The macroscopic physical properties of solids are fundamentally determined by the interactions among microscopic electrons, phonons and photons. In this work, the thermal conductivity and infrared-visible-ultraviolet dielectric functions of alkali chlorides and their temperature dependence are fully investigated at the atomic level, seeking to unveil the microscopic quantum interactions beneath the macroscopic properties. The microscopic phonon-phonon interaction dominates the thermal conductivity which can be investigated by the anharmonic lattice dynamics in combination with Peierls-Boltzmann transport equation. The photon-phonon and electron-photon interaction intrinsically induce the infrared and visible-ultraviolet dielectric functions, respectively, and such microscopic processes can be simulated by first-principles molecular dynamics without empirical parameters. The temperature influence on dielectric functions can be effectively included by choosing the thermally equilibrated configurations as the basic input to calculate the total dipole moment and electronic band structure. The overall agreement between first-principles simulations and literature experiments enables us to interpret the macroscopic thermal conductivity and dielectric functions of solids in a comprehensive way.
Opportunities for Maturing Precision Metrology with Ultracold Gas Studies Aboard the ISS
NASA Astrophysics Data System (ADS)
Williams, Jason; D'Incao, Jose
2017-04-01
Precision atom interferometers (AI) in space are expected to become an enabling technology for future fundamental physics research, with proposals including unprecedented tests of the validity of the weak equivalence principle, measurements of the fine structure and gravitational constants, and detection of gravity waves and dark matter/dark energy. We will discuss our preparation at JPL to use NASA's Cold Atom Lab facility (CAL) to mature the technology of precision, space-based, AIs. The focus of our flight project is three-fold: a) study the controlled dynamics of heteronuclear Feshbach molecules, at temperatures of nano-Kelvins or below, as a means to overcome uncontrolled density-profile-dependent shifts in differential AIs, b) demonstrate unprecedented atom-photon coherence times with spatially constrained AIs, c) use the imaging capabilities of CAL to detect and analyze spatial fringe patterns written onto the clouds after AI and thereby measure the rotational noise of the ISS. The impact from this work, and potential for follow-on studies, will also be reviewed in the context of future space-based fundamental physics missions. This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.
Lung evolution as a cipher for physiology
Torday, J. S.; Rehan, V. K.
2009-01-01
In the postgenomic era, we need an algorithm to readily translate genes into physiologic principles. The failure to advance biomedicine is due to the false hope raised in the wake of the Human Genome Project (HGP) by the promise of systems biology as a ready means of reconstructing physiology from genes. like the atom in physics, the cell, not the gene, is the smallest completely functional unit of biology. Trying to reassemble gene regulatory networks without accounting for this fundamental feature of evolution will result in a genomic atlas, but not an algorithm for functional genomics. For example, the evolution of the lung can be “deconvoluted” by applying cell-cell communication mechanisms to all aspects of lung biology development, homeostasis, and regeneration/repair. Gene regulatory networks common to these processes predict ontogeny, phylogeny, and the disease-related consequences of failed signaling. This algorithm elucidates characteristics of vertebrate physiology as a cascade of emergent and contingent cellular adaptational responses. By reducing complex physiological traits to gene regulatory networks and arranging them hierarchically in a self-organizing map, like the periodic table of elements in physics, the first principles of physiology will emerge. PMID:19366785
NASA Astrophysics Data System (ADS)
Yao, Yi; Kanai, Yosuke
Our ability to correctly model the association of oppositely charged ions in water is fundamental in physical chemistry and essential to various technological and biological applications of molecular dynamics (MD) simulations. MD simulations using classical force fields often show strong clustering of NaCl in the aqueous ionic solutions as a consequence of a deep contact pair minimum in the potential of mean force (PMF) curve. First-Principles Molecular Dynamics (FPMD) based on Density functional theory (DFT) with the popular PBE exchange-correlation approximation, on the other hand, show a different result with a shallow contact pair minimum in the PMF. We employed two of most promising exchange-correlation approximations, ωB97xv by Mardiorossian and Head-Gordon and SCAN by Sun, Ruzsinszky and Perdew, to examine the PMF using FPMD simulations. ωB97xv is highly empirically and optimized in the space of range-separated hybrid functional with a dispersion correction while SCAN is the most recent meta-GGA functional that is constructed by satisfying various known conditions in well-defined physical limits. We will discuss our findings for PMF, charge transfer, water dipoles, etc.
Inerton fields: very new ideas on fundamental physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krasnoholovets, Volodymyr
2010-12-22
Modern theories of everything, or theories of the grand unification of all physical interactions, try to describe the whole world starting from the first principles of quantum theory. However, the first principles operate with undetermined notions, such as the wave {psi}-function, particle, lepton and quark, de Broglie and Compton wavelengths, mass, electric charge, spin, electromagnetic field, photon, gravitation, physical vacuum, space, etc. From a logical point of view this means that such modern approach to the theory of everything is condemned to failure... Thus, what should we suggest to improve the situation? It seems quite reasonable to develop initially amore » theory of something, which will be able to clarify the major fundamental notions (listed above) that physics operates with every day. What would be a starting point in such approach? Of course a theory of space as such, because particles and all physical fields emerge just from space. After that, when a particle and fields (and hence the fields' carriers) are well defined and introduced in the well defined physical space, different kinds of interactions can be proposed and investigated. Moreover, we must also allow for a possible interaction of a created particle with the space that generated the appearance of the particle. The mathematical studies of Michel Bounias and the author have shown what the real physical space is, how the space is constituted, how it is arranged and what its elements are. Having constructed the real physical space we can then derive whatever we wish, in particular, such basic notions as mass, particle and charge. How are mechanics of such objects (a massive particle, a charged massive particle) organised? The appropriate theory of motion has been called a sub microscopic mechanics of particles, which is developed in the real physical space, not an abstract phase space, as conventional quantum mechanics does. A series of questions arise: can these two mechanics (submicroscopic and conventional quantum mechanics) be unified?, what can such unification bring new for us?, can such submicroscopic mechanics be a starting point for the derivation of the phenomenon of gravity?, can this new theory be a unified physical theory?, does the theory allow experimental verification? These major points have been clarified in detail. And, perhaps, the most intriguing aspect of the theory is the derivation of a new physical field associated with the notion of mass (or rather inertia of a particle, which has been called the inerton field and which represents a real sense of the particle's wave {psi}-function). This field emerges by analogy with the electromagnetic field associated with the notion of the electric charge. Yes, the postulated inerton field has being tested in a series of different experiments. Even more, the inerton field might have a number of practical applications...« less
Cosmology of Universe Particles and Beyond
NASA Astrophysics Data System (ADS)
Xu, Wei
2016-06-01
For the first time in history, all properties of cosmology particles are uncovered and described concisely and systematically, known as the elementary particles in contemporary physics.Aligning with the synthesis of the virtual and physical worlds in a hierarchical taxonomy of the universe, this theory refines the topology framework of cosmology, and presents a new perspective of the Yin Yang natural laws that, through the processes of creation and reproduction, the fundamental elements generate an infinite series of circular objects and a Yin Yang duality of dynamic fields that are sequenced and transformed states of matter between the virtual and physical worlds.Once virtual objects are transformed, they embody various enclaves of energy states, known as dark energy, quarks, leptons, bosons, protons, and neutrons, characterized by their incentive oscillations of timestate variables in a duality of virtual realities: energy and time, spin and charge, mass and space, symmetry and antisymmetry.As a consequence, it derives the fully-scaled quantum properties of physical particles in accordance with numerous historical experiments, and has overcome the limitations of uncertainty principle and the Standard Model, towards concisely exploring physical nature and beyond...
Framework for non-coherent interface models at finite displacement jumps and finite strains
NASA Astrophysics Data System (ADS)
Ottosen, Niels Saabye; Ristinmaa, Matti; Mosler, Jörn
2016-05-01
This paper deals with a novel constitutive framework suitable for non-coherent interfaces, such as cracks, undergoing large deformations in a geometrically exact setting. For this type of interface, the displacement field shows a jump across the interface. Within the engineering community, so-called cohesive zone models are frequently applied in order to describe non-coherent interfaces. However, for existing models to comply with the restrictions imposed by (a) thermodynamical consistency (e.g., the second law of thermodynamics), (b) balance equations (in particular, balance of angular momentum) and (c) material frame indifference, these models are essentially fiber models, i.e. models where the traction vector is collinear with the displacement jump. This constraints the ability to model shear and, in addition, anisotropic effects are excluded. A novel, extended constitutive framework which is consistent with the above mentioned fundamental physical principles is elaborated in this paper. In addition to the classical tractions associated with a cohesive zone model, the main idea is to consider additional tractions related to membrane-like forces and out-of-plane shear forces acting within the interface. For zero displacement jump, i.e. coherent interfaces, this framework degenerates to existing formulations presented in the literature. For hyperelasticity, the Helmholtz energy of the proposed novel framework depends on the displacement jump as well as on the tangent vectors of the interface with respect to the current configuration - or equivalently - the Helmholtz energy depends on the displacement jump and the surface deformation gradient. It turns out that by defining the Helmholtz energy in terms of the invariants of these variables, all above-mentioned fundamental physical principles are automatically fulfilled. Extensions of the novel framework necessary for material degradation (damage) and plasticity are also covered.
Schenck, Robert C.; Richter, Dustin L.; Wascher, Daniel C.
2014-01-01
Background: Traumatic knee dislocation is becoming more prevalent because of improved recognition and increased exposure to high-energy trauma, but long-term results are lacking. Purpose: To present 2 cases with minimum 20-year follow-up and a review of the literature to illustrate some of the fundamental principles in the management of the dislocated knee. Study Design: Review and case reports. Methods: Two patients with knee dislocations who underwent multiligamentous knee reconstruction were reviewed, with a minimum 20-year follow-up. These patients were brought back for a clinical evaluation using both subjective and objective measures. Subjective measures include the following scales: Lysholm, Tegner activity, visual analog scale (VAS), Short Form–36 (SF-36), International Knee Documentation Committee (IKDC), and a psychosocial questionnaire. Objective measures included ligamentous examination, radiographic evaluation (including Telos stress radiographs), and physical therapy assessment of function and stability. Results: The mean follow-up was 22 years. One patient had a vascular injury requiring repair prior to ligament reconstruction. The average assessment scores were as follows: SF-36 physical health, 52; SF-36 mental health, 59; Lysholm, 92; IKDC, 86.5; VAS involved, 10.5 mm; and VAS uninvolved, 2.5 mm. Both patients had excellent stability and were functioning at high levels of activity for their age (eg, hiking, skydiving). Both patients had radiographic signs of arthritis, which lowered 1 subject’s IKDC score to “C.” Conclusion: Knee dislocations have rare long-term excellent results, and most intermediate-term studies show fair to good functional results. By following fundamental principles in the management of a dislocated knee, patients can be given the opportunity to function at high levels. Hopefully, continued advances in the evaluation and treatment of knee dislocations will improve the long-term outcomes for these patients in the future. PMID:26535332
An integration of integrated information theory with fundamental physics
Barrett, Adam B.
2014-01-01
To truly eliminate Cartesian ghosts from the science of consciousness, we must describe consciousness as an aspect of the physical. Integrated Information Theory states that consciousness arises from intrinsic information generated by dynamical systems; however existing formulations of this theory are not applicable to standard models of fundamental physical entities. Modern physics has shown that fields are fundamental entities, and in particular that the electromagnetic field is fundamental. Here I hypothesize that consciousness arises from information intrinsic to fundamental fields. This hypothesis unites fundamental physics with what we know empirically about the neuroscience underlying consciousness, and it bypasses the need to consider quantum effects. PMID:24550877
Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerner, Boris S.
It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliable used for control and optimization in traffic networks. It is shown that generally accepted fundamentals and methodologies of traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (formore » example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular stochastic value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of traffic and transportation theory, we discuss three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.« less
ERIC Educational Resources Information Center
Kalaja, Sami Pekka; Jaakkola, Timo Tapio; Liukkonen, Jarmo Olavi; Digelidis, Nikolaos
2012-01-01
Background: There is evidence showing that fundamental movement skills and physical activity are related with each other. The ability to perform a variety of fundamental movement skills increases the likelihood of children participating in different physical activities throughout their lives. However, no fundamental movement skill interventions…
Pitch glide effect induced by a nonlinear string-barrier interaction
NASA Astrophysics Data System (ADS)
Kartofelev, Dmitri; Stulov, Anatoli; Välimäki, Vesa
2015-10-01
Interactions of a vibrating string with its supports and other spatially distributed barriers play a significant role in the physics of many stringed musical instruments. It is well known that the tone of the string vibrations is determined by the string supports, and that the boundary conditions of the string termination may cause a short-lasting initial fundamental frequency shifting. Generally, this phenomenon is associated with the nonlinear modulation of the stiff string tension. The aim of this paper is to study the initial frequency glide phenomenon that is induced only by the string-barrier interaction, apart from other possible physical causes, and without the interfering effects of dissipation and dispersion. From a numerical simulation perspective, this highly nonlinear problem may present various difficulties, not the least of which is the risk of numerical instability. We propose a numerically stable and a purely kinematic model of the string-barrier interaction, which is based on the travelling wave solution of the ideal string vibration. The model is capable of reproducing the motion of the vibrating string exhibiting the initial fundamental frequency glide, which is caused solely by the complex nonlinear interaction of the string with its termination. The results presented in this paper can expand our knowledge and understanding of the timbre evolution and the physical principles of sound generation of numerous stringed instruments, such as lutes called the tambura, sitar and biwa.
Montévil, Maël; Speroni, Lucia; Sonnenschein, Carlos; Soto, Ana M
2016-10-01
In multicellular organisms, relations among parts and between parts and the whole are contextual and interdependent. These organisms and their cells are ontogenetically linked: an organism starts as a cell that divides producing non-identical cells, which organize in tri-dimensional patterns. These association patterns and cells types change as tissues and organs are formed. This contextuality and circularity makes it difficult to establish detailed cause and effect relationships. Here we propose an approach to overcome these intrinsic difficulties by combining the use of two models; 1) an experimental one that employs 3D culture technology to obtain the structures of the mammary gland, namely, ducts and acini, and 2) a mathematical model based on biological principles. The typical approach for mathematical modeling in biology is to apply mathematical tools and concepts developed originally in physics or computer sciences. Instead, we propose to construct a mathematical model based on proper biological principles. Specifically, we use principles identified as fundamental for the elaboration of a theory of organisms, namely i) the default state of cell proliferation with variation and motility and ii) the principle of organization by closure of constraints. This model has a biological component, the cells, and a physical component, a matrix which contains collagen fibers. Cells display agency and move and proliferate unless constrained; they exert mechanical forces that i) act on collagen fibers and ii) on other cells. As fibers organize, they constrain the cells on their ability to move and to proliferate. The model exhibits a circularity that can be interpreted in terms of closure of constraints. Implementing the mathematical model shows that constraints to the default state are sufficient to explain ductal and acinar formation, and points to a target of future research, namely, to inhibitors of cell proliferation and motility generated by the epithelial cells. The success of this model suggests a step-wise approach whereby additional constraints imposed by the tissue and the organism could be examined in silico and rigorously tested by in vitro and in vivo experiments, in accordance with the organicist perspective we embrace. Copyright © 2016. Published by Elsevier Ltd.
Montévil, Maël; Speroni, Lucia; Sonnenschein, Carlos; Soto, Ana M.
2017-01-01
In multicellular organisms, relations among parts and between parts and the whole are contextual and interdependent. These organisms and their cells are ontogenetically linked: an organism starts as a cell that divides producing non-identical cells, which organize in tri-dimensional patterns. These association patterns and cells types change as tissues and organs are formed. This contextuality and circularity makes it difficult to establish detailed cause and effect relationships. Here we propose an approach to overcome these intrinsic difficulties by combining the use of two models; 1) an experimental one that employs 3D culture technology to obtain the structures of the mammary gland, namely, ducts and acini, and 2) a mathematical model based on biological principles. The typical approach for mathematical modeling in biology is to apply mathematical tools and concepts developed originally in physics or computer sciences. Instead, we propose to construct a mathematical model based on proper biological principles. Specifically, we use principles identified as fundamental for the elaboration of a theory of organisms, namely i) the default state of cell proliferation with variation and motility and ii) the principle of organization by closure of constraints. This model has a biological component, the cells, and a physical component, a matrix which contains collagen fibers. Cells display agency and move and proliferate unless constrained; they exert mechanical forces that i) act on collagen fibers and ii) on other cells. As fibers organize, they constrain the cells on their ability to move and to proliferate. The model exhibits a circularity that can be interpreted in terms of closure of constraints. Implementing the mathematical model shows that constraints to the default state are sufficient to explain ductal and acinar formation, and points to a target of future research, namely, to inhibitors of cell proliferation and motility generated by the epithelial cells. The success of this model suggests a step-wise approach whereby additional constraints imposed by the tissue and the organism could be examined in silico and rigorously tested by in vitro and in vivo experiments, in accordance with the organicist perspective we embrace. PMID:27544910
Cornejo Moreno, Borys Alberto; Gómez Arteaga, Gress Marissell
2012-12-16
Even though we are now well into the 21st century and notwithstanding all the abuse to individuals involved in clinical studies that has been documented throughout History, fundamental ethical principles continue to be violated in one way or another. Here are some of the main factors that contribute to the abuse of subjects participating in clinical trials: paternalism, improper use of informed consent, lack of strict ethical supervision, pressure exerted by health institutions to increase the production of scientific material, and the absence of legislation regarding ethics in terms of health care and research. Are researchers ready to respect fundamental ethical principles in light of the ample window of information provided by individual genomes, while defending the rights of the subjects participating in clinical studies as a major priority? As one of the possible solutions to this problem, education regarding fundamental ethical principles is suggested for participants in research studies as an initial method of cognitive training in ethics, together with the promotion of ethical behavior in order to encourage the adoption of reasonable policies in the field of values, attitudes and behavior.
NASA Astrophysics Data System (ADS)
Martyushev, Leonid M.
2018-03-01
The paper [1] is certainly very useful and important for understanding living systems (e.g. brain) as adaptive, self-organizing patterns. There is no need to enumerate all advantages of the paper, they are obvious. The purpose of my brief comment is to discuss one issue which, as I see it, was not thought out by the authors well enough. As a consequence, their ideas do not find as wide distribution as they otherwise could have found. This issue is related to the name selected for the principle forming the basis of their approach: free-energy principle (FEP). According to the sec. 2.1 [1]: "It asserts that all biological systems maintain their integrity by actively reducing the disorder or dispersion (i.e., entropy) of their sensory and physiological states by minimizing their variational free energy." Let us note that the authors suggested different names for the principle in their earlier works (an objective function, a function of the ensemble density encoded by the organism's configuration and the sensory data to which it is exposed, etc.), and explicitly and correctly mentioned that the free energy and entropy considered by them had nothing in common with the quantities employed in physics [2,3]. It is also obvious that a purely information-theoretic approach used by the authors with regard to the problems under study allows many other wordings and interpretations. However, in spite of this fact, in their last papers as well as in the present paper, the authors choose specifically FEP. Apparently, it may be explained by the intent to additionally base their approach on the foundation of statistical thermodynamics and therefore to demonstrate the universality of the described method. However, this is exactly what might cause misunderstandings specifically among physicists and consequently in their rejection and ignoring of FEP. The physical analogy employed by the authors has the following fundamental inconsistencies: In physics, free energy is used to describe processes occurring at constant temperatures and volumes. In physics, the minimum free energy corresponds to an equilibrium state to which an isochoric-isothermal system relaxes [4,5]. It is obvious that the biological systems considered by the authors are fundamentally non-equilibrium, do not seek equilibrium, and, in most cases, do not retain their volumes as they develop. For a biological system, the equilibrium means death, decay. Therefore, to base the idea of life on FEP is the same as to state that the pursuit of death is the purpose and meaning of life. In order to consider processes addressed by the authors, one needs functionals employed in non-equilibrium rather than equilibrium thermodynamics [6-8]. Specifically, I would like to draw their attention to the rate of change of the Gibbs energy with time, or entropy production (the maximum entropy production principle can be useful here [7,9-12]).
Dielectric Spectroscopy in Biomaterials: Agrophysics
El Khaled, Dalia; Castellano, Nuria N.; Gázquez, Jose A.; Perea-Moreno, Alberto-Jesus; Manzano-Agugliaro, Francisco
2016-01-01
Being dependent on temperature and frequency, dielectric properties are related to various types of food. Predicting multiple physical characteristics of agri-food products has been the main objective of non-destructive assessment possibilities executed in many studies on horticultural products and food materials. This review manipulates the basic fundamentals of dielectric properties with their concepts and principles. The different factors affecting the behavior of dielectric properties have been dissected, and applications executed on different products seeking the characterization of a diversity of chemical and physical properties are all pointed out and referenced with their conclusions. Throughout the review, a detailed description of the various adopted measurement techniques and the mostly popular equipment are presented. This compiled review serves in coming out with an updated reference for the dielectric properties of spectroscopy that are applied in the agrophysics field. PMID:28773438
NASA Astrophysics Data System (ADS)
Duggan-Haas, D.
2013-12-01
The Next Generation Science Standards and the Frameworks upon which they are built, built upon and synthesized a wide range of educational research and development that came before them. For the Earth sciences, this importantly includes a series of initiatives to define literacy within oceanography, atmospheric and climate sciences, and geology. Since the publication of the Frameworks, a similarly structured set of principles for energy literacy was also published. Each set of principles includes seven to nine Essential Principles or Big Ideas, all written at the commencement level. Each of these Principles is undergirded by several Fundamental Concepts. This set of idea sets yields 38 Essential Principles and 247 Fundamental Concepts. How do these relate to the content of NGSS? How can teachers, professional development providers and curriculum specialists make sense of this array of ideas and place it into a coherent conceptual framework? This presentation will answer these questions and more. Of course, there is substantial overlap amongst the sets of principles and with the ideas, practices and principles in NGSS. This presentation will provide and describe a framework that identifies these areas of overlap and contextualizes them within a framework that makes them more manageable for educators and learners. A set of five bigger ideas and a pair of overarching questions assembled with the Essential Principles and Earth & Space Science Disciplinary Core Ideas in the form of a 'Rainbow Chart' shows a consistency of thought across Earth science's sub-disciplines and helps educators navigate this somewhat overwhelming landscape of ideas. These questions and ideas are shown in the included figure and listed below. Overarching Questions: - How do we know what we know? - How does what we know inform our decision making? Bigger Ideas: - Earth is a system of systems. - The flow of energy drives the cycling of matter. - Life, including human life, influences and is influenced by the environment. - Physical and chemical principles are unchanging and drive both gradual and rapid changes in the Earth system. - To understand (deep) space and time, models and maps are necessary. What do the colors mean? Each bigger idea has a unique color, and the overarching questions tie this rainbow of colors together and appear white when ideas or principles from the other idea sets reflect the nature of science that is inherent in the overarching questions. The highlighting indicates that each set of literacy principles addresses all Bigger Ideas and the overarching questions. The presentation will also address the way teachers within our professional development programming have used the framework in their instruction. The Rainbow Chart of Earth Science Bigger Ideas
About Essence of the Wave Function on Atomic Level and in Superconductors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikulov, A. V.
The wave function was proposed for description of quantum phenomena on the atomic level. But now it is well known that quantum phenomena are observed not only on atomic level and the wave function is used for description of macroscopic quantum phenomena, such as superconductivity. The essence of the wave function on level elementary particles was and is the subject of heated argument among founders of quantum mechanics and other physicists. This essence seems more clear in superconductor. But impossibility of probabilistic interpretation of wave function in this case results to obvious contradiction of quantum principles with some fundamental principlesmore » of physics.« less
Perspectives on Industrial Innovation from Agilent, HP, and Bell Labs
NASA Astrophysics Data System (ADS)
Hollenhorst, James
2014-03-01
Innovation is the life blood of technology companies. I will give perspectives gleaned from a career in research and development at Bell Labs, HP Labs, and Agilent Labs, from the point of view of an individual contributor and a manager. Physicists bring a unique set of skills to the corporate environment, including a desire to understand the fundamentals, a solid foundation in physical principles, expertise in applied mathematics, and most importantly, an attitude: namely, that hard problems can be solved by breaking them into manageable pieces. In my experience, hiring managers in industry seldom explicitly search for physicists, but they want people with those skills.
Gouy Phase Radial Mode Sorter for Light: Concepts and Experiments.
Gu, Xuemei; Krenn, Mario; Erhard, Manuel; Zeilinger, Anton
2018-03-09
We present an in principle lossless sorter for radial modes of light, using accumulated Gouy phases. The experimental setups have been found by a computer algorithm, and can be intuitively understood in a geometric way. Together with the ability to sort angular-momentum modes, we now have access to the complete two-dimensional transverse plane of light. The device can readily be used in multiplexing classical information. On a quantum level, it is an analog of the Stern-Gerlach experiment-significant for the discussion of fundamental concepts in quantum physics. As such, it can be applied in high-dimensional and multiphotonic quantum experiments.
Gouy Phase Radial Mode Sorter for Light: Concepts and Experiments
NASA Astrophysics Data System (ADS)
Gu, Xuemei; Krenn, Mario; Erhard, Manuel; Zeilinger, Anton
2018-03-01
We present an in principle lossless sorter for radial modes of light, using accumulated Gouy phases. The experimental setups have been found by a computer algorithm, and can be intuitively understood in a geometric way. Together with the ability to sort angular-momentum modes, we now have access to the complete two-dimensional transverse plane of light. The device can readily be used in multiplexing classical information. On a quantum level, it is an analog of the Stern-Gerlach experiment—significant for the discussion of fundamental concepts in quantum physics. As such, it can be applied in high-dimensional and multiphotonic quantum experiments.
The new lunar ephemeris INPOP17a and its application to fundamental physics
NASA Astrophysics Data System (ADS)
Viswanathan, V.; Fienga, A.; Minazzoli, O.; Bernus, L.; Laskar, J.; Gastineau, M.
2018-05-01
We present here the new INPOP lunar ephemeris, INPOP17a. This ephemeris is obtained through the numerical integration of the equations of motion and of rotation of the Moon, fitted over 48 yr of lunar laser ranging (LLR) data. We also include the 2 yr of infrared LLR data acquired at the Grasse station between 2015 and 2017. Tests of the universality of free-fall are performed. We find no violation of the principle of equivalence at the (-3.8 ± 7.1) × 10-14 level. A new interpretation in the frame of dilaton theories is also proposed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-03-01
The module provides an overview of general techniques that owners and operators of reporting facilities may use to estimate their toxic chemical releases. It exlains the basic release estimation techniques used to determine the chemical quantities reported on the Form R and uses those techniques, along with fundamental chemical or physical principles and properties, to estimate releases of listed toxic chemicals. It converts units of mass, volume, and time. It states the rules governing significant figures and rounding techniques, and references general and industry-specific estimation documents.
CODATA Fundamental Physical Constants
National Institute of Standards and Technology Data Gateway
SRD 121 NIST CODATA Fundamental Physical Constants (Web, free access) This site, developed in the Physics Laboratory at NIST, addresses three topics: fundamental physical constants, the International System of Units (SI), which is the modern metric system, and expressing the uncertainty of measurement results.
Relationship of physical activity to fundamental movement skills among adolescents.
Okely, A D; Booth, M L; Patterson, J W
2001-11-01
To determine the relationship of participation in organized and nonorganized physical activity with fundamental movement skills among adolescents. Male and female children in Grade 8 (mean age, 13.3 yr) and Grade 10 (mean age, 15.3 yr) were assessed on six fundamental movement skills (run, vertical jump, catch, overhand throw, forehand strike, and kick). Physical activity was assessed using a self-report recall measure where students reported the type, duration, and frequency of participation in organized physical activity and nonorganized physical activity during a usual week. Multiple regression analysis indicated that fundamental movement skills significantly predicted time in organized physical activity, although the percentage of variance it could explain was small. This prediction was stronger for girls than for boys. Multiple regression analysis showed no relationship between time in nonorganized physical activity and fundamental movement skills. Fundamental movement skills are significantly associated with adolescents' participation in organized physical activity, but predict only a small portion of it.
What Constitutes Science and Scientific Evidence: Roles of Null Hypothesis Testing
ERIC Educational Resources Information Center
Chang, Mark
2017-01-01
We briefly discuss the philosophical basis of science, causality, and scientific evidence, by introducing the hidden but most fundamental principle of science: the similarity principle. The principle's use in scientific discovery is illustrated with Simpson's paradox and other examples. In discussing the value of null hypothesis statistical…
Integration of Social Studies Principles in the Home Economics Curriculum.
ERIC Educational Resources Information Center
Texas Tech Univ., Lubbock. Home Economics Curriculum Center.
This document is intended to help secondary home economics teachers incorporate social studies principles into their curriculum. After an introduction, the document is divided into three sections. The first section identifies and explains fundamental principles within social studies and covers the history and current state of the social studies…
Biological Physics major as a means to stimulate an undergraduate physics program
NASA Astrophysics Data System (ADS)
Jaeger, Herbert; Eid, Khalid; Yarrison-Rice, Jan
2013-03-01
In an effort to stress the cross-disciplinary nature of modern physics we added a Biological Physics major. Drawing from coursework in physics, biology, chemistry, mathematics, and related disciplines, it combines a broad curriculum with physical and mathematical rigor in preparation for careers in biophysics, medical physics, and biomedical engineering. Biological Physics offers a new path of studies to a large pool of life science students. We hope to grow our physics majors from 70-80 to more than 100 students and boost our graduation rate from the mid-teens to the mid-twenties. The new major brought about a revision of our sophomore curriculum to make room for modern topics without sidelining fundamentals. As a result, we split our 1-semester long Contemporary Physics course (4 cr hrs) into a year-long sequence Contemporary Physics Foundations and Contemporary Physics Frontiers (both 3 cr hrs). Foundations starts with relativity, then focuses on 4 quantum mechanics topics: wells, spin 1/2, oscillators, and hydrogen. Throughout the course applications are woven in whenever the opportunity arises, e.g. magnetism and NMR with spin 1/2. The following semester Frontiers explores scientific principles and technological advances that make quantum science and resulting technologies different from the large scale. Frontiers covers enabling techniques from atomic, molecular, condensed matter, and particle physics, as well as advances in nanotechnology, quantum optics, and biophysics.
Recent results and perspectives on cosmology and fundamental physics from microwave surveys
NASA Astrophysics Data System (ADS)
Burigana, Carlo; Battistelli, Elia Stefano; Benetti, Micol; Cabass, Giovanni; de Bernardis, Paolo; di Serego Alighieri, Sperello; di Valentino, Eleonora; Gerbino, Martina; Giusarma, Elena; Gruppuso, Alessandro; Liguori, Michele; Masi, Silvia; Norgaard-Nielsen, Hans Ulrik; Rosati, Piero; Salvati, Laura; Trombetti, Tiziana; Vielva, Patricio
2016-04-01
Recent cosmic microwave background (CMB) data in temperature and polarization have reached high precision in estimating all the parameters that describe the current so-called standard cosmological model. Recent results about the integrated Sachs-Wolfe (ISW) effect from CMB anisotropies, galaxy surveys, and their cross-correlations are presented. Looking at fine signatures in the CMB, such as the lack of power at low multipoles, the primordial power spectrum (PPS) and the bounds on non-Gaussianities, complemented by galaxy surveys, we discuss inflationary physics and the generation of primordial perturbations in the early universe. Three important topics in particle physics, the bounds on neutrinos masses and parameters, on thermal axion mass and on the neutron lifetime derived from cosmological data are reviewed, with attention to the comparison with laboratory experiment results. Recent results from cosmic polarization rotation (CPR) analyses aimed at testing the Einstein equivalence principle (EEP) are presented. Finally, we discuss the perspectives of next radio facilities for the improvement of the analysis of future CMB spectral distortion experiments.
Space Weather, Geomagnetic Disturbances and Impact on the High-Voltage Transmission Systems
NASA Technical Reports Server (NTRS)
Pullkkinen, A.
2011-01-01
Geomagnetically induced currents (GIC) affecting the performance of high-voltage power transmission systems are one of the most significant hazards space weather poses on the operability of critical US infrastructure. The severity of the threat was emphasized, for example, in two recent reports: the National Research Council (NRC) report "Severe Space Weather Events--Understanding Societal and Economic Impacts: A Workshop Report" and the North American Electric Reliability Corporation (NERC) report "HighImpact, Low-Frequency Event Risk to the North American Bulk Power System." The NRC and NERC reports demonstrated the important national security dimension of space weather and GIC and called for comprehensive actions to forecast and mitigate the hazard. In this paper we will give a brief overview of space weather storms and accompanying geomagnetic storm events that lead to GIC. We will also review the fundamental principles of how GIC can impact the power transmission systems. Space weather has been a subject of great scientific advances that have changed the wonder of the past to a quantitative field of physics with true predictive power of today. NASA's Solar Shield system aimed at forecasting of GIC in the North American high-voltage power transmission system can be considered as one of the ultimate fruits of those advances. We will review the fundamental principles of the Solar Shield system and provide our view of the way forward in the science of GIC.
The Visual Geophysical Exploration Environment: A Multi-dimensional Scientific Visualization
NASA Astrophysics Data System (ADS)
Pandya, R. E.; Domenico, B.; Murray, D.; Marlino, M. R.
2003-12-01
The Visual Geophysical Exploration Environment (VGEE) is an online learning environment designed to help undergraduate students understand fundamental Earth system science concepts. The guiding principle of the VGEE is the importance of hands-on interaction with scientific visualization and data. The VGEE consists of four elements: 1) an online, inquiry-based curriculum for guiding student exploration; 2) a suite of El Nino-related data sets adapted for student use; 3) a learner-centered interface to a scientific visualization tool; and 4) a set of concept models (interactive tools that help students understand fundamental scientific concepts). There are two key innovations featured in this interactive poster session. One is the integration of concept models and the visualization tool. Concept models are simple, interactive, Java-based illustrations of fundamental physical principles. We developed eight concept models and integrated them into the visualization tool to enable students to probe data. The ability to probe data using a concept model addresses the common problem of transfer: the difficulty students have in applying theoretical knowledge to everyday phenomenon. The other innovation is a visualization environment and data that are discoverable in digital libraries, and installed, configured, and used for investigations over the web. By collaborating with the Integrated Data Viewer developers, we were able to embed a web-launchable visualization tool and access to distributed data sets into the online curricula. The Thematic Real-time Environmental Data Distributed Services (THREDDS) project is working to provide catalogs of datasets that can be used in new VGEE curricula under development. By cataloging this curricula in the Digital Library for Earth System Education (DLESE), learners and educators can discover the data and visualization tool within a framework that guides their use.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-24
... (Faith-Based Activities) to reflect the amendments made by Executive Order 13559 (Fundamental Principles... several important values and principles of community development. First, the HOME program's flexibility is...
Gabriel Weinreich: The life and style
NASA Astrophysics Data System (ADS)
Hartmann, William M.
2003-10-01
Gabriel Weinreich (Gabi) was born in Vilna, Poland (now the capitol of Lithuania) one year prior to the founding of the Acoustical Society of America. When the second world war began in central Europe, Gabi's family came, in serial fashion, to New York City-Gabi himself arriving in 1941. Gabi studied physics at Columbia, and received a Ph.D. in 1953 for a thesis on atomic physics directed by the legendary I. I. Rabi. He subsequently worked on fundamental properties of semiconductors, first at Bell Labs, then, starting in 1960, at the University of Michigan. In 1977 he turned his attention to the acoustics of musical instruments, mainly the piano and bowed strings. He studied all phases of the physical elements: string excitation, string vibration, coupling, and radiation. Gabi brought his special style to acoustics-a combination of theory and experiment that imaginatively imports ideas and techniques from one area of physics into another, a willingness to attack traditional problems afresh by returning to first principles, and the ability to present ideas with incisive wit and charm so that information is not only informative but is also entertaining.
NASA Astrophysics Data System (ADS)
Canning, Andrew
2013-03-01
Inorganic scintillation phosphors (scintillators) are extensively employed as radiation detector materials in many fields of applied and fundamental research such as medical imaging, high energy physics, astrophysics, oil exploration and nuclear materials detection for homeland security and other applications. The ideal scintillator for gamma ray detection must have exceptional performance in terms of stopping power, luminosity, proportionality, speed, and cost. Recently, trivalent lanthanide dopants such as Ce and Eu have received greater attention for fast and bright scintillators as the optical 5d to 4f transition is relatively fast. However, crystal growth and production costs remain challenging for these new materials so there is still a need for new higher performing scintillators that meet the needs of the different application areas. First principles calculations can provide a useful insight into the chemical and electronic properties of such materials and hence can aid in the search for better new scintillators. In the past there has been little first-principles work done on scintillator materials in part because it means modeling f electrons in lanthanides as well as complex excited state and scattering processes. In this talk I will give an overview of the scintillation process and show how first-principles calculations can be applied to such systems to gain a better understanding of the physics involved. I will also present work on a high-throughput first principles approach to select new scintillator materials for fabrication as well as present more detailed calculations to study trapping process etc. that can limit their brightness. This work in collaboration with experimental groups has lead to the discovery of some new bright scintillators. Work supported by the U.S. Department of Homeland Security and carried out under U.S. Department of Energy Contract no. DE-AC02-05CH11231 at Lawrence Berkeley National Laboratory.
A systems approach to theoretical fluid mechanics: Fundamentals
NASA Technical Reports Server (NTRS)
Anyiwo, J. C.
1978-01-01
A preliminary application of the underlying principles of the investigator's general system theory to the description and analyses of the fluid flow system is presented. An attempt is made to establish practical models, or elements of the general fluid flow system from the point of view of the general system theory fundamental principles. Results obtained are applied to a simple experimental fluid flow system, as test case, with particular emphasis on the understanding of fluid flow instability, transition and turbulence.
Kachina, N N
2013-01-01
Discussed in this paper are fundamental legal principles and organizational aspects of the participation of forensic medical experts in the examination of the corpses at the place of occurrence. A detailed analysis of the current departmental and sectoral regulations governing the activities of specialists in the field of forensic medicine was performed The analysis demonstrated their positive and negative aspects. These findings were used to develop concrete recommendations for further improvement of these documents.
Proton-pumping mechanism of cytochrome c oxidase: A kinetic master-equation approach
Kim, Young C.; Hummer, Gerhard
2011-01-01
Cytochrome c oxidase (CcO) is an efficient energy transducer that reduces oxygen to water and converts the released chemical energy into an electrochemical membrane potential. As a true proton pump, CcO translocates protons across the membrane against this potential. Based on a wealth of experiments and calculations, an increasingly detailed picture of the reaction intermediates in the redox cycle has emerged. However, the fundamental mechanism of proton pumping coupled to redox chemistry remains largely unresolved. Here we examine and extend a kinetic master-equation approach to gain insight into redox-coupled proton pumping in CcO. Basic principles of the CcO proton pump emerge from an analysis of the simplest kinetic models that retain essential elements of the experimentally determined structure, energetics, and kinetics, and that satisfy fundamental physical principles. The master-equation models allow us to address the question of how pumping can be achieved in a system in which all reaction steps are reversible. Whereas proton pumping does not require the direct modulation of microscopic reaction barriers, such kinetic gating greatly increases the pumping efficiency. Further efficiency gains can be achieved by partially decoupling the proton uptake pathway from the ative-site region. Such a mechanism is consistent with the proposed Glu valve, in which the side chain of a key glutamic acid shuttles between the D channel and the active-site region. We also show that the models predict only small proton leaks even in the absence of turnover. The design principles identified here for CcO provide a blueprint for novel biology-inspired fuel cells, and the master-equation formulation should prove useful also for other molecular machines. PMID:21946020
The Fundamentals of an African American Value System.
ERIC Educational Resources Information Center
Alexander, E. Curtis
The Nguzo Saba or "Seven Principles of Blackness" provide the fundamental basis for the development of an African America value system that is based on the cultural and historical particularisms of being Black in an American society that devalues Black efficacy and Black people. The fundamentals of this value system, foundational to the Kwanzaa…
The Subordination of Aesthetic Fundamentals in College Art Instruction
ERIC Educational Resources Information Center
Lavender, Randall
2003-01-01
Opportunities for college students of art and design to study fundamentals of visual aesthetics, integrity of form, and principles of composition are limited today by a number of factors. With the well-documented prominence of postmodern critical theory in the world of contemporary art, the study of aesthetic fundamentals is largely subordinated…
Physical principles for scalable neural recording
Zamft, Bradley M.; Maguire, Yael G.; Shapiro, Mikhail G.; Cybulski, Thaddeus R.; Glaser, Joshua I.; Amodei, Dario; Stranges, P. Benjamin; Kalhor, Reza; Dalrymple, David A.; Seo, Dongjin; Alon, Elad; Maharbiz, Michel M.; Carmena, Jose M.; Rabaey, Jan M.; Boyden, Edward S.; Church, George M.; Kording, Konrad P.
2013-01-01
Simultaneously measuring the activities of all neurons in a mammalian brain at millisecond resolution is a challenge beyond the limits of existing techniques in neuroscience. Entirely new approaches may be required, motivating an analysis of the fundamental physical constraints on the problem. We outline the physical principles governing brain activity mapping using optical, electrical, magnetic resonance, and molecular modalities of neural recording. Focusing on the mouse brain, we analyze the scalability of each method, concentrating on the limitations imposed by spatiotemporal resolution, energy dissipation, and volume displacement. Based on this analysis, all existing approaches require orders of magnitude improvement in key parameters. Electrical recording is limited by the low multiplexing capacity of electrodes and their lack of intrinsic spatial resolution, optical methods are constrained by the scattering of visible light in brain tissue, magnetic resonance is hindered by the diffusion and relaxation timescales of water protons, and the implementation of molecular recording is complicated by the stochastic kinetics of enzymes. Understanding the physical limits of brain activity mapping may provide insight into opportunities for novel solutions. For example, unconventional methods for delivering electrodes may enable unprecedented numbers of recording sites, embedded optical devices could allow optical detectors to be placed within a few scattering lengths of the measured neurons, and new classes of molecularly engineered sensors might obviate cumbersome hardware architectures. We also study the physics of powering and communicating with microscale devices embedded in brain tissue and find that, while radio-frequency electromagnetic data transmission suffers from a severe power–bandwidth tradeoff, communication via infrared light or ultrasound may allow high data rates due to the possibility of spatial multiplexing. The use of embedded local recording and wireless data transmission would only be viable, however, given major improvements to the power efficiency of microelectronic devices. PMID:24187539
Fundamental movement skills and motivational factors influencing engagement in physical activity.
Kalaja, Sami; Jaakkola, Timo; Liukkonen, Jarmo; Watt, Anthony
2010-08-01
To assess whether subgroups based on children's fundamental movement skills, perceived competence, and self-determined motivation toward physical education vary with current self-reported physical activity, a sample of 316 Finnish Grade 7 students completed fundamental movement skills measures and self-report questionnaires assessing perceived competence, self-determined motivation toward physical education, and current physical activity. Cluster analysis indicated a three-cluster structure: "Low motivation/low skills profile," "High skills/low motivation profile," and "High skills/high motivation profile." Analysis of variance indicated that students in the third cluster engaged in significantly more physical activity than students of clusters one and two. These results provide support for previous claims regarding the importance of the relationship of fundamental movement skills with continuing engagement in physical activity. High fundamental movement skills, however, may represent only one element in maintaining adolescents' engagement in physical activity.
Robinson, Leah E
2011-07-01
The purpose of this investigation had two folds. First, it aimed to discover the relationship between perceived physical competence and fundamental motor skills in preschoolers. Secondly, it examined the effect of sex on perceived physical competence and fundamental motor skills within the sample. A total of 119 children (mean age 4.00, SD 0.55 years) participated in this study. The Test of Gross Motor Development--2nd Edition was used to assess fundamental motor skills and the Pictorial Scale of Perceived Competence and Social Acceptance was used to assess perceived physical competence. The results show a moderate and significant correlation between perceived physical competence and fundamental motor skills. Sex differences were also found with boys demonstrating more proficient motor skills and reporting higher perceived physical competence compared with girls. The findings provide relevant information to the child development literature and suggest that a positive relationship exist between preschoolers' self-perceptions of the physical ability and fundamental motor skills. © 2010 Blackwell Publishing Ltd.
A New Principle in Physiscs: the Principle "Finiteness", and Some Consequences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abraham Sternlieb
2010-06-25
In this paper I propose a new principle in physics: the principle of "finiteness". It stems from the definition of physics as a science that deals (among other things) with measurable dimensional physical quantities. Since measurement results, including their errors, are always finite, the principle of finiteness postulates that the mathematical formulation of "legitimate" laws of physics should prevent exactly zero or infinite solutions. Some consequences of the principle of finiteness are discussed, in general, and then more specifically in the fields of special relativity, quantum mechanics, and quantum gravity. The consequences are derived independently of any other theory ormore » principle in physics. I propose "finiteness" as a postulate (like the constancy of the speed of light in vacuum, "c"), as opposed to a notion whose validity has to be corroborated by, or derived theoretically or experimentally from other facts, theories, or principles.« less
Living donation and cosmetic surgery: a double standard in medical ethics?
Testa, Giuliano; Carlisle, Erica; Simmerling, Mary; Angelos, Peter
2012-01-01
The commitment of transplant physicians to protect the physical and psychological health of potential donors is fundamental to the process of living donor organ transplantation. It is appropriate that strict regulations to govern an individual's decision to donate have been developed. Some may argue that adherence to such regulations creates a doctor-patient relationship that is rooted in paternalism, which is in drastic contrast with a doctor-patient relationship that is rooted in patients' autonomy, characteristic of most other operative interventions. In this article we analyze the similarities between cosmetic plastic surgery and living donor surgery as examples of surgeries governed by different ethical principles. It is interesting that, while the prevailing ethical approach in living donor surgery is based on paternalism, the ethical principle guiding cosmetic surgery is respect for patients' autonomy. The purpose of this article is not to criticize either practice, but to suggest that, given the similarities between the two procedures, both operative interventions should be guided by the same ethical principle: a respect for patients' autonomy. We further suggest that if living organ donation valued donors' autonomy as much as cosmetic plastic surgery does, we might witness a wider acceptance of and increase in living organ donation.
Peterson, J P S; Sarthour, R S; Souza, A M; Oliveira, I S; Goold, J; Modi, K; Soares-Pinto, D O; Céleri, L C
2016-04-01
Landauer's principle sets fundamental thermodynamical constraints for classical and quantum information processing, thus affecting not only various branches of physics, but also of computer science and engineering. Despite its importance, this principle was only recently experimentally considered for classical systems. Here we employ a nuclear magnetic resonance set-up to experimentally address the information to energy conversion in a quantum system. Specifically, we consider a three nuclear spins [Formula: see text] (qubits) molecule-the system, the reservoir and the ancilla-to measure the heat dissipated during the implementation of a global system-reservoir unitary interaction that changes the information content of the system. By employing an interferometric technique, we were able to reconstruct the heat distribution associated with the unitary interaction. Then, through quantum state tomography, we measured the relative change in the entropy of the system. In this way, we were able to verify that an operation that changes the information content of the system must necessarily generate heat in the reservoir, exactly as predicted by Landauer's principle. The scheme presented here allows for the detailed study of irreversible entropy production in quantum information processors.
Aspects of skeletal muscle modelling.
Epstein, Marcelo; Herzog, Walter
2003-09-29
The modelling of skeletal muscle raises a number of philosophical questions, particularly in the realm of the relationship between different possible levels of representation and explanation. After a brief incursion into this area, a list of desiderata is proposed as a guiding principle for the construction of a viable model, including: comprehensiveness, soundness, experimental consistency, predictive ability and refinability. Each of these principles is illustrated by means of simple examples. The presence of internal constraints, such as incompressibility, may lead to counterintuitive results. A one-panel example is exploited to advocate the use of the principle of virtual work as the ideal tool to deal with these situations. The question of stability in the descending limb of the force-length relation is addressed and a purely mechanical analogue is suggested. New experimental results confirm the assumption that fibre stiffness is positive even in the descending limb. The indeterminacy of the force-sharing problem is traditionally resolved by optimizing a, presumably, physically meaningful target function. After presenting some new results in this area, based on a separation theorem, it is suggested that a more fundamental approach to the problem is the abandoning of optimization criteria in favour of an explicit implementation of activation criteria.
Modulation Doping of Silicon using Aluminium-induced Acceptor States in Silicon Dioxide
König, Dirk; Hiller, Daniel; Gutsch, Sebastian; Zacharias, Margit; Smith, Sean
2017-01-01
All electronic, optoelectronic or photovoltaic applications of silicon depend on controlling majority charge carriers via doping with impurity atoms. Nanoscale silicon is omnipresent in fundamental research (quantum dots, nanowires) but also approached in future technology nodes of the microelectronics industry. In general, silicon nanovolumes, irrespective of their intended purpose, suffer from effects that impede conventional doping due to fundamental physical principles such as out-diffusion, statistics of small numbers, quantum- or dielectric confinement. In analogy to the concept of modulation doping, originally invented for III-V semiconductors, we demonstrate a heterostructure modulation doping method for silicon. Our approach utilizes a specific acceptor state of aluminium atoms in silicon dioxide to generate holes as majority carriers in adjacent silicon. By relocating the dopants from silicon to silicon dioxide, Si nanoscale doping problems are circumvented. In addition, the concept of aluminium-induced acceptor states for passivating hole selective tunnelling contacts as required for high-efficiency photovoltaics is presented and corroborated by first carrier lifetime and tunnelling current measurements. PMID:28425460
ERIC Educational Resources Information Center
Human Engineering Inst., Cleveland, OH.
THIS MODULE OF A 25-MODULE COURSE IS DESIGNED TO DEVELOP AN UNDERSTANDING OF THE OPERATING PRINCIPLES OF ALTERNATING CURRENT GENERATORS USED ON DIESEL POWERED EQUIPMENT. TOPICS ARE REVIEWING ELECTRICAL FUNDAMENTALS, AND OPERATING PRINCIPLES OF ALTERNATORS. THE MODULE CONSISTS OF A SELF-INSTRUCTIONAL PROGRAMED TRAINING FILM "AC GENERATORS…
Teaching about Due Process of Law. ERIC Digest.
ERIC Educational Resources Information Center
Vontz, Thomas S.
Fundamental constitutional and legal principles are central to effective instruction in the K-12 social studies curriculum. To become competent citizens, students need to develop an understanding of the principles on which their society and government are based. Few principles are as important in the social studies curriculum as due process of…
ERIC Educational Resources Information Center
Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.
2002-01-01
Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…
[Fundamental principles of social work--(also) a contribution to public health ethics].
Lob-Hüdepohl, A
2009-05-01
Social work and public health are different but mutually connected. Both are professions with their own ethical foundations. Despite all differences, they have the same goal: to protect and to enhance the well-being of people. This is, in part, why the fundamental ethical principles of social work are salient for developing public health ethics. As a human rights profession, social work respects the personal autonomy of clients, supports solidarity-based relationships in families, groups or communities, and attempts to uphold social justice in society. Social workers need to adopt special professional attitudes: sensibility for the vulnerabilities of clients, care and attentiveness for their resources and strengths, assistance instead of paternalistic care and advocacy in decision making for clients' well-being when clients are not able to decide for themselves. These fundamental ethical principles are the basis for discussion of special topics of social work ethics as public health ethics, for example, in justifying intervention in individual lifestyles by public services without the participation or consent of the affected persons.
Experimental test of Landauer’s principle in single-bit operations on nanomagnetic memory bits
Hong, Jeongmin; Lambson, Brian; Dhuey, Scott; Bokor, Jeffrey
2016-01-01
Minimizing energy dissipation has emerged as the key challenge in continuing to scale the performance of digital computers. The question of whether there exists a fundamental lower limit to the energy required for digital operations is therefore of great interest. A well-known theoretical result put forward by Landauer states that any irreversible single-bit operation on a physical memory element in contact with a heat bath at a temperature T requires at least kBT ln(2) of heat be dissipated from the memory into the environment, where kB is the Boltzmann constant. We report an experimental investigation of the intrinsic energy loss of an adiabatic single-bit reset operation using nanoscale magnetic memory bits, by far the most ubiquitous digital storage technology in use today. Through sensitive, high-precision magnetometry measurements, we observed that the amount of dissipated energy in this process is consistent (within 2 SDs of experimental uncertainty) with the Landauer limit. This result reinforces the connection between “information thermodynamics” and physical systems and also provides a foundation for the development of practical information processing technologies that approach the fundamental limit of energy dissipation. The significance of the result includes insightful direction for future development of information technology. PMID:26998519
Information physics fundamentals of nanophotonics.
Naruse, Makoto; Tate, Naoya; Aono, Masashi; Ohtsu, Motoichi
2013-05-01
Nanophotonics has been extensively studied with the aim of unveiling and exploiting light-matter interactions that occur at a scale below the diffraction limit of light, and recent progress made in experimental technologies--both in nanomaterial fabrication and characterization--is driving further advancements in the field. From the viewpoint of information, on the other hand, novel architectures, design and analysis principles, and even novel computing paradigms should be considered so that we can fully benefit from the potential of nanophotonics. This paper examines the information physics aspects of nanophotonics. More specifically, we present some fundamental and emergent information properties that stem from optical excitation transfer mediated by optical near-field interactions and the hierarchical properties inherent in optical near-fields. We theoretically and experimentally investigate aspects such as unidirectional signal transfer, energy efficiency and networking effects, among others, and we present their basic theoretical formalisms and describe demonstrations of practical applications. A stochastic analysis of light-assisted material formation is also presented, where an information-based approach provides a deeper understanding of the phenomena involved, such as self-organization. Furthermore, the spatio-temporal dynamics of optical excitation transfer and its inherent stochastic attributes are utilized for solution searching, paving the way to a novel computing paradigm that exploits coherent and dissipative processes in nanophotonics.
Physical Chemistry of Nanomedicine: Understanding the Complex Behaviors of Nanoparticles in Vivo
NASA Astrophysics Data System (ADS)
Lane, Lucas A.; Qian, Ximei; Smith, Andrew M.; Nie, Shuming
2015-04-01
Nanomedicine is an interdisciplinary field of research at the interface of science, engineering, and medicine, with broad clinical applications ranging from molecular imaging to medical diagnostics, targeted therapy, and image-guided surgery. Despite major advances during the past 20 years, there are still major fundamental and technical barriers that need to be understood and overcome. In particular, the complex behaviors of nanoparticles under physiological conditions are poorly understood, and detailed kinetic and thermodynamic principles are still not available to guide the rational design and development of nanoparticle agents. Here we discuss the interactions of nanoparticles with proteins, cells, tissues, and organs from a quantitative physical chemistry point of view. We also discuss insights and strategies on how to minimize nonspecific protein binding, how to design multistage and activatable nanostructures for improved drug delivery, and how to use the enhanced permeability and retention effect to deliver imaging agents for image-guided cancer surgery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosikhin, Ahmad, E-mail: a.rosikhin86@yahoo.co.id; Hidayat, Aulia Fikri; Marimpul, Rinaldo
High crystalline metal thin film preparation in application both for catalyst substrate or electrode in any electronic devices always to be considered in material functional material research and development. As a substrate catalyst, this metal take a role as guidance for material growth in order to resulted in proper surface structure although at the end it will be removed via etching process. Meanwhile as electrodes, it will dragging charges to be collected inside. This brief discussion will elaborate general fundamental principle of physical vapor deposition (PVD) system for metal thin film preparation in micro-nanometer scale. The influence of thermodynamic parametersmore » and metal characteristic such as melting point and particle size will be elucidated. Physical description of deposition process in the chamber can be simplified by schematic evaporation phenomena which is supported by experimental measurement such as SEM and XRD.« less
NASA Astrophysics Data System (ADS)
Gu, Je-An
2014-01-01
Darkessence, the dark source of anti-gravity and that of attractive gravity, serves as the largest testing ground of the interplay between quantum matter and classical gravity. We expect it to shed light on the conflict between quantum physics and gravity, the most important puzzle in fundamental physics in the 21st century. In this paper we attempt to reveal the guidelines hinted by darkessence for clarifying or even resolving the conflict. To this aim, we question (1) the compatibility of the renormalization-group (RG) running with the energy conservation, (2) the effectiveness of an effective action in quantum field theory for describing the gravitation of quantum matter, and (3) the way quantum vacuum energy gravitates. These doubts illustrate the conflict and suggest several guidelines on the resolution: the preservation of the energy conservation and the equivalence principle (or its variant) under RG running, and a natural relief of the vacuum energy catastrophe.
Generalized Knudsen Number for Unsteady Fluid Flow.
Kara, V; Yakhot, V; Ekinci, K L
2017-02-17
We explore the scaling behavior of an unsteady flow that is generated by an oscillating body of finite size in a gas. If the gas is gradually rarefied, the Navier-Stokes equations begin to fail and a kinetic description of the flow becomes more appropriate. The failure of the Navier-Stokes equations can be thought to take place via two different physical mechanisms: either the continuum hypothesis breaks down as a result of a finite size effect or local equilibrium is violated due to the high rate of strain. By independently tuning the relevant linear dimension and the frequency of the oscillating body, we can experimentally observe these two different physical mechanisms. All the experimental data, however, can be collapsed using a single dimensionless scaling parameter that combines the relevant linear dimension and the frequency of the body. This proposed Knudsen number for an unsteady flow is rooted in a fundamental symmetry principle, namely, Galilean invariance.
Generalized Knudsen Number for Unsteady Fluid Flow
NASA Astrophysics Data System (ADS)
Kara, V.; Yakhot, V.; Ekinci, K. L.
2017-02-01
We explore the scaling behavior of an unsteady flow that is generated by an oscillating body of finite size in a gas. If the gas is gradually rarefied, the Navier-Stokes equations begin to fail and a kinetic description of the flow becomes more appropriate. The failure of the Navier-Stokes equations can be thought to take place via two different physical mechanisms: either the continuum hypothesis breaks down as a result of a finite size effect or local equilibrium is violated due to the high rate of strain. By independently tuning the relevant linear dimension and the frequency of the oscillating body, we can experimentally observe these two different physical mechanisms. All the experimental data, however, can be collapsed using a single dimensionless scaling parameter that combines the relevant linear dimension and the frequency of the body. This proposed Knudsen number for an unsteady flow is rooted in a fundamental symmetry principle, namely, Galilean invariance.
On thermonuclear ignition criterion at the National Ignition Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Baolian; Kwan, Thomas J. T.; Wang, Yi-Ming
2014-10-15
Sustained thermonuclear fusion at the National Ignition Facility remains elusive. Although recent experiments approached or exceeded the anticipated ignition thresholds, the nuclear performance of the laser-driven capsules was well below predictions in terms of energy and neutron production. Such discrepancies between expectations and reality motivate a reassessment of the physics of ignition. We have developed a predictive analytical model from fundamental physics principles. Based on the model, we obtained a general thermonuclear ignition criterion in terms of the areal density and temperature of the hot fuel. This newly derived ignition threshold and its alternative forms explicitly show the minimum requirementsmore » of the hot fuel pressure, mass, areal density, and burn fraction for achieving ignition. Comparison of our criterion with existing theories, simulations, and the experimental data shows that our ignition threshold is more stringent than those in the existing literature and that our results are consistent with the experiments.« less
Nuclear analytical techniques in medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cesareo, R.
1988-01-01
This book acquaints one with the fundamental principles and the instrumentation relevant to analytical technique based on atomic and nuclear physics, as well as present and future biomedical applications. Besides providing a theoretical description of the physical phenomena, a large part of the book is devoted to applications in the medical and biological field, particularly in hematology, forensic medicine and environmental science. This volume reviews methods such as the possibility of carrying out rapid multi-element analysis of trace elements on biomedical samples, in vitro and in vivo, by XRF-analysis; the ability of the PIXE-microprobe to analyze in detail and tomore » map trace elements in fragments of biomedical samples or inside the cells; the potentiality of in vivo nuclear activation analysis for diagnostic purposes. Finally, techniques are described such as radiation scattering (elastic and inelastic scattering) and attenuation measurements which will undoubtedly see great development in the immediate future.« less
NASA Astrophysics Data System (ADS)
Georgescu, M.; Chow, W. T. L.; Wang, Z. H.; Brazel, A.; Trapido-Lurie, B.; Roth, M.; Benson-Lira, V.
2015-06-01
Because of a projected surge of several billion urban inhabitants by mid-century, a rising urgency exists to advance local and strategically deployed measures intended to ameliorate negative consequences on urban climate (e.g., heat stress, poor air quality, energy/water availability). Here we highlight the importance of incorporating scale-dependent built environment induced solutions within the broader umbrella of urban sustainability outcomes, thereby accounting for fundamental physical principles. Contemporary and future design of settlements demands cooperative participation between planners, architects, and relevant stakeholders, with the urban and global climate community, which recognizes the complexity of the physical systems involved and is ideally fit to quantitatively examine the viability of proposed solutions. Such participatory efforts can aid the development of locally sensible approaches by integrating across the socioeconomic and climatic continuum, therefore providing opportunities facilitating comprehensive solutions that maximize benefits and limit unintended consequences.
NASA Astrophysics Data System (ADS)
Goradia, Shantilal
2015-10-01
We modify Newtonian gravity to probabilistic quantum mechanical gravity to derive strong coupling. If this approach is valid, we should be able to extend it to the physical body (life) as follows. Using Boltzmann equation, we get the entropy of the universe (137) as if its reciprocal, the fine structure constant (ALPHA), is the hidden candidate representing the negative entropy of the universe which is indicative of the binary information as its basis (http://www.arXiv.org/pdf/physics0210040v5). Since ALPHA relates to cosmology, it must relate to molecular biology too, with the binary system as the fundamental source of information for the nucleotides of the DNA as implicit in the book by the author: ``Quantum Consciousness - The Road to Reality.'' We debate claims of anthropic principle based on the negligible variation of ALPHA and throw light on thermodynamics. We question constancy of G in multiple ways.
Reinventing the Accelerator for the High Energy Frontier
Rosenzweig, James [UCLA, Los Angeles, California, United States
2017-12-09
The history of discovery in high-energy physics has been intimately connected with progress in methods of accelerating particles for the past 75 years. This remains true today, as the post-LHC era in particle physics will require significant innovation and investment in a superconducting linear collider. The choice of the linear collider as the next-generation discovery machine, and the selection of superconducting technology has rather suddenly thrown promising competing techniques -- such as very large hadron colliders, muon colliders, and high-field, high frequency linear colliders -- into the background. We discuss the state of such conventional options, and the likelihood of their eventual success. We then follow with a much longer view: a survey of a new, burgeoning frontier in high energy accelerators, where intense lasers, charged particle beams, and plasmas are all combined in a cross-disciplinary effort to reinvent the accelerator from its fundamental principles on up.
Benchmarking sheath subgrid boundary conditions for macroscopic-scale simulations
NASA Astrophysics Data System (ADS)
Jenkins, T. G.; Smithe, D. N.
2015-02-01
The formation of sheaths near metallic or dielectric-coated wall materials in contact with a plasma is ubiquitous, often giving rise to physical phenomena (sputtering, secondary electron emission, etc) which influence plasma properties and dynamics both near and far from the material interface. In this paper, we use first-principles PIC simulations of such interfaces to formulate a subgrid sheath boundary condition which encapsulates fundamental aspects of the sheath behavior at the interface. Such a boundary condition, based on the capacitive behavior of the sheath, is shown to be useful in fluid simulations wherein sheath scale lengths are substantially smaller than scale lengths for other relevant physical processes (e.g. radiofrequency wavelengths), in that it enables kinetic processes associated with the presence of the sheath to be numerically modeled without explicit resolution of spatial and temporal sheath scales such as electron Debye length or plasma frequency.
A model for Entropy Production, Entropy Decrease and Action Minimization in Self-Organization
NASA Astrophysics Data System (ADS)
Georgiev, Georgi; Chatterjee, Atanu; Vu, Thanh; Iannacchione, Germano
In self-organization energy gradients across complex systems lead to change in the structure of systems, decreasing their internal entropy to ensure the most efficient energy transport and therefore maximum entropy production in the surroundings. This approach stems from fundamental variational principles in physics, such as the principle of least action. It is coupled to the total energy flowing through a system, which leads to increase the action efficiency. We compare energy transport through a fluid cell which has random motion of its molecules, and a cell which can form convection cells. We examine the signs of change of entropy, and the action needed for the motion inside those systems. The system in which convective motion occurs, reduces the time for energy transmission, compared to random motion. For more complex systems, those convection cells form a network of transport channels, for the purpose of obeying the equations of motion in this geometry. Those transport networks are an essential feature of complex systems in biology, ecology, economy and society.
Radiosurgery with photons or protons for benign and malignant tumours of the skull base: a review.
Amichetti, Maurizio; Amelio, Dante; Minniti, Giuseppe
2012-12-14
Stereotactic radiosurgery (SRS) is an important treatment option for intracranial lesions. Many studies have shown the effectiveness of photon-SRS for the treatment of skull base (SB) tumours; however, limited data are available for proton-SRS.Several photon-SRS techniques, including Gamma Knife, modified linear accelerators (Linac) and CyberKnife, have been developed and several studies have compared treatment plan characteristics between protons and photons.The principles of classical radiobiology are similar for protons and photons even though they differ in terms of physical properties and interaction with matter resulting in different dose distributions.Protons have special characteristics that allow normal tissues to be spared better than with the use of photons, although their potential clinical superiority remains to be demonstrated.A critical analysis of the fundamental radiobiological principles, dosimetric characteristics, clinical results, and toxicity of proton- and photon-SRS for SB tumours is provided and discussed with an attempt of defining the advantages and limits of each radiosurgical technique.
A hybrid computational-experimental approach for automated crystal structure solution
NASA Astrophysics Data System (ADS)
Meredig, Bryce; Wolverton, C.
2013-02-01
Crystal structure solution from diffraction experiments is one of the most fundamental tasks in materials science, chemistry, physics and geology. Unfortunately, numerous factors render this process labour intensive and error prone. Experimental conditions, such as high pressure or structural metastability, often complicate characterization. Furthermore, many materials of great modern interest, such as batteries and hydrogen storage media, contain light elements such as Li and H that only weakly scatter X-rays. Finally, structural refinements generally require significant human input and intuition, as they rely on good initial guesses for the target structure. To address these many challenges, we demonstrate a new hybrid approach, first-principles-assisted structure solution (FPASS), which combines experimental diffraction data, statistical symmetry information and first-principles-based algorithmic optimization to automatically solve crystal structures. We demonstrate the broad utility of FPASS to clarify four important crystal structure debates: the hydrogen storage candidates MgNH and NH3BH3; Li2O2, relevant to Li-air batteries; and high-pressure silane, SiH4.
DNA Charge Transport: From Chemical Principles to the Cell
Arnold, Anna R.; Grodick, Michael A.; Barton, Jacqueline K.
2016-01-01
The DNA double helix has captured the imagination of many, bringing it to the forefront of biological research. DNA has unique features that extend our interest into areas of chemistry, physics, material science and engineering. Our laboratory has focused on studies of DNA charge transport (CT), wherein charges can efficiently travel long molecular distances through the DNA helix while maintaining an exquisite sensitivity to base pair π-stacking. Because DNA CT chemistry reports on the integrity of the DNA duplex, this property may be exploited to develop electrochemical devices to detect DNA lesions and DNA-binding proteins. Furthermore, studies now indicate that DNA CT may also be used in the cell by, for example, DNA repair proteins, as a cellular diagnostic, in order to scan the genome to localize efficiently to damage sites. In this review, we describe this evolution of DNA CT chemistry from the discovery of fundamental chemical principles to applications in diagnostic strategies and possible roles in biology. PMID:26933744
On state-of-charge determination for lithium-ion batteries
NASA Astrophysics Data System (ADS)
Li, Zhe; Huang, Jun; Liaw, Bor Yann; Zhang, Jianbo
2017-04-01
Accurate estimation of state-of-charge (SOC) of a battery through its life remains challenging in battery research. Although improved precisions continue to be reported at times, almost all are based on regression methods empirically, while the accuracy is often not properly addressed. Here, a comprehensive review is set to address such issues, from fundamental principles that are supposed to define SOC to methodologies to estimate SOC for practical use. It covers topics from calibration, regression (including modeling methods) to validation in terms of precision and accuracy. At the end, we intend to answer the following questions: 1) can SOC estimation be self-adaptive without bias? 2) Why Ah-counting is a necessity in almost all battery-model-assisted regression methods? 3) How to establish a consistent framework of coupling in multi-physics battery models? 4) To assess the accuracy in SOC estimation, statistical methods should be employed to analyze factors that contribute to the uncertainty. We hope, through this proper discussion of the principles, accurate SOC estimation can be widely achieved.
The structure of tropical forests and sphere packings
Jahn, Markus Wilhelm; Dobner, Hans-Jürgen; Wiegand, Thorsten; Huth, Andreas
2015-01-01
The search for simple principles underlying the complex architecture of ecological communities such as forests still challenges ecological theorists. We use tree diameter distributions—fundamental for deriving other forest attributes—to describe the structure of tropical forests. Here we argue that tree diameter distributions of natural tropical forests can be explained by stochastic packing of tree crowns representing a forest crown packing system: a method usually used in physics or chemistry. We demonstrate that tree diameter distributions emerge accurately from a surprisingly simple set of principles that include site-specific tree allometries, random placement of trees, competition for space, and mortality. The simple static model also successfully predicted the canopy structure, revealing that most trees in our two studied forests grow up to 30–50 m in height and that the highest packing density of about 60% is reached between the 25- and 40-m height layer. Our approach is an important step toward identifying a minimal set of processes responsible for generating the spatial structure of tropical forests. PMID:26598678
Radiosurgery with photons or protons for benign and malignant tumours of the skull base: a review
2012-01-01
Stereotactic radiosurgery (SRS) is an important treatment option for intracranial lesions. Many studies have shown the effectiveness of photon-SRS for the treatment of skull base (SB) tumours; however, limited data are available for proton-SRS. Several photon-SRS techniques, including Gamma Knife, modified linear accelerators (Linac) and CyberKnife, have been developed and several studies have compared treatment plan characteristics between protons and photons. The principles of classical radiobiology are similar for protons and photons even though they differ in terms of physical properties and interaction with matter resulting in different dose distributions. Protons have special characteristics that allow normal tissues to be spared better than with the use of photons, although their potential clinical superiority remains to be demonstrated. A critical analysis of the fundamental radiobiological principles, dosimetric characteristics, clinical results, and toxicity of proton- and photon-SRS for SB tumours is provided and discussed with an attempt of defining the advantages and limits of each radiosurgical technique. PMID:23241206
Statistical physics of vehicular traffic and some related systems
NASA Astrophysics Data System (ADS)
Chowdhury, Debashish; Santen, Ludger; Schadschneider, Andreas
2000-05-01
In the so-called “microscopic” models of vehicular traffic, attention is paid explicitly to each individual vehicle each of which is represented by a “particle”; the nature of the “interactions” among these particles is determined by the way the vehicles influence each others’ movement. Therefore, vehicular traffic, modeled as a system of interacting “particles” driven far from equilibrium, offers the possibility to study various fundamental aspects of truly nonequilibrium systems which are of current interest in statistical physics. Analytical as well as numerical techniques of statistical physics are being used to study these models to understand rich variety of physical phenomena exhibited by vehicular traffic. Some of these phenomena, observed in vehicular traffic under different circumstances, include transitions from one dynamical phase to another, criticality and self-organized criticality, metastability and hysteresis, phase-segregation, etc. In this critical review, written from the perspective of statistical physics, we explain the guiding principles behind all the main theoretical approaches. But we present detailed discussions on the results obtained mainly from the so-called “particle-hopping” models, particularly emphasizing those which have been formulated in recent years using the language of cellular automata.
2016-01-01
Emerging studies indicate that several species such as corvids, apes and children solve ‘The Crow and the Pitcher’ task (from Aesop's Fables) in diverse conditions. Hidden beneath this fascinating paradigm is a fundamental question: by cumulatively interacting with different objects, how can an agent abstract the underlying cause–effect relations to predict and creatively exploit potential affordances of novel objects in the context of sought goals? Re-enacting this Aesop's Fable task on a humanoid within an open-ended ‘learning–prediction–abstraction’ loop, we address this problem and (i) present a brain-guided neural framework that emulates rapid one-shot encoding of ongoing experiences into a long-term memory and (ii) propose four task-agnostic learning rules (elimination, growth, uncertainty and status quo) that correlate predictions from remembered past experiences with the unfolding present situation to gradually abstract the underlying causal relations. Driven by the proposed architecture, the ensuing robot behaviours illustrated causal learning and anticipation similar to natural agents. Results further demonstrate that by cumulatively interacting with few objects, the predictions of the robot in case of novel objects converge close to the physical law, i.e. the Archimedes principle: this being independent of both the objects explored during learning and the order of their cumulative exploration. PMID:27466440
Bhat, Ajaz Ahmad; Mohan, Vishwanathan; Sandini, Giulio; Morasso, Pietro
2016-07-01
Emerging studies indicate that several species such as corvids, apes and children solve 'The Crow and the Pitcher' task (from Aesop's Fables) in diverse conditions. Hidden beneath this fascinating paradigm is a fundamental question: by cumulatively interacting with different objects, how can an agent abstract the underlying cause-effect relations to predict and creatively exploit potential affordances of novel objects in the context of sought goals? Re-enacting this Aesop's Fable task on a humanoid within an open-ended 'learning-prediction-abstraction' loop, we address this problem and (i) present a brain-guided neural framework that emulates rapid one-shot encoding of ongoing experiences into a long-term memory and (ii) propose four task-agnostic learning rules (elimination, growth, uncertainty and status quo) that correlate predictions from remembered past experiences with the unfolding present situation to gradually abstract the underlying causal relations. Driven by the proposed architecture, the ensuing robot behaviours illustrated causal learning and anticipation similar to natural agents. Results further demonstrate that by cumulatively interacting with few objects, the predictions of the robot in case of novel objects converge close to the physical law, i.e. the Archimedes principle: this being independent of both the objects explored during learning and the order of their cumulative exploration. © 2016 The Author(s).
The School Office: An Overview.
ERIC Educational Resources Information Center
Hudson, Randy
2000-01-01
Presents an overview of the fundamental principles of school office design that remain constant despite changes in building technologies, and technological and spatial flexibility. Principles discussed include the school office and traffic patterns, security, and visitor reception requirements. (GR)
Homeschooling and Religious Fundamentalism
ERIC Educational Resources Information Center
Kunzman, Robert
2010-01-01
This article considers the relationship between homeschooling and religious fundamentalism by focusing on their intersection in the philosophies and practices of conservative Christian homeschoolers in the United States. Homeschooling provides an ideal educational setting to support several core fundamentalist principles: resistance to…
ERIC Educational Resources Information Center
Cheek, Jimmy G.; McGhee, Max B.
An activity was undertaken to develop written criterion-referenced tests for the agricultural mechanics component of the Applied Principles of Agribusiness and Natural Resources. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three…
ERIC Educational Resources Information Center
Cheek, Jimmy G.; McGhee, Max B.
An activity was undertaken to develop written criterion-referenced tests for the common core component of Applied Principles of Agribusiness and Natural Resources Occupations. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three…
ERIC Educational Resources Information Center
Cheek, Jimmy G.; McGhee, Max B.
An activity was undertaken to develop written criterion-referenced tests for the forestry component of Applied Principles of Agribusiness and Natural Resources. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three components, with…
ERIC Educational Resources Information Center
Cheek, Jimmy G.; McGhee, Max B.
An activity was undertaken to develop written criterion-referenced tests for the agricultural resources component of Applied Principles of Agribusiness and Natural Resources. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist of three…
ERIC Educational Resources Information Center
Cheek, Jimmy G.; McGhee, Max B.
An activity was undertaken to develop written criterion-referenced tests for the agricultural production component of Applied Principles of Agribusiness and Natural Resources Occupations. Intended for tenth grade students who have completed Fundamentals of Agribusiness and Natural Resources Occupations, applied principles were designed to consist…
Biological attachment devices: exploring nature's diversity for biomimetics.
Gorb, Stanislav N
2008-05-13
Many species of animals and plants are supplied with diverse attachment devices, in which morphology depends on the species biology and the particular function in which the attachment device is involved. Many functional solutions have evolved independently in different lineages of animals and plants. Since the diversity of such biological structures is huge, there is a need for their classification. This paper, based on the original and literature data, proposes ordering of biological attachment systems according to several principles: (i) fundamental physical mechanism, according to which the system operates, (ii) biological function of the attachment device, and (iii) duration of the contact. Finally, we show a biomimetic potential of studies on biological attachment devices.
NASA Astrophysics Data System (ADS)
Chang, S. S. L.
State of the art technology in circuits, fields, and electronics is discussed. The principles and applications of these technologies to industry, digital processing, microwave semiconductors, and computer-aided design are explained. Important concepts and methodologies in mathematics and physics are reviewed, and basic engineering sciences and associated design methods are dealt with, including: circuit theory and the design of magnetic circuits and active filter synthesis; digital signal processing, including FIR and IIR digital filter design; transmission lines, electromagnetic wave propagation and surface acoustic wave devices. Also considered are: electronics technologies, including power electronics, microwave semiconductors, GaAs devices, and magnetic bubble memories; digital circuits and logic design.
Why anthropic reasoning cannot predict Lambda.
Starkman, Glenn D; Trotta, Roberto
2006-11-17
We revisit anthropic arguments purporting to explain the measured value of the cosmological constant. We argue that different ways of assigning probabilities to candidate universes lead to totally different anthropic predictions. As an explicit example, we show that weighting different universes by the total number of possible observations leads to an extremely small probability for observing a value of Lambda equal to or greater than what we now measure. We conclude that anthropic reasoning within the framework of probability as frequency is ill-defined and that in the absence of a fundamental motivation for selecting one weighting scheme over another the anthropic principle cannot be used to explain the value of Lambda, nor, likely, any other physical parameters.
Physical principles of intracellular organization via active and passive phase transitions
NASA Astrophysics Data System (ADS)
Berry, Joel; Brangwynne, Clifford P.; Haataja, Mikko
2018-04-01
Exciting recent developments suggest that phase transitions represent an important and ubiquitous mechanism underlying intracellular organization. We describe key experimental findings in this area of study, as well as the application of classical theoretical approaches for quantitatively understanding these data. We also discuss the way in which equilibrium thermodynamic driving forces may interface with the fundamentally out-of-equilibrium nature of living cells. In particular, time and/or space-dependent concentration profiles may modulate the phase behavior of biomolecules in living cells. We suggest future directions for both theoretical and experimental work that will shed light on the way in which biological activity modulates the assembly, properties, and function of viscoelastic states of living matter.
Saturation wind power potential and its implications for wind energy.
Jacobson, Mark Z; Archer, Cristina L
2012-09-25
Wind turbines convert kinetic to electrical energy, which returns to the atmosphere as heat to regenerate some potential and kinetic energy. As the number of wind turbines increases over large geographic regions, power extraction first increases linearly, but then converges to a saturation potential not identified previously from physical principles or turbine properties. These saturation potentials are >250 terawatts (TW) at 100 m globally, approximately 80 TW at 100 m over land plus coastal ocean outside Antarctica, and approximately 380 TW at 10 km in the jet streams. Thus, there is no fundamental barrier to obtaining half (approximately 5.75 TW) or several times the world's all-purpose power from wind in a 2030 clean-energy economy.
A sessional blind signature based on quantum cryptography
NASA Astrophysics Data System (ADS)
Khodambashi, Siavash; Zakerolhosseini, Ali
2014-01-01
In this paper, we present a sessional blind signature protocol whose security is guaranteed by fundamental principles of quantum physics. It allows a message owner to get his message signed by an authorized signatory. However, the signatory is not capable of reading the message contents and everyone can verify authenticity of the message. For this purpose, we took advantage of a sessional signature as well as quantum entangled pairs which are generated with respect to it in our proposed protocol. We describe our proposed blind signature through an example and briefly discuss about its unconditional security. Due to the feasibility of the protocol, it can be widely employed for e-payment, e-government, e-business and etc.
Physical principles of intracellular organization via active and passive phase transitions.
Berry, Joel; Brangwynne, Clifford P; Haataja, Mikko
2018-04-01
Exciting recent developments suggest that phase transitions represent an important and ubiquitous mechanism underlying intracellular organization. We describe key experimental findings in this area of study, as well as the application of classical theoretical approaches for quantitatively understanding these data. We also discuss the way in which equilibrium thermodynamic driving forces may interface with the fundamentally out-of-equilibrium nature of living cells. In particular, time and/or space-dependent concentration profiles may modulate the phase behavior of biomolecules in living cells. We suggest future directions for both theoretical and experimental work that will shed light on the way in which biological activity modulates the assembly, properties, and function of viscoelastic states of living matter.
The accuracy of the ATLAS muon X-ray tomograph
NASA Astrophysics Data System (ADS)
Avramidou, R.; Berbiers, J.; Boudineau, C.; Dechelette, C.; Drakoulakos, D.; Fabjan, C.; Grau, S.; Gschwendtner, E.; Maugain, J.-M.; Rieder, H.; Rangod, S.; Rohrbach, F.; Sbrissa, E.; Sedykh, E.; Sedykh, I.; Smirnov, Y.; Vertogradov, L.; Vichou, I.
2003-01-01
A gigantic detector, the ATLAS project, is under construction at CERN for particle physics research at the Large Hadron Collider which is to be ready by 2006. An X-ray tomograph has been developed, designed and constructed at CERN in order to control the mechanical quality of the ATLAS muon chambers. We reached a measurement accuracy of 2 μm systematic and 2 μm statistical uncertainties in the horizontal and vertical directions in the working area 220 cm (horizontal)×60 cm (vertical). Here we describe in detail the fundamental approach of the basic principle chosen to achieve such good accuracy. In order to crosscheck our precision, key results of measurements are presented.
St Clair Gibson, A; Swart, J; Tucker, R
2018-02-01
Either central (brain) or peripheral (body physiological system) control mechanisms, or a combination of these, have been championed in the last few decades in the field of Exercise Sciences as how physiological activity and fatigue processes are regulated. In this review, we suggest that the concept of 'central' or 'peripheral' mechanisms are both artificial constructs that have 'straight-jacketed' research in the field, and rather that competition between psychological and physiological homeostatic drives is central to the regulation of both, and that governing principles, rather than distinct physical processes, underpin all physical system and exercise regulation. As part of the Integrative Governor theory we develop in this review, we suggest that both psychological and physiological drives and requirements are underpinned by homeostatic principles, and that regulation of the relative activity of each is by dynamic negative feedback activity, as the fundamental general operational controller. Because of this competitive, dynamic interplay, we propose that the activity in all systems will oscillate, that these oscillations create information, and comparison of this oscillatory information with either prior information, current activity, or activity templates create efferent responses that change the activity in the different systems in a similarly dynamic manner. Changes in a particular system are always the result of perturbations occurring outside the system itself, the behavioural causative 'history' of this external activity will be evident in the pattern of the oscillations, and awareness of change occurs as a result of unexpected rather than planned change in physiological activity or psychological state.
This course is aimed at providing an overview of the fundamental guiding principles and general methods used in chemical risk assessment. Chemical risk assessment is a complex and ever-evolving process. These principles and methods have been organized by the National Research Cou...
Jaakkola, T; Yli-Piipari, S; Huotari, P; Watt, A; Liukkonen, J
2016-01-01
The purpose of this study was to examine the extent to which fundamental movement skills and physical fitness scores assessed in early adolescence predict self-reported physical activity assessed 6 years later. The sample comprised 333 (200 girls, 133 boys; M age = 12.41) students. The effects of previous physical activity, sex, and body mass index (BMI) were controlled in the main analyses. Adolescents' fundamental movement skills, physical fitness, self-report physical activity, and BMI were collected at baseline, and their self-report energy expenditure (metabolic equivalents: METs) and intensity of physical activity were collected using the International Physical Activity Questionnaire 6 years later. Results showed that fundamental movement skills predicted METs, light, moderate, and vigorous intensity physical activity levels, whereas fitness predicted METs, moderate, and vigorous physical activity levels. Hierarchical regression analyses also showed that after controlling for previous levels of physical activity, sex, and BMI, the size of the effect of fundamental movement skills and physical fitness on energy expenditure and physical activity intensity was moderate (R(2) change between 0.06 and 0.15), with the effect being stronger for high intensity physical activity. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
A New "Principal Principle" (#14) of Physical Activity Education Is Emerging
ERIC Educational Resources Information Center
Zeigler, Earle F.
2011-01-01
There is every reason to believe that a new "principal principle" of physical activity education is emerging. In this article, the author talks about the new "principal principle"(#14) of physical education. Revisiting a historical milestone in the field's history to explain the origin of the term "principal principle," Dr. Arthur H. Steinhaus,…
From Particle Physics to Medical Applications
NASA Astrophysics Data System (ADS)
Dosanjh, Manjit
2017-06-01
CERN is the world's largest particle physics research laboratory. Since it was established in 1954, it has made an outstanding contribution to our understanding of the fundamental particles and their interactions, and also to the technologies needed to analyse their properties and behaviour. The experimental challenges have pushed the performance of particle accelerators and detectors to the limits of our technical capabilities, and these groundbreaking technologies can also have a significant impact in applications beyond particle physics. In particular, the detectors developed for particle physics have led to improved techniques for medical imaging, while accelerator technologies lie at the heart of the irradiation methods that are widely used for treating cancer. Indeed, many important diagnostic and therapeutic techniques used by healthcare professionals are based either on basic physics principles or the technologies developed to carry out physics research. Ever since the discovery of x-rays by Roentgen in 1895, physics has been instrumental in the development of technologies in the biomedical domain, including the use of ionizing radiation for medical imaging and therapy. Some key examples that are explored in detail in this book include scanners based on positron emission tomography, as well as radiation therapy for cancer treatment. Even the collaborative model of particle physics is proving to be effective in catalysing multidisciplinary research for medical applications, ensuring that pioneering physics research is exploited for the benefit of all.
1981-03-01
systems, sub- systems, equipment, weapons, tactics, missions, etc. Concepts and Principles - Fundamental truths, ideas, opinions and thoughts formed from...verification, etc. Grasping the meaning of concepts and principles , i.e., understanding the basic principles of infrared and radar detection. Understanding...concepts, principles , procedures, etc.). Analysis A demonstration of a learned process of breaking down material (i.e., data, other information) into
Fundamental electrode kinetics
NASA Technical Reports Server (NTRS)
Elder, J. P.
1968-01-01
Report presents the fundamentals of electrode kinetics and the methods used in evaluating the characteristic parameters of rapid-charge transfer processes at electrode-electrolyte interfaces. The concept of electrode kinetics is outlined, followed by the principles underlying the experimental techniques for the investigation of electrode kinetics.
The status of varying constants: a review of the physics, searches and implications.
Martins, C J A P
2017-12-01
The observational evidence for the recent acceleration of the universe demonstrates that canonical theories of cosmology and particle physics are incomplete-if not incorrect-and that new physics is out there, waiting to be discovered. A key task for the next generation of laboratory and astrophysical facilities is to search for, identify and ultimately characterize this new physics. Here we highlight recent developments in tests of the stability of nature's fundamental couplings, which provide a direct handle on new physics: a detection of variations will be revolutionary, but even improved null results provide competitive constraints on a range of cosmological and particle physics paradigms. A joint analysis of all currently available data shows a preference for variations of α and μ at about the two-sigma level, but inconsistencies between different sub-sets (likely due to hidden systematics) suggest that these statistical preferences need to be taken with caution. On the other hand, these measurements strongly constrain Weak Equivalence Principle violations. Plans and forecasts for forthcoming studies with facilities such as ALMA, ESPRESSO and the ELT, which should clarify these issues, are also discussed, and synergies with other probes are briefly highlighted. The goal is to show how a new generation of precision consistency tests of the standard paradigm will soon become possible.
The status of varying constants: a review of the physics, searches and implications
NASA Astrophysics Data System (ADS)
Martins, C. J. A. P.
2017-12-01
The observational evidence for the recent acceleration of the universe demonstrates that canonical theories of cosmology and particle physics are incomplete—if not incorrect—and that new physics is out there, waiting to be discovered. A key task for the next generation of laboratory and astrophysical facilities is to search for, identify and ultimately characterize this new physics. Here we highlight recent developments in tests of the stability of nature’s fundamental couplings, which provide a direct handle on new physics: a detection of variations will be revolutionary, but even improved null results provide competitive constraints on a range of cosmological and particle physics paradigms. A joint analysis of all currently available data shows a preference for variations of α and μ at about the two-sigma level, but inconsistencies between different sub-sets (likely due to hidden systematics) suggest that these statistical preferences need to be taken with caution. On the other hand, these measurements strongly constrain Weak Equivalence Principle violations. Plans and forecasts for forthcoming studies with facilities such as ALMA, ESPRESSO and the ELT, which should clarify these issues, are also discussed, and synergies with other probes are briefly highlighted. The goal is to show how a new generation of precision consistency tests of the standard paradigm will soon become possible.
Personality Theories Facilitate Integrating the Five Principles and Deducing Hypotheses for Testing
ERIC Educational Resources Information Center
Maddi, Salvatore R.
2007-01-01
Comments on the original article "A New Big Five: Fundamental Principles for an Integrative Science of Personality," by Dan P. McAdams and Jennifer L. Pals (see record 2006-03947-002). In presenting their view of personality science, McAdams and Pals (April 2006) elaborated the importance of five principles for building an integrated science of…
Austin ISD. Integrated Lesson Plans.
ERIC Educational Resources Information Center
East Texas State Univ., Commerce. Educational Development and Training Center.
This packet contains 14 lesson plans for integrated academic and vocational education courses. Lesson plans for the following courses are included: integrated physics and principles of technology; algebra and principles of technology; principles of technology, language arts, and economics; physics and industrial electronics; physics and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Genghong; Zhu, Jia; Jiang, Gelei
Piezoelectricity is closely related with the performance and application of piezoelectric devices. It is a crucial issue to understand its detailed fundamental for designing functional devices with more peculiar performances. Basing on the first principles simulations, the ZnO piezoelectric tunnel junction is taken as an example to systematically investigate its piezoelectricity (including the piezopotential energy, piezoelectric field, piezoelectric polarization and piezocharge) and explore their correlation. The comprehensive picture of the piezoelectricity in the ZnO tunnel junction is revealed at atomic scale and it is verified to be the intrinsic characteristic of ZnO barrier, independent of its terminated surface but dependentmore » on its c axis orientation and the applied strain. In the case of the ZnO c axis pointing from right to left, an in-plane compressive strain will induce piezocharges (and a piezopotential energy drop) with positive and negative signs (negative and positive signs) emerging respectively at the left and right terminated surfaces of the ZnO barrier. Meanwhile a piezoelectric polarization (and a piezoelectric field) pointing from right to left (from left to right) are also induced throughout the ZnO barrier. All these piezoelectric physical quantities would reverse when the applied strain switches from compressive to tensile. This study provides an atomic level insight into the fundamental behavior of the piezoelectricity of the piezoelectric tunnel junction and should have very useful information for future designs of piezoelectric devices.« less
TOPICAL REVIEW: Electric current activated/assisted sintering (ECAS): a review of patents 1906-2008
NASA Astrophysics Data System (ADS)
Grasso, Salvatore; Sakka, Yoshio; Maizza, Giovanni
2009-10-01
The electric current activated/assisted sintering (ECAS) is an ever growing class of versatile techniques for sintering particulate materials. Despite the tremendous advances over the last two decades in ECASed materials and products there is a lack of comprehensive reviews on ECAS apparatuses and methods. This paper fills the gap by tracing the progress of ECAS technology from 1906 to 2008 and surveys 642 ECAS patents published over more than a century. It is found that the ECAS technology was pioneered by Bloxam (1906 GB Patent No. 9020) who developed the first resistive sintering apparatus. The patents were searched by keywords or by cross-links and were withdrawn from the Japanese Patent Office (342 patents), the United States Patent and Trademark Office (175 patents), the Chinese State Intellectual Property Office of P.R.C. (69 patents) and the World Intellectual Property Organization (12 patents). A subset of 119 (out of 642) ECAS patents on methods and apparatuses was selected and described in detail with respect to their fundamental concepts, physical principles and importance in either present ECAS apparatuses or future ECAS technologies for enhancing efficiency, reliability, repeatability, controllability and productivity. The paper is divided into two parts, the first deals with the basic concepts, features and definitions of basic ECAS and the second analyzes the auxiliary devices/peripherals. The basic ECAS is classified with reference to discharge time (fast and ultrafast ECAS). The fundamental principles and definitions of ECAS are outlined in accordance with the scientific and patent literature.
Evolutionary principles and their practical application
Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P
2011-01-01
Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology. PMID:25567966
Evolutionary principles and their practical application.
Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P
2011-03-01
Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology.
Early Childhood Physical Education. The Essential Elements.
ERIC Educational Resources Information Center
Gabbard, Carl
1988-01-01
Details are presented regarding the essential elements of an effective early childhood physical education curriculum. Components include movement awareness, fundamental locomotor skills, fundamental nonlocomotor skills, fundamental manipulative skills, and health-related fitness. (CB)
ERIC Educational Resources Information Center
Trefil, James
2007-01-01
Prize-winning scientist and bestselling author James Trefil explains why every U.S. citizen needs to be "scientifically literate" and, therefore, why schools must teach the fundamental principles of scientific literacy to every student. He lays out those principles straightforwardly, so that educators--and everyone who is interested in…
Preparation for Careers--Not Jobs
ERIC Educational Resources Information Center
Worthy, James C.
1977-01-01
Sangamon State University's experimental management program has demonstrated the fundamental soundness of the generic approach. Application of common principles to a variety of organizational situations contributes to a better understanding of those principles and helps students understand the differences between organizations and how to adapt to…
Water System Adaptation To Hydrological Changes: Module 7, Adaptation Principles and Considerations
This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...
NASA Astrophysics Data System (ADS)
Cardenas, Crystal; Harter, Andrew; Hoyle, C. D.; Leopardi, Holly; Smith, David
2014-03-01
Gravity was the first force to be described mathematically, yet it is the only fundamental force not well understood. The Standard Model of quantum mechanics describes interactions between the fundamental strong, weak and electromagnetic forces while Einstein's theory of General Relativity (GR) describes the fundamental force of gravity. There is yet to be a theory that unifies inconsistencies between GR and quantum mechanics. Scenarios of String Theory predicting more than three spatial dimensions also predict physical effects of gravity at sub-millimeter levels that would alter the gravitational inverse-square law. The Weak Equivalence Principle (WEP), a central feature of GR, states that all objects are accelerated at the same rate in a gravitational field independent of their composition. A violation of the WEP at any length would be evidence that current models of gravity are incorrect. At the Humboldt State University Gravitational Research Laboratory, an experiment is being developed to observe gravitational interactions below the 50-micron distance scale. The experiment measures the twist of a parallel-plate torsion pendulum as an attractor mass is oscillated within 50 microns of the pendulum, providing time varying gravitational torque on the pendulum. The size and distance dependence of the torque amplitude provide means to determine deviations from accepted models of gravity on untested distance scales. undergraduate.
NASA Astrophysics Data System (ADS)
Finkelstein, A. V.; Galzitskaya, O. V.
2004-04-01
Protein physics is grounded on three fundamental experimental facts: protein, this long heteropolymer, has a well defined compact three-dimensional structure; this structure can spontaneously arise from the unfolded protein chain in appropriate environment; and this structure is separated from the unfolded state of the chain by the “all-or-none” phase transition, which ensures robustness of protein structure and therefore of its action. The aim of this review is to consider modern understanding of physical principles of self-organization of protein structures and to overview such important features of this process, as finding out the unique protein structure among zillions alternatives, nucleation of the folding process and metastable folding intermediates. Towards this end we will consider the main experimental facts and simple, mostly phenomenological theoretical models. We will concentrate on relatively small (single-domain) water-soluble globular proteins (whose structure and especially folding are much better studied and understood than those of large or membrane and fibrous proteins) and consider kinetic and structural aspects of transition of initially unfolded protein chains into their final solid (“native”) 3D structures.
NASA Astrophysics Data System (ADS)
Grebenev, Igor V.; Lebedeva, Olga V.; Polushkina, Svetlana V.
2018-07-01
The article proposes a new research object for a general physics course—the vapour Cartesian diver, designed to study the properties of saturated water vapour. Physics education puts great importance on the study of the saturated vapour state, as it is related to many fundamental laws and theories. For example, the temperature dependence of the saturated water vapour pressure allows the teacher to demonstrate the Le Chatelier’s principle: increasing the temperature of a system in a dynamic equilibrium favours the endothermic change. That means that increasing the temperature increases the amount of vapour present, and so increases the saturated vapour pressure. The experimental setup proposed in this paper can be used as an example of an auto-oscillatory system, based on the properties of saturated vapour. The article describes a mathematical model of physical processes that occur in the experiment, and proposes a numerical solution method for the acquired system of equations. It shows that the results of numerical simulation coincide with the self-oscillation parameters from the real experiment. The proposed installation can also be considered as a model of a thermal engine.
Physics-based enzyme design: predicting binding affinity and catalytic activity.
Sirin, Sarah; Pearlman, David A; Sherman, Woody
2014-12-01
Computational enzyme design is an emerging field that has yielded promising success stories, but where numerous challenges remain. Accurate methods to rapidly evaluate possible enzyme design variants could provide significant value when combined with experimental efforts by reducing the number of variants needed to be synthesized and speeding the time to reach the desired endpoint of the design. To that end, extending our computational methods to model the fundamental physical-chemical principles that regulate activity in a protocol that is automated and accessible to a broad population of enzyme design researchers is essential. Here, we apply a physics-based implicit solvent MM-GBSA scoring approach to enzyme design and benchmark the computational predictions against experimentally determined activities. Specifically, we evaluate the ability of MM-GBSA to predict changes in affinity for a steroid binder protein, catalytic turnover for a Kemp eliminase, and catalytic activity for α-Gliadin peptidase variants. Using the enzyme design framework developed here, we accurately rank the most experimentally active enzyme variants, suggesting that this approach could provide enrichment of active variants in real-world enzyme design applications. © 2014 Wiley Periodicals, Inc.
A reinterpretation of transparency perception in terms of gamut relativity.
Vladusich, Tony
2013-03-01
Classical approaches to transparency perception assume that transparency constitutes a perceptual dimension corresponding to the physical dimension of transmittance. Here I present an alternative theory, termed gamut relativity, that naturally explains key aspects of transparency perception. Rather than being computed as values along a perceptual dimension corresponding to transmittance, gamut relativity postulates that transparency is built directly into the fabric of the visual system's representation of surface color. The theory, originally developed to explain properties of brightness and lightness perception, proposes how the relativity of the achromatic color gamut in a perceptual blackness-whiteness space underlies the representation of foreground and background surface layers. Whereas brightness and lightness perception were previously reanalyzed in terms of the relativity of the achromatic color gamut with respect to illumination level, transparency perception is here reinterpreted in terms of relativity with respect to physical transmittance. The relativity of the achromatic color gamut thus emerges as a fundamental computational principle underlying surface perception. A duality theorem relates the definition of transparency provided in gamut relativity with the classical definition underlying the physical blending models of computer graphics.
Relativistic covariance of Ohm's law
NASA Astrophysics Data System (ADS)
Starke, R.; Schober, G. A. H.
2016-04-01
The derivation of Lorentz-covariant generalizations of Ohm's law has been a long-term issue in theoretical physics with deep implications for the study of relativistic effects in optical and atomic physics. In this article, we propose an alternative route to this problem, which is motivated by the tremendous progress in first-principles materials physics in general and ab initio electronic structure theory in particular. We start from the most general, Lorentz-covariant first-order response law, which is written in terms of the fundamental response tensor χμ ν relating induced four-currents to external four-potentials. By showing the equivalence of this description to Ohm's law, we prove the validity of Ohm's law in every inertial frame. We further use the universal relation between χμ ν and the microscopic conductivity tensor σkℓ to derive a fully relativistic transformation law for the latter, which includes all effects of anisotropy and relativistic retardation. In the special case of a constant, scalar conductivity, this transformation law can be used to rederive a standard textbook generalization of Ohm's law.
The Notion of Scientific Knowledge in Biology
NASA Astrophysics Data System (ADS)
Morante, Silvia; Rossi, Giancarlo
2016-03-01
The purpose of this work is to reconsider and critically discuss the conceptual foundations of modern biology and bio-sciences in general, and provide an epistemological guideline to help framing the teaching of these disciplines and enhancing the quality of their presentation in High School, Master and Ph.D. courses. After discussing the methodological problems that arise in trying to construct a sensible and useful scientific approach applicable to the study of living systems, we illustrate what are the general requirements that a workable scheme of investigation should meet to comply with the principles of the Galilean method. The amazing success of basic physics, the Galilean science of election, can be traced back to the development of a radically " reductionistic" approach in the interpretation of experiments and a systematic procedure tailored on the paradigm of " falsifiability" aimed at consistently incorporating new information into extended models/theories. The development of bio-sciences seems to fit with neither reductionism (the deeper is the level of description of a biological phenomenon the more difficult looks finding general and simple laws), nor falsifiability (not always experiments provide a yes-or-no answer). Should we conclude that biology is not a science in the Galilean sense? We want to show that this is not so. Rather in the study of living systems, the novel interpretative paradigm of " complexity" has been developed that, without ever conflicting with the basic principles of physics, allows organizing ideas, conceiving new models and understanding the puzzling lack of reproducibility that seems to affect experiments in biology and in other modern areas of investigation. In the delicate task of conveying scientific concepts and principles to students as well as in popularising bio-sciences to a wider audience, it is of the utmost importance for the success of the process of learning to highlight the internal logical consistency of biology and its compliance with the fundamental laws of physics.
Identity of Particles and Continuum Hypothesis
NASA Astrophysics Data System (ADS)
Berezin, Alexander A.
2001-04-01
Why all electrons are the same? Unlike other objects, particles and atoms (same isotopes) are forbidden to have individuality or personal history (or reveal their hidden variables, even if they do have them). Or at least, what we commonly call physics so far was unable to disprove particle's sameness (Berezin and Nakhmanson, Physics Essays, 1990). Consider two opposing hypotheses: (A) particles are indeed absolutely same, or (B) they do have individuality, but it is beyond our capacity to demonstrate. This dilemma sounds akin to undecidability of Continuum Hypothesis of existence (or not) of intermediate cardinalities between integers and reals (P.Cohen). Both yes and no of it are true. Thus, (alleged) sameness of electrons and atoms may be a physical translation (embodiment) of this fundamental Goedelian undecidability. Experiments unlikely to help: even if we find that all electrons are same within 30 decimal digits, could their masses (or charges) still differ in100-th digit? Within (B) personalized informationally rich (infinitely rich?) digital tails (starting at, say, 100-th decimal) may carry individual record of each particle history. Within (A) parameters (m, q) are indeed exactly same in all digits and their sameness is based on some inherent (meta)physical principle akin to Platonism or Eddington-type numerology.
Nonequilibrium thermodynamics and information theory: basic concepts and relaxing dynamics
NASA Astrophysics Data System (ADS)
Altaner, Bernhard
2017-11-01
Thermodynamics is based on the notions of energy and entropy. While energy is the elementary quantity governing physical dynamics, entropy is the fundamental concept in information theory. In this work, starting from first principles, we give a detailed didactic account on the relations between energy and entropy and thus physics and information theory. We show that thermodynamic process inequalities, like the second law, are equivalent to the requirement that an effective description for physical dynamics is strongly relaxing. From the perspective of information theory, strongly relaxing dynamics govern the irreversible convergence of a statistical ensemble towards the maximally non-commital probability distribution that is compatible with thermodynamic equilibrium parameters. In particular, Markov processes that converge to a thermodynamic equilibrium state are strongly relaxing. Our framework generalizes previous results to arbitrary open and driven systems, yielding novel thermodynamic bounds for idealized and real processes. , which features invited work from the best early-career researchers working within the scope of J. Phys. A. This project is part of the Journal of Physics series’ 50th anniversary celebrations in 2017. Bernhard Altaner was selected by the Editorial Board of J. Phys. A as an Emerging Talent.
ERIC Educational Resources Information Center
Robertson, L. Paul
Designed for use in basic electronics programs, this curriculum guide is comprised of twenty-nine units of instruction in five major content areas: Orientation, Basic Principles of Electricity/Electronics, Fundamentals of Direct Current, Fundamentals of Alternating Current, and Applying for a Job. Each instructional unit includes some or all of…
NASA Astrophysics Data System (ADS)
Uzan, Jean-Philippe
2013-02-01
Fundamental constants play a central role in many modern developments in gravitation and cosmology. Most extensions of general relativity lead to the conclusion that dimensionless constants are actually dynamical fields. Any detection of their variation on sub-Hubble scales would signal a violation of the Einstein equivalence principle and hence a lead to gravity beyond general relativity. On super-Hubble scales, or maybe should we say on super-universe scales, such variations are invoked as a solution to the fine-tuning problem, in connection with an anthropic approach.
Boyle, Cynthia J.; Janke, Kristin K.
2013-01-01
Objective. To assist administrators and faculty members in colleges and schools of pharmacy by gathering expert opinion to frame, direct, and support investments in student leadership development. Methods. Twenty-six leadership instructors participated in a 3-round, online, modified Delphi process to define doctor of pharmacy (PharmD) student leadership instruction. Round 1 asked open-ended questions about leadership knowledge, skills, and attitudes to begin the generation of student leadership development guiding principles and competencies. Statements were identified as guiding principles when they were perceived as foundational to the instructional approach. Round 2 grouped responses for agreement rating and comment. Group consensus with a statement as a guiding principle was set prospectively at 80%. Round 3 allowed rating and comment on guidelines, modified from feedback in round 2, that did not meet consensus. The principles were verified by identifying common contemporary leadership development approaches in the literature. Results. Twelve guiding principles, related to concepts of leadership and educational philosophy, were defined and could be linked to contemporary leadership development thought. These guiding principles describe the motivation for teaching leadership, the fundamental precepts of student leadership development, and the core tenets for leadership instruction. Conclusions. Expert opinion gathered using a Delphi process resulted in guiding principles that help to address many of the fundamental questions that arise when implementing or refining leadership curricula. The principles identified are supported by common contemporary leadership development thought. PMID:24371345
Traynor, Andrew P; Boyle, Cynthia J; Janke, Kristin K
2013-12-16
To assist administrators and faculty members in colleges and schools of pharmacy by gathering expert opinion to frame, direct, and support investments in student leadership development. Twenty-six leadership instructors participated in a 3-round, online, modified Delphi process to define doctor of pharmacy (PharmD) student leadership instruction. Round 1 asked open-ended questions about leadership knowledge, skills, and attitudes to begin the generation of student leadership development guiding principles and competencies. Statements were identified as guiding principles when they were perceived as foundational to the instructional approach. Round 2 grouped responses for agreement rating and comment. Group consensus with a statement as a guiding principle was set prospectively at 80%. Round 3 allowed rating and comment on guidelines, modified from feedback in round 2, that did not meet consensus. The principles were verified by identifying common contemporary leadership development approaches in the literature. Twelve guiding principles, related to concepts of leadership and educational philosophy, were defined and could be linked to contemporary leadership development thought. These guiding principles describe the motivation for teaching leadership, the fundamental precepts of student leadership development, and the core tenets for leadership instruction. Expert opinion gathered using a Delphi process resulted in guiding principles that help to address many of the fundamental questions that arise when implementing or refining leadership curricula. The principles identified are supported by common contemporary leadership development thought.
Beyond the Virtues-Principles Debate.
ERIC Educational Resources Information Center
Keat, Marilyn S.
1992-01-01
Indicates basic ontological assumptions in the virtues-principles debate in moral philosophy, noting Aristotle's and Kant's fundamental ideas about morality and considering a hermeneutic synthesis of theories. The article discusses what acceptance of the synthesis might mean in the theory and practice of moral pedagogy, offering examples of…
Linguistic Recycling and the Open Community.
ERIC Educational Resources Information Center
Dasgupta, Probal
2001-01-01
Examines linguistic recycling in the context of domestic Esperanto use. Argues that word-meaning recycling reflects the same fundamental principles as sentential recursion, and that a linguistics theoretically sensitive to these principles strengthens practical efforts towards the social goal of an open speech community. (Author/VWL)
NASA Astrophysics Data System (ADS)
Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf
2017-09-01
There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.
NASA Astrophysics Data System (ADS)
Lucio Rapoport, Diego
2013-04-01
We present a unified principle for science that surmounts dualism, in terms of torsion fields and the non-orientable surfaces, notably the Klein Bottle and its logic, the Möbius strip and the projective plane. We apply it to the complex numbers and cosmology, to non-linear systems integrating the issue of hyperbolic divergences with the change of orientability, to the biomechanics of vision and the mammal heart, to the morphogenesis of crustal shapes on Earth in connection to the wavefronts of gravitation, elasticity and electromagnetism, to pattern recognition of artificial images and visual recognition, to neurology and the topographic maps of the sensorium, to perception, in particular of music. We develop it in terms of the fundamental 2:1 resonance inherent to the Möbius strip and the Klein Bottle, the minimal surfaces representation of the wavefronts, and the non-dual Klein Bottle logic inherent to pattern recognition, to the harmonic functions and vector fields that lay at the basis of geophysics and physics at large. We discuss the relation between the topographic maps of the sensorium, and the issue of turning inside-out of the visual world as a general principle for cognition, topological chemistry, cell biology and biological morphogenesis in particular in embryology
Scale relativity theory and integrative systems biology: 1. Founding principles and scale laws.
Auffray, Charles; Nottale, Laurent
2008-05-01
In these two companion papers, we provide an overview and a brief history of the multiple roots, current developments and recent advances of integrative systems biology and identify multiscale integration as its grand challenge. Then we introduce the fundamental principles and the successive steps that have been followed in the construction of the scale relativity theory, and discuss how scale laws of increasing complexity can be used to model and understand the behaviour of complex biological systems. In scale relativity theory, the geometry of space is considered to be continuous but non-differentiable, therefore fractal (i.e., explicitly scale-dependent). One writes the equations of motion in such a space as geodesics equations, under the constraint of the principle of relativity of all scales in nature. To this purpose, covariant derivatives are constructed that implement the various effects of the non-differentiable and fractal geometry. In this first review paper, the scale laws that describe the new dependence on resolutions of physical quantities are obtained as solutions of differential equations acting in the scale space. This leads to several possible levels of description for these laws, from the simplest scale invariant laws to generalized laws with variable fractal dimensions. Initial applications of these laws to the study of species evolution, embryogenesis and cell confinement are discussed.
NASA Astrophysics Data System (ADS)
Nesvizhevsky, Valery
2013-03-01
The `whispering gallery' effect has been known since ancient times for sound waves in air, later in water and more recently for a broad range of electromagnetic waves: radio, optics, Roentgen and so on. It is intensively used and explored due to its numerous crucial applications. It consists of wave localization near a curved reflecting surface and is expected for waves of various natures, for instance, for neutrons and (anti)atoms. For (anti)matter waves, it includes a new feature: a massive particle is settled in quantum states, with parameters depending on its mass. In this talk, we present the first observation of the quantum whispering-gallery effect for matter particles (cold neutrons) 1-2. This phenomenon provides an example of an exactly solvable problem analogous to the `quantum bouncer'; it is complementary to recently discovered gravitational quantum states of neutrons3. These two phenomena provide a direct demonstration of the weak equivalence principle for a massive particle in a quantum state. Deeply bound long-living states are weakly sensitive to surface potential; highly excited short-living states are very sensitive to the wall nuclear potential shape. Therefore, they are a promising tool for studying fundamental neutron-matter interactions, quantum neutron optics and surface physics effects. Analogous phenomena could be measured with atoms and anti-atoms 4-5.
Fluid flow measurements by means of vibration monitoring
NASA Astrophysics Data System (ADS)
Campagna, Mauro M.; Dinardo, Giuseppe; Fabbiano, Laura; Vacca, Gaetano
2015-11-01
The achievement of accurate fluid flow measurements is fundamental whenever the control and the monitoring of certain physical quantities governing an industrial process are required. In that case, non-intrusive devices are preferable, but these are often more sophisticated and expensive than those which are more common (such as nozzles, diaphrams, Coriolis flowmeters and so on). In this paper, a novel, non-intrusive, simple and inexpensive methodology is presented to measure the fluid flow rate (in a turbulent regime) whose physical principle is based on the acquisition of transversal vibrational signals induced by the fluid itself onto the pipe walls it is flowing through. Such a principle of operation would permit the use of micro-accelerometers capable of acquiring and transmitting the signals, even by means of wireless technology, to a control room for the monitoring of the process under control. A possible application (whose feasibility will be investigated by the authors in a further study) of this introduced technology is related to the employment of a net of micro-accelerometers to be installed on pipeline networks of aqueducts. This apparatus could lead to the faster and easier detection and location of possible leaks of fluid affecting the pipeline network with more affordable costs. The authors, who have previously proven the linear dependency of the acceleration harmonics amplitude on the flow rate, here discuss an experimental analysis of this functional relation with the variation in the physical properties of the pipe in terms of its diameter and constituent material, to find the eventual limits to the practical application of the measurement methodology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chantler, C.T.
2003-01-24
Richard Deslattes passed away on 16 May 2001 after a life dedicated to fundamental metrology. Although the themes of calibrating light, matter and fundamental constants can give three guiding principles through his career, the wide-ranging nature of his areas of interest are encompassed by over 165 refereed publications with several cited over 100 times. He has left an enduring legacy to science.
Moore, Michael D; Shi, Zhenqi; Wildfong, Peter L D
2010-12-01
To develop a method for drawing statistical inferences from differences between multiple experimental pair distribution function (PDF) transforms of powder X-ray diffraction (PXRD) data. The appropriate treatment of initial PXRD error estimates using traditional error propagation algorithms was tested using Monte Carlo simulations on amorphous ketoconazole. An amorphous felodipine:polyvinyl pyrrolidone:vinyl acetate (PVPva) physical mixture was prepared to define an error threshold. Co-solidified products of felodipine:PVPva and terfenadine:PVPva were prepared using a melt-quench method and subsequently analyzed using PXRD and PDF. Differential scanning calorimetry (DSC) was used as an additional characterization method. The appropriate manipulation of initial PXRD error estimates through the PDF transform were confirmed using the Monte Carlo simulations for amorphous ketoconazole. The felodipine:PVPva physical mixture PDF analysis determined ±3σ to be an appropriate error threshold. Using the PDF and error propagation principles, the felodipine:PVPva co-solidified product was determined to be completely miscible, and the terfenadine:PVPva co-solidified product, although having appearances of an amorphous molecular solid dispersion by DSC, was determined to be phase-separated. Statistically based inferences were successfully drawn from PDF transforms of PXRD patterns obtained from composite systems. The principles applied herein may be universally adapted to many different systems and provide a fundamentally sound basis for drawing structural conclusions from PDF studies.
Implications of Einstein-Weyl Causality on Quantum Mechanics
NASA Astrophysics Data System (ADS)
Bendaniel, David
A fundamental physical principle that has consequences for the topology of space-time is the principle of Einstein-Weyl causality. This also has quantum mechanical manifestations. Borchers and Sen have rigorously investigated the mathematical implications of Einstein-Weyl causality and shown the denumerable space-time Q2 would be implied. They were left with important philosophical paradoxes regarding the nature of the physical real line E, e.g., whether E = R, the real line of mathematics. In order to remove these paradoxes an investigation into a constructible foundation is suggested. We have pursued such a program and find it indeed provides a dense, denumerable space-time and, moreover, an interesting connection with quantum mechanics. We first show that this constructible theory contains polynomial functions which are locally homeomorphic with a dense, denumerable metric space R* and are inherently quantized. Eigenfunctions governing fields can then be effectively obtained by computational iteration. Postulating a Lagrangian for fields in a compactified space-time, we get a general description of which the Schrodinger equation is a special case. From these results we can then also show that this denumerable space-time is relational (in the sense that space is not infinitesimally small if and only if it contains a quantized field) and, since Q2 is imbedded in R*2, it directly fulfills the strict topological requirements for Einstein-Weyl causality. Therefore, the theory predicts that E = R*.
Paleophysiology: From Fossils to the Future.
Vermeij, Geerat J
2015-10-01
Future environments may resemble conditions that have not existed for millions of years. To assess the adaptive options available to organisms evolving under such circumstances, it is instructive to probe paleophysiology, the ways in which ancient life coped with its physical and chemical surroundings. To do this, we need reliable proxies that are based on fundamental principles, quantitatively verified in living species, and observable in fossil remains. Insights have already come from vertebrates and plants, and others will likely emerge for marine animals if promising proxies are validated. Many questions remain about the circumstances for the evolution of environmental tolerances, metabolic rates, biomineralization, and physiological responses to interacting species, and about how living organisms will perform under exceptional conditions. Copyright © 2015 Elsevier Ltd. All rights reserved.
Optical properties of amyloid stained by Congo red: history and mechanisms.
Howie, Alexander J; Brewer, Douglas B
2009-04-01
Amyloid stained by Congo red has striking optical properties that generally have been poorly described and inadequately explained, although they can be understood from principles of physical optics. Molecules of Congo red are orientated on amyloid fibrils, and so the dye becomes dichroic and birefringent. The birefringence varies with wavelength in accordance with a fundamental property of all light-transmitting materials called anomalous dispersion of the refractive index around an absorption peak. The combination of this and absorption of light, with modification by any additional birefringence in the optical system, explains the various colours that can be seen in Congo red-stained amyloid between crossed polariser and analyser, and also when the polariser and analyser are progressively uncrossed. These are called anomalous colours.
Cosmological Inflation: A Personal Perspective
NASA Technical Reports Server (NTRS)
Kazanas, Demos
2008-01-01
We present a review of the sequence of events/circumstances that led to the introduction of interplay between the physics associated with phase transitions in the early universe and their effects on its dynamics of expansion with the goal of resolving the horizon problem that it has since become known as Cosmological Inflation. We then provide a brief review of the fundamentals and the solutions of a theory of gravity based on local scale invariance, known as Weyl gravity that have been elaborated by the presenter and his collaborator P. D. Mannheim. We point out that this theory provides from first principles for a characteristic universal acceleration, whose value appears to be in agreement with observations across a vast range of length scales in the universe.
Cosmological Inflation: A Personal Perspective
NASA Technical Reports Server (NTRS)
Kazanas, Demosthenes
2007-01-01
We present a review of the sequence of events/circumstances that led to the introduction of interplay between the physics associated with phase transitions in the early universe and their effects on its dynamics of expansion with the goal of resolving the horizon problem that it has since become known as Cosmological Inflation. We then provide a brief review of the fundamentals and the solutions of a theory of gravity based on local scale invariance, known as Weyl gravity that have been elaborated by the presenter and his collaborator P. D. Mannheim. We point out that this theory provides from first principles for a characteristic universal acceleration, whose value appears to be in agreement with observations across a vast range of length scales in the universe.
The waveguide laser - A review
NASA Technical Reports Server (NTRS)
Degnan, J. J.
1976-01-01
The present article reviews the fundamental physical principles essential to an understanding of waveguide gas and liquid lasers, and the current technological state of these devices. At the present time, waveguide laser transitions span the visible through submillimeter regions of the wavelength spectrum. The introduction discusses the many applications of waveguide lasers and the wide variety of laser configurations that are possible. Section 1 summarizes the properties of modes in hollow dielectric waveguides of circular, rectangular, and planar cross section. Section 2 considers various approaches to optical feedback including internal and external mirror Fabry-Perot type resonators, hollow waveguide distributed feedback structures, and ring-resonant configurations. Section 3 discusses those aspects of molecular kinetic and laser theory pertinent to the design and optimization of waveguide gas lasers.
Han, Ning-Xu; Xing, Feng
2016-01-01
A review of the research activities and achievements at Shenzhen University is conducted in this paper concerning the creation and further development of novel microcapsule based self-resilience systems for their application in concrete structures. After a brief description of pioneering works in the field starting about 10 years ago, the principles raised in the relevant research are examined, where fundamental terms related to the concept of resilience are discussed. Several breakthrough points are highlighted concerning the three adopted comprehensive self-resilience systems, namely physical, chemical and microbial systems. The major challenges regarding evaluation are emphasized and further development concerning self-resilience in concrete structures will be addressed. PMID:28772362
Subquantum information and computation
NASA Astrophysics Data System (ADS)
Valentini, Antony
2002-08-01
It is argued that immense physical resources -- for nonlocal communication, espionage, and exponentially-fast computation -- are hidden from us by quantum noise, and that this noise is not fundamental but merely a property of an equilibrium state in which the universe happens to be at the present time. It is suggested that `non-quantum' or nonequilibrium matter might exist today in the form of relic particles from the early universe. We describe how such matter could be detected and put to practical use. Nonequilibrium matter could be used to send instantaneous signals, to violate the uncertainty principle, to distinguish non-orthogonal quantum states without disturbing them, to eavesdrop on quantum key distribution, and to outpace quantum computation (solving NP-complete problems in polynomial time).
Han, Ning-Xu; Xing, Feng
2016-12-22
A review of the research activities and achievements at Shenzhen University is conducted in this paper concerning the creation and further development of novel microcapsule based self-resilience systems for their application in concrete structures. After a brief description of pioneering works in the field starting about 10 years ago, the principles raised in the relevant research are examined, where fundamental terms related to the concept of resilience are discussed. Several breakthrough points are highlighted concerning the three adopted comprehensive self-resilience systems, namely physical, chemical and microbial systems. The major challenges regarding evaluation are emphasized and further development concerning self-resilience in concrete structures will be addressed.
NASA Astrophysics Data System (ADS)
Crutchfield, James; Wiesner, Karoline
2010-02-01
Is anything ever simple? When confronted with a complicated system, scientists typically strive to identify underlying simplicity, which we articulate as natural laws and fundamental principles. This simplicity is what makes nature appear so organized. Atomic physics, for example, approached a solid theoretical foundation when Niels Bohr uncovered the organization of electronic energy levels, which only later were redescribed as quantum wavefunctions. Charles Darwin's revolutionary idea about the "origin" of species emerged by mapping how species are organized and discovering why they came to be that way. And James Watson and Francis Crick's interpretation of DNA diffraction spectra was a discovery of the structural organization of genetic information - it was neither about the molecule's disorder (thermodynamic entropy) nor about the statistical randomness of its base-pair sequences.
Elastic Waves: Mental Models and Teaching/Learning Sequences
NASA Astrophysics Data System (ADS)
Tarantino, Giovanni
In last years many research studies have pointed out relevant student diff- culties in understanding the physics of mechanical waves. Moreover, it has been reported that these diffculties deal with some fundamental concepts as the role of the medium in wave propagation, the superposition principle and the mathematical description of waves involving the use of functions of two variables. In the context of pre-service courses for teacher preparation a teaching/learning (T/L) sequence based on using simple RTL experiments and interactive simulation environments aimed to show the effect of medium properties on the propagation speed of a wave pulse, has been experimented. Here, preliminary results of investigations carried out with a 120 traineeteacher (TT) group are reported and discussed.
The Superconducting Cavity Stabilized Oscillator
NASA Technical Reports Server (NTRS)
Turneaure, J. P.; Buchman, Saps; Lipa, John
1997-01-01
Superconducting Cavity Stabilized Oscillators (SCSOs) have produced the most stable clocks to date for integration times between 10(exp 2) and 10(exp 3) seconds, achieving a fractional frequency stability of 2 x 10(exp -16) for a sampling time of 100 s. The principal contributors to cavity frequency variations are: (1) acceleration effects due to gravity and vibrations; (2) temperature variations; (3) variations in the energy stored in the cavity; and (4) noise introduced by the frequency stabilization circuit. We discuss the prospects for improvements in all these areas for both ground-based and space-based SCSOs, which may lead to SCSOs with fractional frequency stabilities below 10(exp -17). SCSOs of this frequency stability will be useful for testing fundamental physical principles.
Perception of Mirror Symmetry in Autism Spectrum Disorders
ERIC Educational Resources Information Center
Falter, Christine M.; Bailey, Anthony J.
2012-01-01
Gestalt grouping in autism spectrum disorders (ASD) is selectively impaired for certain organization principles but for not others. Symmetry is a fundamental Gestalt principle characterizing many biological shapes. Sensitivity to symmetry was tested using the Picture Symmetry Test, which requires finding symmetry lines on pictures. Individuals…
Dual matter-wave inertial sensors in weightlessness
Barrett, Brynle; Antoni-Micollier, Laura; Chichet, Laure; Battelier, Baptiste; Lévèque, Thomas; Landragin, Arnaud; Bouyer, Philippe
2016-01-01
Quantum technology based on cold-atom interferometers is showing great promise for fields such as inertial sensing and fundamental physics. However, the finite free-fall time of the atoms limits the precision achievable on Earth, while in space interrogation times of many seconds will lead to unprecedented sensitivity. Here we realize simultaneous 87Rb–39K interferometers capable of operating in the weightless environment produced during parabolic flight. Large vibration levels (10−2 g Hz−1/2), variations in acceleration (0–1.8 g) and rotation rates (5° s−1) onboard the aircraft present significant challenges. We demonstrate the capability of our correlated quantum system by measuring the Eötvös parameter with systematic-limited uncertainties of 1.1 × 10−3 and 3.0 × 10−4 during standard- and microgravity, respectively. This constitutes a fundamental test of the equivalence principle using quantum sensors in a free-falling vehicle. Our results are applicable to inertial navigation, and can be extended to the trajectory of a satellite for future space missions. PMID:27941928
Jirauschek, Christian; Huber, Robert
2015-01-01
We analyze the physics behind the newest generation of rapidly wavelength tunable sources for optical coherence tomography (OCT), retaining a single longitudinal cavity mode during operation without repeated build up of lasing. In this context, we theoretically investigate the currently existing concepts of rapidly wavelength-swept lasers based on tuning of the cavity length or refractive index, leading to an altered optical path length inside the resonator. Specifically, we consider vertical-cavity surface-emitting lasers (VCSELs) with microelectromechanical system (MEMS) mirrors as well as Fourier domain mode-locked (FDML) and Vernier-tuned distributed Bragg reflector (VT-DBR) lasers. Based on heuristic arguments and exact analytical solutions of Maxwell’s equations for a fundamental laser resonator model, we show that adiabatic wavelength tuning is achieved, i.e., hopping between cavity modes associated with a repeated build up of lasing is avoided, and the photon number is conserved. As a consequence, no fundamental limit exists for the wavelength tuning speed, in principle enabling wide-range wavelength sweeps at arbitrary tuning speeds with narrow instantaneous linewidth. PMID:26203373
Self-organization: the fundament of cell biology
Betz, Timo
2018-01-01
Self-organization refers to the emergence of an overall order in time and space of a given system that results from the collective interactions of its individual components. This concept has been widely recognized as a core principle in pattern formation for multi-component systems of the physical, chemical and biological world. It can be distinguished from self-assembly by the constant input of energy required to maintain order—and self-organization therefore typically occurs in non-equilibrium or dissipative systems. Cells, with their constant energy consumption and myriads of local interactions between distinct proteins, lipids, carbohydrates and nucleic acids, represent the perfect playground for self-organization. It therefore comes as no surprise that many properties and features of self-organized systems, such as spontaneous formation of patterns, nonlinear coupling of reactions, bi-stable switches, waves and oscillations, are found in all aspects of modern cell biology. Ultimately, self-organization lies at the heart of the robustness and adaptability found in cellular and organismal organization, and hence constitutes a fundamental basis for natural selection and evolution. This article is part of the theme issue ‘Self-organization in cell biology’. PMID:29632257
Progress in piezo-phototronic effect modulated photovoltaics.
Que, Miaoling; Zhou, Ranran; Wang, Xiandi; Yuan, Zuqing; Hu, Guofeng; Pan, Caofeng
2016-11-02
Wurtzite structured materials, like ZnO, GaN, CdS, and InN, simultaneously possess semiconductor and piezoelectric properties. The inner-crystal piezopotential induced by external strain can effectively tune/control the carrier generation, transport and separation/combination processes at the metal-semiconductor contact or p-n junction, which is called the piezo-phototronic effect. This effect can efficiently enhance the performance of photovoltaic devices based on piezoelectric semiconductor materials by utilizing the piezo-polarization charges at the junction induced by straining, which can modulate the energy band of the piezoelectric material and then accelerate or prevent the separation process of the photon-generated electrons and vacancies. This paper introduces the fundamental physics principles of the piezo-phototronic effect, and reviews recent progress in piezo-phototronic effect enhanced solar cells, including solar cells based on semiconductor nanowire, organic/inorganic materials, quantum dots, and perovskite. The piezo-phototronic effect is suggested as a suitable basis for the development of an innovative method to enhance the performance of solar cells based on piezoelectric semiconductors by applied extrinsic strains, which might be appropriate for fundamental research and potential applications in various areas of optoelectronics.
Progress in piezo-phototronic effect modulated photovoltaics
NASA Astrophysics Data System (ADS)
Que, Miaoling; Zhou, Ranran; Wang, Xiandi; Yuan, Zuqing; Hu, Guofeng; Pan, Caofeng
2016-11-01
Wurtzite structured materials, like ZnO, GaN, CdS, and InN, simultaneously possess semiconductor and piezoelectric properties. The inner-crystal piezopotential induced by external strain can effectively tune/control the carrier generation, transport and separation/combination processes at the metal-semiconductor contact or p-n junction, which is called the piezo-phototronic effect. This effect can efficiently enhance the performance of photovoltaic devices based on piezoelectric semiconductor materials by utilizing the piezo-polarization charges at the junction induced by straining, which can modulate the energy band of the piezoelectric material and then accelerate or prevent the separation process of the photon-generated electrons and vacancies. This paper introduces the fundamental physics principles of the piezo-phototronic effect, and reviews recent progress in piezo-phototronic effect enhanced solar cells, including solar cells based on semiconductor nanowire, organic/inorganic materials, quantum dots, and perovskite. The piezo-phototronic effect is suggested as a suitable basis for the development of an innovative method to enhance the performance of solar cells based on piezoelectric semiconductors by applied extrinsic strains, which might be appropriate for fundamental research and potential applications in various areas of optoelectronics.
The Principle of General Tovariance
NASA Astrophysics Data System (ADS)
Heunen, C.; Landsman, N. P.; Spitters, B.
2008-06-01
We tentatively propose two guiding principles for the construction of theories of physics, which should be satisfied by a possible future theory of quantum gravity. These principles are inspired by those that led Einstein to his theory of general relativity, viz. his principle of general covariance and his equivalence principle, as well as by the two mysterious dogmas of Bohr's interpretation of quantum mechanics, i.e. his doctrine of classical concepts and his principle of complementarity. An appropriate mathematical language for combining these ideas is topos theory, a framework earlier proposed for physics by Isham and collaborators. Our principle of general tovariance states that any mathematical structure appearing in the laws of physics must be definable in an arbitrary topos (with natural numbers object) and must be preserved under so-called geometric morphisms. This principle identifies geometric logic as the mathematical language of physics and restricts the constructions and theorems to those valid in intuitionism: neither Aristotle's principle of the excluded third nor Zermelo's Axiom of Choice may be invoked. Subsequently, our equivalence principle states that any algebra of observables (initially defined in the topos Sets) is empirically equivalent to a commutative one in some other topos.
Von Neumann's impossibility proof: Mathematics in the service of rhetorics
NASA Astrophysics Data System (ADS)
Dieks, Dennis
2017-11-01
According to what has become a standard history of quantum mechanics, in 1932 von Neumann persuaded the physics community that hidden variables are impossible as a matter of principle, after which leading proponents of the Copenhagen interpretation put the situation to good use by arguing that the completeness of quantum mechanics was undeniable. This state of affairs lasted, so the story continues, until Bell in 1966 exposed von Neumann's proof as obviously wrong. The realization that von Neumann's proof was fallacious then rehabilitated hidden variables and made serious foundational research possible again. It is often added in recent accounts that von Neumann's error had been spotted almost immediately by Grete Hermann, but that her discovery was of no effect due to the dominant Copenhagen Zeitgeist. We shall attempt to tell a story that is more historically accurate and less ideologically charged. Most importantly, von Neumann never claimed to have shown the impossibility of hidden variables tout court, but argued that hidden-variable theories must possess a structure that deviates fundamentally from that of quantum mechanics. Both Hermann and Bell appear to have missed this point; moreover, both raised unjustified technical objections to the proof. Von Neumann's argument was basically that hidden-variables schemes must violate the ;quantum principle; that physical quantities are to be represented by operators in a Hilbert space. As a consequence, hidden-variables schemes, though possible in principle, necessarily exhibit a certain kind of contextuality. As we shall illustrate, early reactions to Bohm's theory are in agreement with this account. Leading physicists pointed out that Bohm's theory has the strange feature that pre-existing particle properties do not generally reveal themselves in measurements, in accordance with von Neumann's result. They did not conclude that the ;impossible was done; and that von Neumann had been shown wrong.
Kant and the Conservation of Matter
NASA Astrophysics Data System (ADS)
Morris, Joel
This dissertation is an examination of Kant's rather notorious claim that natural science, or physics, has a priori principles, understood as the claim that physics is constrained by rules warranted by the essential nature of thought. The overall direction of this study is towards examining Kant's claim by close study of a particular principle of physics, the principle of the conservation of matter. If indeed this is a principle of physics, and Kant can successfully show that it is a priori, then it will be reasonable to conclude, in company with Kant, that physics has a priori principles. Although Kant's proof of the principle of the conservation of matter has been traditionally regraded as a reasonably straightforward consequence of his First Analogy of Experience, a careful reading of his proof reveals that this is not really the case. Rather, Kant's proof of the conservation of matter is a consequence of (i) his schematisation of the category of substance in terms of permanence, and (ii) his identification of matter as substance, by appeal to what he thinks is the empirical criterion of substance, activity. Careful examination of Kant's argument in defence of the principle of the conservation of matter, however, reveals a number of deficiencies, and it is concluded that Kant cannot be said to have satisfactorily demonstrated the principle of the conservation of matter or to have convincingly illustrated his claim that physics has a priori principles by appeal to this instance.
ERIC Educational Resources Information Center
Mendenhall, Anne M.
2012-01-01
Merrill (2002a) created a set of fundamental principles of instruction that can lead to effective, efficient, and engaging (e[superscript 3]) instruction. The First Principles of Instruction (Merrill, 2002a) are a prescriptive set of interrelated instructional design practices that consist of activating prior knowledge, using specific portrayals…
The Basic Principle of Calculus?
ERIC Educational Resources Information Center
Hardy, Michael
2011-01-01
A simple partial version of the Fundamental Theorem of Calculus can be presented on the first day of the first-year calculus course, and then relied upon repeatedly in assigned problems throughout the course. With that experience behind them, students can use the partial version to understand the full-fledged Fundamental Theorem, with further…
The Tractor Electrical System. A Teaching Reference.
ERIC Educational Resources Information Center
American Association for Vocational Instructional Materials, Athens, GA.
The fundamental principles underlying the application of electricity to tractors and farm equipment are presented. An understanding of the material in the basic manual will enable the service man to understand better the service procedures covered in service manuals on electrical equipment. Topics dealt with are fundamentals of electricity,…
Regenerative endodontics--Creating new horizons.
Dhillon, Harnoor; Kaushik, Mamta; Sharma, Roshni
2016-05-01
Trauma to the dental pulp, physical or microbiologic, can lead to inflammation of the pulp followed by necrosis. The current treatment modality for such cases is non-surgical root canal treatment. The damaged tissue is extirpated and the root canal system prepared. It is then obturated with an inert material such a gutta percha. In spite of advances in techniques and materials, 10%-15% of the cases may end in failure of treatment. Regenerative endodontics combines principles of endodontics, cell biology, and tissue engineering to provide an ideal treatment for inflamed and necrotic pulp. It utilizes mesenchymal stem cells, growth factors, and organ tissue culture to provide treatment. Potential treatment modalities include induction of blood clot for pulp revascularization, scaffold aided regeneration, and pulp implantation. Although in its infancy, successful treatment of damaged pulp tissue has been performed using principles of regenerative endodontics. This field is dynamic and exciting with the ability to shape the future of endodontics. This article highlights the fundamental concepts, protocol for treatment, and possible avenues for research in regenerative endodontics. © 2015 Wiley Periodicals, Inc.
Magnetism: Principles and Applications
NASA Astrophysics Data System (ADS)
Craik, Derek J.
2003-09-01
If you are studying physics, chemistry, materials science, electrical engineering, information technology or medicine, then you'll know that understanding magnetism is fundamental to success in your studies and here is the key to unlocking the mysteries of magnetism....... You can: obtain a simple overview of magnetism, including the roles of B and H, resonances and special techniques take full advantage of modern magnets with a wealth of expressions for fields and forces develop realistic general design programmes using isoparametric finite elements study the subtleties of the general theory of magnetic moments and their dynamics follow the development of outstanding materials appreciate how magnetism encompasses topics as diverse as rock magnetism, chemical reaction rates, biological compasses, medical therapies, superconductivity and levitation understand the basis and remarkable achievements of magnetic resonance imaging In his new book, Magnetism, Derek Craik throws light on the principles and applications of this fascinating subject. From formulae for calculating fields to quantum theory, the secrets of magnetism are exposed, ensuring that whether you are a chemist or engineer, physicist, medic or materials scientist Magnetism is the book for our course.
NASA Astrophysics Data System (ADS)
Zhang, Zhongyang; Nian, Qiong; Doumanidis, Charalabos C.; Liao, Yiliang
2018-02-01
Nanosecond pulsed laser shock processing (LSP) techniques, including laser shock peening, laser peen forming, and laser shock imprinting, have been employed for widespread industrial applications. In these processes, the main beneficial characteristic is the laser-induced shockwave with a high pressure (in the order of GPa), which leads to the plastic deformation with an ultrahigh strain rate (105-106/s) on the surface of target materials. Although LSP processes have been extensively studied by experiments, few efforts have been put on elucidating underlying process mechanisms through developing a physics-based process model. In particular, development of a first-principles model is critical for process optimization and novel process design. This work aims at introducing such a theoretical model for a fundamental understanding of process mechanisms in LSP. Emphasis is placed on the laser-matter interaction and plasma dynamics. This model is found to offer capabilities in predicting key parameters including electron and ion temperatures, plasma state variables (temperature, density, and pressure), and the propagation of the laser shockwave. The modeling results were validated by experimental data.
PACS—Realization of an adaptive concept using pressure actuated cellular structures
NASA Astrophysics Data System (ADS)
Gramüller, B.; Boblenz, J.; Hühne, C.
2014-10-01
A biologically inspired concept is investigated which can be utilized to develop energy efficient, lightweight and applicational flexible adaptive structures. Building a real life morphing unit is an ambitious task as the numerous works in the particular field show. Summarizing fundamental demands and barriers regarding shape changing structures, the basic challenges of designing morphing structures are listed. The concept of Pressure Actuated Cellular Structures (PACS) is arranged within the recent morphing activities and it is shown that it complies with the underlying demands. Systematically divided into energy-related and structural subcomponents the working principle is illuminated and relationships between basic design parameters are expressed. The analytical background describing the physical mechanisms of PACS is presented in concentrated manner. This work focuses on the procedure of dimensioning, realizing and experimental testing of a single cell and a single row cantilever made of PACS. The experimental outcomes as well as the results from the FEM computations are used for evaluating the analytical methods. The functionality of the basic principle is thus validated and open issues are determined pointing the way ahead.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martins, C.J.A.P.; Pinho, A.M.M.; Alves, R.F.C.
2015-08-01
Astrophysical tests of the stability of fundamental couplings, such as the fine-structure constant α, are becoming an increasingly powerful probe of new physics. Here we discuss how these measurements, combined with local atomic clock tests and Type Ia supernova and Hubble parameter data, constrain the simplest class of dynamical dark energy models where the same degree of freedom is assumed to provide both the dark energy and (through a dimensionless coupling, ζ, to the electromagnetic sector) the α variation. Specifically, current data tightly constrains a combination of ζ and the present dark energy equation of state w{sub 0}. Moreover, inmore » these models the new degree of freedom inevitably couples to nucleons (through the α dependence of their masses) and leads to violations of the Weak Equivalence Principle. We obtain indirect bounds on the Eötvös parameter η that are typically stronger than the current direct ones. We discuss the model-dependence of our results and briefly comment on how the forthcoming generation of high-resolution ultra-stable spectrographs will enable significantly tighter constraints.« less
Dahmash, Eman Z; Mohammed, Afzal R
2015-01-01
Production of functionalised particles using dry powder coating is a one-step, environmentally friendly process that paves the way for the development of particles with targeted properties and diverse functionalities. Applying the first principles in physical science for powders, fine guest particles can be homogeneously dispersed over the surface of larger host particles to develop functionalised particles. Multiple functionalities can be modified including: flowability, dispersibility, fluidisation, homogeneity, content uniformity and dissolution profile. The current publication seeks to understand the fundamental underpinning principles and science governing dry coating process, evaluate key technologies developed to produce functionalised particles along with outlining their advantages, limitations and applications and discusses in detail the resultant functionalities and their applications. Dry particle coating is a promising solvent-free manufacturing technology to produce particles with targeted functionalities. Progress within this area requires the development of continuous processing devices that can overcome challenges encountered with current technologies such as heat generation and particle attrition. Growth within this field requires extensive research to further understand the impact of process design and material properties on resultant functionalities.
How not to criticize the precautionary principle.
Hughes, Jonathan
2006-10-01
The precautionary principle has its origins in debates about environmental policy, but is increasingly invoked in bioethical contexts. John Harris and Søren Holm argue that the principle should be rejected as incoherent, irrational, and representing a fundamental threat to scientific advance and technological progress. This article argues that while there are problems with standard formulations of the principle, Harris and Holm's rejection of all its forms is mistaken. In particular, they focus on strong versions of the principle and fail to recognize that weaker forms, which may escape their criticisms, are both possible and advocated in the literature.
Fundamental movement skills and habitual physical activity in young children.
Fisher, Abigail; Reilly, John J; Kelly, Louise A; Montgomery, Colette; Williamson, Avril; Paton, James Y; Grant, Stan
2005-04-01
To test for relationships between objectively measured habitual physical activity and fundamental movement skills in a relatively large and representative sample of preschool children. Physical activity was measured over 6 d using the Computer Science and Applications (CSA) accelerometer in 394 boys and girls (mean age 4.2, SD 0.5 yr). Children were scored on 15 fundamental movement skills, based on the Movement Assessment Battery, by a single observer. Total physical activity (r=0.10, P<0.05) and percent time spent in moderate to vigorous physical activity (MVPA) (r=0.18, P<0.001) were significantly correlated with total movement skills score. Time spent in light-intensity physical activity was not significantly correlated with motor skills score (r=0.02, P>0.05). In this sample and setting, fundamental movement skills were significantly associated with habitual physical activity, but the association between the two variables was weak. The present study questions whether the widely assumed relationships between motor skills and habitual physical activity actually exist in young children.
Environmental Science. An Experimental Programme for Primary Teachers.
ERIC Educational Resources Information Center
Linke, R. D.
An experimental course covering some of the fundamental principles and terminology associated with environmental science and the application of these principles to various contemporary problems is summarized in this report. The course involved a series of lectures together with a program of specific seminar and discussion topics presented by the…
Institutional Ethics and Values. The Fundamentals. Board Basics.
ERIC Educational Resources Information Center
Corts, Thomas E.
1998-01-01
This booklet offers trustees of institutions of higher education a guide to the ethical principles upon which the institution should base its decisions. An introductory section offers actual examples of unethical decisions by single officials for which the institution was responsible. The following sections each explain a principle and offer…
Epistemic Autonomy: A Criterion for Virtue?
ERIC Educational Resources Information Center
Mudd, Sasha
2013-01-01
Catherine Elgin proposes a novel principle for identifying epistemic virtue. Based loosely on Kant's Categorical Imperative, it identifies autonomy as our fundamental epistemic responsibility, and defines the epistemic virtues as those traits of character needed to exercise epistemic autonomy. I argue that Elgin's principle fails as a…
ERIC Educational Resources Information Center
Turner, Kenneth; Tevaarwerk, Emma; Unterman, Nathan; Grdinic, Marcel; Campbell, Jason; Chandrasekhar, Venkat; Chang, R. P. H.
2006-01-01
Nanoscience refers to the fundamental study of scientific phenomena, which occur at the nanoscale--nanotechnology to the exploitation of novel properties and functions of materials in the sub-100 nm size range. One of the underlying principles of science is development of models of observed phenomena. In biology, the Hardy-Weinberg principle is a…
ERIC Educational Resources Information Center
Ellis Associates, Inc., College Park, MD.
This training package is designed to present the basic principles of pesticide use, handling, and application. Included in this package is information on federal laws and regulations, personal safety, environmental implications, storage and disposal considerations, proper application procedures, and fundamentals of pest management. Successful…
ERIC Educational Resources Information Center
Ellis Associates, Inc., College Park, MD.
The training package is designed to present the basic principles of pesticide use, handling, and application. Included in this package is information on Federal laws and regulations, personal safety, environmental implications, storage and disposal considerations, proper application procedures, and fundamentals of pest management. Successful…
Basic Comfort Heating Principles.
ERIC Educational Resources Information Center
Dempster, Chalmer T.
The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…
This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...
Development of Canonical Transformations from Hamilton's Principle.
ERIC Educational Resources Information Center
Quade, C. Richard
1979-01-01
The theory of canonical transformations and its development are discussed with regard to its application to Hutton's principle. Included are the derivation of the equations of motion and a lack of symmetry in the formulaion with respect to Lagrangian and the fundamental commutator relations of quantum mechanics. (Author/SA)
How Do Students in an Innovative Principle-Based Mechanics Course Understand Energy Concepts?
ERIC Educational Resources Information Center
Ding, Lin; Chabay, Ruth; Sherwood, Bruce
2013-01-01
We investigated students' conceptual learning of energy topics in an innovative college-level introductory mechanics course, entitled Matter & Interactions (M&I) Modern Mechanics. This course differs from traditional curricula in that it emphasizes application of a small number of fundamental principles across various scales, involving…
Basic principles of variable speed drives
NASA Technical Reports Server (NTRS)
Loewenthal, S. H.
1973-01-01
An understanding of the principles which govern variable speed drive operation is discussed for successful drive application. The fundamental factors of torque, speed ratio, and power as they relate to drive selection are discussed. The basic types of variable speed drives, their operating characteristics and their applications are also presented.
Providing Experiential Business and Management Training for Biomedical Research Trainees
ERIC Educational Resources Information Center
Petrie, Kimberly A.; Carnahan, Robert H.; Brown, Abigail M.; Gould, Kathleen L.
2017-01-01
Many biomedical PhD trainees lack exposure to business principles, which limits their competitiveness and effectiveness in academic and industry careers. To fill this training gap, we developed Business and Management Principles for Scientists, a semester-long program that combined didactic exposure to business fundamentals with practical…
Experimental simulation of the Unruh effect on an NMR quantum simulator
NASA Astrophysics Data System (ADS)
Jin, FangZhou; Chen, HongWei; Rong, Xing; Zhou, Hui; Shi, MingJun; Zhang, Qi; Ju, ChenYong; Cai, YiFu; Luo, ShunLong; Peng, XinHua; Du, JiangFeng
2016-03-01
The Unruh effect is one of the most fundamental manifestations of the fact that the particle content of a field theory is observer dependent. However, there has been so far no experimental verification of this effect, as the associated temperatures lie far below any observable threshold. Recently, physical phenomena, which are of great experimental challenge, have been investigated by quantum simulations in various fields. Here we perform a proof-of-principle simulation of the evolution of fermionic modes under the Unruh effect with a nuclear magnetic resonance (NMR) quantum simulator. By the quantum simulator, we experimentally demonstrate the behavior of Unruh temperature with acceleration, and we further investigate the quantum correlations quantified by quantum discord between two fermionic modes as seen by two relatively accelerated observers. It is shown that the quantum correlations can be created by the Unruh effect from the classically correlated states. Our work may provide a promising way to explore the quantum physics of accelerated systems.
Generalized Knudsen Number for Oscillatory Flows Generated by MEMS and NEMS Resonators
NASA Astrophysics Data System (ADS)
Ekinci, Kamil; Kara, Vural; Yakhot, Victor
2017-11-01
We have explored the scaling behavior of oscillatory flows that are generated by the oscillations of MEMS and NEMS resonators in a gas. If the gas is gradually rarefied, the Navier-Stokes equations begin to fail and a kinetic description of the flow becomes more appropriate. The failure of the Navier-Stokes equations can be thought to take place via two different physical mechanisms: either the continuum hypothesis breaks down as a result of a finite size effect; or local equilibrium is violated due to the high rate of strain. By independently tuning the relevant linear dimensions and the frequencies of the MEMS and NEMS resonators, we experimentally observe these two different physical mechanisms. All the experimental data, however, can be collapsed using a single dimensionless scaling parameter that combines the linear dimension and the frequency of each resonator. This proposed Knudsen number for oscillatory flows is rooted in a fundamental symmetry principle, namely Galilean invariance. We acknowledge support from US NSF through Grant No. CBET-1604075.
Gamma-ray lines from neutron stars as probes of fundamental physics
NASA Technical Reports Server (NTRS)
Brecher, K.
1978-01-01
The detection of gamma-ray lines produced at the surface of neutron stars will serve to test both the strong and gravitational interactions under conditions unavailable in terrestrial laboratories. Observation of a single redshifted gamma-ray line, combined with an estimate of the mass of the star will serve as a strong constraint on allowable equations of state of matter at supernuclear densities. Detection of two redshifted lines arising from different physical processes at the neutron star surface can provide a test of the strong principle of equivalence. Expected fluxes of nuclear gamma-ray lines from accreting neutron stars were calculated, including threshold, radiative transfer and redshift effects. The most promising probes of neutron star structure are the deuterium formation line and the positron annihilation line. Detection of sharp redshifted gamma-ray lines from X-ray sources such as Cyg X-1 would argue strongly in favor of a neutron star rather than black hole identification for the object.
Barrett, Harrison H; Myers, Kyle J; Caucci, Luca
2014-08-17
A fundamental way of describing a photon-limited imaging system is in terms of a Poisson random process in spatial, angular and wavelength variables. The mean of this random process is the spectral radiance. The principle of conservation of radiance then allows a full characterization of the noise in the image (conditional on viewing a specified object). To elucidate these connections, we first review the definitions and basic properties of radiance as defined in terms of geometrical optics, radiology, physical optics and quantum optics. The propagation and conservation laws for radiance in each of these domains are reviewed. Then we distinguish four categories of imaging detectors that all respond in some way to the incident radiance, including the new category of photon-processing detectors. The relation between the radiance and the statistical properties of the detector output is discussed and related to task-based measures of image quality and the information content of a single detected photon.
Barrett, Harrison H.; Myers, Kyle J.; Caucci, Luca
2016-01-01
A fundamental way of describing a photon-limited imaging system is in terms of a Poisson random process in spatial, angular and wavelength variables. The mean of this random process is the spectral radiance. The principle of conservation of radiance then allows a full characterization of the noise in the image (conditional on viewing a specified object). To elucidate these connections, we first review the definitions and basic properties of radiance as defined in terms of geometrical optics, radiology, physical optics and quantum optics. The propagation and conservation laws for radiance in each of these domains are reviewed. Then we distinguish four categories of imaging detectors that all respond in some way to the incident radiance, including the new category of photon-processing detectors. The relation between the radiance and the statistical properties of the detector output is discussed and related to task-based measures of image quality and the information content of a single detected photon. PMID:27478293
Quantum Cause of Gravity Waves and Dark Matter
NASA Astrophysics Data System (ADS)
Goradia, Shantilal; Goradia Team
2016-09-01
Per Einstein's theory mass tells space how to curve and space tells mass how to move. How do they tell''? The question boils down to information created by quantum particles blinking ON and OFF analogous to `Ying and Yang' or some more complex ways that may include dark matter. If not, what creates curvature of space-time? Consciousness, dark matter, quantum physics, uncertainty principle, constants of nature like strong coupling, fine structure constant, cosmological constant introduced by Einstein, information, gravitation etc. are fundamentally consequences of that ONE TOE. Vedic philosophers, who impressed Schrodinger so much, called it ATMA split in the categories of AnuAtma (particle soul), JivAtma (life soul) and ParamAtma (Omnipresent soul) which we relate to quantum physics, biology and cosmology. There is no separate TOE for any one thing. The long range relativistic propagations of the strong and weak couplings of the microscopic black holes in are just gravity waves. What else could they be?
Modern Aspects of Liquid Metal Engineering
NASA Astrophysics Data System (ADS)
Czerwinski, Frank
2017-02-01
Liquid metal engineering (LME) refers to a variety of physical and/or chemical treatments of molten metals aimed at influencing their solidification characteristics. Although the fundamentals have been known for decades, only recent progress in understanding solidification mechanisms has renewed an interest in opportunities this technique creates for an improvement of castings. This review covers conventional and novel concepts of LME with their application to modern manufacturing techniques based not only on liquid but also on semisolid routes. The role of external forces applied to the melt combined with grain nucleation control is explained along with laboratory- and commercial-scale equipment designed for implementation of various concepts exploring mechanical, electromagnetic, and ultrasound principles. An influence of melt treatments on quality of the final product is considered through distinguishing between internal integrity of net shape components and the alloy microstructure. Recent global developments indicate that exploring the synergy of melt chemistry and physical treatments achieved through LME allows creating the optimum conditions for nucleation and growth during solidification, positively affecting quality of castings.
The physical mechanisms of complete denture retention.
Darvell, B W; Clark, R K
2000-09-09
The purpose of this article is to assist the practitioner to understand which factors are relevant to complete denture retention in the light of the current understanding of physics and materials science and thus to guide design. Atmospheric pressure, vacuum, adhesion, cohesion, surface tension, viscosity, base adaption, border seal, seating force and muscular control have all been cited at one time or another as major or contributory factors, but usually as an opinion without proper reference to fundamental principles. Although there has been a detailed analysis published, it seems appropriate that a restatement of the points in a collated form be made. In fact, denture retention is a dynamic issue dependent on the control of the flow of interposed fluid and thus its viscosity and film thickness, while the timescale of displacement loading affects the assessment. Surface tension forces at the periphery contribute to retention, but the most important concerns are good base adaptation and border seal. These must be achieved if full advantage is to be taken of the saliva flow-related effects.
Chapelle, D; Fragu, M; Mallet, V; Moireau, P
2013-11-01
We present the fundamental principles of data assimilation underlying the Verdandi library, and how they are articulated with the modular architecture of the library. This translates--in particular--into the definition of standardized interfaces through which the data assimilation library interoperates with the model simulation software and the so-called observation manager. We also survey various examples of data assimilation applied to the personalization of biophysical models, in particular, for cardiac modeling applications within the euHeart European project. This illustrates the power of data assimilation concepts in such novel applications, with tremendous potential in clinical diagnosis assistance.
A defense of fundamental principles and human rights: a reply to Robert Baker.
Macklin, Ruth
1998-12-01
This article seeks to rebut Robert Baker's contention that attempts to ground international bioethics in fundamental principles cannot withstand the challenges posed by multiculturalism and postmodernism. First, several corrections are provided of Baker's account of the conclusions reached by the Advisory Committee on Human Radiation Experiments. Second, a rebuttal is offered to Baker's claim that an unbridgeable moral gap exists between Western individualism and non-Western communalism. In conclusion, this article argues that Baker's "nonnegotiable primary goods" cannot do the work of "classical human rights" and that the latter framework is preferable from both a practical and a theoretical standpoint.
Fundamental principles in periodontal plastic surgery and mucosal augmentation--a narrative review.
Burkhardt, Rino; Lang, Niklaus P
2014-04-01
To provide a narrative review of the current literature elaborating on fundamental principles of periodontal plastic surgical procedures. Based on a presumptive outline of the narrative review, MESH terms have been used to search the relevant literature electronically in the PubMed and Cochrane Collaboration databases. If possible, systematic reviews were included. The review is divided into three phases associated with periodontal plastic surgery: a) pre-operative phase, b) surgical procedures and c) post-surgical care. The surgical procedures were discussed in the light of a) flap design and preparation, b) flap mobilization and c) flap adaptation and stabilization. Pre-operative paradigms include the optimal plaque control and smoking counselling. Fundamental principles in surgical procedures address basic knowledge in anatomy and vascularity, leading to novel appropriate flap designs with papilla preservation. Flap mobilization based on releasing incisions can be performed up to 5 mm. Flap adaptation and stabilization depend on appropriate wound bed characteristics, undisturbed blood clot formation, revascularization and wound stability through adequate suturing. Delicate tissue handling and tension free wound closure represent prerequisites for optimal healing outcomes. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Autonomous intelligent cars: proof that the EPSRC Principles are future-proof
NASA Astrophysics Data System (ADS)
de Cock Buning, Madeleine; de Bruin, Roeland
2017-07-01
Principle 2 of the EPSRC's principles of robotics (AISB workshop on Principles of Robotics, 2016) proves to be future proof when applied to the current state of the art of law and technology surrounding autonomous intelligent cars (AICs). Humans, not AICS, are responsible agents. AICs should be designed; operated as far as is practicable to comply with existing laws and fundamental rights and freedoms, including privacy by design. It will show that some legal questions arising from autonomous intelligent driving technology can be answered by the technology itself.
Safe use of cellular telephones in hospitals: fundamental principles and case studies.
Cohen, Ted; Ellis, Willard S; Morrissey, Joseph J; Bakuzonis, Craig; David, Yadin; Paperman, W David
2005-01-01
Many industries and individuals have embraced cellular telephones. They provide mobile, synchronous communication, which could hypothetically increase the efficiency and safety of inpatient healthcare. However, reports of early analog cellular telephones interfering with critical life-support machines had led many hospitals to strictly prohibit cellular telephones. A literature search revealed that individual hospitals now are allowing cellular telephone use with various policies to prevent electromagnetic interference with medical devices. The fundamental principles underlying electromagnetic interference are immunity, frequency, modulation technology, distance, and power Electromagnetic interference risk mitigation methods based on these principles have been successfully implemented. In one case study, a minimum distance between cellular telephones and medical devices is maintained, with restrictions in critical areas. In another case study, cellular telephone coverage is augmented to automatically control the power of the cellular telephone. While no uniform safety standard yet exists, cellular telephones can be safely used in hospitals when their use is managed carefully.
Detection principles of biological and chemical FET sensors.
Kaisti, Matti
2017-12-15
The seminal importance of detecting ions and molecules for point-of-care tests has driven the search for more sensitive, specific, and robust sensors. Electronic detection holds promise for future miniaturized in-situ applications and can be integrated into existing electronic manufacturing processes and technology. The resulting small devices will be inherently well suited for multiplexed and parallel detection. In this review, different field-effect transistor (FET) structures and detection principles are discussed, including label-free and indirect detection mechanisms. The fundamental detection principle governing every potentiometric sensor is introduced, and different state-of-the-art FET sensor structures are reviewed. This is followed by an analysis of electrolyte interfaces and their influence on sensor operation. Finally, the fundamentals of different detection mechanisms are reviewed and some detection schemes are discussed. In the conclusion, current commercial efforts are briefly considered. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Electric current activated/assisted sintering (ECAS): a review of patents 1906–2008
Grasso, Salvatore; Sakka, Yoshio; Maizza, Giovanni
2009-01-01
The electric current activated/assisted sintering (ECAS) is an ever growing class of versatile techniques for sintering particulate materials. Despite the tremendous advances over the last two decades in ECASed materials and products there is a lack of comprehensive reviews on ECAS apparatuses and methods. This paper fills the gap by tracing the progress of ECAS technology from 1906 to 2008 and surveys 642 ECAS patents published over more than a century. It is found that the ECAS technology was pioneered by Bloxam (1906 GB Patent No. 9020) who developed the first resistive sintering apparatus. The patents were searched by keywords or by cross-links and were withdrawn from the Japanese Patent Office (342 patents), the United States Patent and Trademark Office (175 patents), the Chinese State Intellectual Property Office of P.R.C. (69 patents) and the World Intellectual Property Organization (12 patents). A subset of 119 (out of 642) ECAS patents on methods and apparatuses was selected and described in detail with respect to their fundamental concepts, physical principles and importance in either present ECAS apparatuses or future ECAS technologies for enhancing efficiency, reliability, repeatability, controllability and productivity. The paper is divided into two parts, the first deals with the basic concepts, features and definitions of basic ECAS and the second analyzes the auxiliary devices/peripherals. The basic ECAS is classified with reference to discharge time (fast and ultrafast ECAS). The fundamental principles and definitions of ECAS are outlined in accordance with the scientific and patent literature. PMID:27877308
West, Geoffrey B; Brown, James H
2005-05-01
Life is the most complex physical phenomenon in the Universe, manifesting an extraordinary diversity of form and function over an enormous scale from the largest animals and plants to the smallest microbes and subcellular units. Despite this many of its most fundamental and complex phenomena scale with size in a surprisingly simple fashion. For example, metabolic rate scales as the 3/4-power of mass over 27 orders of magnitude, from molecular and intracellular levels up to the largest organisms. Similarly, time-scales (such as lifespans and growth rates) and sizes (such as bacterial genome lengths, tree heights and mitochondrial densities) scale with exponents that are typically simple powers of 1/4. The universality and simplicity of these relationships suggest that fundamental universal principles underly much of the coarse-grained generic structure and organisation of living systems. We have proposed a set of principles based on the observation that almost all life is sustained by hierarchical branching networks, which we assume have invariant terminal units, are space-filling and are optimised by the process of natural selection. We show how these general constraints explain quarter power scaling and lead to a quantitative, predictive theory that captures many of the essential features of diverse biological systems. Examples considered include animal circulatory systems, plant vascular systems, growth, mitochondrial densities, and the concept of a universal molecular clock. Temperature considerations, dimensionality and the role of invariants are discussed. Criticisms and controversies associated with this approach are also addressed.
NASA Astrophysics Data System (ADS)
Salleh, Khalijah Mohd; Abdullah, Abu Bakar Bin
2008-05-01
An explorative study was carried out to confirm Malaysian Physics teachers' perception that Archimedes' principle is a difficult topic for secondary level students. The interview method was used for data collection. The study sample was made of nine national secondary schools teachers from Miri, Sarawak. The data was analysed qualitatively using the Atlas-ti version 5.2 software. The findings of the study showed that i) Archimedes' principle as compared to Bernoulli's and Pascal's is the most difficult principle of hydrodynamics for students, ii) more time was given in the teaching and learning (TL) of Archimedes principle compared to the other two principles, iii) the major TL problems include conceptual understanding, application of physics principles and ideas, and lack of mathematical skills. These findings implicate the need to develop corresponding instructional materials and learning kits that can assist students' understanding of Archimedes' principle.
A Content-Driven Approach to Visual Literacy: Gestalt Rediscovered.
ERIC Educational Resources Information Center
Schamber, Linda
The goal of an introductory graphics course is fundamental visual literacy, which includes learning to appreciate the power of visuals in communication and to express ideas visually. Traditional principles of design--the focus of the course--are based on more fundamental gestalt theory, which relates to human pattern-seeking behavior, particularly…
On-Line Syntax: Thoughts on the Temporality of Spoken Language
ERIC Educational Resources Information Center
Auer, Peter
2009-01-01
One fundamental difference between spoken and written language has to do with the "linearity" of speaking in time, in that the temporal structure of speaking is inherently the outcome of an interactive process between speaker and listener. But despite the status of "linearity" as one of Saussure's fundamental principles, in practice little more…
Religious Fundamentalism among Young Muslims in Egypt and Saudi Arabia
ERIC Educational Resources Information Center
Moaddel, Mansoor; Karabenick, Stuart A.
2008-01-01
Religious fundamentalism is conceived as a distinctive set of beliefs and attitudes toward one's religion, including obedience to religious norms, belief in the universality and immutability of its principles, the validity of its claims, and its indispensability for human happiness. Surveys of Egyptian and Saudi youth, ages 18-25, reveal that…
Introduction to the Control of Electric Motors.
ERIC Educational Resources Information Center
Spencer, Frederick
The fundamentals of electric circuits and electric machines are presented in the text, with an emphasis on the practical operation rather than on mathematical analyses of theories involved. The material contained in the text includes the fundamentals of both D.C. and A.C. circuits together with the principles of magnetism and electro-magnetic…
The Sun to the Earth - and Beyond: A Decadal Research Strategy in Solar and Space Physics
NASA Technical Reports Server (NTRS)
2003-01-01
The sun is the source of energy for life on earth and is the strongest modulator of the human physical environment. In fact, the Sun's influence extends throughout the solar system, both through photons, which provide heat, light, and ionization, and through the continuous outflow of a magnetized, supersonic ionized gas known as the solar wind. While the accomplishments of the past decade have answered important questions about the physics of the Sun, the interplanetary medium, and the space environments of Earth and other solar system bodies, they have also highlighted other questions, some of which are long-standing and fundamental. The Sun to the Earth--and Beyond organizes these questions in terms of five challenges that are expected to be the focus of scientific investigations in solar and space physics during the coming decade and beyond. While the accomplishments of the past decades have answered important questions about the physics of the Sun, the interplanetary medium, and the space environments of Earth and other solar system bodies, they have also highlighted other questions, some of which are long-standing and fundamental. This report organizes these questions in terms of five challenges that are expected to be the focus of scientific investigations in solar and space physics during the coming decade and beyond: Challenge 1: Understanding the structure and dynamics of the Sun's interior, the generation of solar magnetic fields, the origin of the solar cycle, the causes of solar activity, and the structure and dynamics of the corona. Challenge 2: Understanding heliospheric structure, the distribution of magnetic fields and matter throughout the solar system, and the interaction of the solar atmosphere with the local interstellar medium. Challenge 3: Understanding the space environments of Earth and other solar system bodies and their dynamical response to external and internal influences. Challenge 4: Understanding the basic physical principles manifest in processes observed in solar and space plasmas. Challenge 5: Developing a near-real-time predictive capability for understanding and quantifying the impact on human activities of dynamical processes at the Sun, in the interplanetary medium, and in Earth's magnetosphere and ionosphere. This report summarizes the state of knowledge about the total heliospheric system, poses key scientific questions for further research, and presents an integrated research strategy, with prioritized initiatives, for the next decade. The recommended strategy embraces both basic research programs and targeted basic research activities that will enhance knowledge and prediction of space weather effects on Earth. The report emphasizes the importance of understanding the Sun, the heliosphere, and planetary magnetospheres and ionospheres as astrophysical objects and as laboratories for the investigation of fundamental plasma physics phenomena.
Applying Universal Design for Learning in Online Courses: Pedagogical and Practical Considerations
ERIC Educational Resources Information Center
Dell, Cindy Ann; Dell, Thomas F.; Blackwell, Terry L.
2015-01-01
Inclusion of the universal design for learning (UDL) model as a guiding set of principles for online curriculum development in higher education is discussed. Fundamentally, UDL provides the student with multiple means of accessing the course based on three overarching principles: presentation; action and expression; and engagement and interaction.…
Devising Principles of Design for Numeracy Tasks
ERIC Educational Resources Information Center
Geiger, Vince; Forgasz, Helen; Goos, Merrilyn; Bennison, Anne
2014-01-01
Numeracy is a fundamental component of the Australian National Curriculum as a General Capability identified in each F-10 subject. In this paper, we consider the principles of design necessary for the development of numeracy tasks specific to subjects other than mathematics--in this case, the subject of English. We explore the nature of potential…
Invisible Ink Revealed: Concept, Context, and Chemical Principles of "Cold War" Writing
ERIC Educational Resources Information Center
Macrakis, Kristie; Bell, Elizabeth K.; Perry, Dale L.; Sweeder, Ryan D.
2012-01-01
By modifying secret writing formulas uncovered from the archives of the East German Ministry of State Security (MfS or Stasi), a novel general chemistry secret writing laboratory was developed. The laboratory combines science and history that highlights several fundamental chemical principles related to the writing. These include catalysis, redox…
Daniel J. Fairbanks
2001-01-01
In 1866, Gregor Mendel published his experiments on heredity in the garden pea (Pisum sativum). The fundamental principles of inheritance derived from his work apply to nearly all eukaryotic species and are now known as Mendelian principles. Since 1900, Mendel has been recognized as the founder of genetics. In 1900, three botanists, Carl Correns, Hugo De Vries, and...
ERIC Educational Resources Information Center
Barroso, Luciana R.; Morgan, James R.
2012-01-01
This paper describes the creation and evolution of an undergraduate dynamics and vibrations course for civil engineering students. Incorporating vibrations into the course allows students to see and study "real" civil engineering applications of the course content. This connection of academic principles to real life situations is in…
The Didactic Principles and Their Applications in the Didactic Activity
ERIC Educational Resources Information Center
Marius-Costel, Esi
2010-01-01
The evaluation and reevaluation of the fundamental didactic principles suppose the acceptance at the level of an instructive-educative activity of a new educational paradigm. Thus, its understanding implies an assumption at a conceptual-theoretical level of some approaches where the didactic aspects find their usefulness by relating to value…
From Barrier Free to Safe Environments: The New Zealand Experience. Monograph #44.
ERIC Educational Resources Information Center
Wrightson, William; Pope, Campbell
Intrinsically safe design is presented as a logical extension of the principles of barrier free design, and as a higher level design strategy for effecting widespread implementation of the basic accessibility requirements for people with disabilities. Two fundamental planning procedures are proposed: including principles of safe and accessible…
Compression as a Universal Principle of Animal Behavior
ERIC Educational Resources Information Center
Ferrer-i-Cancho, Ramon; Hernández-Fernández, Antoni; Lusseau, David; Agoramoorthy, Govindasamy; Hsu, Minna J.; Semple, Stuart
2013-01-01
A key aim in biology and psychology is to identify fundamental principles underpinning the behavior of animals, including humans. Analyses of human language and the behavior of a range of non-human animal species have provided evidence for a common pattern underlying diverse behavioral phenomena: Words follow Zipf's law of brevity (the…
ERIC Educational Resources Information Center
McIlvane, William J.
2009-01-01
Throughout its history, laboratory research in the experimental analysis of behavior has been successful in elucidating and clarifying basic learning principles and processes in both humans and nonhumans. In parallel, applied behavior analysis has shown how fundamental behavior-analytic principles and procedures can be employed to promote…
Margaritelis, Nikos V; Cobley, James N; Paschalis, Vassilis; Veskoukis, Aristidis S; Theodorou, Anastasios A; Kyparos, Antonios; Nikolaidis, Michalis G
2016-04-01
The equivocal role of reactive species and redox signaling in exercise responses and adaptations is an example clearly showing the inadequacy of current redox biology research to shed light on fundamental biological processes in vivo. Part of the answer probably relies on the extreme complexity of the in vivo redox biology and the limitations of the currently applied methodological and experimental tools. We propose six fundamental principles that should be considered in future studies to mechanistically link reactive species production to exercise responses or adaptations: 1) identify and quantify the reactive species, 2) determine the potential signaling properties of the reactive species, 3) detect the sources of reactive species, 4) locate the domain modified and verify the (ir)reversibility of post-translational modifications, 5) establish causality between redox and physiological measurements, 6) use selective and targeted antioxidants. Fulfilling these principles requires an idealized human experimental setting, which is certainly a utopia. Thus, researchers should choose to satisfy those principles, which, based on scientific evidence, are most critical for their specific research question. Copyright © 2015 Elsevier Inc. All rights reserved.
The Fundamental Neutron Physics Facilities at NIST.
Nico, J S; Arif, M; Dewey, M S; Gentile, T R; Gilliam, D M; Huffman, P R; Jacobson, D L; Thompson, A K
2005-01-01
The program in fundamental neutron physics at the National Institute of Standards and Technology (NIST) began nearly two decades ago. The Neutron Interactions and Dosimetry Group currently maintains four neutron beam lines dedicated to studies of fundamental neutron interactions. The neutrons are provided by the NIST Center for Neutron Research, a national user facility for studies that include condensed matter physics, materials science, nuclear chemistry, and biological science. The beam lines for fundamental physics experiments include a high-intensity polychromatic beam, a 0.496 nm monochromatic beam, a 0.89 nm monochromatic beam, and a neutron interferometer and optics facility. This paper discusses some of the parameters of the beam lines along with brief presentations of some of the experiments performed at the facilities.
The Fundamental Neutron Physics Facilities at NIST
Nico, J. S.; Arif, M.; Dewey, M. S.; Gentile, T. R.; Gilliam, D. M.; Huffman, P. R.; Jacobson, D. L.; Thompson, A. K.
2005-01-01
The program in fundamental neutron physics at the National Institute of Standards and Technology (NIST) began nearly two decades ago. The Neutron Interactions and Dosimetry Group currently maintains four neutron beam lines dedicated to studies of fundamental neutron interactions. The neutrons are provided by the NIST Center for Neutron Research, a national user facility for studies that include condensed matter physics, materials science, nuclear chemistry, and biological science. The beam lines for fundamental physics experiments include a high-intensity polychromatic beam, a 0.496 nm monochromatic beam, a 0.89 nm monochromatic beam, and a neutron interferometer and optics facility. This paper discusses some of the parameters of the beam lines along with brief presentations of some of the experiments performed at the facilities. PMID:27308110
Measuring Motor Skill Learning--A Practical Application
ERIC Educational Resources Information Center
Kovacs, Christopher R.
2008-01-01
The assessment of fundamental motor skills in early learners is critical to the overall well-being and physical development of the students within the physical education setting. Olrich (2002) has suggested that any physical education program must be designed to assess both measures of physical fitness and fundamental motor skills in all students.…
The principles of teratology: are they still true?
Friedman, Jan M
2010-10-01
James Wilson originally proposed a set of "Principles of Teratology" in 1959, the year before he helped to found the Teratology Society. By 1977, when these Principles were presented in a more definitive form in Wilson and Fraser's Handbook of Teratology, they had become a standard formulation of the basic tenets of the field. Wilson's Principles have continued to guide scientific research in teratology, and they are widely used in teaching. Recent advances in our knowledge of the molecular and cellular bases of embryogenesis serve only to provide a deeper understanding of the fundamental developmental mechanisms that underlie Wilson's Principles of Teratology. © 2010 Wiley-Liss, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Energy Literacy: Essential Principles and Fundamental Concepts for Energy Education presents energy concepts that, if understood and applied, will help individuals and communities make informed energy decisions.
On Ruch's Principle of Decreasing Mixing Distance in classical statistical physics
NASA Astrophysics Data System (ADS)
Busch, Paul; Quadt, Ralf
1990-10-01
Ruch's Principle of Decreasing Mixing Distance is reviewed as a statistical physical principle and its basic suport and geometric interpretation, the Ruch-Schranner-Seligman theorem, is generalized to be applicable to a large representative class of classical statistical systems.
NASA Astrophysics Data System (ADS)
Yakovlev, A. A.; Sorokin, V. S.; Mishustina, S. N.; Proidakova, N. V.; Postupaeva, S. G.
2017-01-01
The article describes a new method of search design of refrigerating systems, the basis of which is represented by a graph model of the physical operating principle based on thermodynamical description of physical processes. The mathematical model of the physical operating principle has been substantiated, and the basic abstract theorems relatively semantic load applied to nodes and edges of the graph have been represented. The necessity and the physical operating principle, sufficient for the given model and intended for the considered device class, were demonstrated by the example of a vapour-compression refrigerating plant. The example of obtaining a multitude of engineering solutions of a vapour-compression refrigerating plant has been considered.
Simulation and Experimentation in an Astronomy Laboratory, Part II
NASA Astrophysics Data System (ADS)
Maloney, F. P.; Maurone, P. A.; Hones, M.
1995-12-01
The availability of low-cost, high-performance computing hardware and software has transformed the manner by which astronomical concepts can be re-discovered and explored in a laboratory that accompanies an astronomy course for non-scientist students. We report on a strategy for allowing each student to understand fundamental scientific principles by interactively confronting astronomical and physical phenomena, through direct observation and by computer simulation. Direct observation of physical phenomena, such as Hooke's Law, begins by using a computer and hardware interface as a data-collection and presentation tool. In this way, the student is encouraged to explore the physical conditions of the experiment and re-discover the fundamentals involved. The hardware frees the student from the tedium of manual data collection and presentation, and permits experimental design which utilizes data that would otherwise be too fleeting, too imprecise, or too voluminous. Computer simulation of astronomical phenomena allows the student to travel in time and space, freed from the vagaries of weather, to re-discover such phenomena as the daily and yearly cycles, the reason for the seasons, the saros, and Kepler's Laws. By integrating the knowledge gained by experimentation and simulation, the student can understand both the scientific concepts and the methods by which they are discovered and explored. Further, students are encouraged to place these discoveries in an historical context, by discovering, for example, the night sky as seen by the survivors of the sinking Titanic, or Halley's comet as depicted on the Bayeux tapestry. We report on the continuing development of these laboratory experiments. Futher details and the text for the experiments are available at the following site: http://astro4.ast.vill.edu/ This work is supported by a grant from The Pew Charitable Trusts.
Intense fusion neutron sources
NASA Astrophysics Data System (ADS)
Kuteev, B. V.; Goncharov, P. R.; Sergeev, V. Yu.; Khripunov, V. I.
2010-04-01
The review describes physical principles underlying efficient production of free neutrons, up-to-date possibilities and prospects of creating fission and fusion neutron sources with intensities of 1015-1021 neutrons/s, and schemes of production and application of neutrons in fusion-fission hybrid systems. The physical processes and parameters of high-temperature plasmas are considered at which optimal conditions for producing the largest number of fusion neutrons in systems with magnetic and inertial plasma confinement are achieved. The proposed plasma methods for neutron production are compared with other methods based on fusion reactions in nonplasma media, fission reactions, spallation, and muon catalysis. At present, intense neutron fluxes are mainly used in nanotechnology, biotechnology, material science, and military and fundamental research. In the near future (10-20 years), it will be possible to apply high-power neutron sources in fusion-fission hybrid systems for producing hydrogen, electric power, and technological heat, as well as for manufacturing synthetic nuclear fuel and closing the nuclear fuel cycle. Neutron sources with intensities approaching 1020 neutrons/s may radically change the structure of power industry and considerably influence the fundamental and applied science and innovation technologies. Along with utilizing the energy produced in fusion reactions, the achievement of such high neutron intensities may stimulate wide application of subcritical fast nuclear reactors controlled by neutron sources. Superpower neutron sources will allow one to solve many problems of neutron diagnostics, monitor nano-and biological objects, and carry out radiation testing and modification of volumetric properties of materials at the industrial level. Such sources will considerably (up to 100 times) improve the accuracy of neutron physics experiments and will provide a better understanding of the structure of matter, including that of the neutron itself.
Contemporary Physics Education Project - CPEP
Fundamental Particles Plasma Physics & Fusion History & Fate of the Universe Nuclear current understanding of the fundamental nature of matter and energy, incorporating the major research
Selecting the correct solution to a physics problem when given several possibilities
NASA Astrophysics Data System (ADS)
Richards, Evan Thomas
Despite decades of research on what learning actions are associated with effective learners (Palincsar and Brown, 1984; Atkinson, et al., 2000), the literature has not fully addressed how to cue those actions (particularly within the realm of physics). Recent reforms that integrate incorrect solutions suggest a possible avenue to reach those actions. However, there is only a limited understanding as to what actions are invoked with such reforms (Grosse and Renkl, 2007). This paper reports on a study that tasked participants with selecting the correct solution to a physics problem when given three possible solutions, where only one of the solutions was correct and the other two solutions contained errors. Think aloud protocol data (Ericsson and Simon, 1993) was analyzed per a framework adapted from Palincsar and Brown (1984). Cued actions were indeed connected to those identified in the worked example literature. Particularly satisfying is the presence of internal consistency checks (i.e., are the solutions self-consistent?), which is a behavior predicted by the Palincsar and Brown (1984) framework, but not explored in the worked example realm. Participant discussions were also found to be associated with those physics-related solution features that were varied across solutions (such as fundamental principle selection or system and surroundings selections).
Development of a Hands-On Survey Course in the Physics of Living Systems
NASA Astrophysics Data System (ADS)
Matthews, Megan; Goldman, Daniel I.
Due to the widespread availability and technological capabilities of modern smartphones, many biophysical systems can be investigated using easily accessible, low-cost, and/or ``homemade'' equipment. Our survey course is structured to provide students with an overview of research in the physics of living systems, emphasizing the interplay between measurement, mechanism, and modeling required to understand principles at the intersection of physics and biology. The course proceeds through seven modules each consisting of one week of lectures and one week of hands-on experiments, called ``microlabs''. Using smartphones, Arduinos, and 3D printed materials students create their own laboratory equipment, including a 150X van Leeuwenhoek microscope, a shaking incubator, and an oscilloscope, and then use them to study biological systems ranging in length scales from nanometers to meters. These systems include population dynamics of rotifer/algae cultures, experimental evolution of multicellularity in budding yeast, and the bio- & neuromechanics involved in animal locomotion, among others. In each module, students are introduced to fundamental biological and physical concepts as well as theoretical and computational tools (nonlinear dynamics, molecular dynamics simulation, and statistical mechanics). At the end of the course, students apply these concepts and tools to the creation of their own microlab that integrates hands-on experimentation and modeling in the study of their chosen biophysical system.
Generalized statistical mechanics approaches to earthquakes and tectonics.
Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios
2016-12-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.
Physics Without Causality — Theory and Evidence
NASA Astrophysics Data System (ADS)
Shoup, Richard
2006-10-01
The principle of cause and effect is deeply rooted in human experience, so much so that it is routinely and tacitly assumed throughout science, even by scientists working in areas where time symmetry is theoretically ingrained, as it is in both classical and quantum physics. Experiments are said to cause their results, not the other way around. In this informal paper, we argue that this assumption should be replaced with a more general notion of mutual influence — bi-directional relations or constraints on joint values of two or more variables. From an analysis based on quantum entropy, it is proposed that quantum measurement is a unitary three-interaction, with no collapse, no fundamental randomness, and no barrier to backward influence. Experimental results suggesting retrocausality are seen frequently in well-controlled laboratory experiments in parapsychology and elsewhere, especially where a random element is included. Certain common characteristics of these experiments give the appearance of contradicting well-established physical laws, thus providing an opportunity for deeper understanding and important clues that must be addressed by any explanatory theory. We discuss how retrocausal effects and other anomalous phenomena can be explained without major injury to existing physical theory. A modified quantum formalism can give new insights into the nature of quantum measurement, randomness, entanglement, causality, and time.
Generalized statistical mechanics approaches to earthquakes and tectonics
Papadakis, Giorgos; Michas, Georgios
2016-01-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548
NASA Astrophysics Data System (ADS)
Sabino, Fernando P.; Besse, Rafael; Oliveira, Luiz Nunes; Wei, Su-Huai; Da Silva, Juarez L. F.
2015-11-01
Good transparent conducting oxides (TCOs), such as In2O3 :Sn (ITO), usually combine large optical band gaps, essential for high transparency, with relatively small fundamental band gaps due to low conduction-band minima, which favor n -type doping and enhance the electrical conductivity. It has been understood that the optical band gaps are wider than the fundamental band gaps because optical transitions between the band-edge states are forbidden. The mechanism blocking such transitions, which can play a crucial role in the designing of alternative TCOs, nonetheless remains obscure. Here, based on first-principles density functional theory calculations and symmetry analysis of three oxides, M2O3 (M =Al ,Ga ,In ), we identify the physical origin of the gap disparities. Three conditions are necessary: (1) the crystal structure must have global inversion symmetry; (2) in order to belong to the Ag or A1 g irreducible representations, the states at the conduction-band minimum must have cation and oxygen s character; (3) in order to have g parity, the oxygen p orbitals constituting the states near the valence-band maximum must be strongly coupled to the cation d orbitals. Under these conditions, optical excitations across the fundamental gap will be forbidden. The three criteria explain the trends in the M2O3 (M =Al,Ga,In) sequence, in particular, explaining why In2O3 in the bixbyite structure yields the highest figure of merit. Our study provides guidelines expected to be instrumental in the search for new TCO materials.
Clifford Algebra Implying Three Fermion Generations Revisited
NASA Astrophysics Data System (ADS)
Krolikowski, Wojciech
2002-09-01
The author's idea of algebraic compositeness of fundamental particles, allowing to understand the existence in Nature of three fermion generations, is revisited. It is based on two postulates. Primo, for all fundamental particles of matter the Dirac square-root procedure √ {p2} → {Γ }(N)p works, leading to a sequence N = 1,2,3, ... of Dirac-type equations, where four Dirac-type matrices {Γ }(N)μ are embedded into a Clifford algebra via a Jacobi definition introducing four ``centre-of-mass'' and (N-1)× four ``relative'' Dirac-type matrices. These define one ``centre-of-mass'' and (N-1) ``relative'' Dirac bispinor indices. Secundo, the ``centre-of-mass'' Dirac bispinor index is coupled to the Standard Model gauge fields, while (N-1) ``relative'' Dirac bispinor indices are all free indistinguishable physical objects obeying Fermi statistics along with the Pauli principle which requires the full antisymmetry with respect to ``relative'' Dirac indices. This allows only for three Dirac-type equations with N = 1,3,5 in the case of N odd, and two with N = 2,4 in the case of N even. The first of these results implies unavoidably the existence of three and only three generations of fundamental fermions, namely leptons and quarks, as labelled by the Standard Model signature. At the end, a comment is added on the possible shape of Dirac 3x3 mass matrices for four sorts of spin-1/2 fundamental fermions appearing in three generations. For charged leptons a prediction is mτ = 1776.80 MeV, when the input of experimental me and mμ is used.
Space shuttle’s liftoff: a didactical model
NASA Astrophysics Data System (ADS)
Borghi, Riccardo; Spinozzi, Turi Maria
2017-07-01
The pedagogical aim of the present paper, thought for an undergraduate audience, is to help students to appreciate how the development of elementary models based on physics first principles is a fundamental and necessary preliminary step for the behaviour of complex real systems to be grasped with minimal amounts of math. In some particularly fortunate cases, such models also show reasonably good results when are compared to reality. The speed behaviour of the Space Shuttle during its first two minutes of flight from liftoff is here analysed from such a didactical point of view. Only the momentum conservation law is employed to develop the model, which is eventually applied to quantitatively interpret the telemetry of the 2011 last launches of Shuttle Discovery and Shuttle Endeavour. To the STS-51-L and STS-107 astronauts, in memoriam.
NASA Astrophysics Data System (ADS)
Grein, C. H.; John, Sajeev
1990-04-01
We present the results of a parameter-free first-principles theory for the fine structure of the Urbach optical-absorption edge in crystalline and disordered semiconductors. The dominant features are recaptured by means of a simple physical argument based on the most probable potential-well analogy. At finite temperatures, the overall linear exponential Urbach behavior of the subgap optical-absorption coefficient is a consequence of multiple LA-phonon emission and absorption sidebands that accompany the electronic transition. The fine structure of subgap absorption spectra observed in some materials is accounted for by multiple TO-, LO-, and TA-phonon absorption and emission sidebands. Good agreement is found with experimental data on crystalline silicon. The effects of nonadiabaticity in the electron-phonon interaction are calculated.
Quantum Dots in Diagnostics and Detection: Principles and Paradigms
Pisanic, T. R.; Zhang, Y.; Wang, T. H.
2014-01-01
Quantum dots are semiconductor nanocrystals that exhibit exceptional optical and electrical behaviors not found in their bulk counterparts. Following seminal work in the development of water-soluble quantum dots in the late 1990's, researchers have sought to develop interesting and novel ways of exploiting the extraordinary properties of quantum dots for biomedical applications. Since that time, over 10,000 articles have been published related to the use of quantum dots in biomedicine, many of which regard their use in detection and diagnostic bioassays. This review presents a didactic overview of fundamental physical phenomena associated with quantum dots and paradigm examples of how these phenomena can and have been readily exploited for manifold uses in nanobiotechnology with a specific focus on their implementation in in vitro diagnostic assays and biodetection. PMID:24770716
Zhang, Zhi-jian; Liu, Meng; Zhu, Jun
2013-05-01
There is a growing attention on the environmental pollution and loss of potential regeneration of resources due to the poor handling of organic wastes, while earthworm vermicomposting and larvae bioconversion are well-known as two promising biotechnologies for sustainable wastes treatments, where earthworms or housefly larvae are employed to convert the organic wastes into humus like material, together with value-added worm product. Taken earthworm ( Eisenia foetida) and housefly larvae ( Musca domestica) as model species, this work illustrates fundamental definition and principle, operational process, technical mechanism, main factors, and bio-chemical features of organisms of these two technologies. Integrated with the physical and biochemical mechanisms, processes of biomass conversion, intestinal digestion, enzyme degradation and microflora decomposition are comprehensively reviewed on waste treatments with purposes of waste reduction, value-addition, and stabilization.