Science.gov

Sample records for minimization fundamental principles

  1. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)]. PMID:26382367

  2. Principle of minimal work fluctuations

    NASA Astrophysics Data System (ADS)

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality =e-β Δ F , a change in the fluctuations of e-β W may impact how rapidly the statistical average of e-β W converges towards the theoretical value e-β Δ F, where W is the work, β is the inverse temperature, and Δ F is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-β W. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-β W, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-β W. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014), 10.1103/PhysRevE.90.052132].

  3. Fundamental base closure environmental principles

    SciTech Connect

    Yim, R.A.

    1994-12-31

    Military base closures present a paradox. The rate, scale and timing of military base closures is historically unique. However, each base itself typically does not present unique problems. Thus, the challenge is to design innovative solutions to base redevelopment and remediation issues, while simultaneously adopting common, streamlined or pre-approved strategies to shared problems. The author presents six environmental principles that are fundamental to base closure. They are: remediation not clean up; remediation will impact reuse; reuse will impact remediation; remediation and reuse must be coordinated; environmental contamination must be evaluated as any other initial physical constraint on development, not as an overlay after plans are created; and remediation will impact development, financing and marketability.

  4. Responsible gambling: general principles and minimal requirements.

    PubMed

    Blaszczynski, Alex; Collins, Peter; Fong, Davis; Ladouceur, Robert; Nower, Lia; Shaffer, Howard J; Tavares, Hermano; Venisse, Jean-Luc

    2011-12-01

    Many international jurisdictions have introduced responsible gambling programs. These programs intend to minimize negative consequences of excessive gambling, but vary considerably in their aims, focus, and content. Many responsible gambling programs lack a conceptual framework and, in the absence of empirical data, their components are based only on general considerations and impressions. This paper outlines the consensus viewpoint of an international group of researchers suggesting fundamental responsible gambling principles, roles of key stakeholders, and minimal requirements that stakeholders can use to frame and inform responsible gambling programs across jurisdictions. Such a framework does not purport to offer value statements regarding the legal status of gambling or its expansion. Rather, it proposes gambling-related initiatives aimed at government, industry, and individuals to promote responsible gambling and consumer protection. This paper argues that there is a set of basic principles and minimal requirements that should form the basis for every responsible gambling program. PMID:21359586

  5. Gas cell neutralizers (Fundamental principles)

    SciTech Connect

    Fuehrer, B.

    1985-06-01

    Neutralizing an ion-beam of the size and energy levels involved in the neutral-particle-beam program represents a considerable extension of the state-of-the-art of neutralizer technology. Many different mediums (e.g., solid, liquid, gas, plasma, photons) can be used to strip the hydrogen ion of its extra electron. A large, multidisciplinary R and D effort will no doubt be required to sort out all of the ''pros and cons'' of these various techniques. The purpose of this particular presentation is to discuss some basic configurations and fundamental principles of the gas type of neutralizer cell. Particular emphasis is placed on the ''Gasdynamic Free-Jet'' neutralizer since this configuration has the potential of being much shorter than other type of gas cells (in the beam direction) and it could operate in nearly a continuous mode (CW) if necessary. These were important considerations in the ATSU design which is discussed in some detail in the second presentation entitled ''ATSU Point Design''.

  6. Fundamental principles of robot vision

    NASA Astrophysics Data System (ADS)

    Hall, Ernest L.

    1993-08-01

    Robot vision is a specialty of intelligent machines which describes the interaction between robotic manipulators and machine vision. Early robot vision systems were built to demonstrate that a robot with vision could adapt to changes in its environment. More recently attention is being directed toward machines with expanded adaptation and learning capabilities. The use of robot vision for automatic inspection and recognition of objects for manipulation by an industrial robot or for guidance of a mobile robot are two primary applications. Adaptation and learning characteristics are often lacking in industrial automation and if they can be added successfully, result in a more robust system. Due to a real time requirement, the robot vision methods that have proven most successful have been ones which could be reduced to a simple, fast computation. The purpose of this paper is to discuss some of the fundamental concepts in sufficient detail to provide a starting point for the interested engineer or scientist. A detailed example of a camera system viewing an object and for a simple, two dimensional robot vision system is presented. Finally, conclusions and recommendations for further study are presented.

  7. Two Fundamental Principles of Nature's Interactions

    NASA Astrophysics Data System (ADS)

    Ma, Tian; Wang, Shouhong

    2014-03-01

    In this talk, we present two fundamental principles of nature's interactions, the principle of interaction dynamics (PID) and the principle of representation invariance (PRI). Intuitively, PID takes the variation of the action functional under energy-momentum conservation constraint. PID offers a completely different and natural way of introducing Higgs fields. PRI requires that physical laws be independent of representations of the gauge groups. These two principles give rise to a unified field model for four interactions, which can be naturally decoupled to study individual interactions. With these two principles, we are able to derive 1) a unified theory for dark matter and dark energy, 2) layered strong and weak interaction potentials, and 3) the energy levels of subatomic particles. Supported in part by NSF, ONR and Chinese NSF.

  8. Stem cell bioprocessing: fundamentals and principles.

    PubMed

    Placzek, Mark R; Chung, I-Ming; Macedo, Hugo M; Ismail, Siti; Mortera Blanco, Teresa; Lim, Mayasari; Cha, Jae Min; Fauzi, Iliana; Kang, Yunyi; Yeo, David C L; Ma, Chi Yip Joan; Polak, Julia M; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2009-03-01

    In recent years, the potential of stem cell research for tissue engineering-based therapies and regenerative medicine clinical applications has become well established. In 2006, Chung pioneered the first entire organ transplant using adult stem cells and a scaffold for clinical evaluation. With this a new milestone was achieved, with seven patients with myelomeningocele receiving stem cell-derived bladder transplants resulting in substantial improvements in their quality of life. While a bladder is a relatively simple organ, the breakthrough highlights the incredible benefits that can be gained from the cross-disciplinary nature of tissue engineering and regenerative medicine (TERM) that encompasses stem cell research and stem cell bioprocessing. Unquestionably, the development of bioprocess technologies for the transfer of the current laboratory-based practice of stem cell tissue culture to the clinic as therapeutics necessitates the application of engineering principles and practices to achieve control, reproducibility, automation, validation and safety of the process and the product. The successful translation will require contributions from fundamental research (from developmental biology to the 'omics' technologies and advances in immunology) and from existing industrial practice (biologics), especially on automation, quality assurance and regulation. The timely development, integration and execution of various components will be critical-failures of the past (such as in the commercialization of skin equivalents) on marketing, pricing, production and advertising should not be repeated. This review aims to address the principles required for successful stem cell bioprocessing so that they can be applied deftly to clinical applications. PMID:19033137

  9. Stem cell bioprocessing: fundamentals and principles

    PubMed Central

    Placzek, Mark R.; Chung, I-Ming; Macedo, Hugo M.; Ismail, Siti; Mortera Blanco, Teresa; Lim, Mayasari; Min Cha, Jae; Fauzi, Iliana; Kang, Yunyi; Yeo, David C.L.; Yip Joan Ma, Chi; Polak, Julia M.; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2008-01-01

    In recent years, the potential of stem cell research for tissue engineering-based therapies and regenerative medicine clinical applications has become well established. In 2006, Chung pioneered the first entire organ transplant using adult stem cells and a scaffold for clinical evaluation. With this a new milestone was achieved, with seven patients with myelomeningocele receiving stem cell-derived bladder transplants resulting in substantial improvements in their quality of life. While a bladder is a relatively simple organ, the breakthrough highlights the incredible benefits that can be gained from the cross-disciplinary nature of tissue engineering and regenerative medicine (TERM) that encompasses stem cell research and stem cell bioprocessing. Unquestionably, the development of bioprocess technologies for the transfer of the current laboratory-based practice of stem cell tissue culture to the clinic as therapeutics necessitates the application of engineering principles and practices to achieve control, reproducibility, automation, validation and safety of the process and the product. The successful translation will require contributions from fundamental research (from developmental biology to the ‘omics’ technologies and advances in immunology) and from existing industrial practice (biologics), especially on automation, quality assurance and regulation. The timely development, integration and execution of various components will be critical—failures of the past (such as in the commercialization of skin equivalents) on marketing, pricing, production and advertising should not be repeated. This review aims to address the principles required for successful stem cell bioprocessing so that they can be applied deftly to clinical applications. PMID:19033137

  10. [Isotopy and multicorticality: two fundamental principles].

    PubMed

    Muratori, G

    1991-05-15

    Starting out from Oral Implantology pioneers, the Author comes down to the present situation, in an effort to show which values should be considered as the true ones. "Nothing new under the skies" is the Author's comment, when he examines all the techniques and materials presented as "new" for commercial purposes, whereas they are not new at all. To prove his statements he goes back to the work of some implantology pioneers, such as Formiggini, Perron Andrès, Cherchève, Strock, Pasqualini, Muratori, Tramonte, Linkow and others. In going over their most remarkable techniques, he maintains that what is being proposed nowadays as brand new was actually done long ago. Only names are now different: the process now called fibrous osseointegration used to be named osteofibrosis, and what is now called osseointegration was known as complete ossification. In order to remove the great confusion now prevailing in the dozens of implant systems, as well as in implant philosophy itself, the Author maintains that good implantologists should follow two fundamental principles: 1) implants should be built in a great variety of sizes, in order to take full advantage of cortical bones. They should be multicortical, generally quadricortical, since they should rest on the sinus floor cortical bone, on the alveolar ridge, the palatal and the buccal cortical bones (this is true for the elements implanted in the upper arch and in the front-mesial arch).(ABSTRACT TRUNCATED AT 250 WORDS) PMID:1864418

  11. Fundamental Principles of Proper Space Kinematics

    NASA Astrophysics Data System (ADS)

    Wade, Sean

    It is desirable to understand the movement of both matter and energy in the universe based upon fundamental principles of space and time. Time dilation and length contraction are features of Special Relativity derived from the observed constancy of the speed of light. Quantum Mechanics asserts that motion in the universe is probabilistic and not deterministic. While the practicality of these dissimilar theories is well established through widespread application inconsistencies in their marriage persist, marring their utility, and preventing their full expression. After identifying an error in perspective the current theories are tested by modifying logical assumptions to eliminate paradoxical contradictions. Analysis of simultaneous frames of reference leads to a new formulation of space and time that predicts the motion of both kinds of particles. Proper Space is a real, three-dimensional space clocked by proper time that is undergoing a densification at the rate of c. Coordinate transformations to a familiar object space and a mathematical stationary space clarify the counterintuitive aspects of Special Relativity. These symmetries demonstrate that within the local universe stationary observers are a forbidden frame of reference; all is in motion. In lieu of Quantum Mechanics and Uncertainty the use of the imaginary number i is restricted for application to the labeling of mass as either material or immaterial. This material phase difference accounts for both the perceived constant velocity of light and its apparent statistical nature. The application of Proper Space Kinematics will advance more accurate representations of microscopic, oscopic, and cosmological processes and serve as a foundation for further study and reflection thereafter leading to greater insight.

  12. Lighting fundamentals handbook: Lighting fundamentals and principles for utility personnel

    SciTech Connect

    Eley, C.; Tolen, T. Associates, San Francisco, CA ); Benya, J.R. )

    1992-12-01

    Lighting accounts for approximately 30% of overall electricity use and demand in commercial buildings. This handbook for utility personnel provides a source of basic information on lighting principles, lighting equipment, and other considerations related to lighting design. The handbook is divided into three parts. Part One, Physics of Light, has chapters on light, vision, optics, and photometry. Part Two, Lighting Equipment and Technology, focuses on lamps, luminaires, and lighting controls. Part Three, Lighting Design Decisions, deals with the manner in which lighting design decisions are made and reviews relevant methods and issues. These include the quantity and quality of light needed for visual tasks, calculation methods for verifying that lighting needs are satisfied, lighting economics and methods for evaluating investments in efficient lighting systems, and miscellaneous design issues including energy codes, power quality, photobiology, and disposal of lighting equipment. The handbook contains a discussion of the role of the utility in promoting the use of energy-efficient lighting. The handbook also includes a lighting glossary and a list of references for additional information. This convenient and comprehensive handbook is designed to enable utility lighting personnel to assist their customers in developing high-quality, energy-efficient lighting systems. The handbook is not intended to be an up-to-date reference on lighting products and equipment.

  13. The "Fundamental Pedogagical Principle" in Second Language Teaching.

    ERIC Educational Resources Information Center

    Krashen, Stephen D.

    1981-01-01

    A fundamental principle of second language acquisition is stated and applied to language teaching. The principle states that learners acquire a second language when they receive comprehensible input in situations where their affective filters are sufficiently low. The theoretical background of this principle consists of five hypotheses: the…

  14. Fundamental Ethical Principles in Sports Medicine.

    PubMed

    Devitt, Brian M

    2016-04-01

    In sports medicine, the practice of ethics presents many unique challenges because of the unusual clinical environment of caring for players within the context of a team whose primary goal is to win. Ethical issues frequently arise because a doctor-patient-team triad often replaces the traditional doctor-patient relationship. Conflict may exist when the team's priority clashes with or even replaces the doctor's obligation to player well-being. Customary ethical norms that govern most forms of clinical practice, such as autonomy and confidentiality, are not easily translated to sports medicine. Ethical principles and examples of how they relate to sports medicine are discussed. PMID:26832970

  15. Fundamental principles and applications of microfluidic systems.

    PubMed

    Ong, Soon-Eng; Zhang, Sam; Du, Hejun; Fu, Yongqing

    2008-01-01

    Microelectromechanical systems (MEMS) technology has provided the platform for the miniaturization of analytical devices for biological applications. Beside the fabrication technology, the study and understanding of flow characteristics of fluid in micrometer or even nanometer scale is vital for the successful implementation of such miniaturized systems. Microfluidics is currently under the spotlight for medical diagnostics and many other bio-analysis as its physical size manifested numerous advantages over lab-based devices. In this review, elementary concepts of fluid and its flow characteristics together with various transport processes and microchannel condition are presented. They are among the fundamental building block for the success in microfluidic systems. Selected application examples include biological cell handling employing different schemes of manipulation and DNA amplification using different microreactor arrangement and fluid flow regime. PMID:17981751

  16. Fundamental principles of energy consumption for gene expression.

    PubMed

    Huang, Lifang; Yuan, Zhanjiang; Yu, Jianshe; Zhou, Tianshou

    2015-12-01

    How energy is consumed in gene expression is largely unknown mainly due to complexity of non-equilibrium mechanisms affecting expression levels. Here, by analyzing a representative gene model that considers complexity of gene expression, we show that negative feedback increases energy consumption but positive feedback has an opposite effect; promoter leakage always reduces energy consumption; generating more bursts needs to consume more energy; and the speed of promoter switching is at the cost of energy consumption. We also find that the relationship between energy consumption and expression noise is multi-mode, depending on both the type of feedback and the speed of promoter switching. Altogether, these results constitute fundamental principles of energy consumption for gene expression, which lay a foundation for designing biologically reasonable gene modules. In addition, we discuss possible biological implications of these principles by combining experimental facts. PMID:26723140

  17. Rigorous force field optimization principles based on statistical distance minimization

    SciTech Connect

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  18. Negative-Refraction Metamaterials: Fundamental Principles and Applications

    NASA Astrophysics Data System (ADS)

    Eleftheriades, G. V.; Balmain, K. G.

    2005-06-01

    Learn about the revolutionary new technology of negative-refraction metamaterials Negative-Refraction Metamaterials: Fundamental Principles and Applications introduces artificial materials that support the unusual electromagnetic property of negative refraction. Readers will discover several classes of negative-refraction materials along with their exciting, groundbreaking applications, such as lenses and antennas, imaging with super-resolution, microwave devices, dispersion-compensating interconnects, radar, and defense. The book begins with a chapter describing the fundamentals of isotropic metamaterials in which a negative index of refraction is defined. In the following chapters, the text builds on the fundamentals by describing a range of useful microwave devices and antennas. Next, a broad spectrum of exciting new research and emerging applications is examined, including: Theory and experiments behind a super-resolving, negative-refractive-index transmission-line lens 3-D transmission-line metamaterials with a negative refractive index Numerical simulation studies of negative refraction of Gaussian beams and associated focusing phenomena Unique advantages and theory of shaped lenses made of negative-refractive-index metamaterials A new type of transmission-line metamaterial that is anisotropic and supports the formation of sharp steerable beams (resonance cones) Implementations of negative-refraction metamaterials at optical frequencies Unusual propagation phenomena in metallic waveguides partially filled with negative-refractive-index metamaterials Metamaterials in which the refractive index and the underlying group velocity are both negative This work brings together the best minds in this cutting-edge field. It is fascinating reading for scientists, engineers, and graduate-level students in physics, chemistry, materials science, photonics, and electrical engineering.

  19. Minimal self-models and the free energy principle

    PubMed Central

    Limanowski, Jakub; Blankenburg, Felix

    2013-01-01

    The term “minimal phenomenal selfhood” (MPS) describes the basic, pre-reflective experience of being a self (Blanke and Metzinger, 2009). Theoretical accounts of the minimal self have long recognized the importance and the ambivalence of the body as both part of the physical world, and the enabling condition for being in this world (Gallagher, 2005a; Grafton, 2009). A recent account of MPS (Metzinger, 2004a) centers on the consideration that minimal selfhood emerges as the result of basic self-modeling mechanisms, thereby being founded on pre-reflective bodily processes. The free energy principle (FEP; Friston, 2010) is a novel unified theory of cortical function built upon the imperative that self-organizing systems entail hierarchical generative models of the causes of their sensory input, which are optimized by minimizing free energy as an approximation of the log-likelihood of the model. The implementation of the FEP via predictive coding mechanisms and in particular the active inference principle emphasizes the role of embodiment for predictive self-modeling, which has been appreciated in recent publications. In this review, we provide an overview of these conceptions and illustrate thereby the potential power of the FEP in explaining the mechanisms underlying minimal selfhood and its key constituents, multisensory integration, interoception, agency, perspective, and the experience of mineness. We conclude that the conceptualization of MPS can be well mapped onto a hierarchical generative model furnished by the FEP and may constitute the basis for higher-level, cognitive forms of self-referral, as well as the understanding of other minds. PMID:24062658

  20. Application of trajectory optimization principles to minimize aircraft operating costs

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.; Morello, S. A.; Erzberger, H.

    1979-01-01

    This paper summarizes various applications of trajectory optimization principles that have been or are being devised by both government and industrial researchers to minimize aircraft direct operating costs (DOC). These costs (time and fuel) are computed for aircraft constrained to fly over a fixed range. Optimization theory is briefly outlined, and specific algorithms which have resulted from application of this theory are described. Typical results which demonstrate use of these algorithms and the potential savings which they can produce are given. Finally, need for further trajectory optimization research is presented.

  1. Classical Dynamics Based on the Minimal Length Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Chung, Won Sang

    2016-02-01

    In this paper we consider the quadratic modification of the Heisenberg algebra and its classical limit version which we call the β-deformed Poisson bracket for corresponding classical variables. We use the β-deformed Poisson bracket to discuss some physical problems in the β-deformed classical dynamics. Finally, we consider the ( α, β)- deformed classical dynamics in which minimal length uncertainty principle is given by [ hat {x} , hat {p}] = i hbar (1 + α hat {x}2 + β hat {p}2 ) . For two small parameters α, β, we discuss the free fall of particle and a composite system in a uniform gravitational field.

  2. The Principle of Minimal Resistance in Non-equilibrium Thermodynamics

    NASA Astrophysics Data System (ADS)

    Mauri, Roberto

    2016-04-01

    Analytical models describing the motion of colloidal particles in given force fields are presented. In addition to local approaches, leading to well known master equations such as the Langevin and the Fokker-Planck equations, a global description based on path integration is reviewed. A new result is presented, showing that under very broad conditions, during its evolution a dissipative system tends to minimize its energy dissipation in such a way to keep constant the Hamiltonian time rate, equal to the difference between the flux-based and the force-based Rayleigh dissipation functions. In fact, the Fokker-Planck equation can be interpreted as the Hamilton-Jacobi equation resulting from such minumum principle. At steady state, the Hamiltonian time rate is maximized, leading to a minimum resistance principle. In the unsteady case, we consider the relaxation to equilibrium of harmonic oscillators and the motion of a Brownian particle in shear flow, obtaining results that coincide with the solution of the Fokker-Planck and the Langevin equations.

  3. Ultrafast mid-IR laser scalpel: protein signals of the fundamental limits to minimally invasive surgery.

    PubMed

    Amini-Nik, Saeid; Kraemer, Darren; Cowan, Michael L; Gunaratne, Keith; Nadesan, Puviindran; Alman, Benjamin A; Miller, R J Dwayne

    2010-01-01

    Lasers have in principle the capability to cut at the level of a single cell, the fundamental limit to minimally invasive procedures and restructuring biological tissues. To date, this limit has not been achieved due to collateral damage on the macroscale that arises from thermal and shock wave induced collateral damage of surrounding tissue. Here, we report on a novel concept using a specifically designed Picosecond IR Laser (PIRL) that selectively energizes water molecules in the tissue to drive ablation or cutting process faster than thermal exchange of energy and shock wave propagation, without plasma formation or ionizing radiation effects. The targeted laser process imparts the least amount of energy in the remaining tissue without any of the deleterious photochemical or photothermal effects that accompanies other laser wavelengths and pulse parameters. Full thickness incisional and excisional wounds were generated in CD1 mice using the Picosecond IR Laser, a conventional surgical laser (DELight Er:YAG) or mechanical surgical tools. Transmission and scanning electron microscopy showed that the PIRL laser produced minimal tissue ablation with less damage of surrounding tissues than wounds formed using the other modalities. The width of scars formed by wounds made by the PIRL laser were half that of the scars produced using either a conventional surgical laser or a scalpel. Aniline blue staining showed higher levels of collagen in the early stage of the wounds produced using the PIRL laser, suggesting that these wounds mature faster. There were more viable cells extracted from skin using the PIRL laser, suggesting less cellular damage. β-catenin and TGF-β signalling, which are activated during the proliferative phase of wound healing, and whose level of activation correlates with the size of wounds was lower in wounds generated by the PIRL system. Wounds created with the PIRL systsem also showed a lower rate of cell proliferation. Direct comparison of wound

  4. Fundamental Principles of Classical Mechanics: a Geometrical Perspectives

    NASA Astrophysics Data System (ADS)

    Lam, Kai S.

    2014-07-01

    Classical mechanics is the quantitative study of the laws of motion for oscopic physical systems with mass. The fundamental laws of this subject, known as Newton's Laws of Motion, are expressed in terms of second-order differential equations governing the time evolution of vectors in a so-called configuration space of a system (see Chapter 12). In an elementary setting, these are usually vectors in 3-dimensional Euclidean space, such as position vectors of point particles; but typically they can be vectors in higher dimensional and more abstract spaces. A general knowledge of the mathematical properties of vectors, not only in their most intuitive incarnations as directed arrows in physical space but as elements of abstract linear vector spaces, and those of linear operators (transformations) on vector spaces as well, is then indispensable in laying the groundwork for both the physical and the more advanced mathematical - more precisely topological and geometrical - concepts that will prove to be vital in our subject. In this beginning chapter we will review these properties, and introduce the all-important related notions of dual spaces and tensor products of vector spaces. The notational convention for vectorial and tensorial indices used for the rest of this book (except when otherwise specified) will also be established...

  5. Fundamental Principles of Network Formation among Preschool Children1

    PubMed Central

    Schaefer, David R.; Light, John M.; Fabes, Richard A.; Hanish, Laura D.; Martin, Carol Lynn

    2009-01-01

    The goal of this research was to investigate the origins of social networks by examining the formation of children’s peer relationships in 11 preschool classes throughout the school year. We investigated whether several fundamental processes of relationship formation were evident at this age, including reciprocity, popularity, and triadic closure effects. We expected these mechanisms to change in importance over time as the network crystallizes, allowing more complex structures to evolve from simpler ones in a process we refer to as structural cascading. We analyzed intensive longitudinal observational data of children’s interactions using the SIENA actor-based model. We found evidence that reciprocity, popularity, and triadic closure all shaped the formation of preschool children’s networks. The influence of reciprocity remained consistent, whereas popularity and triadic closure became increasingly important over the course of the school year. Interactions between age and endogenous network effects were nonsignificant, suggesting that these network formation processes were not moderated by age in this sample of young children. We discuss the implications of our longitudinal network approach and findings for the study of early network developmental processes. PMID:20161606

  6. Lighting fundamentals handbook: Lighting fundamentals and principles for utility personnel. Final report

    SciTech Connect

    Eley, C.; Tolen, T.; Benya, J.R.

    1992-12-01

    Lighting accounts for approximately 30% of overall electricity use and demand in commercial buildings. This handbook for utility personnel provides a source of basic information on lighting principles, lighting equipment, and other considerations related to lighting design. The handbook is divided into three parts. Part One, Physics of Light, has chapters on light, vision, optics, and photometry. Part Two, Lighting Equipment and Technology, focuses on lamps, luminaires, and lighting controls. Part Three, Lighting Design Decisions, deals with the manner in which lighting design decisions are made and reviews relevant methods and issues. These include the quantity and quality of light needed for visual tasks, calculation methods for verifying that lighting needs are satisfied, lighting economics and methods for evaluating investments in efficient lighting systems, and miscellaneous design issues including energy codes, power quality, photobiology, and disposal of lighting equipment. The handbook contains a discussion of the role of the utility in promoting the use of energy-efficient lighting. The handbook also includes a lighting glossary and a list of references for additional information. This convenient and comprehensive handbook is designed to enable utility lighting personnel to assist their customers in developing high-quality, energy-efficient lighting systems. The handbook is not intended to be an up-to-date reference on lighting products and equipment.

  7. [The input of medical community into development of fundamental principles of Zemstvo medicine of Russia].

    PubMed

    Yegorysheva, I V

    2013-01-01

    The article considers the participation of medical community in formation of fundamental principles of unique system of public health--the Zemstvo medicine. This occurrence found its reflexion in activities of medical scientific societies and congresses, periodic medical mass media. PMID:24649614

  8. Free minimization of the fundamental measure theory functional: Freezing of parallel hard squares and cubes.

    PubMed

    Belli, S; Dijkstra, M; van Roij, R

    2012-09-28

    Due to remarkable advances in colloid synthesis techniques, systems of squares and cubes, once an academic abstraction for theorists and simulators, are nowadays an experimental reality. By means of a free minimization of the free-energy functional, we apply fundamental measure theory to analyze the phase behavior of parallel hard squares and hard cubes. We compare our results with those obtained by the traditional approach based on the Gaussian parameterization, finding small deviations and good overall agreement between the two methods. For hard squares, our predictions feature at intermediate packing fraction a smectic phase, which is however expected to be unstable due to thermal fluctuations. Due to this inconsistency, we cannot determine unambiguously the prediction of the theory for the expected fluid-to-crystal transition of parallel hard squares, but we deduce two alternative scenarios: (i) a second-order transition with a coexisting vacancy-rich crystal or (ii) a higher-density first-order transition with a coexisting crystal characterized by a lower vacancy concentration. In accordance with previous studies, a second-order transition with a high vacancy concentration is predicted for hard cubes. PMID:23020342

  9. A defense of fundamental principles and human rights: a reply to Robert Baker.

    PubMed

    Macklin, Ruth

    1998-12-01

    This article seeks to rebut Robert Baker's contention that attempts to ground international bioethics in fundamental principles cannot withstand the challenges posed by multiculturalism and postmodernism. First, several corrections are provided of Baker's account of the conclusions reached by the Advisory Committee on Human Radiation Experiments. Second, a rebuttal is offered to Baker's claim that an unbridgeable moral gap exists between Western individualism and non-Western communalism. In conclusion, this article argues that Baker's "nonnegotiable primary goods" cannot do the work of "classical human rights" and that the latter framework is preferable from both a practical and a theoretical standpoint. PMID:11657320

  10. The uncertainty threshold principle - Fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1976-01-01

    The fundamental limitations of the optimal control of dynamic systems with random parameters are analyzed by studying a scalar linear-quadratic optimal control example. It is demonstrated that optimum long-range decision making is possible only if the dynamic uncertainty (quantified by the means and covariances of the random parameters) is below a certain threshold. If this threshold is exceeded, there do not exist optimum decision rules. This phenomenon is called the 'uncertainty threshold principle'. The implications of this phenomenon to the field of modelling, identification, and adaptive control are discussed.

  11. Astronomical Tests of Relativity: Beyond Parameterized Post-Newtonian Formalism (PPN), to Testing Fundamental Principles

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik

    2009-05-01

    By the early 1970s, the improved accuracy of astrometric and time measurements enabled researchers not only to experimentally compare relativistic gravity with the Newtonian predictions, but also to compare different relativistic gravitational theories (e.g., the Brans-Dicke Scalar-Tensor Theory of Gravitation). For this comparison, Kip Thorne and others developed the Parameterized Post-Newtonian Formalism (PPN), and derived the dependence of different astronomically observable effects on the values of the corresponding parameters. Since then, all the observations have confirmed General Relativity. In other words, the question of which relativistic gravitation theory is in the best accordance with the experiments has been largely settled. This does not mean that General Relativity is the final theory of gravitation: it needs to be reconciled with quantum physics (into quantum gravity), it may also need to be reconciled with numerous surprising cosmological observations, etc. It is therefore reasonable to prepare an extended version of the PPN formalism, that will enable us to test possible quantum-related modifications of General Relativity. In particular, we need to include the possibility of violating fundamental principles that underlie the PPN formalism but that may be violated in quantum physics, such as scale-invariance, T-invariance, P-invariance, energy conservation, spatial isotropy violations, etc. In this talk, we present the first attempt to design the corresponding extended PPN formalism, with the (partial) analysis of the relation between the corresponding fundamental physical principles.

  12. Astronomical tests of relativity: beyond parameterized post-Newtonian formalism (PPN), to testing fundamental principles

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik

    2010-01-01

    By the early 1970s, the improved accuracy of astrometric and time measurements enabled researchers not only to experimentally compare relativistic gravity with the Newtonian predictions, but also to compare different relativistic gravitational theories (e.g., the Brans-Dicke Scalar-Tensor Theory of Gravitation). For this comparison, Kip Thorne and others developed the Parameterized Post-Newtonian Formalism (PPN), and derived the dependence of different astronomically observable effects on the values of the corresponding parameters. Since then, all the observations have confirmed General Relativity. In other words, the question of which relativistic gravitation theory is in the best accordance with the experiments has been largely settled. This does not mean that General Relativity is the final theory of gravitation: it needs to be reconciled with quantum physics (into quantum gravity), it may also need to be reconciled with numerous surprising cosmological observations, etc. It is, therefore, reasonable to prepare an extended version of the PPN formalism, that will enable us to test possible quantum-related modifications of General Relativity. In particular, we need to include the possibility of violating fundamental principles that underlie the PPN formalism but that may be violated in quantum physics, such as scale-invariance, T-invariance, P-invariance, energy conservation, spatial isotropy violations, etc. In this paper, we present the first attempt to design the corresponding extended PPN formalism, with the (partial) analysis of the relation between the corresponding fundamental physical principles.

  13. Polynomial-time algorithms for the integer minimal principle for centrosymmetric structures.

    PubMed

    Vaia, Anastasia; Sahinidis, Nikolaos V

    2005-07-01

    The minimal principle for structure determination from single-crystal X-ray diffraction measurements has recently been formulated as an integer linear optimization model for the case of centrosymmetric structures. Solution of this model via established combinatorial branch-and-bound algorithms provides the true global minimum of the minimal principle while operating exclusively in reciprocal space. However, integer programming techniques may require an exponential number of iterations to exhaust the search space. In this paper, a new approach is developed to solve the integer minimal principle to global optimality without requiring the solution of an optimization problem. Instead, properties of the solution of the optimization problem, as observed in a large number of computational experiments, are exploited in order to reduce the optimization formulation to a system of linear equations in the number field of two elements (F(2)). Two specialized Gaussian elimination algorithms are then developed to solve this system of equations in polynomial time in the number of atoms. Computational results on a collection of 38 structures demonstrate that the proposed approach provides very fast and accurate solutions to the phase problem for centrosymmetric structures. This approach also provided much better crystallographic R values than SHELXS for all 38 structures tested. PMID:15972998

  14. Contemporary extracorporeal membrane oxygenation therapy in adults: Fundamental principles and systematic review of the evidence.

    PubMed

    Squiers, John J; Lima, Brian; DiMaio, J Michael

    2016-07-01

    Extracorporeal membrane oxygenation (ECMO) provides days to weeks of support for patients with respiratory, cardiac, or combined cardiopulmonary failure. Since ECMO was first reported in 1974, nearly 70,000 runs of ECMO have been implemented, and the use of ECMO in adults increased by more than 400% from 2006 to 2011 in the United States. A variety of factors, including the 2009 influenza A epidemic, results from recent clinical trials, and improvements in ECMO technology, have motivated this increased use in adults. Because ECMO is increasingly becoming available to a diverse population of critically ill patients, we provide an overview of its fundamental principles and a systematic review of the evidence basis of this treatment modality for a variety of indications in adults. PMID:27060027

  15. Position-sensitive detection of slow neutrons: Survey of fundamental principles

    SciTech Connect

    Crawford, R.K.

    1992-07-01

    This paper sets forth the fundamental principles governing the development of position-sensitive detection systems for slow neutrons. Since neutrons are only weakly interacting with most materials, it is not generally practical to detect slow neutrons directly. Therefore all practical slow neutron detection mechanisms depend on the use of nuclear reactions to ``convert`` the neutron to one or more charged particles, followed by the subsequent detection of the charged particles. The different conversion reactions which can be used are discussed, along with the relative merits of each. This is followed with a discussion of the various methods of charged particle detection, how these lend themselves to position-sensitive encoding, and the means of position encoding which can be applied to each case. Detector performance characteristics which may be of importance to the end user are discussed and related to these various detection and position-encoding mechanisms.

  16. Position-sensitive detection of slow neutrons: Survey of fundamental principles

    SciTech Connect

    Crawford, R.K.

    1992-01-01

    This paper sets forth the fundamental principles governing the development of position-sensitive detection systems for slow neutrons. Since neutrons are only weakly interacting with most materials, it is not generally practical to detect slow neutrons directly. Therefore all practical slow neutron detection mechanisms depend on the use of nuclear reactions to convert'' the neutron to one or more charged particles, followed by the subsequent detection of the charged particles. The different conversion reactions which can be used are discussed, along with the relative merits of each. This is followed with a discussion of the various methods of charged particle detection, how these lend themselves to position-sensitive encoding, and the means of position encoding which can be applied to each case. Detector performance characteristics which may be of importance to the end user are discussed and related to these various detection and position-encoding mechanisms.

  17. Minimization principles for the coupled problem of Darcy-Biot-type fluid transport in porous media linked to phase field modeling of fracture

    NASA Astrophysics Data System (ADS)

    Miehe, Christian; Mauthe, Steffen; Teichtmeister, Stephan

    2015-09-01

    This work develops new minimization and saddle point principles for the coupled problem of Darcy-Biot-type fluid transport in porous media at fracture. It shows that the quasi-static problem of elastically deforming, fluid-saturated porous media is related to a minimization principle for the evolution problem. This two-field principle determines the rate of deformation and the fluid mass flux vector. It provides a canonically compact model structure, where the stress equilibrium and the inverse Darcy's law appear as the Euler equations of a variational statement. A Legendre transformation of the dissipation potential relates the minimization principle to a characteristic three field saddle point principle, whose Euler equations determine the evolutions of deformation and fluid content as well as Darcy's law. A further geometric assumption results in modified variational principles for a simplified theory, where the fluid content is linked to the volumetric deformation. The existence of these variational principles underlines inherent symmetries of Darcy-Biot theories of porous media. This can be exploited in the numerical implementation by the construction of time- and space-discrete variational principles, which fully determine the update problems of typical time stepping schemes. Here, the proposed minimization principle for the coupled problem is advantageous with regard to a new unconstrained stable finite element design, while space discretizations of the saddle point principles are constrained by the LBB condition. The variational principles developed provide the most fundamental approach to the discretization of nonlinear fluid-structure interactions, showing symmetric systems in algebraic update procedures. They also provide an excellent starting point for extensions towards more complex problems. This is demonstrated by developing a minimization principle for a phase field description of fracture in fluid-saturated porous media. It is designed for an

  18. The fundamental operating principles of electronic root canal length measurement devices.

    PubMed

    Nekoofar, M H; Ghandi, M M; Hayes, S J; Dummer, P M H

    2006-08-01

    It is generally accepted that root canal treatment procedures should be confined within the root canal system. To achieve this objective the canal terminus must be detected accurately during canal preparation and precise control of working length during the process must be maintained. Several techniques have been used for determining the apical canal terminus including electronic methods. However, the fundamental electronic operating principles and classification of the electronic devices used in this method are often unknown and a matter of controversy. The basic assumption with all electronic length measuring devices is that human tissues have certain characteristics that can be modelled by a combination of electrical components. Therefore, by measuring the electrical properties of the model, such as resistance and impedance, it should be possible to detect the canal terminus. The root canal system is surrounded by dentine and cementum that are insulators to electrical current. At the minor apical foramen, however, there is a small hole in which conductive materials within the canal space (tissue, fluid) are electrically connected to the periodontal ligament that is itself a conductor of electric current. Thus, dentine, along with tissue and fluid inside the canal, forms a resistor, the value of which depends on their dimensions, and their inherent resistivity. When an endodontic file penetrates inside the canal and approaches the minor apical foramen, the resistance between the endodontic file and the foramen decreases, because the effective length of the resistive material (dentine, tissue, fluid) decreases. As well as resistive properties, the structure of the tooth root has capacitive characteristics. Therefore, various electronic methods have been developed that use a variety of other principles to detect the canal terminus. Whilst the simplest devices measure resistance, other devices measure impedance using either high frequency, two frequencies, or

  19. Driving an Active Vibration Balancer to Minimize Vibrations at the Fundamental and Harmonic Frequencies

    NASA Technical Reports Server (NTRS)

    Holliday, Ezekiel S. (Inventor)

    2014-01-01

    Vibrations of a principal machine are reduced at the fundamental and harmonic frequencies by driving the drive motor of an active balancer with balancing signals at the fundamental and selected harmonics. Vibrations are sensed to provide a signal representing the mechanical vibrations. A balancing signal generator for the fundamental and for each selected harmonic processes the sensed vibration signal with adaptive filter algorithms of adaptive filters for each frequency to generate a balancing signal for each frequency. Reference inputs for each frequency are applied to the adaptive filter algorithms of each balancing signal generator at the frequency assigned to the generator. The harmonic balancing signals for all of the frequencies are summed and applied to drive the drive motor. The harmonic balancing signals drive the drive motor with a drive voltage component in opposition to the vibration at each frequency.

  20. Mobility analysis tool based on the fundamental principle of conservation of energy.

    SciTech Connect

    Spletzer, Barry Louis; Nho, Hyuchul C.; Salton, Jonathan Robert

    2007-08-01

    In the past decade, a great deal of effort has been focused in research and development of versatile robotic ground vehicles without understanding their performance in a particular operating environment. As the usage of robotic ground vehicles for intelligence applications increases, understanding mobility of the vehicles becomes critical to increase the probability of their successful operations. This paper describes a framework based on conservation of energy to predict the maximum mobility of robotic ground vehicles over general terrain. The basis of the prediction is the difference between traction capability and energy loss at the vehicle-terrain interface. The mission success of a robotic ground vehicle is primarily a function of mobility. Mobility of a vehicle is defined as the overall capability of a vehicle to move from place to place while retaining its ability to perform its primary mission. A mobility analysis tool based on the fundamental principle of conservation of energy is described in this document. The tool is a graphical user interface application. The mobility analysis tool has been developed at Sandia National Laboratories, Albuquerque, NM. The tool is at an initial stage of development. In the future, the tool will be expanded to include all vehicles and terrain types.

  1. [Fundamental ethical principles in the European framework programmes for research and development].

    PubMed

    Hirsch, François; Karatzas, Isidoros; Zilgalvis, Pēteris

    2009-01-01

    The European Commission is one of the most important international funding bodies for research conducted in Europe and beyond, including developing countries and countries in transition. Through its framework programmes for research and development, the European Union finances a vast array of projects concerning fields affecting the citizens' health, as well as the researchers' mobility, the development of new technologies or the safeguard of the environment. With the agreement of the European Parliament and of the Council of Ministers, the two decisional authorities of the European Union, the 7th framework programmes was started on December 2006. This program has a budget of 54 billion Euros to be distributed over a 7-year period. Therefore, the European Union aims to fully address the challenge as stated by the European Council of Lisbon (of March 2000) which declared the idea of providing 3% of the GDP of all the Member States for the purpose of research and development. One of the important conditions stated by the Members of the European Parliament to allocate this financing is to ensuring that "the funding research activities respect the fundamental ethical principles". In this article, we will approach this aspect of the evaluation. PMID:19765393

  2. Designing nanomaterials to maximize performance and minimize undesirable implications guided by the Principles of Green Chemistry.

    PubMed

    Gilbertson, Leanne M; Zimmerman, Julie B; Plata, Desiree L; Hutchison, James E; Anastas, Paul T

    2015-08-21

    The Twelve Principles of Green Chemistry were first published in 1998 and provide a framework that has been adopted not only by chemists, but also by design practitioners and decision-makers (e.g., materials scientists and regulators). The development of the Principles was initially motivated by the need to address decades of unintended environmental pollution and human health impacts from the production and use of hazardous chemicals. Yet, for over a decade now, the Principles have been applied to the synthesis and production of engineered nanomaterials (ENMs) and the products they enable. While the combined efforts of the global scientific community have led to promising advances in the field of nanotechnology, there remain significant research gaps and the opportunity to leverage the potential global economic, societal and environmental benefits of ENMs safely and sustainably. As such, this tutorial review benchmarks the successes to date and identifies critical research gaps to be considered as future opportunities for the community to address. A sustainable material design framework is proposed that emphasizes the importance of establishing structure-property-function (SPF) and structure-property-hazard (SPH) relationships to guide the rational design of ENMs. The goal is to achieve or exceed the functional performance of current materials and the technologies they enable, while minimizing inherent hazard to avoid risk to human health and the environment at all stages of the life cycle. PMID:25955514

  3. Simple approach to sediment provenance tracing using element analysis and fundamental principles

    NASA Astrophysics Data System (ADS)

    Matys Grygar, Tomas; Elznicova, Jitka; Popelka, Jan

    2016-04-01

    Common sediment fingerprinting techniques use either (1) extensive analytical datasets, sometimes nearly complete with respect to accessible characterization techniques; they are processed by multidimensional statistics based on certain statistical assumptions on distribution functions of analytical results and conservativeness/additivity of some components, or (2) analytically demanding characteristics such as isotope ratios assumed to be unequivocal "labels" on the parent material unaltered by any catchment process. The inherent problem of the approach ad (1) is that interpretation of statistical components ("sources") is done ex post and remains purely formal. The problem of the approach ad (2) is that catchment processes (weathering, transport, deposition) can modify most geochemical parameters of soils and sediments, in other words, that the idea that some geochemistry parameters are "conservative" may be idealistic. Grain-size effects and sediment provenance have a joint influence on chemical composition of fluvial sediments that is indeed not easy to distinguish. Attempts to separate those two main components using only statistics seem risky and equivocal, because grain-size dependence of element composition is nearly individual for each element and reflects sediment maturity and catchment-specific formation transport processes. We suppose that the use of less extensive datasets of analytical results and their interpretation respecting fundamental principles should be more robust than only statistic tools applied to overwhelming datasets. We examined sediment composition, both published by other researchers and gathered by us, and we found some general principles, which are in our opinion relevant for fingerprinting: (1) Concentrations of all elements are grain-size sensitive, i.e. there are no "conservative" elements in conventional sense of provenance- or transport-pathways tracing, (2) fractionation by catchment processes and fluvial transport changes

  4. D 1 , 2 (RN) versus C (RN) local minimizer and a Hopf-type maximum principle

    NASA Astrophysics Data System (ADS)

    Carl, Siegfried; Costa, David G.; Tehrani, Hossein

    2016-08-01

    We consider functionals of the form Φ (u) =1/2∫RN | ∇u|2 -∫RN b (x) G (u) on D 1 , 2 (RN), N ≥ 3, whose critical points are the weak solutions of a corresponding elliptic equation in the whole RN. We present a Brezis-Nirenberg type result and a Hopf-type maximum principle in the context of the space D 1 , 2 (RN). More precisely, we prove that a local minimizer of Φ in the topology of the subspace V must be a local minimizer of Φ in the D 1 , 2 (RN)-topology, where V is given by V : = { v ∈D 1 , 2 (RN) : v ∈ C (RN)withsupx∈RN ⁡ (1 + | x| N - 2) | v (x) | < ∞ }. It is well-known that the Brezis-Nirenberg result has been proved a strong tool in the study of multiple solutions for elliptic boundary value problems in bounded domains. We believe that the result obtained in this paper may play a similar role for elliptic problems in RN.

  5. Prediction of Metabolic Flux Distribution from Gene Expression Data Based on the Flux Minimization Principle

    PubMed Central

    Song, Hyun-Seob; Reifman, Jaques; Wallqvist, Anders

    2014-01-01

    Prediction of possible flux distributions in a metabolic network provides detailed phenotypic information that links metabolism to cellular physiology. To estimate metabolic steady-state fluxes, the most common approach is to solve a set of macroscopic mass balance equations subjected to stoichiometric constraints while attempting to optimize an assumed optimal objective function. This assumption is justifiable in specific cases but may be invalid when tested across different conditions, cell populations, or other organisms. With an aim to providing a more consistent and reliable prediction of flux distributions over a wide range of conditions, in this article we propose a framework that uses the flux minimization principle to predict active metabolic pathways from mRNA expression data. The proposed algorithm minimizes a weighted sum of flux magnitudes, while biomass production can be bounded to fit an ample range from very low to very high values according to the analyzed context. We have formulated the flux weights as a function of the corresponding enzyme reaction's gene expression value, enabling the creation of context-specific fluxes based on a generic metabolic network. In case studies of wild-type Saccharomyces cerevisiae, and wild-type and mutant Escherichia coli strains, our method achieved high prediction accuracy, as gauged by correlation coefficients and sums of squared error, with respect to the experimentally measured values. In contrast to other approaches, our method was able to provide quantitative predictions for both model organisms under a variety of conditions. Our approach requires no prior knowledge or assumption of a context-specific metabolic functionality and does not require trial-and-error parameter adjustments. Thus, our framework is of general applicability for modeling the transcription-dependent metabolism of bacteria and yeasts. PMID:25397773

  6. Emergent features and perceptual objects: re-examining fundamental principles in analogical display design.

    PubMed

    Holt, Jerred; Bennett, Kevin B; Flach, John M

    2015-01-01

    Two sets of design principles for analogical visual displays, based on the concepts of emergent features and perceptual objects, are described. An interpretation of previous empirical findings for three displays (bar graph, polar graphic, alphanumeric) is provided from both perspectives. A fourth display (configural coordinate) was designed using principles of ecological interface design (i.e. direct perception). An experiment was conducted to evaluate performance (accuracy and latency of state identification) with these four displays. Numerous significant effects were obtained and a clear rank ordering of performance emerged (from best to worst): configural coordinate, bar graph, alphanumeric and polar graphic. These findings are consistent with principles of design based on emergent features; they are inconsistent with principles based on perceptual objects. Some limitations of the configural coordinate display are discussed and a redesign is provided. Practitioner Summary: Principles of ecological interface design, which emphasise the quality of very specific mappings between domain, display and observer constraints, are described; these principles are applicable to the design of all analogical graphical displays. PMID:26218496

  7. Developing a Dynamics and Vibrations Course for Civil Engineering Students Based on Fundamental-Principles

    ERIC Educational Resources Information Center

    Barroso, Luciana R.; Morgan, James R.

    2012-01-01

    This paper describes the creation and evolution of an undergraduate dynamics and vibrations course for civil engineering students. Incorporating vibrations into the course allows students to see and study "real" civil engineering applications of the course content. This connection of academic principles to real life situations is in…

  8. The Uncertainty Threshold Principle: Some Fundamental Limitations of Optimal Decision Making Under Dynamic Uncertainity

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  9. The uncertainty threshold principle - Some fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  10. Fundamental principles of hollow-cathode-discharge operations in space and the design of a rocket-borne demonstration

    NASA Technical Reports Server (NTRS)

    Szuszczewicz, Edward P.; Mccoy, James E.; Bonifazi, Carlo; Dobrowolny, Marino

    1988-01-01

    The issue of hollow-cathode operations in space is treated from the point of view of fundamental principles of plasma interactions and their control over currents involving the device, the spaceborne vehicle, and the ambient space plasma. Particular attention is given to collective plasma processes, the effects of the ambient magnetic field, and the high probability of plasma turbulence triggered by hallow cathode operations. The paper presents a rocket payload and experiment scenario designed for accommodation on a Black Brant booster, launched from a midlatitude site to an apogee in excess of 400 km.

  11. Bench-to-bedside review: Fundamental principles of acid-base physiology

    PubMed Central

    Corey, Howard E

    2005-01-01

    Complex acid–base disorders arise frequently in critically ill patients, especially in those with multiorgan failure. In order to diagnose and treat these disorders better, some intensivists have abandoned traditional theories in favor of revisionist models of acid–base balance. With claimed superiority over the traditional approach, the new methods have rekindled debate over the fundmental principles of acid–base physiology. In order to shed light on this controversy, we review the derivation and application of new models of acid–base balance. PMID:15774076

  12. Al-Air Batteries: Fundamental Thermodynamic Limitations from First Principles Theory

    NASA Astrophysics Data System (ADS)

    Chen, Leanne D.; Noerskov, Jens K.; Luntz, Alan C.

    2015-03-01

    The Al-air battery possesses high theoretical specific energy (4140 Wh/kg) and is therefore an attractive candidate for vehicle propulsion applications. However, the experimentally observed open-circuit potential is much lower than what thermodynamics predicts, and this potential loss is widely believed to be an effect of corrosion. We present a detailed study of the Al-air battery using density functional theory. The results suggest that the difference between bulk thermodynamic and surface potentials is due to both the effects of asymmetry in multi-electron transfer reactions that define the anodic dissolution of Al and, more importantly, a large chemical step inherent to the formation of bulk Al(OH)3 from surface intermediates. The former results in an energy loss of 3%, while the latter accounts for 14 -29% of the total thermodynamic energy depending on the surface site where dissolution occurs. Therefore, the maximum open-circuit potential of the Al anode is only -1.87 V vs. SHE in the absence of thermal excitations, contrary to -2.34 V predicted by bulk thermodynamics at pH 14.6. This is a fundamental limitation of the system and governs the maximum output potential, which cannot be improved even if corrosion effects were completely suppressed. Supported by the Natural Sciences and Engineering Research Council of Canada and the ReLiable Project (#11-116792) funded by the Danish Council for Strategic Research.

  13. Al-Air Batteries: Fundamental Thermodynamic Limitations from First-Principles Theory.

    PubMed

    Chen, Leanne D; Nørskov, Jens K; Luntz, Alan C

    2015-01-01

    The Al-air battery possesses high theoretical specific energy (4140 W h/kg) and is therefore an attractive candidate for vehicle propulsion. However, the experimentally observed open-circuit potential is much lower than what bulk thermodynamics predicts, and this potential loss is typically attributed to corrosion. Similarly, large Tafel slopes associated with the battery are assumed to be due to film formation. We present a detailed thermodynamic study of the Al-air battery using density functional theory. The results suggest that the maximum open-circuit potential of the Al anode is only -1.87 V versus the standard hydrogen electrode at pH 14.6 instead of the traditionally assumed -2.34 V and that large Tafel slopes are inherent in the electrochemistry. These deviations from the bulk thermodynamics are intrinsic to the electrochemical surface processes that define Al anodic dissolution. This has contributions from both asymmetry in multielectron transfers and, more importantly, a large chemical stabilization inherent to the formation of bulk Al(OH)3 from surface intermediates. These are fundamental limitations that cannot be improved even if corrosion and film effects are completely suppressed. PMID:26263108

  14. A covariant action principle for dissipative fluid dynamics: from formalism to fundamental physics

    NASA Astrophysics Data System (ADS)

    Andersson, N.; Comer, G. L.

    2015-04-01

    We present a new variational framework for dissipative general relativistic fluid dynamics. The model extends the convective variational principle for multi-fluid systems to account for a range of dissipation channels. The key ingredients in the construction are (i) the use of a lower dimensional matter space for each fluid component, and (ii) an extended functional dependence for the associated volume forms. In an effort to make the concepts clear, the formalism is developed step-by-step with model examples considered at each level. Thus we consider a model for heat flow, derive the relativistic Navier-Stokes equations and discuss why the individual dissipative stress tensors need not be spacetime symmetric. We argue that the new formalism, which notably does not involve an expansion away from an assumed equilibrium state, provides a conceptual breakthrough in this area of research. We also provide an ambitious list of directions in which one may want to extend it in the future. This involves an exciting set of problems, relating to both applications and foundational issues.

  15. First Principles Studies of Tapered Silicon Nanowires: Fundamental Insights and Practical Applications

    NASA Astrophysics Data System (ADS)

    Wu, Zhigang

    2008-03-01

    Nanowires (NWs) are often observed experimentally to be tapered rather than straight-edged, with diameters (d) shrinking by as much as 1 nm per 10 nm of vertical growth. Previous theoretical studies have examined the electronic properties of straight-edged nanowires (SNWs), although the effects of tapering on quantum confinement may be of both fundamental and practical importance. We have employed ab initio calculations to study the structural and electronic properties of tapered Si NWs. As one may expect, tapered nanowires (TNWs) possess axially-dependent electronic properties; their local energy gaps vary along the wire axis, with the largest gap occurring at the narrowest point of the wire. In contrast to SNWs, where confinement tends to shift valence bands more than conduction bands away from the bulk gap, the unoccupied states in TNWs are much more sensitive to d than the occupied states. In addition, tapering causes the band-edge states to be spatially separated along the wire axis, a consequence of the interplay between a strong variation in quantum confinement strength with diameter and the tapering-induced charge transfer. This property may be exploited in electronic and optical applications, for example, in photovoltaic devices where the separation of the valence and conduction band states could be used to transport excited charges during the thermalization process. In order to gain insight into TNW photovoltaic properties, we have also carried out calculations of the dipole matrix elements near the band edges as well as the role of metal contacts on TNW electronic properties. Finally, a combination of ab initio total energy calculations and classical molecular dynamics (MD) simulations are employed to suggest a new technique for bringing nanoscale objects together to form ordered, ultra high-aspect ratio nanowires. This work was supported in part by the U.S. Department of Energy under Contract No. DE-AC02-05CH11231.

  16. A Greatly Under-Appreciated Fundamental Principle of Physical Organic Chemistry

    PubMed Central

    Cox, Robin A.

    2011-01-01

    If a species does not have a finite lifetime in the reaction medium, it cannot be a mechanistic intermediate. This principle was first enunciated by Jencks, as the concept of an enforced mechanism. For instance, neither primary nor secondary carbocations have long enough lifetimes to exist in an aqueous medium, so SN1 reactions involving these substrates are not possible, and an SN2 mechanism is enforced. Only tertiary carbocations and those stabilized by resonance (benzyl cations, acylium ions) are stable enough to be reaction intermediates. More importantly, it is now known that neither H3O+ nor HO− exist as such in dilute aqueous solution. Several recent high-level calculations on large proton clusters are unable to localize the positive charge; it is found to be simply “on the cluster” as a whole. The lifetime of any ionized water species is exceedingly short, a few molecular vibrations at most; the best experimental study, using modern IR instrumentation, has the most probable hydrated proton structure as H13O6+, but only an estimated quarter of the protons are present even in this form at any given instant. Thanks to the Grotthuss mechanism of chain transfer along hydrogen bonds, in reality a proton or a hydroxide ion is simply instantly available anywhere it is needed for reaction. Important mechanistic consequences result. Any charged oxygen species (e.g., a tetrahedral intermediate) is also not going to exist long enough to be a reaction intermediate, unless the charge is stabilized in some way, usually by resonance. General acid catalysis is the rule in reactions in concentrated aqueous acids. The Grotthuss mechanism also means that reactions involving neutral water are favored; the solvent is already highly structured, so the entropy involved in bringing several solvent molecules to the reaction center is unimportant. Examples are given. PMID:22272074

  17. Structural phase transitions and fundamental band gaps of MgxZn1 xO alloys from first principles

    SciTech Connect

    Maznichenko, I. V.; Ernst, Arthur; Bouhassoune, M.; Henk, J.; Daene, Markus W; Lueders, Martin; Bruno, Patrick; Wolfam, Hergert; Mertig, I.; Szotek, Zdzislawa; Temmerman, Walter M

    2009-01-01

    The structural phase transitions and the fundamental band gaps of MgxZn1 xO alloys are investigated by detailed first-principles calculations in the entire range of Mg concentrations x, applying a multiple-scattering theoretical approach (Korringa-Kohn-Rostoker method). Disordered alloys are treated within the coherent-potential approximation. The calculations for various crystal phases have given rise to a phase diagram in good agreement with experiments and other theoretical approaches. The phase transition from the wurtzite to the rock-salt structure is predicted at the Mg concentration of x=0.33, which is close to the experimental value of 0.33 0.40. The size of the fundamental band gap, typically underestimated by the local-density approximation, is considerably improved by the self-interaction correction. The increase in the gap upon alloying ZnO with Mg corroborates experimental trends. Our findings are relevant for applications in optical, electrical, and, in particular, in magnetoelectric devices.

  18. On classes of non-Gaussian asymptotic minimizers in entropic uncertainty principles

    NASA Astrophysics Data System (ADS)

    Zozor, S.; Vignat, C.

    2007-03-01

    In this paper we revisit the Bialynicki-Birula and Mycielski uncertainty principle and its cases of equality. This Shannon entropic version of the well-known Heisenberg uncertainty principle can be used when dealing with variables that admit no variance. In this paper, we extend this uncertainty principle to Rényi entropies. We recall that in both Shannon and Rényi cases, and for a given dimension n, the only case of equality occurs for Gaussian random vectors. We show that as n grows, however, the bound is also asymptotically attained in the cases of n-dimensional Student- t and Student- r distributions. A complete analytical study is performed in a special case of a Student- t distribution. We also show numerically that this effect exists for the particular case of a n-dimensional Cauchy variable, whatever the Rényi entropy considered, extending the results of Abe and illustrating the analytical asymptotic study of the Student- t case. In the Student- r case, we show numerically that the same behavior occurs for uniformly distributed vectors. These particular cases and other ones investigated in this paper are interesting since they show that this asymptotic behavior cannot be considered as a “Gaussianization” of the vector when the dimension increases.

  19. Quantum statistical entropy and minimal length of 5D Ricci-flat black string with generalized uncertainty principle

    SciTech Connect

    Liu Molin; Gui Yuanxing; Liu Hongya

    2008-12-15

    In this paper, we study the quantum statistical entropy in a 5D Ricci-flat black string solution, which contains a 4D Schwarzschild-de Sitter black hole on the brane, by using the improved thin-layer method with the generalized uncertainty principle. The entropy is the linear sum of the areas of the event horizon and the cosmological horizon without any cutoff and any constraint on the bulk's configuration rather than the usual uncertainty principle. The system's density of state and free energy are convergent in the neighborhood of horizon. The small-mass approximation is determined by the asymptotic behavior of metric function near horizons. Meanwhile, we obtain the minimal length of the position {delta}x, which is restrained by the surface gravities and the thickness of layer near horizons.

  20. First-principles study of the minimal model of magnetic interactions in Fe-based superconductors

    NASA Astrophysics Data System (ADS)

    Glasbrenner, J. K.; Velev, J. P.; Mazin, I. I.

    2014-02-01

    Using noncollinear first-principles calculations, we perform a systematic study of the magnetic order in several families of ferropnictides. We find a fairly universal energy dependence on the magnetization order in all cases. Our results confirm that a simple Heisenberg model fails to account for the energy dependence of the magnetization in a couple of ways: first, a biquadratic term is present in all cases and, second, the magnetic moment softens depending on the orientation. We also find that hole doping substantially reduces the biquadratic contribution, although the antiferromagnetic stripe state remains stable within the whole range of doping concentrations, and thus the reported lack of the orthorhombicity in Na-doped BaFe2As2 is probably due to factors other than a sign reversal of the biquadratic term. Finally, we discover that even with the biquadratic term, there is a limit to the accuracy of mapping the density functional theory energetics onto Heisenberg-type models, independent of the range of the model.

  1. Dielectric properties of agricultural products – fundamental principles, influencing factors, and measurement technirques. Chapter 4. Electrotechnologies for Food Processing: Book Series. Volume 3. Radio-Frequency Heating

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this chapter, definitions of dielectric properties, or permittivity, of materials and a brief discussion of the fundamental principles governing their behavior with respect to influencing factors are presented. The basic physics of the influence of frequency of the electric fields and temperatur...

  2. The inactivation principle: mathematical solutions minimizing the absolute work and biological implications for the planning of arm movements.

    PubMed

    Berret, Bastien; Darlot, Christian; Jean, Frédéric; Pozzo, Thierry; Papaxanthis, Charalambos; Gauthier, Jean Paul

    2008-10-01

    An important question in the literature focusing on motor control is to determine which laws drive biological limb movements. This question has prompted numerous investigations analyzing arm movements in both humans and monkeys. Many theories assume that among all possible movements the one actually performed satisfies an optimality criterion. In the framework of optimal control theory, a first approach is to choose a cost function and test whether the proposed model fits with experimental data. A second approach (generally considered as the more difficult) is to infer the cost function from behavioral data. The cost proposed here includes a term called the absolute work of forces, reflecting the mechanical energy expenditure. Contrary to most investigations studying optimality principles of arm movements, this model has the particularity of using a cost function that is not smooth. First, a mathematical theory related to both direct and inverse optimal control approaches is presented. The first theoretical result is the Inactivation Principle, according to which minimizing a term similar to the absolute work implies simultaneous inactivation of agonistic and antagonistic muscles acting on a single joint, near the time of peak velocity. The second theoretical result is that, conversely, the presence of non-smoothness in the cost function is a necessary condition for the existence of such inactivation. Second, during an experimental study, participants were asked to perform fast vertical arm movements with one, two, and three degrees of freedom. Observed trajectories, velocity profiles, and final postures were accurately simulated by the model. In accordance, electromyographic signals showed brief simultaneous inactivation of opposing muscles during movements. Thus, assuming that human movements are optimal with respect to a certain integral cost, the minimization of an absolute-work-like cost is supported by experimental observations. Such types of optimality

  3. Locomotor control of limb force switches from minimal intervention principle in early adaptation to noise reduction in late adaptation

    PubMed Central

    Selgrade, Brian P.

    2014-01-01

    During movement, errors are typically corrected only if they hinder performance. Preferential correction of task-relevant deviations is described by the minimal intervention principle but has not been demonstrated in the joints during locomotor adaptation. We studied hopping as a tractable model of locomotor adaptation of the joints within the context of a limb-force-specific task space. Subjects hopped while adapting to shifted visual feedback that induced them to increase peak ground reaction force (GRF). We hypothesized subjects would preferentially reduce task-relevant joint torque deviations over task-irrelevant deviations to increase peak GRF. We employed a modified uncontrolled manifold analysis to quantify task-relevant and task-irrelevant joint torque deviations for each individual hop cycle. As would be expected by the explicit goal of the task, peak GRF errors decreased in early adaptation before reaching steady state during late adaptation. Interestingly, during the early adaptation performance improvement phase, subjects reduced GRF errors by decreasing only the task-relevant joint torque deviations. In contrast, during the late adaption performance maintenance phase, all torque deviations decreased in unison regardless of task relevance. In deadaptation, when the shift in visual feedback was removed, all torque deviations decreased in unison, possibly because performance improvement was too rapid to detect changes in only the task-relevant dimension. We conclude that limb force adaptation in hopping switches from a minimal intervention strategy during performance improvement to a noise reduction strategy during performance maintenance, which may represent a general control strategy for locomotor adaptation of limb force in other bouncing gaits, such as running. PMID:25475343

  4. Lessons that Bear Repeating and Repeating that Bears Lessons: An Interdisciplinary Unit on Principles of Minimalism in Modern Music, Art, and Poetry (Grades 4-8)

    ERIC Educational Resources Information Center

    Smigel, Eric; McDonald, Nan L.

    2012-01-01

    This theory-to-practice article focuses on interdisciplinary classroom activities based on principles of minimalism in modern music, art, and poetry. A lesson sequence was designed for an inner-city Grades 4 and 5 general classroom of English language learners, where the unit was taught, assessed, and documented by the authors. Included in the…

  5. A review of the fundamentals of polymer-modified asphalts: Asphalt/polymer interactions and principles of compatibility.

    PubMed

    Polacco, Giovanni; Filippi, Sara; Merusi, Filippo; Stastna, George

    2015-10-01

    During the last decades, the number of vehicles per citizen as well as the traffic speed and load has dramatically increased. This sudden and somehow unplanned overloading has strongly shortened the life of pavements and increased its cost of maintenance and risks to users. In order to limit the deterioration of road networks, it is necessary to improve the quality and performance of pavements, which was achieved through the addition of a polymer to the bituminous binder. Since their introduction, polymer-modified asphalts have gained in importance during the second half of the twentieth century, and they now play a fundamental role in the field of road paving. With high-temperature and high-shear mixing with asphalt, the polymer incorporates asphalt molecules, thereby forming a swallowed network that involves the entire binder and results in a significant improvement of the viscoelastic properties in comparison with those of the unmodified binder. Such a process encounters the well-known difficulties related to the poor solubility of polymers, which limits the number of macromolecules able to not only form such a structure but also maintain it during high-temperature storage in static conditions, which may be necessary before laying the binder. Therefore, polymer-modified asphalts have been the subject of numerous studies aimed to understand and optimize their structure and storage stability, which gradually attracted polymer scientists into this field that was initially explored by civil engineers. The analytical techniques of polymer science have been applied to polymer-modified asphalts, which resulted in a good understanding of their internal structure. Nevertheless, the complexity and variability of asphalt composition rendered it nearly impossible to generalize the results and univocally predict the properties of a given polymer/asphalt pair. The aim of this paper is to review these aspects of polymer-modified asphalts. Together with a brief description of

  6. PET/CT: fundamental principles.

    PubMed

    Seemann, Marcus D

    2004-05-28

    Positron emission tomography (PET) facilitates the evaluation of metabolic and molecular characteristics of a wide variety of cancers, but is limited in its ability to visualize anatomical structures. Computed tomography (CT) facilitates the evaluation of anatomical structures of cancers, but can not visualize their metabolic and molecular aspects. Therefore, the combination of PET and CT provides the ability to accurately register metabolic and molecular aspects of disease with anatomical findings, adding further information to the diagnosis and staging of tumors. The recent generation of high performance PET/CT scanners combines a state of the art full-ring 3D PET scanner and a high-end 16-slice CT scanner. In PET/CT scanners, a CT examination is used for attenuation correction of PET images rather than standard transmission scanning using superset 68 Ge sources. This reduces the examination time, but metallic objects and contrast agents that alter the CT image quality and quantitative measurements of standardized uptake values (SUV) may lead to artifacts in the PET images. Hybrid PET/CT imaging will be very important in oncological applications in the decades to come, and possibly for use in cancer screening and cardiac imaging. PMID:15257877

  7. Animal and robot experiments to discover principles behind the evolution of a minimal locomotor apparatus for robust legged locomotion

    NASA Astrophysics Data System (ADS)

    McInroe, Benjamin; Astley, Henry; Kawano, Sandy; Blob, Richard; Goldman, Daniel I.

    2015-03-01

    In the evolutionary transition from an aquatic to a terrestrial environment, early walkers adapted to the challenges of locomotion on complex, flowable substrates (e.g. sand and mud). Our previous biological and robotic studies have demonstrated that locomotion on such substrates is sensitive to both limb morphology and kinematics. Although reconstructions of early vertebrate skeletal morphologies exist, the kinematic strategies required for successful locomotion by these organisms have not yet been explored. To gain insight into how early walkers contended with complex substrates, we developed a robotic model with appendage morphology inspired by a model analog organism, the mudskipper. We tested mudskippers and the robot on different substrates, including rigid ground and dry granular media, varying incline angle. The mudskippers moved effectively on all level substrates using a fin-driven gait. But as incline angle increased, the animals used their tails in concert with their fins to generate propulsion. Adding an actuated tail to the robot improved robustness, making possible locomotion on otherwise inaccessible inclines. With these discoveries, we are elucidating a minimal template that may have allowed the early walkers to adapt to locomotion on land. This work was supported by NSF PoLS.

  8. Transplant of bone marrow and cord blood hematopoietic stem cells in pediatric practice, revisited according to the fundamental principles of bioethics.

    PubMed

    Burgio, G R; Locatelli, F

    1997-06-01

    The two most widely used sources of hematopoietic stem cells for allogeneic transplants in pediatric practice are bone marrow (BM) and cord blood (CB). While bone marrow transplantation (BMT) is reaching its 30th year of application, human umbilical cord blood transplantation (HUCBT) is approaching its 10th. Although these procedures have basically the same purpose, a number of biological differences distinguish them. In particular, the intrinsically limited quantity of CB stem cells and their immunological naiveté confer peculiar characteristics to these hematopoietic progenitors. From a bioethical point of view, the problems which have repeatedly been raised when the BM donor is a child are well-known. Different but no less important ethical problems are raised when one considers HUCBT; in this regard the most important issues are the easier propensity of programming a CB donor in comparison with a BM donor (clearly due to the shorter time interval needed to collect the hematopoietic progenitors); the in utero HLA-typing; the implication of employing 'blood belonging to a neonate' for a third party; the need to perform a number of investigations both on the CB of the donor and on the mother and the implications that the discovery of disease may have for them, but also the need to establish banks for storing CB, with the accompanying administration and management problems. All these different aspects of UCBT will be discussed in the light of the four fundamental and traditional principles of bioethics, namely autonomy, nonmaleficence, beneficence and justice. PMID:9208108

  9. Buridan's Principle

    NASA Astrophysics Data System (ADS)

    Lamport, Leslie

    2012-08-01

    Buridan's principle asserts that a discrete decision based upon input having a continuous range of values cannot be made within a bounded length of time. It appears to be a fundamental law of nature. Engineers aware of it can design devices so they have an infinitessimal probability of not making a decision quickly enough. Ignorance of the principle could have serious consequences.

  10. Fundamentals of Diesel Engines.

    ERIC Educational Resources Information Center

    Marine Corps Inst., Washington, DC.

    This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the fundamentals of diesel engine mechanics. Addressed in the three individual units of the course are the following topics: basic principles of diesel mechanics; principles, mechanics, and…

  11. Use of minimal invasive extracorporeal circulation in cardiac surgery: principles, definitions and potential benefits. A position paper from the Minimal invasive Extra-Corporeal Technologies international Society (MiECTiS).

    PubMed

    Anastasiadis, Kyriakos; Murkin, John; Antonitsis, Polychronis; Bauer, Adrian; Ranucci, Marco; Gygax, Erich; Schaarschmidt, Jan; Fromes, Yves; Philipp, Alois; Eberle, Balthasar; Punjabi, Prakash; Argiriadou, Helena; Kadner, Alexander; Jenni, Hansjoerg; Albrecht, Guenter; van Boven, Wim; Liebold, Andreas; de Somer, Fillip; Hausmann, Harald; Deliopoulos, Apostolos; El-Essawi, Aschraf; Mazzei, Valerio; Biancari, Fausto; Fernandez, Adam; Weerwind, Patrick; Puehler, Thomas; Serrick, Cyril; Waanders, Frans; Gunaydin, Serdar; Ohri, Sunil; Gummert, Jan; Angelini, Gianni; Falk, Volkmar; Carrel, Thierry

    2016-05-01

    Minimal invasive extracorporeal circulation (MiECC) systems have initiated important efforts within science and technology to further improve the biocompatibility of cardiopulmonary bypass components to minimize the adverse effects and improve end-organ protection. The Minimal invasive Extra-Corporeal Technologies international Society was founded to create an international forum for the exchange of ideas on clinical application and research of minimal invasive extracorporeal circulation technology. The present work is a consensus document developed to standardize the terminology and the definition of minimal invasive extracorporeal circulation technology as well as to provide recommendations for the clinical practice. The goal of this manuscript is to promote the use of MiECC systems into clinical practice as a multidisciplinary strategy involving cardiac surgeons, anaesthesiologists and perfusionists. PMID:26819269

  12. Quantum computing with photons: introduction to the circuit model, the one-way quantum computer, and the fundamental principles of photonic experiments

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie

    2015-04-01

    Quantum physics has revolutionized our understanding of information processing and enables computational speed-ups that are unattainable using classical computers. This tutorial reviews the fundamental tools of photonic quantum information processing. The basics of theoretical quantum computing are presented and the quantum circuit model as well as measurement-based models of quantum computing are introduced. Furthermore, it is shown how these concepts can be implemented experimentally using photonic qubits, where information is encoded in the photons’ polarization.

  13. Fundamentals of fluid lubrication

    NASA Technical Reports Server (NTRS)

    Hamrock, Bernard J.

    1991-01-01

    The aim is to coordinate the topics of design, engineering dynamics, and fluid dynamics in order to aid researchers in the area of fluid film lubrication. The lubrication principles that are covered can serve as a basis for the engineering design of machine elements. The fundamentals of fluid film lubrication are presented clearly so that students that use the book will have confidence in their ability to apply these principles to a wide range of lubrication situations. Some guidance on applying these fundamentals to the solution of engineering problems is also provided.

  14. Homeschooling and Religious Fundamentalism

    ERIC Educational Resources Information Center

    Kunzman, Robert

    2010-01-01

    This article considers the relationship between homeschooling and religious fundamentalism by focusing on their intersection in the philosophies and practices of conservative Christian homeschoolers in the United States. Homeschooling provides an ideal educational setting to support several core fundamentalist principles: resistance to…

  15. Fundamental electrode kinetics

    NASA Technical Reports Server (NTRS)

    Elder, J. P.

    1968-01-01

    Report presents the fundamentals of electrode kinetics and the methods used in evaluating the characteristic parameters of rapid-charge transfer processes at electrode-electrolyte interfaces. The concept of electrode kinetics is outlined, followed by the principles underlying the experimental techniques for the investigation of electrode kinetics.

  16. Fundamental Reaction Pathway for Peptide Metabolism by Proteasome: Insights from First-principles Quantum Mechanical/Molecular Mechanical Free Energy Calculations

    PubMed Central

    Wei, Donghui; Fang, Lei; Tang, Mingsheng; Zhan, Chang-Guo

    2013-01-01

    Proteasome is the major component of the crucial nonlysosomal protein degradation pathway in the cells, but the detailed reaction pathway is unclear. In this study, first-principles quantum mechanical/molecular mechanical free energy calculations have been performed to explore, for the first time, possible reaction pathways for proteasomal proteolysis/hydrolysis of a representative peptide, succinyl-leucyl-leucyl-valyl-tyrosyl-7-amino-4-methylcoumarin (Suc-LLVY-AMC). The computational results reveal that the most favorable reaction pathway consists of six steps. The first is a water-assisted proton transfer within proteasome, activating Thr1-Oγ. The second is a nucleophilic attack on the carbonyl carbon of a Tyr residue of substrate by the negatively charged Thr1-Oγ, followed by the dissociation of the amine AMC (third step). The fourth step is a nucleophilic attack on the carbonyl carbon of the Tyr residue of substrate by a water molecule, accompanied by a proton transfer from the water molecule to Thr1-Nz. Then, Suc-LLVY is dissociated (fifth step), and Thr1 is regenerated via a direct proton transfer from Thr1-Nz to Thr1-Oγ. According to the calculated energetic results, the overall reaction energy barrier of the proteasomal hydrolysis is associated with the transition state (TS3b) for the third step involving a water-assisted proton transfer. The determined most favorable reaction pathway and the rate-determining step have provided a reasonable interpretation of the reported experimental observations concerning the substituent and isotopic effects on the kinetics. The calculated overall free energy barrier of 18.2 kcal/mol is close to the experimentally-derived activation free energy of ~18.3–19.4 kcal/mol, suggesting that the computational results are reasonable. PMID:24111489

  17. Minimal cosmography

    NASA Astrophysics Data System (ADS)

    Piazza, Federico; Schücker, Thomas

    2016-04-01

    The minimal requirement for cosmography—a non-dynamical description of the universe—is a prescription for calculating null geodesics, and time-like geodesics as a function of their proper time. In this paper, we consider the most general linear connection compatible with homogeneity and isotropy, but not necessarily with a metric. A light-cone structure is assigned by choosing a set of geodesics representing light rays. This defines a "scale factor" and a local notion of distance, as that travelled by light in a given proper time interval. We find that the velocities and relativistic energies of free-falling bodies decrease in time as a consequence of cosmic expansion, but at a rate that can be different than that dictated by the usual metric framework. By extrapolating this behavior to photons' redshift, we find that the latter is in principle independent of the "scale factor". Interestingly, redshift-distance relations and other standard geometric observables are modified in this extended framework, in a way that could be experimentally tested. An extremely tight constraint on the model, however, is represented by the blackbody-ness of the cosmic microwave background. Finally, as a check, we also consider the effects of a non-metric connection in a different set-up, namely, that of a static, spherically symmetric spacetime.

  18. Monte Carlo fundamentals

    SciTech Connect

    Brown, F.B.; Sutton, T.M.

    1996-02-01

    This report is composed of the lecture notes from the first half of a 32-hour graduate-level course on Monte Carlo methods offered at KAPL. These notes, prepared by two of the principle developers of KAPL`s RACER Monte Carlo code, cover the fundamental theory, concepts, and practices for Monte Carlo analysis. In particular, a thorough grounding in the basic fundamentals of Monte Carlo methods is presented, including random number generation, random sampling, the Monte Carlo approach to solving transport problems, computational geometry, collision physics, tallies, and eigenvalue calculations. Furthermore, modern computational algorithms for vector and parallel approaches to Monte Carlo calculations are covered in detail, including fundamental parallel and vector concepts, the event-based algorithm, master/slave schemes, parallel scaling laws, and portability issues.

  19. A Yoga Strengthening Program Designed to Minimize the Knee Adduction Moment for Women with Knee Osteoarthritis: A Proof-Of-Principle Cohort Study

    PubMed Central

    2015-01-01

    People with knee osteoarthritis may benefit from exercise prescriptions that minimize knee loads in the frontal plane. The primary objective of this study was to determine whether a novel 12-week strengthening program designed to minimize exposure to the knee adduction moment (KAM) could improve symptoms and knee strength in women with symptomatic knee osteoarthritis. A secondary objective was to determine whether the program could improve mobility and fitness, and decrease peak KAM during gait. The tertiary objective was to evaluate the biomechanical characteristics of this yoga program. In particular, we compared the peak KAM during gait with that during yoga postures at baseline. We also compared lower limb normalized mean electromyography (EMG) amplitudes during yoga postures between baseline and follow-up. Primary measures included self-reported pain and physical function (Knee injury and Osteoarthritis Outcome Score) and knee strength (extensor and flexor torques). Secondary measures included mobility (six-minute walk, 30-second chair stand, stair climbing), fitness (submaximal cycle ergometer test), and clinical gait analysis using motion capture synchronized with electromyography and force measurement. Also, KAM and normalized mean EMG amplitudes were collected during yoga postures. Forty-five women over age 50 with symptomatic knee osteoarthritis, consistent with the American College of Rheumatology criteria, enrolled in our 12-week (3 sessions per week) program. Data from 38 were analyzed (six drop-outs; one lost to co-intervention). Participants experienced reduced pain (mean improvement 10.1–20.1 normalized to 100; p<0.001), increased knee extensor strength (mean improvement 0.01 Nm/kg; p = 0.004), and increased flexor strength (mean improvement 0.01 Nm/kg; p = 0.001) at follow-up compared to baseline. Participants improved mobility on the six-minute walk (mean improvement 37.7 m; p<0.001) and 30-second chair stand (mean improvement 1.3; p = 0.006) at

  20. Marketing fundamentals.

    PubMed

    Redmond, W H

    2001-01-01

    This chapter outlines current marketing practice from a managerial perspective. The role of marketing within an organization is discussed in relation to efficiency and adaptation to changing environments. Fundamental terms and concepts are presented in an applied context. The implementation of marketing plans is organized around the four P's of marketing: product (or service), promotion (including advertising), place of delivery, and pricing. These are the tools with which marketers seek to better serve their clients and form the basis for competing with other organizations. Basic concepts of strategic relationship management are outlined. Lastly, alternate viewpoints on the role of advertising in healthcare markets are examined. PMID:11401791

  1. Fundamentals of fossil simulator instructor training

    SciTech Connect

    Not Available

    1984-01-01

    This single-volume, looseleaf text introduces the beginning instructor to fundamental instructor training principles, and then shows how to apply those principles to fossil simulator training. Topics include the fundamentals of classroom instruction, the learning process, course development, and the specifics of simulator training program development.

  2. Ethics fundamentals.

    PubMed

    Chambers, David W

    2011-01-01

    Ethics is about studying the right and the good; morality is about acting as one should. Although there are differences among what is legal, charitable, professional, ethical, and moral, these desirable characteristics tend to cluster and are treasured in dentistry. The traditional approach to professionalism in dentistry is based on a theory of biomedical ethics advanced 30 years ago. Known as the principles approach, general ideals such as respect for autonomy, nonmaleficence, beneficence, justice, and veracity, are offered as guides. Growth in professionalism consists in learning to interpret the application of these principles as one's peers do. Moral behavior is conceived as a continuous cycle of sensitivity to situations requiring moral response, moral reasoning, the moral courage to take action when necessary, and integration of habits of moral behavior into one's character. This essay is the first of two papers that provide the backbone for the IDEA Project of the College--an online, multiformat, interactive "textbook" of ethics for the profession. PMID:22263371

  3. VCSEL Fundamentals

    NASA Astrophysics Data System (ADS)

    Michalzik, Rainer

    In this chapter we outline major principles of vertical-cavity surface-emitting laser (VCSEL) design and operation. Basic device properties and generally applicable cavity design rules are introduced. Characteristic parameters like threshold gain and current, differential quantum efficiency and power conversion efficiency, as well as thermal resistance are discussed. We describe the design of Bragg reflectors and explain the transfer matrix method as a convenient tool to compute VCSEL resonator properties in a one-dimensional approximation. Experimental results illustrate the emission characteristics of high-efficiency VCSELs that apply selective oxidation for current and photon confinement. Both the 850 and 980 nm wavelength regions are considered. The basic treatment of laser dynamics and noise behavior is presented in terms of the small-signal modulation response as well as the relative intensity noise. Finally we give some examples of VCSEL applications in fiber-based optical interconnects, i.e., optical data transmission over short distances.

  4. How fundamental are fundamental constants?

    NASA Astrophysics Data System (ADS)

    Duff, M. J.

    2015-01-01

    I argue that the laws of physics should be independent of one's choice of units or measuring apparatus. This is the case if they are framed in terms of dimensionless numbers such as the fine structure constant, ?. For example, the standard model of particle physics has 19 such dimensionless parameters whose values all observers can agree on, irrespective of what clock, rulers or scales? they use to measure them. Dimensional constants, on the other hand, such as ?, c, G, e and k ?, are merely human constructs whose number and values differ from one choice of units to the next. In this sense, only dimensionless constants are 'fundamental'. Similarly, the possible time variation of dimensionless fundamental 'constants' of nature is operationally well defined and a legitimate subject of physical enquiry. By contrast, the time variation of dimensional constants such as ? or ? on which a good many (in my opinion, confusing) papers have been written, is a unit-dependent phenomenon on which different observers might disagree depending on their apparatus. All these confusions disappear if one asks only unit-independent questions. We provide a selection of opposing opinions in the literature and respond accordingly.

  5. Minimal metabolic pathway structure is consistent with associated biomolecular interactions

    PubMed Central

    Bordbar, Aarash; Nagarajan, Harish; Lewis, Nathan E; Latif, Haythem; Ebrahim, Ali; Federowicz, Stephen; Schellenberger, Jan; Palsson, Bernhard O

    2014-01-01

    Pathways are a universal paradigm for functionally describing cellular processes. Even though advances in high-throughput data generation have transformed biology, the core of our biological understanding, and hence data interpretation, is still predicated on human-defined pathways. Here, we introduce an unbiased, pathway structure for genome-scale metabolic networks defined based on principles of parsimony that do not mimic canonical human-defined textbook pathways. Instead, these minimal pathways better describe multiple independent pathway-associated biomolecular interaction datasets suggesting a functional organization for metabolism based on parsimonious use of cellular components. We use the inherent predictive capability of these pathways to experimentally discover novel transcriptional regulatory interactions in Escherichia coli metabolism for three transcription factors, effectively doubling the known regulatory roles for Nac and MntR. This study suggests an underlying and fundamental principle in the evolutionary selection of pathway structures; namely, that pathways may be minimal, independent, and segregated. PMID:24987116

  6. Minimal metabolic pathway structure is consistent with associated biomolecular interactions.

    PubMed

    Bordbar, Aarash; Nagarajan, Harish; Lewis, Nathan E; Latif, Haythem; Ebrahim, Ali; Federowicz, Stephen; Schellenberger, Jan; Palsson, Bernhard O

    2014-01-01

    Pathways are a universal paradigm for functionally describing cellular processes. Even though advances in high-throughput data generation have transformed biology, the core of our biological understanding, and hence data interpretation, is still predicated on human-defined pathways. Here, we introduce an unbiased, pathway structure for genome-scale metabolic networks defined based on principles of parsimony that do not mimic canonical human-defined textbook pathways. Instead, these minimal pathways better describe multiple independent pathway-associated biomolecular interaction datasets suggesting a functional organization for metabolism based on parsimonious use of cellular components. We use the inherent predictive capability of these pathways to experimentally discover novel transcriptional regulatory interactions in Escherichia coli metabolism for three transcription factors, effectively doubling the known regulatory roles for Nac and MntR. This study suggests an underlying and fundamental principle in the evolutionary selection of pathway structures; namely, that pathways may be minimal, independent, and segregated. PMID:24987116

  7. Does osteoderm growth follow energy minimization principles?

    PubMed

    Sensale, Sebastián; Jones, Washington; Blanco, R Ernesto

    2014-08-01

    Although the growth and development of tissues and organs of extinct species cannot be directly observed, their fossils can record and preserve evidence of these mechanisms. It is generally accepted that bone architecture is the result of genetically based biomechanical constraints, but what about osteoderms? In this article, the influence of physical constraints on cranial osteoderms growth is assessed. Comparisons among lepidosaurs, synapsids, and archosaurs are performed; according to these analyses, lepidosaur osteoderms growth is predicted to be less energy demanding than that of synapsids and archosaurs. Obtained results also show that, from an energetic viewpoint, ankylosaurid osteoderms growth resembles more that of mammals than the one of reptilians, adding evidence to debate whether dinosaurs were hot or cold blooded. PMID:24634089

  8. Commentary: Minimizing Evaluation Misuse as Principled Practice

    ERIC Educational Resources Information Center

    Cousins, J. Bradley

    2004-01-01

    "Ethical Challenges," in my experience, is invariably interesting, often instructive and sometimes amusing. Some of the most engaging stimulus scenarios raise thorny evaluation practice issues that ultimately lead to disparate points of view about the nature of the issue and how to handle it (Datta, 2002; Smith, 2002). Despite my poor performance…

  9. Minimal length uncertainty and accelerating universe

    NASA Astrophysics Data System (ADS)

    Farmany, A.; Mortazavi, S. S.

    2016-06-01

    In this paper, minimal length uncertainty is used as a constraint to solve the Friedman equation. It is shown that, based on the minimal length uncertainty principle, the Hubble scale is decreasing which corresponds to an accelerating universe.

  10. Fundamentals of Environmental Education. Report.

    ERIC Educational Resources Information Center

    1976

    An outline of fundamental definitions, relationships, and human responsibilities related to environment provides a basis from which a variety of materials, programs, and activities can be developed. The outline can be used in elementary, secondary, higher education, or adult education programs. The framework is based on principles of the science…

  11. Fundamentals of Structural Geology

    NASA Astrophysics Data System (ADS)

    Pollard, David D.; Fletcher, Raymond C.

    2005-09-01

    Fundamentals of Structural Geology provides a new framework for the investigation of geological structures by integrating field mapping and mechanical analysis. Assuming a basic knowledge of physical geology, introductory calculus and physics, it emphasizes the observational data, modern mapping technology, principles of continuum mechanics, and the mathematical and computational skills, necessary to quantitatively map, describe, model, and explain deformation in Earth's lithosphere. By starting from the fundamental conservation laws of mass and momentum, the constitutive laws of material behavior, and the kinematic relationships for strain and rate of deformation, the authors demonstrate the relevance of solid and fluid mechanics to structural geology. This book offers a modern quantitative approach to structural geology for advanced students and researchers in structural geology and tectonics. It is supported by a website hosting images from the book, additional colour images, student exercises and MATLAB scripts. Solutions to the exercises are available to instructors. The book integrates field mapping using modern technology with the analysis of structures based on a complete mechanics MATLAB is used to visualize physical fields and analytical results and MATLAB scripts can be downloaded from the website to recreate textbook graphics and enable students to explore their choice of parameters and boundary conditions The supplementary website hosts color images of outcrop photographs used in the text, supplementary color images, and images of textbook figures for classroom presentations The textbook website also includes student exercises designed to instill the fundamental relationships, and to encourage the visualization of the evolution of geological structures; solutions are available to instructors

  12. Evolutionary principles and their practical application

    PubMed Central

    Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P

    2011-01-01

    Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology. PMID:25567966

  13. Dynamic sealing principles

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function, these are presented and discussed. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin film seals, which enable leakage calculations to be made, are also presented. Concept of flow functions, application of Reynolds lubrication equation, and nonlubrication equation flow, friction and wear; and seal lubrication regimes are explained.

  14. The minimal work cost of information processing

    PubMed Central

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  15. The minimal work cost of information processing.

    PubMed

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  16. Fundamentals in Nuclear Physics

    NASA Astrophysics Data System (ADS)

    Basdevant, Jean-Louis, Rich, James, Spiro, Michael

    This course on nuclear physics leads the reader to the exploration of the field from nuclei to astrophysical issues. Much nuclear phenomenology can be understood from simple arguments such as those based on the Pauli principle and the Coulomb barrier. This book is concerned with extrapolating from such arguments and illustrating nuclear systematics with experimental data. Starting with the basic concepts in nuclear physics, nuclear models, and reactions, the book covers nuclear decays and the fundamental electro-weak interactions, radioactivity, and nuclear energy. After the discussions of fission and fusion leading into nuclear astrophysics, there is a presentation of the latest ideas about cosmology. As a primer this course will lay the foundations for more specialized subjects. This book emerged from a series of topical courses the authors delivered at the Ecole Polytechnique and will be useful for graduate students and for scientists in a variety of fields.

  17. Minimal Reduplication

    ERIC Educational Resources Information Center

    Kirchner, Jesse Saba

    2010-01-01

    This dissertation introduces Minimal Reduplication, a new theory and framework within generative grammar for analyzing reduplication in human language. I argue that reduplication is an emergent property in multiple components of the grammar. In particular, reduplication occurs independently in the phonology and syntax components, and in both cases…

  18. Classical tests of general relativity: Brane-world Sun from minimal geometric deformation

    NASA Astrophysics Data System (ADS)

    Casadio, R.; Ovalle, J.; da Rocha, Roldão

    2015-05-01

    We consider a solution of the effective four-dimensional brane-world equations, obtained from the general relativistic Schwarzschild metric via the principle of minimal geometric deformation, and investigate the corresponding signatures stemming from the possible existence of a warped extra-dimension. In particular, we derive bounds on an extra-dimensional parameter, closely related with the fundamental gravitational length, from the experimental results of the classical tests of general relativity in the Solar system.

  19. Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory: A brief review

    NASA Astrophysics Data System (ADS)

    Kerner, Boris S.

    2013-11-01

    It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliably used for control and optimization in traffic networks. It is shown that the generally accepted fundamentals and methodologies of the traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of the traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (for example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular (fixed or stochastic) value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of the traffic and transportation theory, we discuss the three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.

  20. Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory

    SciTech Connect

    Kerner, Boris S.

    2015-03-10

    It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliable used for control and optimization in traffic networks. It is shown that generally accepted fundamentals and methodologies of traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (for example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular stochastic value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of traffic and transportation theory, we discuss three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.

  1. Testing Our Fundamental Assumptions

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  2. Taxonomic minimalism.

    PubMed

    Beattle, A J; Oliver, I

    1994-12-01

    Biological surveys are in increasing demand while taxonomic resources continue to decline. How much formal taxonomy is required to get the job done? The answer depends on the kind of job but it is possible that taxonomic minimalism, especially (1) the use of higher taxonomic ranks, (2) the use of morphospecies rather than species (as identified by Latin binomials), and (3) the involvement of taxonomic specialists only for training and verification, may offer advantages for biodiversity assessment, environmental monitoring and ecological research. As such, formal taxonomy remains central to the process of biological inventory and survey but resources may be allocated more efficiently. For example, if formal Identification is not required, resources may be concentrated on replication and increasing sample sizes. Taxonomic minimalism may also facilitate the inclusion in these activities of important but neglected groups, especially among the invertebrates, and perhaps even microorganisms. PMID:21236933

  3. Generating minimal living systems from non-living materials and increasing their evolutionary abilities.

    PubMed

    Rasmussen, Steen; Constantinescu, Adi; Svaneborg, Carsten

    2016-08-19

    We review lessons learned about evolutionary transitions from a bottom-up construction of minimal life. We use a particular systemic protocell design process as a starting point for exploring two fundamental questions: (i) how may minimal living systems emerge from non-living materials? and (ii) how may minimal living systems support increasingly more evolutionary richness? Under (i), we present what has been accomplished so far and discuss the remaining open challenges and their possible solutions. Under (ii), we present a design principle we have used successfully both for our computational and experimental protocellular investigations, and we conjecture how this design principle can be extended for enhancing the evolutionary potential for a wide range of systems.This article is part of the themed issue 'The major synthetic evolutionary transitions'. PMID:27431518

  4. Fundamentals of Pharmacogenetics in Personalized, Precision Medicine.

    PubMed

    Valdes, Roland; Yin, DeLu Tyler

    2016-09-01

    This article introduces fundamental principles of pharmacogenetics as applied to personalized and precision medicine. Pharmacogenetics establishes relationships between pharmacology and genetics by connecting phenotypes and genotypes in predicting the response of therapeutics in individual patients. We describe differences between precision and personalized medicine and relate principles of pharmacokinetics and pharmacodynamics to applications in laboratory medicine. We also review basic principles of pharmacogenetics, including its evolution, how it enables the practice of personalized therapeutics, and the role of the clinical laboratory. These fundamentals are a segue for understanding specific clinical applications of pharmacogenetics described in subsequent articles in this issue. PMID:27514461

  5. Fundamentals of the Control of Gas-Turbine Power Plants for Aircraft. Part 2; Principles of Control Common to Jet, Turbine-Propeller Jet, and Ducted-Fan Jet Power Plants

    NASA Technical Reports Server (NTRS)

    Kuehl, H.

    1947-01-01

    After defining the aims and requirements to be set for a control system of gas-turbine power plants for aircraft, the report will deal with devices that prevent the quantity of fuel supplied per unit of time from exceeding the value permissible at a given moment. The general principles of the actuation of the adjustable parts of the power plant are also discussed.

  6. Testing Our Fundamental Assumptions

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  7. Periodic minimal surfaces

    NASA Astrophysics Data System (ADS)

    Mackay, Alan L.

    1985-04-01

    A minimal surface is one for which, like a soap film with the same pressure on each side, the mean curvature is zero and, thus, is one where the two principal curvatures are equal and opposite at every point. For every closed circuit in the surface, the area is a minimum. Schwarz1 and Neovius2 showed that elements of such surfaces could be put together to give surfaces periodic in three dimensions. These periodic minimal surfaces are geometrical invariants, as are the regular polyhedra, but the former are curved. Minimal surfaces are appropriate for the description of various structures where internal surfaces are prominent and seek to adopt a minimum area or a zero mean curvature subject to their topology; thus they merit more complete numerical characterization. There seem to be at least 18 such surfaces3, with various symmetries and topologies, related to the crystallographic space groups. Recently, glyceryl mono-oleate (GMO) was shown by Longley and McIntosh4 to take the shape of the F-surface. The structure postulated is shown here to be in good agreement with an analysis of the fundamental geometry of periodic minimal surfaces.

  8. The ZOOM minimization package

    SciTech Connect

    Fischler, Mark S.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  9. Basic Comfort Heating Principles.

    ERIC Educational Resources Information Center

    Dempster, Chalmer T.

    The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…

  10. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  11. Fundamentals of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Tang, C. L.

    2005-06-01

    Quantum mechanics has evolved from a subject of study in pure physics to one with a wide range of applications in many diverse fields. The basic concepts of quantum mechanics are explained in this book in a concise and easy-to-read manner emphasising applications in solid state electronics and modern optics. Following a logical sequence, the book is focused on the key ideas and is conceptually and mathematically self-contained. The fundamental principles of quantum mechanics are illustrated by showing their application to systems such as the hydrogen atom, multi-electron ions and atoms, the formation of simple organic molecules and crystalline solids of practical importance. It leads on from these basic concepts to discuss some of the most important applications in modern semiconductor electronics and optics. Containing many homework problems and worked examples, the book is suitable for senior-level undergraduate and graduate level students in electrical engineering, materials science and applied physics. Clear exposition of quantum mechanics written in a concise and accessible style Precise physical interpretation of the mathematical foundations of quantum mechanics Illustrates the important concepts and results by reference to real-world examples in electronics and optoelectronics Contains homeworks and worked examples, with solutions available for instructors

  12. GRBs and Fundamental Physics

    NASA Astrophysics Data System (ADS)

    Petitjean, Patrick; Wang, F. Y.; Wu, X. F.; Wei, J. J.

    2016-02-01

    Gamma-ray bursts (GRBs) are short and intense flashes at the cosmological distances, which are the most luminous explosions in the Universe. The high luminosities of GRBs make them detectable out to the edge of the visible universe. So, they are unique tools to probe the properties of high-redshift universe: including the cosmic expansion and dark energy, star formation rate, the reionization epoch and the metal evolution of the Universe. First, they can be used to constrain the history of cosmic acceleration and the evolution of dark energy in a redshift range hardly achievable by other cosmological probes. Second, long GRBs are believed to be formed by collapse of massive stars. So they can be used to derive the high-redshift star formation rate, which can not be probed by current observations. Moreover, the use of GRBs as cosmological tools could unveil the reionization history and metal evolution of the Universe, the intergalactic medium (IGM) properties and the nature of first stars in the early universe. But beyond that, the GRB high-energy photons can be applied to constrain Lorentz invariance violation (LIV) and to test Einstein's Equivalence Principle (EEP). In this paper, we review the progress on the GRB cosmology and fundamental physics probed by GRBs.

  13. Fundamentals of Cryogenics

    NASA Technical Reports Server (NTRS)

    Johnson, Wesley; Tomsik, Thomas; Moder, Jeff

    2014-01-01

    Analysis of the extreme conditions that are encountered in cryogenic systems requires the most effort out of analysts and engineers. Due to the costs and complexity associated with the extremely cold temperatures involved, testing is sometimes minimized and extra analysis is often relied upon. This short course is designed as an introduction to cryogenic engineering and analysis, and it is intended to introduce the basic concepts related to cryogenic analysis and testing as well as help the analyst understand the impacts of various requests on a test facility. Discussion will revolve around operational functions often found in cryogenic systems, hardware for both tests and facilities, and what design or modelling tools are available for performing the analysis. Emphasis will be placed on what scenarios to use what hardware or the analysis tools to get the desired results. The class will provide a review of first principles, engineering practices, and those relations directly applicable to this subject including such topics as cryogenic fluids, thermodynamics and heat transfer, material properties at low temperature, insulation, cryogenic equipment, instrumentation, refrigeration, testing of cryogenic systems, cryogenics safety and typical thermal and fluid analysis used by the engineer. The class will provide references for further learning on various topics in cryogenics for those who want to dive deeper into the subject or have encountered specific problems.

  14. Towards a Minimal System for Cell Division

    NASA Astrophysics Data System (ADS)

    Schwille, Petra

    We have entered the "omics" era of the life sciences, meaning that our general knowledge about biological systems has become vast, complex, and almost impossible to fully comprehend. Consequently, the challenge for quantitative biology and biophysics is to identify appropriate procedures and protocols that allow the researcher to strip down the complexity of a biological system to a level that can be reliably modeled but still retains the essential features of its "real" counterpart. The virtue of physics has always been the reductionist approach, which allowed scientists to identify the underlying basic principles of seemingly complex phenomena, and subject them to rigorous mathematical treatment. Biological systems are obviously among the most complex phenomena we can think of, and it is fair to state that our rapidly increasing knowledge does not make it easier to identify a small set of fundamental principles of the big concept of "life" that can be defined and quantitatively understood. Nevertheless, it is becoming evident that only by tight cooperation and interdisciplinary exchange between the life sciences and quantitative sciences, and by applying intelligent reductionist approaches also to biology, will we be able to meet the intellectual challenges of the twenty-first century. These include not only the collection and proper categorization of the data, but also their true understanding and harnessing such that we can solve important practical problems imposed by medicine or the worldwide need for new energy sources. Many of these approaches are reflected by the modern buzz word "synthetic biology", therefore I briefly discuss this term in the first section. Further, I outline some endeavors of our and other groups to model minimal biological systems, with particular focus on the possibility of generating a minimal system for cell division.

  15. The Seven Cardinal Principles Revisited

    ERIC Educational Resources Information Center

    Shane, Harold G.

    1976-01-01

    The seven cardinal principles of education as stated in 1918--health, command of fundamental processes; worthy home membership, vocation; citizenship, use of leisure, and ethical character--were reassessed by panelists and the future development of each principle examined in the light of a changing world. (JD)

  16. Toward systematic integration between self-determination theory and motivational interviewing as examples of top-down and bottom-up intervention development: autonomy or volition as a fundamental theoretical principle

    PubMed Central

    2012-01-01

    Clinical interventions can be developed through two distinct pathways. In the first, which we call top-down, a well-articulated theory drives the development of the intervention, whereas in the case of a bottom-up approach, clinical experience, more so than a dedicated theoretical perspective, drives the intervention. Using this dialectic, this paper discusses Self-Determination Theory (SDT) [1,2] and Motivational Interviewing (MI) [3] as prototypical examples of a top-down and bottom-up approaches, respectively. We sketch the different starting points, foci and developmental processes of SDT and MI, but equally note the complementary character and the potential for systematic integration between both approaches. Nevertheless, for a deeper integration to take place, we contend that MI researchers might want to embrace autonomy as a fundamental basic process underlying therapeutic change and we discuss the advantages of doing so. PMID:22385828

  17. Plasma Modeling Enabled Technology Development Empowered by Fundamental Scattering Data

    NASA Astrophysics Data System (ADS)

    Kushner, Mark J.

    2016-05-01

    Technology development increasingly relies on modeling to speed the innovation cycle. This is particularly true for systems using low temperature plasmas (LTPs) and their role in enabling energy efficient processes with minimal environmental impact. In the innovation cycle, LTP modeling supports investigation of fundamental processes that seed the cycle, optimization of newly developed technologies, and prediction of performance of unbuilt systems for new applications. Although proof-of-principle modeling may be performed for idealized systems in simple gases, technology development must address physically complex systems that use complex gas mixtures that now may be multi-phase (e.g., in contact with liquids). The variety of fundamental electron and ion scattering, and radiation transport data (FSRD) required for this modeling increases as the innovation cycle progresses, while the accuracy required of that data depends on the intended outcome. In all cases, the fidelity, depth and impact of the modeling depends on the availability of FSRD. Modeling and technology development are, in fact, empowered by the availability and robustness of FSRD. In this talk, examples of the impact of and requirements for FSRD in the innovation cycle enabled by plasma modeling will be discussed using results from multidimensional and global models. Examples of fundamental studies and technology optimization will focus on microelectronics fabrication and on optically pumped lasers. Modeling of systems as yet unbuilt will address the interaction of atmospheric pressure plasmas with liquids. Work supported by DOE Office of Fusion Energy Science and the National Science Foundation.

  18. Combustion Fundamentals Research

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Increased emphasis is placed on fundamental and generic research at Lewis Research Center with less systems development efforts. This is especially true in combustion research, where the study of combustion fundamentals has grown significantly in order to better address the perceived long term technical needs of the aerospace industry. The main thrusts for this combustion fundamentals program area are as follows: analytical models of combustion processes, model verification experiments, fundamental combustion experiments, and advanced numeric techniques.

  19. Exchange Rates and Fundamentals.

    ERIC Educational Resources Information Center

    Engel, Charles; West, Kenneth D.

    2005-01-01

    We show analytically that in a rational expectations present-value model, an asset price manifests near-random walk behavior if fundamentals are I (1) and the factor for discounting future fundamentals is near one. We argue that this result helps explain the well-known puzzle that fundamental variables such as relative money supplies, outputs,…

  20. Reconstruction of fundamental SUSY parameters

    SciTech Connect

    P. M. Zerwas et al.

    2003-09-25

    We summarize methods and expected accuracies in determining the basic low-energy SUSY parameters from experiments at future e{sup +}e{sup -} linear colliders in the TeV energy range, combined with results from LHC. In a second step we demonstrate how, based on this set of parameters, the fundamental supersymmetric theory can be reconstructed at high scales near the grand unification or Planck scale. These analyses have been carried out for minimal supergravity [confronted with GMSB for comparison], and for a string effective theory.

  1. Making the Most of Minimalism in Music.

    ERIC Educational Resources Information Center

    Geiersbach, Frederick J.

    1998-01-01

    Describes the minimalist movement in music. Discusses generations of minimalist musicians and, in general, the minimalist approach. Considers various ways that minimalist strategies can be integrated into the music classroom focusing on (1) minimalism and (2) student-centered composition and principles of minimalism for use with elementary band…

  2. Itch Management: General Principles.

    PubMed

    Misery, Laurent

    2016-01-01

    Like pain, itch is a challenging condition that needs to be managed. Within this setting, the first principle of itch management is to get an appropriate diagnosis to perform an etiology-oriented therapy. In several cases it is not possible to treat the cause, the etiology is undetermined, there are several causes, or the etiological treatment is not effective enough to alleviate itch completely. This is also why there is need for symptomatic treatment. In all patients, psychological support and associated pragmatic measures might be helpful. General principles and guidelines are required, yet patient-centered individual care remains fundamental. PMID:27578069

  3. Basic principles of the Stirling cycle

    NASA Astrophysics Data System (ADS)

    1983-03-01

    The basic principles of the Stirling cycle are outlined. From an elementary theory the general properties of the cycle are derived with a discussion of the most important losses. The performance of the fundamental and ideal (isothermal) cycle are described. The actual cycle, which differs from the ideal one by the occurrence of losses is also described. In the ideal Stirling cycle, the cold is produced by the reversible expansion of a gas. The gas performs a closed cycle, during which it is alternately compressed at ambient temperature in a compression space and expanded at the desired low temperature in an expansion space, thereby reciprocating between these spaces through one connecting duct, wherein a regenerator provides for the heat exchange between the outgoing and the returning gas flow. The problem of how to minimize the total sum of the losses is examined.

  4. Swarm robotics and minimalism

    NASA Astrophysics Data System (ADS)

    Sharkey, Amanda J. C.

    2007-09-01

    Swarm Robotics (SR) is closely related to Swarm Intelligence, and both were initially inspired by studies of social insects. Their guiding principles are based on their biological inspiration and take the form of an emphasis on decentralized local control and communication. Earlier studies went a step further in emphasizing the use of simple reactive robots that only communicate indirectly through the environment. More recently SR studies have moved beyond these constraints to explore the use of non-reactive robots that communicate directly, and that can learn and represent their environment. There is no clear agreement in the literature about how far such extensions of the original principles could go. Should there be any limitations on the individual abilities of the robots used in SR studies? Should knowledge of the capabilities of social insects lead to constraints on the capabilities of individual robots in SR studies? There is a lack of explicit discussion of such questions, and researchers have adopted a variety of constraints for a variety of reasons. A simple taxonomy of swarm robotics is presented here with the aim of addressing and clarifying these questions. The taxonomy distinguishes subareas of SR based on the emphases and justifications for minimalism and individual simplicity.

  5. A correspondence principle

    NASA Astrophysics Data System (ADS)

    Hughes, Barry D.; Ninham, Barry W.

    2016-02-01

    A single mathematical theme underpins disparate physical phenomena in classical, quantum and statistical mechanical contexts. This mathematical "correspondence principle", a kind of wave-particle duality with glorious realizations in classical and modern mathematical analysis, embodies fundamental geometrical and physical order, and yet in some sense sits on the edge of chaos. Illustrative cases discussed are drawn from classical and anomalous diffusion, quantum mechanics of single particles and ideal gases, quasicrystals and Casimir forces.

  6. A Fundamental Theorem on Particle Acceleration

    SciTech Connect

    Xie, Ming

    2003-05-01

    A fundamental theorem on particle acceleration is derived from the reciprocity principle of electromagnetism and a rigorous proof of the theorem is presented. The theorem establishes a relation between acceleration and radiation, which is particularly useful for insightful understanding of and practical calculation about the first order acceleration in which energy gain of the accelerated particle is linearly proportional to the accelerating field.

  7. Sensors, Volume 1, Fundamentals and General Aspects

    NASA Astrophysics Data System (ADS)

    Grandke, Thomas; Ko, Wen H.

    1996-12-01

    'Sensors' is the first self-contained series to deal with the whole area of sensors. It describes general aspects, technical and physical fundamentals, construction, function, applications and developments of the various types of sensors. This volume deals with the fundamentals and common principles of sensors and covers the wide areas of principles, technologies, signal processing, and applications. Contents include: Sensor Fundamentals, e.g. Sensor Parameters, Modeling, Design and Packaging; Basic Sensor Technologies, e.g. Thin and Thick Films, Integrated Magnetic Sensors, Optical Fibres and Intergrated Optics, Ceramics and Oxides; Sensor Interfaces, e.g. Signal Processing, Multisensor Signal Processing, Smart Sensors, Interface Systems; Sensor Applications, e.g. Automotive: On-board Sensors, Traffic Surveillance and Control, Home Appliances, Environmental Monitoring, etc. This volume is an indispensable reference work and text book for both specialits and newcomers, researchers and developers.

  8. The New Minimal Standard Model

    SciTech Connect

    Davoudiasl, Hooman; Kitano, Ryuichiro; Li, Tianjun; Murayama, Hitoshi

    2005-01-13

    We construct the New Minimal Standard Model that incorporates the new discoveries of physics beyond the Minimal Standard Model (MSM): Dark Energy, non-baryonic Dark Matter, neutrino masses, as well as baryon asymmetry and cosmic inflation, adopting the principle of minimal particle content and the most general renormalizable Lagrangian. We base the model purely on empirical facts rather than aesthetics. We need only six new degrees of freedom beyond the MSM. It is free from excessive flavor-changing effects, CP violation, too-rapid proton decay, problems with electroweak precision data, and unwanted cosmological relics. Any model of physics beyond the MSM should be measured against the phenomenological success of this model.

  9. The Design of MACs (Minimal Actin Cortices)

    PubMed Central

    Vogel, Sven K; Heinemann, Fabian; Chwastek, Grzegorz; Schwille, Petra

    2013-01-01

    The actin cell cortex in eukaryotic cells is a key player in controlling and maintaining the shape of cells, and in driving major shape changes such as in cytokinesis. It is thereby constantly being remodeled. Cell shape changes require forces acting on membranes that are generated by the interplay of membrane coupled actin filaments and assemblies of myosin motors. Little is known about how their interaction regulates actin cell cortex remodeling and cell shape changes. Because of the vital importance of actin, myosin motors and the cell membrane, selective in vivo experiments and manipulations are often difficult to perform or not feasible. Thus, the intelligent design of minimal in vitro systems for actin-myosin-membrane interactions could pave a way for investigating actin cell cortex mechanics in a detailed and quantitative manner. Here, we present and discuss the design of several bottom-up in vitro systems accomplishing the coupling of actin filaments to artificial membranes, where key parameters such as actin densities and membrane properties can be varied in a controlled manner. Insights gained from these in vitro systems may help to uncover fundamental principles of how exactly actin-myosin-membrane interactions govern actin cortex remodeling and membrane properties for cell shape changes. © 2013 Wiley Periodicals, Inc. PMID:24039068

  10. Principled Narrative

    ERIC Educational Resources Information Center

    MacBeath, John; Swaffield, Sue; Frost, David

    2009-01-01

    This article provides an overview of the "Carpe Vitam: Leadership for Learning" project, accounting for its provenance and purposes, before focusing on the principles for practice that constitute an important part of the project's legacy. These principles framed the dialogic process that was a dominant feature of the project and are presented,…

  11. Fundamental Physical Constants

    National Institute of Standards and Technology Data Gateway

    SRD 121 CODATA Fundamental Physical Constants (Web, free access)   This site, developed in the Physics Laboratory at NIST, addresses three topics: fundamental physical constants, the International System of Units (SI), which is the modern metric system, and expressing the uncertainty of measurement results.

  12. Pattern formation in a minimal model of continuum dislocation plasticity

    NASA Astrophysics Data System (ADS)

    Sandfeld, Stefan; Zaiser, Michael

    2015-09-01

    The spontaneous emergence of heterogeneous dislocation patterns is a conspicuous feature of plastic deformation and strain hardening of crystalline solids. Despite long-standing efforts in the materials science and physics of defect communities, there is no general consensus regarding the physical mechanism which leads to the formation of dislocation patterns. In order to establish the fundamental mechanism, we formulate an extremely simplified, minimal model to investigate the formation of patterns based on the continuum theory of fluxes of curved dislocations. We demonstrate that strain hardening as embodied in a Taylor-type dislocation density dependence of the flow stress, in conjunction with the structure of the kinematic equations that govern dislocation motion under the action of external stresses, is already sufficient for the formation of dislocation patterns that are consistent with the principle of similitude.

  13. Esophagectomy - minimally invasive

    MedlinePlus

    Minimally invasive esophagectomy; Robotic esophagectomy; Removal of the esophagus - minimally invasive; Achalasia - esophagectomy; Barrett esophagus - esophagectomy; Esophageal cancer - esophagectomy - laparoscopic; Cancer of the ...

  14. Fundamental principles and applications of natural gas hydrates

    NASA Astrophysics Data System (ADS)

    Sloan, E. Dendy

    2003-11-01

    Natural gas hydrates are solid, non-stoichiometric compounds of small gas molecules and water. They form when the constituents come into contact at low temperature and high pressure. The physical properties of these compounds, most notably that they are non-flowing crystalline solids that are denser than typical fluid hydrocarbons and that the gas molecules they contain are effectively compressed, give rise to numerous applications in the broad areas of energy and climate effects. In particular, they have an important bearing on flow assurance and safety issues in oil and gas pipelines, they offer a largely unexploited means of energy recovery and transportation, and they could play a significant role in past and future climate change.

  15. Fundamental principles and applications of natural gas hydrates.

    PubMed

    Sloan, E Dendy

    2003-11-20

    Natural gas hydrates are solid, non-stoichiometric compounds of small gas molecules and water. They form when the constituents come into contact at low temperature and high pressure. The physical properties of these compounds, most notably that they are non-flowing crystalline solids that are denser than typical fluid hydrocarbons and that the gas molecules they contain are effectively compressed, give rise to numerous applications in the broad areas of energy and climate effects. In particular, they have an important bearing on flow assurance and safety issues in oil and gas pipelines, they offer a largely unexploited means of energy recovery and transportation, and they could play a significant role in past and future climate change. PMID:14628065

  16. Fundamental Principles of Writing a Successful Grant Proposal

    PubMed Central

    Chung, Kevin C.; Shauver, Melissa J.

    2015-01-01

    It is important for the field of hand surgery to develop a new generation of surgeon-scientists who can produce high impact studies to raise the profile of this specialty. To this end, organizations such as the American Society for Surgery of the Hand have initiated programs to promote multicenter clinical research that can be competitive for fiscal support from the National Institutes of Health and other funding agencies. Crafting a well-structured grant proposal is critical to securing adequate funding to investigate the many ambitious clinical and basic science projects in hand surgery. In this paper, we present the key elements to a successful grant proposal to help potential applicants to navigate the complex pathways in the grant application process. PMID:18406962

  17. Governing during an Institutional Crisis: 10 Fundamental Principles

    ERIC Educational Resources Information Center

    White, Lawrence

    2012-01-01

    In today's world, managing a campus crisis poses special challenges for an institution's governing board, which may operate some distance removed from the immediate events giving rise to the crisis. In its most challenging form, a campus crisis--a shooting, a natural disaster, a fraternity hazing death, the arrest of a prominent campus…

  18. 47 CFR 36.2 - Fundamental principles underlying procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... interexchange classification, there are three broad types of plant, i.e., operator systems, switching plant, and....e., operator systems, switching plant, truck equipment and subscriber plant. Subscriber plant... basis for measuring the use of local and toll switching plant. (iii) Conversation-minute-kilometers...

  19. Developing Fundamental Principles for Teacher Education Programs and Practices

    ERIC Educational Resources Information Center

    Korthagen, Fred; Loughran, John; Russell, Tom

    2006-01-01

    Traditional approaches to teacher education are increasingly critiqued for their limited relationship to student teachers' needs and for their meager impact on practice. Many pleas are heard for a radical new and effective pedagogy of teacher education in which theory and practice are linked effectively. Although various attempts to restructure…

  20. DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES

    EPA Science Inventory

    Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...

  1. Fundamental monogamy relation between contextuality and nonlocality.

    PubMed

    Kurzyński, Paweł; Cabello, Adán; Kaszlikowski, Dagomir

    2014-03-14

    We show that the no-disturbance principle imposes a tradeoff between locally contextual correlations violating the Klyachko-Can-Biniciogˇlu-Shumovski inequality and spatially separated correlations violating the Clauser-Horne-Shimony-Holt inequality. The violation of one inequality forbids the violation of the other. We also obtain the corresponding monogamy relation imposed by quantum theory for a qutrit-qubit system. Our results show the existence of fundamental monogamy relations between contextuality and nonlocality that suggest that entanglement might be a particular form of a more fundamental resource. PMID:24679270

  2. Laser Wakefield Acceleration and Fundamental Physics

    SciTech Connect

    Tajima, Toshiki

    2011-06-20

    The laser wakefield acceleration (LWFA) along with the now available laser technology allows us to look at TeV physics both in leptons and hadrons. Near future proof-of-principle experiments for a collider as well as high energy frontier experiments without a collider paradigm are suggested. The intense laser can also contribute to other fundamental physics explorations such as those of dark matter and dark energy candidates. Finally the combination of intense laser and laser-accelerated particles (electrons, hadrons, gammas) provides a further avenue of fundamental research.

  3. Fundamental Monogamy Relation between Contextuality and Nonlocality

    NASA Astrophysics Data System (ADS)

    Kurzyński, Paweł; Cabello, Adán; Kaszlikowski, Dagomir

    2014-03-01

    We show that the no-disturbance principle imposes a tradeoff between locally contextual correlations violating the Klyachko-Can-Binicioǧlu-Shumovski inequality and spatially separated correlations violating the Clauser-Horne-Shimony-Holt inequality. The violation of one inequality forbids the violation of the other. We also obtain the corresponding monogamy relation imposed by quantum theory for a qutrit-qubit system. Our results show the existence of fundamental monogamy relations between contextuality and nonlocality that suggest that entanglement might be a particular form of a more fundamental resource.

  4. The Subordination of Aesthetic Fundamentals in College Art Instruction

    ERIC Educational Resources Information Center

    Lavender, Randall

    2003-01-01

    Opportunities for college students of art and design to study fundamentals of visual aesthetics, integrity of form, and principles of composition are limited today by a number of factors. With the well-documented prominence of postmodern critical theory in the world of contemporary art, the study of aesthetic fundamentals is largely subordinated…

  5. Toward a Minimal Artificial Axon.

    PubMed

    Ariyaratne, Amila; Zocchi, Giovanni

    2016-07-01

    The electrophysiology of action potentials is usually studied in neurons, through relatively demanding experiments which are difficult to scale up to a defined network. Here we pursue instead the minimal artificial system based on the essential biological components-ion channels and lipid bilayers-where action potentials can be generated, propagated, and eventually networked. The fundamental unit is the classic supported bilayer: a planar bilayer patch with embedded ion channels in a fluidic environment where an ionic gradient is imposed across the bilayer. Two such units electrically connected form the basic building block for a network. The system is minimal in that we demonstrate that one kind of ion channel and correspondingly a gradient of only one ionic species is sufficient to generate an excitable system which shows amplification and threshold behavior. PMID:27049652

  6. Basic principles of remote sensing. [bibliography

    NASA Technical Reports Server (NTRS)

    Clapp, J. L.

    1973-01-01

    Forty eight selected bibliographic references dealing with the remote sensing of the environment are given. Emphasis was placed on data that deal with fundamental aspects and principles of the technique.

  7. Prepulse minimization in KALI-5000.

    PubMed

    Kumar, D Durga Praveen; Mitra, S; Senthil, K; Sharma, Vishnu K; Singh, S K; Roy, A; Sharma, Archana; Nagesh, K V; Chakravarthy, D P

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given. PMID:19655979

  8. Prepulse minimization in KALI-5000

    NASA Astrophysics Data System (ADS)

    Kumar, D. Durga Praveen; Mitra, S.; Senthil, K.; Sharma, Vishnu K.; Singh, S. K.; Roy, A.; Sharma, Archana; Nagesh, K. V.; Chakravarthy, D. P.

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given.

  9. Fundamentals of fluid sealing

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamentals of fluid sealing, including seal operating regimes, are discussed and the general fluid-flow equations for fluid sealing are developed. Seal performance parameters such as leakage and power loss are presented. Included in the discussion are the effects of geometry, surface deformations, rotation, and both laminar and turbulent flows. The concept of pressure balancing is presented, as are differences between liquid and gas sealing. Mechanisms of seal surface separation, fundamental friction and wear concepts applicable to seals, seal materials, and pressure-velocity (PV) criteria are discussed.

  10. Common Principles and Multiculturalism

    PubMed Central

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720

  11. The principle of reciprocity.

    PubMed

    Hoult, D I

    2011-12-01

    The circumstances surrounding the realisation that NMR signal reception could be quantified in a simple fundamental manner using Lorentz's Principle of Reciprocity are described. The poor signal-to-noise ratio of the first European superconducting magnet is identified as a major motivating factor, together with the author's need to understand phenomena at a basic level. A summary is then given of the thought processes leading to the very simple pseudo-static formula that has been the basis of signal-to-noise calculations for over a generation. PMID:21889377

  12. About the ZOOM minimization package

    SciTech Connect

    Fischler, M.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  13. Reading Is Fundamental, 1977.

    ERIC Educational Resources Information Center

    Smithsonian Institution, Washington, DC. National Reading is Fun-damental Program.

    Reading Is Fundamental (RIF) is a national, nonprofit organization designed to motivate children to read by making a wide variety of inexpensive books available to them and allowing the children to choose and keep books that interest them. This annual report for 1977 contains the following information on the RIF project: an account of the…

  14. Fundamentals of soil science

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study guide provides comments and references for professional soil scientists who are studying for the soil science fundamentals exam needed as the first step for certification. The performance objectives were determined by the Soil Science Society of America's Council of Soil Science Examiners...

  15. Fundamentals of tribology

    SciTech Connect

    Suh, N.P.; Saka, N.

    1980-01-01

    This book presents the proceedings of the June 1978 International Conference on the Fundamentals of Tribology. The papers discuss the effects of surface topography and of the properties of materials on wear; friction, wear, and thermomechanical effects; wear mechanisms in metal processing; polymer wear; wear monitoring and prevention; and lubrication. (LCL)

  16. Food Service Fundamentals.

    ERIC Educational Resources Information Center

    Marine Corps Inst., Washington, DC.

    Developed as part of the Marine Corps Institute (MCI) correspondence training program, this course on food service fundamentals is designed to provide a general background in the basic aspects of the food service program in the Marine Corps; it is adaptable for nonmilitary instruction. Introductory materials include specific information for MCI…

  17. Unification of Fundamental Forces

    NASA Astrophysics Data System (ADS)

    Salam, Abdus; Taylor, Foreword by John C.

    2005-10-01

    Foreword John C. Taylor; 1. Unification of fundamental forces Abdus Salam; 2. History unfolding: an introduction to the two 1968 lectures by W. Heisenberg and P. A. M. Dirac Abdus Salam; 3. Theory, criticism, and a philosophy Werner Heisenberg; 4. Methods in theoretical physics Paul Adrian Maurice Dirac.

  18. Fundamentals of Library Instruction

    ERIC Educational Resources Information Center

    McAdoo, Monty L.

    2012-01-01

    Being a great teacher is part and parcel of being a great librarian. In this book, veteran instruction services librarian McAdoo lays out the fundamentals of the discipline in easily accessible language. Succinctly covering the topic from top to bottom, he: (1) Offers an overview of the historical context of library instruction, drawing on recent…

  19. Fundamental research data base

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A fundamental research data base containing ground truth, image, and Badhwar profile feature data for 17 North Dakota, South Dakota, and Minnesota agricultural sites is described. Image data was provided for a minimum of four acquisition dates for each site and all four images were registered to one another.

  20. Laser Fundamentals and Experiments.

    ERIC Educational Resources Information Center

    Van Pelt, W. F.; And Others

    As a result of work performed at the Southwestern Radiological Health Laboratory with respect to lasers, this manual was prepared in response to the increasing use of lasers in high schools and colleges. It is directed primarily toward the high school instructor who may use the text for a short course in laser fundamentals. The definition of the…

  1. The Fundamental Property Relation.

    ERIC Educational Resources Information Center

    Martin, Joseph J.

    1983-01-01

    Discusses a basic equation in thermodynamics (the fundamental property relation), focusing on a logical approach to the development of the relation where effects other than thermal, compression, and exchange of matter with the surroundings are considered. Also demonstrates erroneous treatments of the relation in three well-known textbooks. (JN)

  2. Basic Publication Fundamentals.

    ERIC Educational Resources Information Center

    Savedge, Charles E., Ed.

    Designed for students who produce newspapers and newsmagazines in junior high, middle, and elementary schools, this booklet is both a scorebook and a fundamentals text. The scorebook provides realistic criteria for judging publication excellence at these educational levels. All the basics for good publications are included in the text of the…

  3. The 4th Thermodynamic Principle?

    SciTech Connect

    Montero Garcia, Jose de la Luz; Novoa Blanco, Jesus Francisco

    2007-04-28

    It should be emphasized that the 4th Principle above formulated is a thermodynamic principle and, at the same time, is mechanical-quantum and relativist, as it should inevitably be and its absence has been one of main the theoretical limitations of the physical theory until today.We show that the theoretical discovery of Dimensional Primitive Octet of Matter, the 4th Thermodynamic Principle, the Quantum Hexet of Matter, the Global Hexagonal Subsystem of Fundamental Constants of Energy and the Measurement or Connected Global Scale or Universal Existential Interval of the Matter is that it is possible to be arrived at a global formulation of the four 'forces' or fundamental interactions of nature. The Einstein's golden dream is possible.

  4. Fundamentals of Refrigeration.

    ERIC Educational Resources Information Center

    Sutliff, Ronald D.; And Others

    This self-study course is designed to familiarize Marine enlisted personnel with the principles of the refrigeration process. The course contains five study units. Each study unit begins with a general objective, which is a statement of what the student should learn from the unit. The study units are divided into numbered work units, each…

  5. Marine Electrician--Fundamentals.

    ERIC Educational Resources Information Center

    Sutliff, Ronald D.; And Others

    This self-study course is designed to familiarize Marine Corps enlisted personnel with the principles of electricity, safety, and tools. The course contains three study units. Each study unit begins with a general objective, which is a statement of what the student should learn from the unit. The study units are divided into numbered work units,…

  6. FUNDAMENTALS OF TELEVISION SYSTEMS.

    ERIC Educational Resources Information Center

    KESSLER, WILLIAM J.

    DESIGNED FOR A READER WITHOUT SPECIAL TECHNICAL KNOWLEDGE, THIS ILLUSTRATED RESOURCE PAPER EXPLAINS THE COMPONENTS OF A TELEVISION SYSTEM AND RELATES THEM TO THE COMPLETE SYSTEM. SUBJECTS DISCUSSED ARE THE FOLLOWING--STUDIO ORGANIZATION AND COMPATIBLE COLOR TELEVISION PRINCIPLES, WIRED AND RADIO TRANSMISSION SYSTEMS, DIRECT VIEW AND PROJECTION…

  7. Minimal change disease

    MedlinePlus

    ... seen under a very powerful microscope called an electron microscope. Minimal change disease is the most common ... biopsy and examination of the tissue with an electron microscope can show signs of minimal change disease.

  8. Minimal change disease

    MedlinePlus

    Minimal change nephrotic syndrome; Nil disease; Lipoid nephrosis; Idiopathic nephrotic syndrome of childhood ... which filter blood and produce urine. In minimal change disease, there is damage to the glomeruli. These ...

  9. Ecological Principles and Guidelines for Managing the Use of Land

    SciTech Connect

    Dale, Virginia H; Brown, Sandra; Haeuber, R A; Hobbs, N T; Huntly, N; Naiman, R J; Riebsame, W E; Turner, M G; Valone, T J

    2014-01-01

    The many ways that people have used and managed land throughout history has emerged as a primary cause of land-cover change around the world. Thus, land use and land management increasingly represent a fundamental source of change in the global environment. Despite their global importance, however, many decisions about the management and use of land are made with scant attention to ecological impacts. Thus, ecologists' knowledge of the functioning of Earth's ecosystems is needed to broaden the scientific basis of decisions on land use and management. In response to this need, the Ecological Society of America established a committee to examine the ways that land-use decisions are made and the ways that ecologists could help inform those decisions. This paper reports the scientific findings of that committee. Five principles of ecological science have particular implications for land use and can assure that fundamental processes of Earth's ecosystems are sustained. These ecological principles deal with time, species, place, dis- turbance, and the landscape. The recognition that ecological processes occur within a temporal setting and change over time is fundamental to analyzing the effects of land use. In addition, individual species and networks of interacting species have strong and far-reaching effects on ecological processes. Furthermore, each site or region has a unique set of organisms and abiotic conditions influencing and constraining ecological processes. Distur- bances are important and ubiquitous ecological events whose effects may strongly influence population, com- munity, and ecosystem dynamics. Finally, the size, shape, and spatial relationships of habitat patches on the landscape affect the structure and function of ecosystems. The responses of the land to changes in use and management by people depend on expressions of these fundamental principles in nature. These principles dictate several guidelines for land use. The guidelines give practical

  10. Fundamentals of Polarized Light

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael

    2003-01-01

    The analytical and numerical basis for describing scattering properties of media composed of small discrete particles is formed by the classical electromagnetic theory. Although there are several excellent textbooks outlining the fundamentals of this theory, it is convenient for our purposes to begin with a summary of those concepts and equations that are central to the subject of this book and will be used extensively in the following chapters. We start by formulating Maxwell's equations and constitutive relations for time- harmonic macroscopic electromagnetic fields and derive the simplest plane-wave solution that underlies the basic optical idea of a monochromatic parallel beam of light. This solution naturally leads to the introduction of such fundamental quantities as the refractive index and the Stokes parameters. Finally, we define the concept of a quasi-monochromatic beam of light and discuss its implications.

  11. Fundamentals of Geophysics

    NASA Astrophysics Data System (ADS)

    Frohlich, Cliff

    Choosing an intermediate-level geophysics text is always problematic: What should we teach students after they have had introductory courses in geology, math, and physics, but little else? Fundamentals of Geophysics is aimed specifically at these intermediate-level students, and the author's stated approach is to construct a text “using abundant diagrams, a simplified mathematical treatment, and equations in which the student can follow each derivation step-by-step.” Moreover, for Lowrie, the Earth is round, not flat—the “fundamentals of geophysics” here are the essential properties of our Earth the planet, rather than useful techniques for finding oil and minerals. Thus this book is comparable in both level and approach to C. M. R. Fowler's The Solid Earth (Cambridge University Press, 1990).

  12. Fundamental limits on EMC

    NASA Astrophysics Data System (ADS)

    Showers, R. M.; Lin, S.-Y.; Schulz, R. B.

    1981-02-01

    Both fundamental and state-of-the-art limits are treated with emphasis on the former. Fundamental limits result from both natural and man-made electromagnetic noise which then affect two basic ratios, signal-to-noise (S/N) and extraneous-input-to-noise (I/N). Tolerable S/N values are discussed for both digital and analog communications systems. These lead to tolerable signal-to-extraneous-input (S/I) ratios, again for digital and analog communications systems, as well as radar and sonar. State-of-the-art limits for transmitters include RF noise emission, spurious emissions, and intermodulation. Receiver limits include adjacent-channel interactions, image, IF, and other spurious responses, including cross modulation, intermodulation, and desensitization. Unintentional emitters and receivers are also discussed. Coupling limitations between undesired sources and receptors are considered from mechanisms including radiation, induction, and conduction.

  13. Free-energy minimization and the dark-room problem.

    PubMed

    Friston, Karl; Thornton, Christopher; Clark, Andy

    2012-01-01

    Recent years have seen the emergence of an important new fundamental theory of brain function. This theory brings information-theoretic, Bayesian, neuroscientific, and machine learning approaches into a single framework whose overarching principle is the minimization of surprise (or, equivalently, the maximization of expectation). The most comprehensive such treatment is the "free-energy minimization" formulation due to Karl Friston (see e.g., Friston and Stephan, 2007; Friston, 2010a,b - see also Fiorillo, 2010; Thornton, 2010). A recurrent puzzle raised by critics of these models is that biological systems do not seem to avoid surprises. We do not simply seek a dark, unchanging chamber, and stay there. This is the "Dark-Room Problem." Here, we describe the problem and further unpack the issues to which it speaks. Using the same format as the prolog of Eddington's Space, Time, and Gravitation (Eddington, 1920) we present our discussion as a conversation between: an information theorist (Thornton), a physicist (Friston), and a philosopher (Clark). PMID:22586414

  14. Fundamental studies in geodynamics

    NASA Technical Reports Server (NTRS)

    Anderson, D. L.; Hager, B. H.; Kanamori, H.

    1981-01-01

    Research in fundamental studies in geodynamics continued in a number of fields including seismic observations and analysis, synthesis of geochemical data, theoretical investigation of geoid anomalies, extensive numerical experiments in a number of geodynamical contexts, and a new field seismic volcanology. Summaries of work in progress or completed during this report period are given. Abstracts of publications submitted from work in progress during this report period are attached as an appendix.

  15. Value of Fundamental Science

    NASA Astrophysics Data System (ADS)

    Burov, Alexey

    Fundamental science is a hard, long-term human adventure that has required high devotion and social support, especially significant in our epoch of Mega-science. The measure of this devotion and this support expresses the real value of the fundamental science in public opinion. Why does fundamental science have value? What determines its strength and what endangers it? The dominant answer is that the value of science arises out of curiosity and is supported by the technological progress. Is this really a good, astute answer? When trying to attract public support, we talk about the ``mystery of the universe''. Why do these words sound so attractive? What is implied by and what is incompatible with them? More than two centuries ago, Immanuel Kant asserted an inseparable entanglement between ethics and metaphysics. Thus, we may ask: which metaphysics supports the value of scientific cognition, and which does not? Should we continue to neglect the dependence of value of pure science on metaphysics? If not, how can this issue be addressed in the public outreach? Is the public alienated by one or another message coming from the face of science? What does it mean to be politically correct in this sort of discussion?

  16. [Principle of least action, physiology of vision, and conditioned reflexes theory].

    PubMed

    Shelepin, Iu E; Krasil'nikov, N N

    2003-06-01

    The variation principles such as principle of least action by Maupertuis (1740) and Fermat principle (1660) are fundamental for physics. They permit to establish a property by which the actual state is differing from all possible states of the system. The variation approach permits to establish equation of motion and equilibrium of a material system on the basis of one common rule which reduces to the search of the function extremes, describes this property of the system. So for the optical systems, crucial is the time and not the length of the way. According to Fermat principles, the light "choosen" from all possible ways connects two dots in the way which needs the least time. Generality of the variation principles guarantees success of their use in brain function investigations. Between different attempts to apply the variation principles to psychology and linguistics, the Zipf principle of least effort must be distinguished. Zipf (1949) demonstrated that languages and some artificial codes satisfied the least principle. For the brain physiology, classical conditioned reflex theory is the ideal area of variation principles application. According to this approach, conditioning leads to finding the extreme during fixation of the temporal link. In vision, physiological investigations are difficult because the signal has many dimensions. For example, during perception of spatial properties of surrounding world, in vision is realized minimization (reduction) of spatial-frequency spectrum of the scene. The receptive fields provide optimal accumulation of the signal. In ontogenesis, signal--noise ratio becomes optimal as receptive fields minimized the internal noise spectrum. According to the theory of match filtration, in the visual system recognition is carryied out by minimal differences between the image description in the visual system and storage in the human memory template of that image. The variation principles help to discover the physical property of

  17. Quantum mechanics and the generalized uncertainty principle

    SciTech Connect

    Bang, Jang Young; Berger, Micheal S.

    2006-12-15

    The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

  18. Review of receptor model fundamentals

    NASA Astrophysics Data System (ADS)

    Henry, Ronald C.; Lewis, Charles W.; Hopke, Philip K.; Williamson, Hugh J.

    There are several broad classes of mathematical models used to apportion the aerosol measured at a receptor site to its likely sources. This paper surveys the two types applied in exercises for the Mathematical and Empirical Receptor Models Workshop (Quail Roost II): chemical mass balance models and multivariate models. The fundamental principles of each are reviewed. Also considered are the specific models available within each class. These include: tracer element, linear programming, ordinary linear least-squares, effective variance least-squares and ridge regression (all solutions to the chemical mass balance equation), and factor analysis, target transformation factor analysis, multiple linear regression and extended Q-mode factor analysis (all multivariate models). In practical application of chemical mass balance models, a frequent problem is the presence of two or more emission sources whose signatures are very similar. Several techniques to reduce the effects of such multicollinearity are discussed. The propagation of errors for source contribution estimates, another practical concern, also is given special attention.

  19. Fundamental experiments in velocimetry

    SciTech Connect

    Briggs, Matthew Ellsworth; Hull, Larry; Shinas, Michael

    2009-01-01

    One can understand what velocimetry does and does not measure by understanding a few fundamental experiments. Photon Doppler Velocimetry (PDV) is an interferometer that will produce fringe shifts when the length of one of the legs changes, so we might expect the fringes to change whenever the distance from the probe to the target changes. However, by making PDV measurements of tilted moving surfaces, we have shown that fringe shifts from diffuse surfaces are actually measured only from the changes caused by the component of velocity along the beam. This is an important simplification in the interpretation of PDV results, arising because surface roughness randomizes the scattered phases.

  20. Fundamental research data base

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A fundamental research data base was created on a single 9-track 1600 BPI tape containing ground truth, image, and Badhwar profile feature data for 17 North Dakota, South Dakota, and Minnesota agricultural sites. Each site is 5x6 nm in area. Image data has been provided for a minimum of four acquisition dates for each site. All four images have been registered to one another. A list of the order of the files on tape and the dates of acquisition is provided.

  1. Fundamentals of satellite navigation

    NASA Astrophysics Data System (ADS)

    Stiller, A. H.

    The basic operating principles and capabilities of conventional and satellite-based navigation systems for air, sea, and land vehicles are reviewed and illustrated with diagrams. Consideration is given to autonomous onboard systems; systems based on visible or radio beacons; the Transit, Cicada, Navstar-GPS, and Glonass satellite systems; the physical laws and parameters of satellite motion; the definition of time in satellite systems; and the content of the demodulated GPS data signal. The GPS and Glonass data format frames are presented graphically, and tables listing the GPS and Glonass satellites, their technical characteristics, and the (past or scheduled) launch dates are provided.

  2. The Role of Fisher Information Theory in the Development of Fundamental Laws in Physical Chemistry

    ERIC Educational Resources Information Center

    Honig, J. M.

    2009-01-01

    The unifying principle that involves rendering the Fisher information measure an extremum is reviewed. It is shown that with this principle, in conjunction with appropriate constraints, a large number of fundamental laws can be derived from a common source in a unified manner. The resulting economy of thought pertaining to fundamental principles…

  3. Semi-analytical formulation of modal dispersion parameter of an optical fiber with Kerr nonlinearity and using a novel fundamental modal field approximation

    NASA Astrophysics Data System (ADS)

    Choudhury, Raja Roy; Choudhury, Arundhati Roy; Ghose, Mrinal Kanti

    2013-09-01

    To characterize nonlinear optical fiber, a semi-analytical formulation using variational principle and the Nelder-Mead Simplex method for nonlinear unconstrained minimization is proposed. The number of optimizing parameters in order to optimize core parameter U has been increased to incorporate more flexibility in the formulation of an innovative form of fundamental modal field. This formulation provides accurate analytical expressions for modal dispersion parameter (g) of optical fiber with Kerr nonlinearity. The minimization of core parameter (U), which involves Kerr nonlinearity through the nonstationary expression of propagation constant, is carried out by the Nelder-Mead Simplex method of nonlinear unconstrained minimization, suitable for problems with nonsmooth functions as the method does not require any derivative information. This formulation has less computational burden for calculation of modal parameters than full numerical methods.

  4. Fundamental Atomtronic Circuit Elements

    NASA Astrophysics Data System (ADS)

    Lee, Jeffrey; McIlvain, Brian; Lobb, Christopher; Hill, Wendell T., III

    2012-06-01

    Recent experiments with neutral superfluid gases have shown that it is possible to create atomtronic circuits analogous to existing superconducting circuits. The goals of these experiments are to create complex systems such as Josephson junctions. In addition, there are theoretical models for active atomtronic components analogous to diodes, transistors and oscillators. In order for any of these devices to function, an understanding of the more fundamental atomtronic elements is needed. Here we describe the first experimental realization of these more fundamental elements. We have created an atomtronic capacitor that is discharged through a resistance and inductance. We will discuss a theoretical description of the system that allows us to determine values for the capacitance, resistance and inductance. The resistance is shown to be analogous to the Sharvin resistance, and the inductance analogous to kinetic inductance in electronics. This atomtronic circuit is implemented with a thermal sample of laser cooled rubidium atoms. The atoms are confined using what we call free-space atom chips, a novel optical dipole trap produced using a generalized phase-contrast imaging technique. We will also discuss progress toward implementing this atomtronic system in a degenerate Bose gas.

  5. Fundamentals of electrokinetics

    NASA Astrophysics Data System (ADS)

    Kozak, M. W.

    The study of electrokinetics is a very mature field. Experimental studies date from the early 1800s, and acceptable theoretical analyses have existed since the early 1900s. The use of electrokinetics in practical field problems is more recent, but it is still quite mature. Most developments in the fundamental understanding of electrokinetics are in the colloid science literature. A significant and increasing divergence between the theoretical understanding of electrokinetics found in the colloid science literature and the theoretical analyses used in interpreting applied experimental studies in soil science and waste remediation has developed. The soil science literature has to date restricted itself to the use of very early theories, with their associated limitations. The purpose of this contribution is to review fundamental aspects of electrokinetic phenomena from a colloid science viewpoint. It is hoped that a bridge can be built between the two branches of the literature, from which both will benefit. Attention is paid to special topics such as the effects of overlapping double layers, applications in unsaturated soils, the influence of dispersivity, and the differences between electrokinetic theory and conductivity theory.

  6. Controlling molecular transport in minimal emulsions

    NASA Astrophysics Data System (ADS)

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of `minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions.

  7. Controlling molecular transport in minimal emulsions

    PubMed Central

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of ‘minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  8. Controlling molecular transport in minimal emulsions.

    PubMed

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of 'minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  9. Analysis of lipid flow on minimal surfaces

    NASA Astrophysics Data System (ADS)

    Bahmani, Fatemeh; Christenson, Joel; Rangamani, Padmini

    2016-03-01

    Interaction between the bilayer shape and surface flow is important for capturing the flow of lipids in many biological membranes. Recent microscopy evidence has shown that minimal surfaces (planes, catenoids, and helicoids) occur often in cellular membranes. In this study, we explore lipid flow in these geometries using a `stream function' formulation for viscoelastic lipid bilayers. Using this formulation, we derive two-dimensional lipid flow equations for the commonly occurring minimal surfaces in lipid bilayers. We show that for three minimal surfaces (planes, catenoids, and helicoids), the surface flow equations satisfy Stokes flow equations. In helicoids and catenoids, we show that the tangential velocity field is a Killing vector field. Thus, our analysis provides fundamental insight into the flow patterns of lipids on intracellular organelle membranes that are characterized by fixed shapes reminiscent of minimal surfaces.

  10. Achieving sustainable plant disease management through evolutionary principles.

    PubMed

    Zhan, Jiasui; Thrall, Peter H; Burdon, Jeremy J

    2014-09-01

    Plants and their pathogens are engaged in continuous evolutionary battles and sustainable disease management requires novel systems to create environments conducive for short-term and long-term disease control. In this opinion article, we argue that knowledge of the fundamental factors that drive host-pathogen coevolution in wild systems can provide new insights into disease development in agriculture. Such evolutionary principles can be used to guide the formulation of sustainable disease management strategies which can minimize disease epidemics while simultaneously reducing pressure on pathogens to evolve increased infectivity and aggressiveness. To ensure agricultural sustainability, disease management programs that reflect the dynamism of pathogen population structure are essential and evolutionary biologists should play an increasing role in their design. PMID:24853471

  11. Fundamentals of gel dosimeters

    NASA Astrophysics Data System (ADS)

    McAuley, K. B.; Nasr, A. T.

    2013-06-01

    Fundamental chemical and physical phenomena that occur in Fricke gel dosimeters, polymer gel dosimeters, micelle gel dosimeters and genipin gel dosimeters are discussed. Fricke gel dosimeters are effective even though their radiation sensitivity depends on oxygen concentration. Oxygen contamination can cause severe problems in polymer gel dosimeters, even when THPC is used. Oxygen leakage must be prevented between manufacturing and irradiation of polymer gels, and internal calibration methods should be used so that contamination problems can be detected. Micelle gel dosimeters are promising due to their favourable diffusion properties. The introduction of micelles to gel dosimetry may open up new areas of dosimetry research wherein a range of water-insoluble radiochromic materials can be explored as reporter molecules.

  12. Fundamentals of battery dynamics

    NASA Astrophysics Data System (ADS)

    Jossen, Andreas

    Modern applications, such as wireless communication systems or hybrid electric vehicles operate at high power fluctuations. For some applications, where the power frequencies are high (above some 10 or 100 Hz) it is possible to filter the high frequencies using passive components; yet this results in additional costs. In other applications, where the dynamic time constants are in the range up to some seconds, filtering cannot be done. Batteries are hence operated with the dynamic loads. But what happens under these dynamic operation conditions? This paper describes the fundamentals of the dynamic characteristics of batteries in a frequency range from some MHz down to the mHz range. As the dynamic behaviour depends on the actual state of charge (SOC) and the state of health (SOH), it is possible to gain information on the battery state by analysing the dynamic behaviour. High dynamic loads can influence the battery temperature, the battery performance and the battery lifetime.

  13. Fundamentals of zoological scaling

    NASA Astrophysics Data System (ADS)

    Lin, Herbert

    1982-01-01

    Most introductory physics courses emphasize highly idealized problems with unique well-defined answers. Though many textbooks complement these problems with estimation problems, few books present anything more than an elementary discussion of scaling. This paper presents some fundamentals of scaling in the zoological domain—a domain complex by any standard, but one also well suited to illustrate the power of very simple physical ideas. We consider the following animal characteristics: skeletal weight, speed of running, height and range of jumping, food consumption, heart rate, lifetime, locomotive efficiency, frequency of wing flapping, and maximum sizes of animals that fly and hover. These relationships are compared to zoological data and everyday experience, and match reasonably well.

  14. Unification of Fundamental Forces

    NASA Astrophysics Data System (ADS)

    Salam, Abdus

    1990-05-01

    This is an expanded version of the third Dirac Memorial Lecture, given in 1988 by the Nobel Laureate Abdus Salam. Salam's lecture presents an overview of the developments in modern particle physics from its inception at the turn of the century to the present theories seeking to unify all the fundamental forces. In addition, two previously unpublished lectures by Paul Dirac, and Werner Heisenberg are included. These lectures provide a fascinating insight into their approach to research and the developments in particle physics at that time. Nonspecialists, undergraduates and researchers will find this a fascinating book. It contains a clear introduction to the major themes of particle physics and cosmology by one of the most distinguished contemporary physicists.

  15. Wall of fundamental constants

    SciTech Connect

    Olive, Keith A.; Peloso, Marco; Uzan, Jean-Philippe

    2011-02-15

    We consider the signatures of a domain wall produced in the spontaneous symmetry breaking involving a dilatonlike scalar field coupled to electromagnetism. Domains on either side of the wall exhibit slight differences in their respective values of the fine-structure constant, {alpha}. If such a wall is present within our Hubble volume, absorption spectra at large redshifts may or may not provide a variation in {alpha} relative to the terrestrial value, depending on our relative position with respect to the wall. This wall could resolve the contradiction between claims of a variation of {alpha} based on Keck/Hires data and of the constancy of {alpha} based on Very Large Telescope data. We derive the properties of the wall and the parameters of the underlying microscopic model required to reproduce the possible spatial variation of {alpha}. We discuss the constraints on the existence of the low-energy domain wall and describe its observational implications concerning the variation of the fundamental constants.

  16. Fundamentals of plasma simulation

    SciTech Connect

    Forslund, D.W.

    1985-01-01

    With the increasing size and speed of modern computers, the incredibly complex nonlinear properties of plasmas in the laboratory and in space are being successfully explored in increasing depth. Of particular importance have been numerical simulation techniques involving finite size particles on a discrete mesh. After discussing the importance of this means of understanding a variety of nonlinear plasma phenomena, we describe the basic elements of particle-in-cell simulation and their limitations and advantages. The differencing techniques, stability and accuracy issues, data management and optimization issues are discussed by means of a simple example of a particle-in-cell code. Recent advances in simulation methods allowing large space and time scales to be treated with minimal sacrifice in physics are reviewed. Various examples of nonlinear processes successfully studied by plasma simulation will be given.

  17. Complementary Huygens Principle for Geometrical and Nongeometrical Optics

    ERIC Educational Resources Information Center

    Luis, Alfredo

    2007-01-01

    We develop a fundamental principle depicting the generalized ray formulation of optics provided by the Wigner function. This principle is formally identical to the Huygens-Fresnel principle but in terms of opposite concepts, rays instead of waves, and incoherent superpositions instead of coherent ones. This ray picture naturally includes…

  18. DOE Fundamentals Handbook: Instrumentation and Control, Volume 1

    SciTech Connect

    Not Available

    1992-06-01

    The Instrumentation and Control Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of instrumentation and control systems. The handbook includes information on temperature, pressure, flow, and level detection systems; position indication systems; process control systems; and radiation detection principles. This information will provide personnel with an understanding of the basic operation of various types of DOE nuclear facility instrumentation and control systems.

  19. A systems approach to theoretical fluid mechanics: Fundamentals

    NASA Technical Reports Server (NTRS)

    Anyiwo, J. C.

    1978-01-01

    A preliminary application of the underlying principles of the investigator's general system theory to the description and analyses of the fluid flow system is presented. An attempt is made to establish practical models, or elements of the general fluid flow system from the point of view of the general system theory fundamental principles. Results obtained are applied to a simple experimental fluid flow system, as test case, with particular emphasis on the understanding of fluid flow instability, transition and turbulence.

  20. Minimally Invasive Valve Surgery

    PubMed Central

    Pope, Nicolas H.; Ailawadi, Gorav

    2014-01-01

    Cardiac valve surgery is life saving for many patients. The advent of minimally invasive surgical techniques has historically allowed for improvement in both post-operative convalescence and important clinical outcomes. The development of minimally invasive cardiac valve repair and replacement surgery over the past decade is poised to revolutionize the care of cardiac valve patients. Here, we present a review of the history and current trends in minimally invasive aortic and mitral valve repair and replacement, including the development of sutureless bioprosthetic valves. PMID:24797148

  1. Fundamentals of Atmospheric Radiation

    NASA Astrophysics Data System (ADS)

    Bohren, Craig F.; Clothiaux, Eugene E.

    2006-02-01

    This textbook fills a gap in the literature for teaching material suitable for students of atmospheric science and courses on atmospheric radiation. It covers the fundamentals of emission, absorption, and scattering of electromagnetic radiation from ultraviolet to infrared and beyond. Much of the book applies to planetary atmosphere. The authors are physicists and teach at the largest meteorology department of the US at Penn State. Craig T. Bohren has taught the atmospheric radiation course there for the past 20 years with no book. Eugene Clothiaux has taken over and added to the course notes. Problems given in the text come from students, colleagues, and correspondents. The design of the figures especially for this book is meant to ease comprehension. Discussions have a graded approach with a thorough treatment of subjects, such as single scattering by particles, at different levels of complexity. The discussion of the multiple scattering theory begins with piles of plates. This simple theory introduces concepts in more advanced theories, i.e. optical thickness, single-scattering albedo, asymmetry parameter. The more complicated theory, the two-stream theory, then takes the reader beyond the pile-of-plates theory. Ideal for advanced undergraduate and graduate students of atmospheric science.

  2. Overlay accuracy fundamentals

    NASA Astrophysics Data System (ADS)

    Kandel, Daniel; Levinski, Vladimir; Sapiens, Noam; Cohen, Guy; Amit, Eran; Klein, Dana; Vakshtein, Irina

    2012-03-01

    Currently, the performance of overlay metrology is evaluated mainly based on random error contributions such as precision and TIS variability. With the expected shrinkage of the overlay metrology budget to < 0.5nm, it becomes crucial to include also systematic error contributions which affect the accuracy of the metrology. Here we discuss fundamental aspects of overlay accuracy and a methodology to improve accuracy significantly. We identify overlay mark imperfections and their interaction with the metrology technology, as the main source of overlay inaccuracy. The most important type of mark imperfection is mark asymmetry. Overlay mark asymmetry leads to a geometrical ambiguity in the definition of overlay, which can be ~1nm or less. It is shown theoretically and in simulations that the metrology may enhance the effect of overlay mark asymmetry significantly and lead to metrology inaccuracy ~10nm, much larger than the geometrical ambiguity. The analysis is carried out for two different overlay metrology technologies: Imaging overlay and DBO (1st order diffraction based overlay). It is demonstrated that the sensitivity of DBO to overlay mark asymmetry is larger than the sensitivity of imaging overlay. Finally, we show that a recently developed measurement quality metric serves as a valuable tool for improving overlay metrology accuracy. Simulation results demonstrate that the accuracy of imaging overlay can be improved significantly by recipe setup optimized using the quality metric. We conclude that imaging overlay metrology, complemented by appropriate use of measurement quality metric, results in optimal overlay accuracy.

  3. System level electrochemical principles

    NASA Technical Reports Server (NTRS)

    Thaller, L. H.

    1985-01-01

    The traditional electrochemical storage concepts are difficult to translate into high power, high voltage multikilowatt storage systems. The increased use of electronics, and the use of electrochemical couples that minimize the difficulties associated with the corrective measures to reduce the cell to cell capacity dispersion were adopted by battery technology. Actively cooled bipolar concepts are described which represent some attractive alternative system concepts. They are projected to have higher energy densities lower volumes than current concepts. They should be easier to scale from one capacity to another and have a closer cell to cell capacity balance. These newer storage system concepts are easier to manage since they are designed to be a fully integrated battery. These ideas are referred to as system level electrochemistry. The hydrogen-oxygen regenerative fuel cells (RFC) is probably the best example of the integrated use of these principles.

  4. Fundamentals of neurogastroenterology

    PubMed Central

    Wood, J; Alpers, D; Andrews, P

    1999-01-01

    Current concepts and basic principles of neurogastroenterology in relation to functional gastrointestinal disorders are reviewed. Neurogastroenterology is emphasized as a new and advancing subspecialty of clinical gastroenterology and digestive science. As such, it embraces the investigative sciences dealing with functions, malfunctions, and malformations in the brain and spinal cord, and the sympathetic, parasympathetic and enteric divisions of the autonomic innervation of the digestive tract. Somatomotor systems are included insofar as pharyngeal phases of swallowing and pelvic floor involvement in defecation, continence, and pelvic pain are concerned. Inclusion of basic physiology of smooth muscle, mucosal epithelium, and the enteric immune system in the neurogastroenterologic domain relates to requirements for compatibility with neural control mechanisms. Psychologic and psychiatric relations to functional gastrointestinal disorders are included because they are significant components of neurogastroenterology, especially in relation to projections of discomfort and pain to the digestive tract.


Keywords: enteric nervous system; brain-gut axis; autonomic nervous system; nausea; gut motility; mast cells; gastrointestinal pain; Rome II PMID:10457039

  5. Role of Fundamental Physics in Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Turyshev, Slava

    2004-01-01

    This talk will discuss the critical role that fundamental physics research plays for the human space exploration. In particular, the currently available technologies can already provide significant radiation reduction, minimize bone loss, increase crew productivity and, thus, uniquely contribute to overall mission success. I will discuss how fundamental physics research and emerging technologies may not only further reduce the risks of space travel, but also increase the crew mobility, enhance safety and increase the value of space exploration in the near future.

  6. Fundamentals and Techniques of Nonimaging

    SciTech Connect

    O'Gallagher, J. J.; Winston, R.

    2003-07-10

    This is the final report describing a long term basic research program in nonimaging optics that has led to major advances in important areas, including solar energy, fiber optics, illumination techniques, light detectors, and a great many other applications. The term ''nonimaging optics'' refers to the optics of extended sources in systems for which image forming is not important, but effective and efficient collection, concentration, transport, and distribution of light energy is. Although some of the most widely known developments of the early concepts have been in the field of solar energy, a broad variety of other uses have emerged. Most important, under the auspices of this program in fundamental research in nonimaging optics established at the University of Chicago with support from the Office of Basic Energy Sciences at the Department of Energy, the field has become very dynamic, with new ideas and concepts continuing to develop, while applications of the early concepts continue to be pursued. While the subject began as part of classical geometrical optics, it has been extended subsequently to the wave optics domain. Particularly relevant to potential new research directions are recent developments in the formalism of statistical and wave optics, which may be important in understanding energy transport on the nanoscale. Nonimaging optics permits the design of optical systems that achieve the maximum possible concentration allowed by physical conservation laws. The earliest designs were constructed by optimizing the collection of the extreme rays from a source to the desired target: the so-called ''edge-ray'' principle. Later, new concentrator types were generated by placing reflectors along the flow lines of the ''vector flux'' emanating from lambertian emitters in various geometries. A few years ago, a new development occurred with the discovery that making the design edge-ray a functional of some other system parameter permits the construction of whole

  7. Quantum measurements and Landauer's principle

    NASA Astrophysics Data System (ADS)

    Shevchenko, V.

    2015-05-01

    Information processing systems must obey laws of physics. One of particular examples of this general statement is known as Landauer's principle - irreversible operations (such as erasure) performed by any computing device at finite temperature have to dissipate some amount of heat bound from below. Together with other results of this kind, Landauer's principle represents a fundamental limit any modern or future computer must obey. We discuss interpretation of the physics behind the Landauer's principle using a model of Unruh-DeWitt detector. Of particular interest is the validity of this limit in quantum domain. We systematically study finite time effects. It is shown, in particular, that in high temperature limit finiteness of measurement time leads to renormalization of the detector's temperature.

  8. Revisiting Tversky's diagnosticity principle.

    PubMed

    Evers, Ellen R K; Lakens, Daniël

    2014-01-01

    Similarity is a fundamental concept in cognition. In 1977, Amos Tversky published a highly influential feature-based model of how people judge the similarity between objects. The model highlights the context-dependence of similarity judgments, and challenged geometric models of similarity. One of the context-dependent effects Tversky describes is the diagnosticity principle. The diagnosticity principle determines which features are used to cluster multiple objects into subgroups. Perceived similarity between items within clusters is expected to increase, while similarity between items in different clusters decreases. Here, we present two pre-registered replications of the studies on the diagnosticity effect reported in Tversky (1977). Additionally, one alternative mechanism that has been proposed to play a role in the original studies, an increase in the choice for distractor items (a substitution effect, see Medin et al., 1995), is examined. Our results replicate those found by Tversky (1977), revealing an average diagnosticity-effect of 4.75%. However, when we eliminate the possibility of substitution effects confounding the results, a meta-analysis of the data provides no indication of any remaining effect of diagnosticity. PMID:25161638

  9. Revisiting Tversky's diagnosticity principle

    PubMed Central

    Evers, Ellen R. K.; Lakens, Daniël

    2013-01-01

    Similarity is a fundamental concept in cognition. In 1977, Amos Tversky published a highly influential feature-based model of how people judge the similarity between objects. The model highlights the context-dependence of similarity judgments, and challenged geometric models of similarity. One of the context-dependent effects Tversky describes is the diagnosticity principle. The diagnosticity principle determines which features are used to cluster multiple objects into subgroups. Perceived similarity between items within clusters is expected to increase, while similarity between items in different clusters decreases. Here, we present two pre-registered replications of the studies on the diagnosticity effect reported in Tversky (1977). Additionally, one alternative mechanism that has been proposed to play a role in the original studies, an increase in the choice for distractor items (a substitution effect, see Medin et al., 1995), is examined. Our results replicate those found by Tversky (1977), revealing an average diagnosticity-effect of 4.75%. However, when we eliminate the possibility of substitution effects confounding the results, a meta-analysis of the data provides no indication of any remaining effect of diagnosticity. PMID:25161638

  10. Prostate resection - minimally invasive

    MedlinePlus

    ... are: Erection problems (impotence) No symptom improvement Passing semen back into your bladder instead of out through ... Whelan JP, Goeree L. Systematic review and meta-analysis of transurethral resection of the prostate versus minimally ...

  11. Minimizing Shortness of Breath

    MedlinePlus

    ... Top Doctors in the Nation Departments & Divisions Home Health Insights Stress & Relaxation Breathing and Relaxation Minimizing Shortness of Breath ... Management Assess Your Stress Coping Strategies Identifying ... & Programs Health Insights Doctors & Departments Research & Science Education & Training Make ...

  12. Minimalism. Clip and Save.

    ERIC Educational Resources Information Center

    Hubbard, Guy

    2002-01-01

    Provides background information on the art movement called "Minimalism" discussing why it started and its characteristics. Includes learning activities and information on the artist, Donald Judd. Includes a reproduction of one of his art works and discusses its content. (CMK)

  13. Minimal Orderings Revisited

    SciTech Connect

    Peyton, B.W.

    1999-07-01

    When minimum orderings proved too difficult to deal with, Rose, Tarjan, and Leuker instead studied minimal orderings and how to compute them (Algorithmic aspects of vertex elimination on graphs, SIAM J. Comput., 5:266-283, 1976). This paper introduces an algorithm that is capable of computing much better minimal orderings much more efficiently than the algorithm in Rose et al. The new insight is a way to use certain structures and concepts from modern sparse Cholesky solvers to re-express one of the basic results in Rose et al. The new algorithm begins with any initial ordering and then refines it until a minimal ordering is obtained. it is simple to obtain high-quality low-cost minimal orderings by using fill-reducing heuristic orderings as initial orderings for the algorithm. We examine several such initial orderings in some detail.

  14. Minimally invasive hip replacement

    MedlinePlus

    ... Smits SA, Swinford RR, Bahamonde RE. A randomized, prospective study of 3 minimally invasive surgical approaches in total hip arthroplasty: comprehensive gait analysis. J Arthroplasty . 2008;23:68-73. PMID: 18722305 ...

  15. Minimum Principles in Motor Control.

    PubMed

    Engelbrecht, Sascha E.

    2001-06-01

    Minimum (or minimal) principles are mathematical laws that were first used in physics: Hamilton's principle and Fermat's principle of least time are two famous example. In the past decade, a number of motor control theories have been proposed that are formally of the same kind as the minimum principles of physics, and some of these have been quite successful at predicting motor performance in a variety of tasks. The present paper provides a comprehensive review of this work. Particular attention is given to the relation between minimum theories in motor control and those used in other disciplines. Other issues around which the review is organized include: (1) the relation between minimum principles and structural models of motor planning and motor control, (2) the empirically-driven development of minimum principles and the danger of circular theorizing, and (3) the design of critical tests for minimum theories. Some perspectives for future research are discussed in the concluding section of the paper. Copyright 2001 Academic Press. PMID:11401453

  16. Solar astrophysical fundamental parameters

    NASA Astrophysics Data System (ADS)

    Meftah, M.; Irbah, A.; Hauchecorne, A.

    2014-08-01

    The accurate determination of the solar photospheric radius has been an important problem in astronomy for many centuries. From the measurements made by the PICARD spacecraft during the transit of Venus in 2012, we obtained a solar radius of 696,156±145 kilometres. This value is consistent with recent measurements carried out atmosphere. This observation leads us to propose a change of the canonical value obtained by Arthur Auwers in 1891. An accurate value for total solar irradiance (TSI) is crucial for the Sun-Earth connection, and represents another solar astrophysical fundamental parameter. Based on measurements collected from different space instruments over the past 35 years, the absolute value of the TSI, representative of a quiet Sun, has gradually decreased from 1,371W.m-2 in 1978 to around 1,362W.m-2 in 2013, mainly due to the radiometers calibration differences. Based on the PICARD data and in agreement with Total Irradiance Monitor measurements, we predicted the TSI input at the top of the Earth's atmosphere at a distance of one astronomical unit (149,597,870 kilometres) from the Sun to be 1,362±2.4W.m-2, which may be proposed as a reference value. To conclude, from the measurements made by the PICARD spacecraft, we obtained a solar photospheric equator-to-pole radius difference value of 5.9±0.5 kilometres. This value is consistent with measurements made by different space instruments, and can be given as a reference value.

  17. Li-O2 Kinetic Overpotentials: Tafel Plots from Experiment and First-Principles Theory.

    PubMed

    Viswanathan, V; Nørskov, J K; Speidel, A; Scheffler, R; Gowda, S; Luntz, A C

    2013-02-21

    We report the current dependence of the fundamental kinetic overpotentials for Li-O2 discharge and charge (Tafel plots) that define the optimal cycle efficiency in a Li-air battery. Comparison of the unusual experimental Tafel plots obtained in a bulk electrolysis cell with those obtained by first-principles theory is semiquantitative. The kinetic overpotentials for any practical current density are very small, considerably less than polarization losses due to iR drops from the cell impedance in Li-O2 batteries. If only the kinetic overpotentials were present, then a discharge-charge voltaic cycle efficiency of ∼85% should be possible at ∼10 mA/cm(2) superficial current density in a battery of ∼0.1 m(2) total cathode area. We therefore suggest that minimizing the cell impedance is a more important problem than minimizing the kinetic overpotentials to develop higher current Li-air batteries. PMID:26281865

  18. Fundamentals of phosphate transfer.

    PubMed

    Kirby, Anthony J; Nome, Faruk

    2015-07-21

    Historically, the chemistry of phosphate transfer-a class of reactions fundamental to the chemistry of Life-has been discussed almost exclusively in terms of the nucleophile and the leaving group. Reactivity always depends significantly on both factors; but recent results for reactions of phosphate triesters have shown that it can also depend strongly on the nature of the nonleaving or "spectator" groups. The extreme stabilities of fully ionised mono- and dialkyl phosphate esters can be seen as extensions of the same effect, with one or two triester OR groups replaced by O(-). Our chosen lead reaction is hydrolysis-phosphate transfer to water: because water is the medium in which biological chemistry takes place; because the half-life of a system in water is an accepted basic index of stability; and because the typical mechanisms of hydrolysis, with solvent H2O providing specific molecules to act as nucleophiles and as general acids or bases, are models for reactions involving better nucleophiles and stronger general species catalysts. Not least those available in enzyme active sites. Alkyl monoester dianions compete with alkyl diester monoanions for the slowest estimated rates of spontaneous hydrolysis. High stability at physiological pH is a vital factor in the biological roles of organic phosphates, but a significant limitation for experimental investigations. Almost all kinetic measurements of phosphate transfer reactions involving mono- and diesters have been followed by UV-visible spectroscopy using activated systems, conveniently compounds with good leaving groups. (A "good leaving group" OR* is electron-withdrawing, and can be displaced to generate an anion R*O(-) in water near pH 7.) Reactivities at normal temperatures of P-O-alkyl derivatives-better models for typical biological substrates-have typically had to be estimated: by extended extrapolation from linear free energy relationships, or from rate measurements at high temperatures. Calculation is free

  19. Fundamentals of Space Medicine

    NASA Astrophysics Data System (ADS)

    Clément, Gilles

    2005-03-01

    A total of more than 240 human space flights have been completed to date, involving about 450 astronauts from various countries, for a combined total presence in space of more than 70 years. The seventh long-duration expedition crew is currently in residence aboard the International Space Station, continuing a permanent presence in space that began in October 2000. During that time, investigations have been conducted on both humans and animal models to study the bone demineralization and muscle deconditioning, space motion sickness, the causes and possible treatment of postflight orthostatic intolerance, the changes in immune function, crew and crew-ground interactions, and the medical issues of living in a space environment, such as the effects of radiation or the risk of developing kidney stones. Some results of these investigations have led to fundamental discoveries about the adaptation of the human body to the space environment. Gilles Clément has been active in this research. This readable text presents the findings from the life science experiments conducted during and after space missions. Topics discussed in this book include: adaptation of sensory-motor, cardio-vascular, bone, and muscle systems to the microgravity of spaceflight; psychological and sociological issues of living in a confined, isolated, and stressful environment; operational space medicine, such as crew selection, training and in-flight health monitoring, countermeasures and support; results of space biology experiments on individual cells, plants, and animal models; and the impact of long-duration missions such as the human mission to Mars. The author also provides a detailed description of how to fly a space experiment, based on his own experience with research projects conducted onboard Salyut-7, Mir, Spacelab, and the Space Shuttle. Now is the time to look at the future of human spaceflight and what comes next. The future human exploration of Mars captures the imagination of both the

  20. Fundamentals of Space Medicine

    NASA Astrophysics Data System (ADS)

    Clément, G.

    2003-10-01

    As of today, a total of more than 240 human space flights have been completed, involving about 450 astronauts from various countries, for a combined total presence in space of more than 70 years. The seventh long-duration expedition crew is currently in residence aboard the International Space Station, continuing a permanent presence in space that began in October 2000. During that time, investigations have been conducted on both humans and animal models to study the bone demineralization and muscle deconditioning, space motion sickness, the causes and possible treatment of postflight orthostatic intolerance, the changes in immune function, crew and crew-ground interactions, and the medical issues of living in a space environment, such as the effects of radiation or the risk of developing kidney stones. Some results of these investigations have led to fundamental discoveries about the adaptation of the human body to the space environment. Gilles Clément has been active in this research. This book presents in a readable text the findings from the life science experiments conducted during and after space missions. Topics discussed in this book include: adaptation of sensory-motor, cardiovascular, bone and muscle systems to the microgravity of spaceflight; psychological and sociological issues of living in a confined, isolated and stressful environment; operational space medicine, such as crew selection, training and in-flight health monitoring, countermeasures and support; results of space biology experiments on individual cells, plants, and animal models; and the impact of long-duration missions such as the human mission to Mars. The author also provides a detailed description of how to fly a space experiment, based on his own experience with research projects conducted onboard Salyut-7, Mir, Spacelab, and the Space Shuttle. Now is the time to look at the future of human spaceflight and what comes next. The future human exploration of Mars captures the imagination

  1. Minimally invasive procedures

    PubMed Central

    Baltayiannis, Nikolaos; Michail, Chandrinos; Lazaridis, George; Anagnostopoulos, Dimitrios; Baka, Sofia; Mpoukovinas, Ioannis; Karavasilis, Vasilis; Lampaki, Sofia; Papaiwannou, Antonis; Karavergou, Anastasia; Kioumis, Ioannis; Pitsiou, Georgia; Katsikogiannis, Nikolaos; Tsakiridis, Kosmas; Rapti, Aggeliki; Trakada, Georgia; Zissimopoulos, Athanasios; Zarogoulidis, Konstantinos

    2015-01-01

    Minimally invasive procedures, which include laparoscopic surgery, use state-of-the-art technology to reduce the damage to human tissue when performing surgery. Minimally invasive procedures require small “ports” from which the surgeon inserts thin tubes called trocars. Carbon dioxide gas may be used to inflate the area, creating a space between the internal organs and the skin. Then a miniature camera (usually a laparoscope or endoscope) is placed through one of the trocars so the surgical team can view the procedure as a magnified image on video monitors in the operating room. Specialized equipment is inserted through the trocars based on the type of surgery. There are some advanced minimally invasive surgical procedures that can be performed almost exclusively through a single point of entry—meaning only one small incision, like the “uniport” video-assisted thoracoscopic surgery (VATS). Not only do these procedures usually provide equivalent outcomes to traditional “open” surgery (which sometimes require a large incision), but minimally invasive procedures (using small incisions) may offer significant benefits as well: (I) faster recovery; (II) the patient remains for less days hospitalized; (III) less scarring and (IV) less pain. In our current mini review we will present the minimally invasive procedures for thoracic surgery. PMID:25861610

  2. Fundamentals of Nursing Science: Units 1 through 8.

    ERIC Educational Resources Information Center

    Einstoss, Esther

    A description is provided of "Fundamentals of Nursing," a two-year college course designed to introduce nursing students to the basic principles of patient care. First, information is presented on the place of the course in the nursing curriculum, in-class time allotments, and course prerequisites. A section on course content includes a statement…

  3. 48 CFR 9904.405-40 - Fundamental requirement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... 9904.405-40 Section 9904.405-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.405-40 Fundamental requirement. (a) Costs expressly... be subject to the same cost accounting principles governing cost allocability as allowable costs....

  4. 48 CFR 9904.405-40 - Fundamental requirement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... 9904.405-40 Section 9904.405-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.405-40 Fundamental requirement. (a) Costs expressly... be subject to the same cost accounting principles governing cost allocability as allowable costs....

  5. 48 CFR 9904.405-40 - Fundamental requirement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... 9904.405-40 Section 9904.405-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.405-40 Fundamental requirement. (a) Costs expressly... be subject to the same cost accounting principles governing cost allocability as allowable costs....

  6. Religious Fundamentalism among Young Muslims in Egypt and Saudi Arabia

    ERIC Educational Resources Information Center

    Moaddel, Mansoor; Karabenick, Stuart A.

    2008-01-01

    Religious fundamentalism is conceived as a distinctive set of beliefs and attitudes toward one's religion, including obedience to religious norms, belief in the universality and immutability of its principles, the validity of its claims, and its indispensability for human happiness. Surveys of Egyptian and Saudi youth, ages 18-25, reveal that…

  7. A Matter of Principle: The Principles of Quantum Theory, Dirac's Equation, and Quantum Information

    NASA Astrophysics Data System (ADS)

    Plotnitsky, Arkady

    2015-10-01

    This article is concerned with the role of fundamental principles in theoretical physics, especially quantum theory. The fundamental principles of relativity will be addressed as well, in view of their role in quantum electrodynamics and quantum field theory, specifically Dirac's work, which, in particular Dirac's derivation of his relativistic equation of the electron from the principles of relativity and quantum theory, is the main focus of this article. I shall also consider Heisenberg's earlier work leading him to the discovery of quantum mechanics, which inspired Dirac's work. I argue that Heisenberg's and Dirac's work was guided by their adherence to and their confidence in the fundamental principles of quantum theory. The final section of the article discusses the recent work by D'Ariano and coworkers on the principles of quantum information theory, which extend quantum theory and its principles in a new direction. This extension enabled them to offer a new derivation of Dirac's equations from these principles alone, without using the principles of relativity.

  8. Promoting patient-centred fundamental care in acute healthcare systems.

    PubMed

    Feo, Rebecca; Kitson, Alison

    2016-05-01

    Meeting patients' fundamental care needs is essential for optimal safety and recovery and positive experiences within any healthcare setting. There is growing international evidence, however, that these fundamentals are often poorly executed in acute care settings, resulting in patient safety threats, poorer and costly care outcomes, and dehumanising experiences for patients and families. Whilst care standards and policy initiatives are attempting to address these issues, their impact has been limited. This discussion paper explores, through a series of propositions, why fundamental care can be overlooked in sophisticated, high technology acute care settings. We argue that the central problem lies in the invisibility and subsequent devaluing of fundamental care. Such care is perceived to involve simple tasks that require little skill to execute and have minimal impact on patient outcomes. The propositions explore the potential origins of this prevailing perception, focusing upon the impact of the biomedical model, the consequences of managerial approaches that drive healthcare cultures, and the devaluing of fundamental care by nurses themselves. These multiple sources of invisibility and devaluing surrounding fundamental care have rendered the concept underdeveloped and misunderstood both conceptually and theoretically. Likewise, there remains minimal role clarification around who should be responsible for and deliver such care, and a dearth of empirical evidence and evidence-based metrics. In explicating these propositions, we argue that key to transforming the delivery of acute healthcare is a substantial shift in the conceptualisation of fundamental care. The propositions present a cogent argument that counters the prevailing perception that fundamental care is basic and does not require systematic investigation. We conclude by calling for the explicit valuing and embedding of fundamental care in healthcare education, research, practice and policy. Without this

  9. Free-Energy Minimization and the Dark-Room Problem

    PubMed Central

    Friston, Karl; Thornton, Christopher; Clark, Andy

    2012-01-01

    Recent years have seen the emergence of an important new fundamental theory of brain function. This theory brings information-theoretic, Bayesian, neuroscientific, and machine learning approaches into a single framework whose overarching principle is the minimization of surprise (or, equivalently, the maximization of expectation). The most comprehensive such treatment is the “free-energy minimization” formulation due to Karl Friston (see e.g., Friston and Stephan, 2007; Friston, 2010a,b – see also Fiorillo, 2010; Thornton, 2010). A recurrent puzzle raised by critics of these models is that biological systems do not seem to avoid surprises. We do not simply seek a dark, unchanging chamber, and stay there. This is the “Dark-Room Problem.” Here, we describe the problem and further unpack the issues to which it speaks. Using the same format as the prolog of Eddington’s Space, Time, and Gravitation (Eddington, 1920) we present our discussion as a conversation between: an information theorist (Thornton), a physicist (Friston), and a philosopher (Clark). PMID:22586414

  10. Minimally Invasive Radiofrequency Devices.

    PubMed

    Sadick, Neil; Rothaus, Kenneth O

    2016-07-01

    This article reviews minimally invasive radiofrequency options for skin tightening, focusing on describing their mechanism of action and clinical profile in terms of safety and efficacy and presenting peer-reviewed articles associated with the specific technologies. Treatments offered by minimally invasive radiofrequency devices (fractional, microneedling, temperature-controlled) are increasing in popularity due to the dramatic effects they can have without requiring skin excision, downtime, or even extreme financial burden from the patient's perspective. Clinical applications thus far have yielded impressive results in treating signs of the aging face and neck, either as stand-alone or as postoperative maintenance treatments. PMID:27363771

  11. Equivalence principles and electromagnetism

    NASA Technical Reports Server (NTRS)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  12. Effects of Phonetic Context on Relative Fundamental Frequency

    ERIC Educational Resources Information Center

    Lien, Yu-An S.; Gattuccio, Caitlin I.; Stepp, Cara E.

    2014-01-01

    Purpose: The effect of phonetic context on relative fundamental frequency (RFF) was examined, in order to develop stimuli sets with minimal within-speaker variability that can be implemented in future clinical protocols. Method: Sixteen speakers with healthy voices produced RFF stimuli. Uniform utterances consisted of 3 repetitions of the same…

  13. Ways To Minimize Bullying.

    ERIC Educational Resources Information Center

    Mueller, Mary Ellen; Parisi, Mary Joy

    This report delineates a series of interventions aimed at minimizing incidences of bullying in a suburban elementary school. The social services staff was scheduled to initiate an anti-bullying incentive in fall 2001 due to the increased occurrences of bullying during the prior year. The target population consisted of third- and fourth-grade…

  14. The Minimal Era

    ERIC Educational Resources Information Center

    Van Ness, Wilhelmina

    1974-01-01

    Described the development of Minimal Art, a composite name that has been applied to the scattering of bland, bleak, non-objective fine arts painting and sculpture forms that proliferated slightly mysteriously in the middle 1960's as Pop Art began to decline. (Author/RK)

  15. Minimally invasive pancreatic surgery.

    PubMed

    Yiannakopoulou, E

    2015-12-01

    Minimally invasive pancreatic surgery is feasible and safe. Laparoscopic distal pancreatectomy should be widely adopted for benign lesions of the pancreas. Laparoscopic pancreaticoduodenectomy, although technically demanding, in the setting of pancreatic ductal adenocarcinoma has a number of advantages including shorter hospital stay, faster recovery, allowing patients to recover in a timelier manner and pursue adjuvant treatment options. Furthermore, it seems that progression-free survival is longer in patients undergoing laparoscopic pancreaticoduodenectomy in comparison with those undergoing open pancreaticoduodenectomy. Minimally invasive middle pancreatectomy seems appropriate for benign or borderline tumors of the neck of the pancreas. Technological advances including intraoperative ultrasound and intraoperative fluorescence imaging systems are expected to facilitate the wide adoption of minimally invasive pancreatic surgery. Although, the oncological outcome seems similar with that of open surgery, there are still concerns, as the majority of relevant evidence comes from retrospective studies. Large multicenter randomized studies comparing laparoscopic with open pancreatectomy as well as robotic assisted with both open and laparoscopic approaches are needed. Robotic approach could be possibly shown to be less invasive than conventional laparoscopic approach through the less traumatic intra-abdominal handling of tissues. In addition, robotic approach could enable the wide adoption of the technique by surgeon who is not that trained in advanced laparoscopic surgery. A putative clinical benefit of minimally invasive pancreatic surgery could be the attenuated surgical stress response leading to reduced morbidity and mortality as well as lack of the detrimental immunosuppressive effect especially for the oncological patients. PMID:26530291

  16. The Future of Financial Aid: Principles, Problems, Probable Outcomes.

    ERIC Educational Resources Information Center

    Johnstone, D. Bruce

    1986-01-01

    Forces threatening the fundamental principles and practices of student financial aid are examined, and some advice to the profession is offered. A large dedicated profession has emerged that is skilled at bringing together students, colleges, and resources. (MLW)

  17. Water Balance Covers For Waste Containment: Principles and Practice

    EPA Science Inventory

    Water Balance Covers for Waste Containment: Principles and Practices introduces water balance covers and compares them with conventional approaches to waste containment. The authors provided detailed analysis of the fundamentals of soil physics and design issues, introduce appl...

  18. Minimally invasive radioguided parathyroidectomy.

    PubMed

    Costello, D; Norman, J

    1999-07-01

    The last decade has been characterized by an emphasis on minimizing interventional techniques, hospital stays, and overall costs of patient care. It is clear that most patients with sporadic HPT do not require a complete neck exploration. We now know that a minimal approach is appropriate for this disease. Importantly, the MIRP technique can be applied to most patients with sporadic HPT and can be performed by surgeons with modest advanced training. The use of a gamma probe as a surgical tool converts the sestamibi to a functional and anatomical scan eliminating the need for any other preoperative localizing study. Quantification of the radioactivity within the removed gland eliminates the need for routine frozen section histologic examination and obviates the need for costly intraoperative parathyroid hormone measurements. This radioguided technique allows the benefit of local anesthesia, dramatically reduces operative times, eliminates postoperative blood tests, provides a smaller scar, requires minimal time spent in the hospital, and almost assures a rapid, near pain-free recovery. This combination is beneficial to the patient whereas helping achieve a reduction in overall costs. PMID:10448697

  19. Waste Minimization Crosscut Plan

    SciTech Connect

    Not Available

    1992-05-13

    On November 27, 1991, the Secretary of Energy directed that a Department of Energy (DOE) crosscut plan for waste minimization (WMin) be prepared and submitted by March 1, 1992. This Waste Minimization Crosscut Plan responds to the Secretary`s direction and supports the National Energy Strategy (NES) goals of achieving greater energy security, increasing energy and economic efficiency, and enhancing environmental quality. It provides a DOE-wide planning framework for effective coordination of all DOE WMin activities. This Plan was jointly prepared by the following Program Secretarial Officer (PSO) organizations: Civilian Radioactive Waste Management (RW); Conservation and Renewable Energy (CE); Defense Programs (DP); Environmental Restoration and Waste Management (EM), lead; Energy Research (ER); Fossil Energy (FE); Nuclear Energy (NE); and New Production Reactors (NP). Assistance and guidance was provided by the offices of Policy, Planning, and Analysis (PE) and Environment, Safety and Health (EH). Comprehensive application of waste minimization within the Department and in both the public and private sectors will provide significant benefits and support National Energy Strategy goals. These benefits include conservation of a substantial proportion of the energy now used by industry and Government, improved environmental quality, reduced health risks, improved production efficiencies, and longer useful life of disposal capacity. Taken together, these benefits will mean improved US global competitiveness, expanded job opportunities, and a better quality of life for all citizens.

  20. Beyond the Virtues-Principles Debate.

    ERIC Educational Resources Information Center

    Keat, Marilyn S.

    1992-01-01

    Indicates basic ontological assumptions in the virtues-principles debate in moral philosophy, noting Aristotle's and Kant's fundamental ideas about morality and considering a hermeneutic synthesis of theories. The article discusses what acceptance of the synthesis might mean in the theory and practice of moral pedagogy, offering examples of…

  1. Development of Canonical Transformations from Hamilton's Principle.

    ERIC Educational Resources Information Center

    Quade, C. Richard

    1979-01-01

    The theory of canonical transformations and its development are discussed with regard to its application to Hutton's principle. Included are the derivation of the equations of motion and a lack of symmetry in the formulaion with respect to Lagrangian and the fundamental commutator relations of quantum mechanics. (Author/SA)

  2. Basic principles of variable speed drives

    NASA Technical Reports Server (NTRS)

    Loewenthal, S. H.

    1973-01-01

    An understanding of the principles which govern variable speed drive operation is discussed for successful drive application. The fundamental factors of torque, speed ratio, and power as they relate to drive selection are discussed. The basic types of variable speed drives, their operating characteristics and their applications are also presented.

  3. Principles of Guided Missiles and Nuclear Weapons.

    ERIC Educational Resources Information Center

    Naval Personnel Program Support Activity, Washington, DC.

    Fundamentals of missile and nuclear weapons systems are presented in this book which is primarily prepared as the second text of a three-volume series for students of the Navy Reserve Officers' Training Corps and the Officer Candidate School. Following an introduction to guided missiles and nuclear physics, basic principles and theories are…

  4. Against Explanatory Minimalism in Psychiatry

    PubMed Central

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell’s criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein’s Zettel. But attention to the context of Wittgenstein’s remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  5. Against Explanatory Minimalism in Psychiatry.

    PubMed

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell's criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein's Zettel. But attention to the context of Wittgenstein's remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  6. Principles of nanoscience: an overview.

    PubMed

    Behari, Jitendra

    2010-10-01

    The scientific basis of nanotechnology as envisaged from the first principles is compared to bulk behavior. Development of nanoparticles having controllable physical and electronic properties has opened up possibility of designing artificial solids. Top down and bottom up approaches are emphasized. The role of nanoparticle (quantum dots) application in nanophotonics (photovoltaic cell), and drug delivery vehicle is discussed. Fundamentals of DNA structure as the prime site in bionanotechnological manipulations is also discussed. A summary of presently available devices and applications are presented. PMID:21299044

  7. Design of the fundamental power coupler and photocathode inserts for the 112MHz superconducting electron gun

    SciTech Connect

    Xin, T.; Ben-Zvi, I.; Belomestnykh, S.; Chang, X.; Rao, T.; Skaritka, J.; Wu, Q.; Wang, E.; Liang, X.

    2011-07-25

    A 112 MHz superconducting quarter-wave resonator electron gun will be used as the injector of the Coherent Electron Cooling (CEC) proof-of-principle experiment at BNL. Furthermore, this electron gun can be the testing cavity for various photocathodes. In this paper, we present the design of the cathode stalks and a Fundamental Power Coupler (FPC) designated to the future experiments. Two types of cathode stalks are discussed. Special shape of the stalk is applied in order to minimize the RF power loss. The location of cathode plane is also optimized to enable the extraction of low emittance beam. The coaxial waveguide structure FPC has the properties of tunable coupling factor and small interference to the electron beam output. The optimization of the coupling factor and the location of the FPC are discussed in detail. Based on the transmission line theory, we designed a half wavelength cathode stalk which significantly brings down the voltage drop between the cavity and the stalk from more than 5.6 kV to 0.1 kV. The transverse field distribution on cathode has been optimized by carefully choosing the position of cathode stalk inside the cavity. Moreover, in order to decrease the RF power loss, a variable diameter design of cathode stalk has been applied. Compared to the uniform shape of stalk, this design gives us much smaller power losses in important locations. Besides that, we also proposed a fundamental power coupler based on the designed beam parameters for the future proof-of-principle CEC experiment. This FPC should give a strong enough coupling which has the Q external range from 1.5e7 to 2.6e8.

  8. Magnetism: Principles and Applications

    NASA Astrophysics Data System (ADS)

    Craik, Derek J.

    2003-09-01

    If you are studying physics, chemistry, materials science, electrical engineering, information technology or medicine, then you'll know that understanding magnetism is fundamental to success in your studies and here is the key to unlocking the mysteries of magnetism....... You can: obtain a simple overview of magnetism, including the roles of B and H, resonances and special techniques take full advantage of modern magnets with a wealth of expressions for fields and forces develop realistic general design programmes using isoparametric finite elements study the subtleties of the general theory of magnetic moments and their dynamics follow the development of outstanding materials appreciate how magnetism encompasses topics as diverse as rock magnetism, chemical reaction rates, biological compasses, medical therapies, superconductivity and levitation understand the basis and remarkable achievements of magnetic resonance imaging In his new book, Magnetism, Derek Craik throws light on the principles and applications of this fascinating subject. From formulae for calculating fields to quantum theory, the secrets of magnetism are exposed, ensuring that whether you are a chemist or engineer, physicist, medic or materials scientist Magnetism is the book for our course.

  9. The validity of the extended energy principle

    SciTech Connect

    Chance, M.S.; Johnson, J.L.; Kulsrud, R.M.

    1994-04-01

    A recent analysis of plasma stability based on modifications of the extended energy principle for magnetohydrodynamic stability led to conclusions that are too optimistic. The original interpretation of this principle is indeed applicable. The present analysis demonstrates explicitly the fallacy of using the wrong functional for {delta}W in the extended energy principle. It then shows that the original energy principle functional {delta}W{sub B} is also obtained for a model in which a surface mass is incorporated to provide pressure balance. This work therefore indicates, but does not prove, that the eigenfunctions that are obtained from a minimization of the extended energy principle with the proper kinetic energy norm provide a good representation of what would be achieved with an exact treatment.

  10. Fundamentals of natural computing: an overview

    NASA Astrophysics Data System (ADS)

    de Castro, Leandro Nunes

    2007-03-01

    Natural computing is a terminology introduced to encompass three classes of methods: (1) those that take inspiration from nature for the development of novel problem-solving techniques; (2) those that are based on the use of computers to synthesize natural phenomena; and (3) those that employ natural materials (e.g., molecules) to compute. The main fields of research that compose these three branches are the artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others. This paper provides an overview of the fundamentals of natural computing, particularly the fields listed above, emphasizing the biological motivation, some design principles, their scope of applications, current research trends and open problems. The presentation is concluded with a discussion about natural computing, and when it should be used.