Science.gov

Sample records for minimization fundamental principles

  1. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)]. PMID:26382367

  2. Principle of minimal work fluctuations

    NASA Astrophysics Data System (ADS)

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality =e-β Δ F , a change in the fluctuations of e-β W may impact how rapidly the statistical average of e-β W converges towards the theoretical value e-β Δ F, where W is the work, β is the inverse temperature, and Δ F is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-β W. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-β W, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-β W. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014), 10.1103/PhysRevE.90.052132].

  3. Fundamental base closure environmental principles

    SciTech Connect

    Yim, R.A.

    1994-12-31

    Military base closures present a paradox. The rate, scale and timing of military base closures is historically unique. However, each base itself typically does not present unique problems. Thus, the challenge is to design innovative solutions to base redevelopment and remediation issues, while simultaneously adopting common, streamlined or pre-approved strategies to shared problems. The author presents six environmental principles that are fundamental to base closure. They are: remediation not clean up; remediation will impact reuse; reuse will impact remediation; remediation and reuse must be coordinated; environmental contamination must be evaluated as any other initial physical constraint on development, not as an overlay after plans are created; and remediation will impact development, financing and marketability.

  4. Responsible gambling: general principles and minimal requirements.

    PubMed

    Blaszczynski, Alex; Collins, Peter; Fong, Davis; Ladouceur, Robert; Nower, Lia; Shaffer, Howard J; Tavares, Hermano; Venisse, Jean-Luc

    2011-12-01

    Many international jurisdictions have introduced responsible gambling programs. These programs intend to minimize negative consequences of excessive gambling, but vary considerably in their aims, focus, and content. Many responsible gambling programs lack a conceptual framework and, in the absence of empirical data, their components are based only on general considerations and impressions. This paper outlines the consensus viewpoint of an international group of researchers suggesting fundamental responsible gambling principles, roles of key stakeholders, and minimal requirements that stakeholders can use to frame and inform responsible gambling programs across jurisdictions. Such a framework does not purport to offer value statements regarding the legal status of gambling or its expansion. Rather, it proposes gambling-related initiatives aimed at government, industry, and individuals to promote responsible gambling and consumer protection. This paper argues that there is a set of basic principles and minimal requirements that should form the basis for every responsible gambling program. PMID:21359586

  5. Gas cell neutralizers (Fundamental principles)

    SciTech Connect

    Fuehrer, B.

    1985-06-01

    Neutralizing an ion-beam of the size and energy levels involved in the neutral-particle-beam program represents a considerable extension of the state-of-the-art of neutralizer technology. Many different mediums (e.g., solid, liquid, gas, plasma, photons) can be used to strip the hydrogen ion of its extra electron. A large, multidisciplinary R and D effort will no doubt be required to sort out all of the ''pros and cons'' of these various techniques. The purpose of this particular presentation is to discuss some basic configurations and fundamental principles of the gas type of neutralizer cell. Particular emphasis is placed on the ''Gasdynamic Free-Jet'' neutralizer since this configuration has the potential of being much shorter than other type of gas cells (in the beam direction) and it could operate in nearly a continuous mode (CW) if necessary. These were important considerations in the ATSU design which is discussed in some detail in the second presentation entitled ''ATSU Point Design''.

  6. Fundamental principles of robot vision

    NASA Astrophysics Data System (ADS)

    Hall, Ernest L.

    1993-08-01

    Robot vision is a specialty of intelligent machines which describes the interaction between robotic manipulators and machine vision. Early robot vision systems were built to demonstrate that a robot with vision could adapt to changes in its environment. More recently attention is being directed toward machines with expanded adaptation and learning capabilities. The use of robot vision for automatic inspection and recognition of objects for manipulation by an industrial robot or for guidance of a mobile robot are two primary applications. Adaptation and learning characteristics are often lacking in industrial automation and if they can be added successfully, result in a more robust system. Due to a real time requirement, the robot vision methods that have proven most successful have been ones which could be reduced to a simple, fast computation. The purpose of this paper is to discuss some of the fundamental concepts in sufficient detail to provide a starting point for the interested engineer or scientist. A detailed example of a camera system viewing an object and for a simple, two dimensional robot vision system is presented. Finally, conclusions and recommendations for further study are presented.

  7. Two Fundamental Principles of Nature's Interactions

    NASA Astrophysics Data System (ADS)

    Ma, Tian; Wang, Shouhong

    2014-03-01

    In this talk, we present two fundamental principles of nature's interactions, the principle of interaction dynamics (PID) and the principle of representation invariance (PRI). Intuitively, PID takes the variation of the action functional under energy-momentum conservation constraint. PID offers a completely different and natural way of introducing Higgs fields. PRI requires that physical laws be independent of representations of the gauge groups. These two principles give rise to a unified field model for four interactions, which can be naturally decoupled to study individual interactions. With these two principles, we are able to derive 1) a unified theory for dark matter and dark energy, 2) layered strong and weak interaction potentials, and 3) the energy levels of subatomic particles. Supported in part by NSF, ONR and Chinese NSF.

  8. Stem cell bioprocessing: fundamentals and principles.

    PubMed

    Placzek, Mark R; Chung, I-Ming; Macedo, Hugo M; Ismail, Siti; Mortera Blanco, Teresa; Lim, Mayasari; Cha, Jae Min; Fauzi, Iliana; Kang, Yunyi; Yeo, David C L; Ma, Chi Yip Joan; Polak, Julia M; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2009-03-01

    In recent years, the potential of stem cell research for tissue engineering-based therapies and regenerative medicine clinical applications has become well established. In 2006, Chung pioneered the first entire organ transplant using adult stem cells and a scaffold for clinical evaluation. With this a new milestone was achieved, with seven patients with myelomeningocele receiving stem cell-derived bladder transplants resulting in substantial improvements in their quality of life. While a bladder is a relatively simple organ, the breakthrough highlights the incredible benefits that can be gained from the cross-disciplinary nature of tissue engineering and regenerative medicine (TERM) that encompasses stem cell research and stem cell bioprocessing. Unquestionably, the development of bioprocess technologies for the transfer of the current laboratory-based practice of stem cell tissue culture to the clinic as therapeutics necessitates the application of engineering principles and practices to achieve control, reproducibility, automation, validation and safety of the process and the product. The successful translation will require contributions from fundamental research (from developmental biology to the 'omics' technologies and advances in immunology) and from existing industrial practice (biologics), especially on automation, quality assurance and regulation. The timely development, integration and execution of various components will be critical-failures of the past (such as in the commercialization of skin equivalents) on marketing, pricing, production and advertising should not be repeated. This review aims to address the principles required for successful stem cell bioprocessing so that they can be applied deftly to clinical applications. PMID:19033137

  9. Stem cell bioprocessing: fundamentals and principles

    PubMed Central

    Placzek, Mark R.; Chung, I-Ming; Macedo, Hugo M.; Ismail, Siti; Mortera Blanco, Teresa; Lim, Mayasari; Min Cha, Jae; Fauzi, Iliana; Kang, Yunyi; Yeo, David C.L.; Yip Joan Ma, Chi; Polak, Julia M.; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2008-01-01

    In recent years, the potential of stem cell research for tissue engineering-based therapies and regenerative medicine clinical applications has become well established. In 2006, Chung pioneered the first entire organ transplant using adult stem cells and a scaffold for clinical evaluation. With this a new milestone was achieved, with seven patients with myelomeningocele receiving stem cell-derived bladder transplants resulting in substantial improvements in their quality of life. While a bladder is a relatively simple organ, the breakthrough highlights the incredible benefits that can be gained from the cross-disciplinary nature of tissue engineering and regenerative medicine (TERM) that encompasses stem cell research and stem cell bioprocessing. Unquestionably, the development of bioprocess technologies for the transfer of the current laboratory-based practice of stem cell tissue culture to the clinic as therapeutics necessitates the application of engineering principles and practices to achieve control, reproducibility, automation, validation and safety of the process and the product. The successful translation will require contributions from fundamental research (from developmental biology to the ‘omics’ technologies and advances in immunology) and from existing industrial practice (biologics), especially on automation, quality assurance and regulation. The timely development, integration and execution of various components will be critical—failures of the past (such as in the commercialization of skin equivalents) on marketing, pricing, production and advertising should not be repeated. This review aims to address the principles required for successful stem cell bioprocessing so that they can be applied deftly to clinical applications. PMID:19033137

  10. [Isotopy and multicorticality: two fundamental principles].

    PubMed

    Muratori, G

    1991-05-15

    Starting out from Oral Implantology pioneers, the Author comes down to the present situation, in an effort to show which values should be considered as the true ones. "Nothing new under the skies" is the Author's comment, when he examines all the techniques and materials presented as "new" for commercial purposes, whereas they are not new at all. To prove his statements he goes back to the work of some implantology pioneers, such as Formiggini, Perron Andrès, Cherchève, Strock, Pasqualini, Muratori, Tramonte, Linkow and others. In going over their most remarkable techniques, he maintains that what is being proposed nowadays as brand new was actually done long ago. Only names are now different: the process now called fibrous osseointegration used to be named osteofibrosis, and what is now called osseointegration was known as complete ossification. In order to remove the great confusion now prevailing in the dozens of implant systems, as well as in implant philosophy itself, the Author maintains that good implantologists should follow two fundamental principles: 1) implants should be built in a great variety of sizes, in order to take full advantage of cortical bones. They should be multicortical, generally quadricortical, since they should rest on the sinus floor cortical bone, on the alveolar ridge, the palatal and the buccal cortical bones (this is true for the elements implanted in the upper arch and in the front-mesial arch).(ABSTRACT TRUNCATED AT 250 WORDS) PMID:1864418

  11. Fundamental Principles of Proper Space Kinematics

    NASA Astrophysics Data System (ADS)

    Wade, Sean

    It is desirable to understand the movement of both matter and energy in the universe based upon fundamental principles of space and time. Time dilation and length contraction are features of Special Relativity derived from the observed constancy of the speed of light. Quantum Mechanics asserts that motion in the universe is probabilistic and not deterministic. While the practicality of these dissimilar theories is well established through widespread application inconsistencies in their marriage persist, marring their utility, and preventing their full expression. After identifying an error in perspective the current theories are tested by modifying logical assumptions to eliminate paradoxical contradictions. Analysis of simultaneous frames of reference leads to a new formulation of space and time that predicts the motion of both kinds of particles. Proper Space is a real, three-dimensional space clocked by proper time that is undergoing a densification at the rate of c. Coordinate transformations to a familiar object space and a mathematical stationary space clarify the counterintuitive aspects of Special Relativity. These symmetries demonstrate that within the local universe stationary observers are a forbidden frame of reference; all is in motion. In lieu of Quantum Mechanics and Uncertainty the use of the imaginary number i is restricted for application to the labeling of mass as either material or immaterial. This material phase difference accounts for both the perceived constant velocity of light and its apparent statistical nature. The application of Proper Space Kinematics will advance more accurate representations of microscopic, oscopic, and cosmological processes and serve as a foundation for further study and reflection thereafter leading to greater insight.

  12. Lighting fundamentals handbook: Lighting fundamentals and principles for utility personnel

    SciTech Connect

    Eley, C.; Tolen, T. Associates, San Francisco, CA ); Benya, J.R. )

    1992-12-01

    Lighting accounts for approximately 30% of overall electricity use and demand in commercial buildings. This handbook for utility personnel provides a source of basic information on lighting principles, lighting equipment, and other considerations related to lighting design. The handbook is divided into three parts. Part One, Physics of Light, has chapters on light, vision, optics, and photometry. Part Two, Lighting Equipment and Technology, focuses on lamps, luminaires, and lighting controls. Part Three, Lighting Design Decisions, deals with the manner in which lighting design decisions are made and reviews relevant methods and issues. These include the quantity and quality of light needed for visual tasks, calculation methods for verifying that lighting needs are satisfied, lighting economics and methods for evaluating investments in efficient lighting systems, and miscellaneous design issues including energy codes, power quality, photobiology, and disposal of lighting equipment. The handbook contains a discussion of the role of the utility in promoting the use of energy-efficient lighting. The handbook also includes a lighting glossary and a list of references for additional information. This convenient and comprehensive handbook is designed to enable utility lighting personnel to assist their customers in developing high-quality, energy-efficient lighting systems. The handbook is not intended to be an up-to-date reference on lighting products and equipment.

  13. The "Fundamental Pedogagical Principle" in Second Language Teaching.

    ERIC Educational Resources Information Center

    Krashen, Stephen D.

    1981-01-01

    A fundamental principle of second language acquisition is stated and applied to language teaching. The principle states that learners acquire a second language when they receive comprehensible input in situations where their affective filters are sufficiently low. The theoretical background of this principle consists of five hypotheses: the…

  14. Fundamental Ethical Principles in Sports Medicine.

    PubMed

    Devitt, Brian M

    2016-04-01

    In sports medicine, the practice of ethics presents many unique challenges because of the unusual clinical environment of caring for players within the context of a team whose primary goal is to win. Ethical issues frequently arise because a doctor-patient-team triad often replaces the traditional doctor-patient relationship. Conflict may exist when the team's priority clashes with or even replaces the doctor's obligation to player well-being. Customary ethical norms that govern most forms of clinical practice, such as autonomy and confidentiality, are not easily translated to sports medicine. Ethical principles and examples of how they relate to sports medicine are discussed. PMID:26832970

  15. Fundamental principles and applications of microfluidic systems.

    PubMed

    Ong, Soon-Eng; Zhang, Sam; Du, Hejun; Fu, Yongqing

    2008-01-01

    Microelectromechanical systems (MEMS) technology has provided the platform for the miniaturization of analytical devices for biological applications. Beside the fabrication technology, the study and understanding of flow characteristics of fluid in micrometer or even nanometer scale is vital for the successful implementation of such miniaturized systems. Microfluidics is currently under the spotlight for medical diagnostics and many other bio-analysis as its physical size manifested numerous advantages over lab-based devices. In this review, elementary concepts of fluid and its flow characteristics together with various transport processes and microchannel condition are presented. They are among the fundamental building block for the success in microfluidic systems. Selected application examples include biological cell handling employing different schemes of manipulation and DNA amplification using different microreactor arrangement and fluid flow regime. PMID:17981751

  16. Fundamental principles of energy consumption for gene expression.

    PubMed

    Huang, Lifang; Yuan, Zhanjiang; Yu, Jianshe; Zhou, Tianshou

    2015-12-01

    How energy is consumed in gene expression is largely unknown mainly due to complexity of non-equilibrium mechanisms affecting expression levels. Here, by analyzing a representative gene model that considers complexity of gene expression, we show that negative feedback increases energy consumption but positive feedback has an opposite effect; promoter leakage always reduces energy consumption; generating more bursts needs to consume more energy; and the speed of promoter switching is at the cost of energy consumption. We also find that the relationship between energy consumption and expression noise is multi-mode, depending on both the type of feedback and the speed of promoter switching. Altogether, these results constitute fundamental principles of energy consumption for gene expression, which lay a foundation for designing biologically reasonable gene modules. In addition, we discuss possible biological implications of these principles by combining experimental facts. PMID:26723140

  17. Rigorous force field optimization principles based on statistical distance minimization

    SciTech Connect

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  18. Negative-Refraction Metamaterials: Fundamental Principles and Applications

    NASA Astrophysics Data System (ADS)

    Eleftheriades, G. V.; Balmain, K. G.

    2005-06-01

    Learn about the revolutionary new technology of negative-refraction metamaterials Negative-Refraction Metamaterials: Fundamental Principles and Applications introduces artificial materials that support the unusual electromagnetic property of negative refraction. Readers will discover several classes of negative-refraction materials along with their exciting, groundbreaking applications, such as lenses and antennas, imaging with super-resolution, microwave devices, dispersion-compensating interconnects, radar, and defense. The book begins with a chapter describing the fundamentals of isotropic metamaterials in which a negative index of refraction is defined. In the following chapters, the text builds on the fundamentals by describing a range of useful microwave devices and antennas. Next, a broad spectrum of exciting new research and emerging applications is examined, including: Theory and experiments behind a super-resolving, negative-refractive-index transmission-line lens 3-D transmission-line metamaterials with a negative refractive index Numerical simulation studies of negative refraction of Gaussian beams and associated focusing phenomena Unique advantages and theory of shaped lenses made of negative-refractive-index metamaterials A new type of transmission-line metamaterial that is anisotropic and supports the formation of sharp steerable beams (resonance cones) Implementations of negative-refraction metamaterials at optical frequencies Unusual propagation phenomena in metallic waveguides partially filled with negative-refractive-index metamaterials Metamaterials in which the refractive index and the underlying group velocity are both negative This work brings together the best minds in this cutting-edge field. It is fascinating reading for scientists, engineers, and graduate-level students in physics, chemistry, materials science, photonics, and electrical engineering.

  19. Minimal self-models and the free energy principle

    PubMed Central

    Limanowski, Jakub; Blankenburg, Felix

    2013-01-01

    The term “minimal phenomenal selfhood” (MPS) describes the basic, pre-reflective experience of being a self (Blanke and Metzinger, 2009). Theoretical accounts of the minimal self have long recognized the importance and the ambivalence of the body as both part of the physical world, and the enabling condition for being in this world (Gallagher, 2005a; Grafton, 2009). A recent account of MPS (Metzinger, 2004a) centers on the consideration that minimal selfhood emerges as the result of basic self-modeling mechanisms, thereby being founded on pre-reflective bodily processes. The free energy principle (FEP; Friston, 2010) is a novel unified theory of cortical function built upon the imperative that self-organizing systems entail hierarchical generative models of the causes of their sensory input, which are optimized by minimizing free energy as an approximation of the log-likelihood of the model. The implementation of the FEP via predictive coding mechanisms and in particular the active inference principle emphasizes the role of embodiment for predictive self-modeling, which has been appreciated in recent publications. In this review, we provide an overview of these conceptions and illustrate thereby the potential power of the FEP in explaining the mechanisms underlying minimal selfhood and its key constituents, multisensory integration, interoception, agency, perspective, and the experience of mineness. We conclude that the conceptualization of MPS can be well mapped onto a hierarchical generative model furnished by the FEP and may constitute the basis for higher-level, cognitive forms of self-referral, as well as the understanding of other minds. PMID:24062658

  20. Application of trajectory optimization principles to minimize aircraft operating costs

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.; Morello, S. A.; Erzberger, H.

    1979-01-01

    This paper summarizes various applications of trajectory optimization principles that have been or are being devised by both government and industrial researchers to minimize aircraft direct operating costs (DOC). These costs (time and fuel) are computed for aircraft constrained to fly over a fixed range. Optimization theory is briefly outlined, and specific algorithms which have resulted from application of this theory are described. Typical results which demonstrate use of these algorithms and the potential savings which they can produce are given. Finally, need for further trajectory optimization research is presented.

  1. Classical Dynamics Based on the Minimal Length Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Chung, Won Sang

    2016-02-01

    In this paper we consider the quadratic modification of the Heisenberg algebra and its classical limit version which we call the β-deformed Poisson bracket for corresponding classical variables. We use the β-deformed Poisson bracket to discuss some physical problems in the β-deformed classical dynamics. Finally, we consider the ( α, β)- deformed classical dynamics in which minimal length uncertainty principle is given by [ hat {x} , hat {p}] = i hbar (1 + α hat {x}2 + β hat {p}2 ) . For two small parameters α, β, we discuss the free fall of particle and a composite system in a uniform gravitational field.

  2. The Principle of Minimal Resistance in Non-equilibrium Thermodynamics

    NASA Astrophysics Data System (ADS)

    Mauri, Roberto

    2016-04-01

    Analytical models describing the motion of colloidal particles in given force fields are presented. In addition to local approaches, leading to well known master equations such as the Langevin and the Fokker-Planck equations, a global description based on path integration is reviewed. A new result is presented, showing that under very broad conditions, during its evolution a dissipative system tends to minimize its energy dissipation in such a way to keep constant the Hamiltonian time rate, equal to the difference between the flux-based and the force-based Rayleigh dissipation functions. In fact, the Fokker-Planck equation can be interpreted as the Hamilton-Jacobi equation resulting from such minumum principle. At steady state, the Hamiltonian time rate is maximized, leading to a minimum resistance principle. In the unsteady case, we consider the relaxation to equilibrium of harmonic oscillators and the motion of a Brownian particle in shear flow, obtaining results that coincide with the solution of the Fokker-Planck and the Langevin equations.

  3. Ultrafast mid-IR laser scalpel: protein signals of the fundamental limits to minimally invasive surgery.

    PubMed

    Amini-Nik, Saeid; Kraemer, Darren; Cowan, Michael L; Gunaratne, Keith; Nadesan, Puviindran; Alman, Benjamin A; Miller, R J Dwayne

    2010-01-01

    Lasers have in principle the capability to cut at the level of a single cell, the fundamental limit to minimally invasive procedures and restructuring biological tissues. To date, this limit has not been achieved due to collateral damage on the macroscale that arises from thermal and shock wave induced collateral damage of surrounding tissue. Here, we report on a novel concept using a specifically designed Picosecond IR Laser (PIRL) that selectively energizes water molecules in the tissue to drive ablation or cutting process faster than thermal exchange of energy and shock wave propagation, without plasma formation or ionizing radiation effects. The targeted laser process imparts the least amount of energy in the remaining tissue without any of the deleterious photochemical or photothermal effects that accompanies other laser wavelengths and pulse parameters. Full thickness incisional and excisional wounds were generated in CD1 mice using the Picosecond IR Laser, a conventional surgical laser (DELight Er:YAG) or mechanical surgical tools. Transmission and scanning electron microscopy showed that the PIRL laser produced minimal tissue ablation with less damage of surrounding tissues than wounds formed using the other modalities. The width of scars formed by wounds made by the PIRL laser were half that of the scars produced using either a conventional surgical laser or a scalpel. Aniline blue staining showed higher levels of collagen in the early stage of the wounds produced using the PIRL laser, suggesting that these wounds mature faster. There were more viable cells extracted from skin using the PIRL laser, suggesting less cellular damage. β-catenin and TGF-β signalling, which are activated during the proliferative phase of wound healing, and whose level of activation correlates with the size of wounds was lower in wounds generated by the PIRL system. Wounds created with the PIRL systsem also showed a lower rate of cell proliferation. Direct comparison of wound

  4. Fundamental Principles of Network Formation among Preschool Children1

    PubMed Central

    Schaefer, David R.; Light, John M.; Fabes, Richard A.; Hanish, Laura D.; Martin, Carol Lynn

    2009-01-01

    The goal of this research was to investigate the origins of social networks by examining the formation of children’s peer relationships in 11 preschool classes throughout the school year. We investigated whether several fundamental processes of relationship formation were evident at this age, including reciprocity, popularity, and triadic closure effects. We expected these mechanisms to change in importance over time as the network crystallizes, allowing more complex structures to evolve from simpler ones in a process we refer to as structural cascading. We analyzed intensive longitudinal observational data of children’s interactions using the SIENA actor-based model. We found evidence that reciprocity, popularity, and triadic closure all shaped the formation of preschool children’s networks. The influence of reciprocity remained consistent, whereas popularity and triadic closure became increasingly important over the course of the school year. Interactions between age and endogenous network effects were nonsignificant, suggesting that these network formation processes were not moderated by age in this sample of young children. We discuss the implications of our longitudinal network approach and findings for the study of early network developmental processes. PMID:20161606

  5. Fundamental Principles of Classical Mechanics: a Geometrical Perspectives

    NASA Astrophysics Data System (ADS)

    Lam, Kai S.

    2014-07-01

    Classical mechanics is the quantitative study of the laws of motion for oscopic physical systems with mass. The fundamental laws of this subject, known as Newton's Laws of Motion, are expressed in terms of second-order differential equations governing the time evolution of vectors in a so-called configuration space of a system (see Chapter 12). In an elementary setting, these are usually vectors in 3-dimensional Euclidean space, such as position vectors of point particles; but typically they can be vectors in higher dimensional and more abstract spaces. A general knowledge of the mathematical properties of vectors, not only in their most intuitive incarnations as directed arrows in physical space but as elements of abstract linear vector spaces, and those of linear operators (transformations) on vector spaces as well, is then indispensable in laying the groundwork for both the physical and the more advanced mathematical - more precisely topological and geometrical - concepts that will prove to be vital in our subject. In this beginning chapter we will review these properties, and introduce the all-important related notions of dual spaces and tensor products of vector spaces. The notational convention for vectorial and tensorial indices used for the rest of this book (except when otherwise specified) will also be established...

  6. Lighting fundamentals handbook: Lighting fundamentals and principles for utility personnel. Final report

    SciTech Connect

    Eley, C.; Tolen, T.; Benya, J.R.

    1992-12-01

    Lighting accounts for approximately 30% of overall electricity use and demand in commercial buildings. This handbook for utility personnel provides a source of basic information on lighting principles, lighting equipment, and other considerations related to lighting design. The handbook is divided into three parts. Part One, Physics of Light, has chapters on light, vision, optics, and photometry. Part Two, Lighting Equipment and Technology, focuses on lamps, luminaires, and lighting controls. Part Three, Lighting Design Decisions, deals with the manner in which lighting design decisions are made and reviews relevant methods and issues. These include the quantity and quality of light needed for visual tasks, calculation methods for verifying that lighting needs are satisfied, lighting economics and methods for evaluating investments in efficient lighting systems, and miscellaneous design issues including energy codes, power quality, photobiology, and disposal of lighting equipment. The handbook contains a discussion of the role of the utility in promoting the use of energy-efficient lighting. The handbook also includes a lighting glossary and a list of references for additional information. This convenient and comprehensive handbook is designed to enable utility lighting personnel to assist their customers in developing high-quality, energy-efficient lighting systems. The handbook is not intended to be an up-to-date reference on lighting products and equipment.

  7. [The input of medical community into development of fundamental principles of Zemstvo medicine of Russia].

    PubMed

    Yegorysheva, I V

    2013-01-01

    The article considers the participation of medical community in formation of fundamental principles of unique system of public health--the Zemstvo medicine. This occurrence found its reflexion in activities of medical scientific societies and congresses, periodic medical mass media. PMID:24649614

  8. Free minimization of the fundamental measure theory functional: Freezing of parallel hard squares and cubes.

    PubMed

    Belli, S; Dijkstra, M; van Roij, R

    2012-09-28

    Due to remarkable advances in colloid synthesis techniques, systems of squares and cubes, once an academic abstraction for theorists and simulators, are nowadays an experimental reality. By means of a free minimization of the free-energy functional, we apply fundamental measure theory to analyze the phase behavior of parallel hard squares and hard cubes. We compare our results with those obtained by the traditional approach based on the Gaussian parameterization, finding small deviations and good overall agreement between the two methods. For hard squares, our predictions feature at intermediate packing fraction a smectic phase, which is however expected to be unstable due to thermal fluctuations. Due to this inconsistency, we cannot determine unambiguously the prediction of the theory for the expected fluid-to-crystal transition of parallel hard squares, but we deduce two alternative scenarios: (i) a second-order transition with a coexisting vacancy-rich crystal or (ii) a higher-density first-order transition with a coexisting crystal characterized by a lower vacancy concentration. In accordance with previous studies, a second-order transition with a high vacancy concentration is predicted for hard cubes. PMID:23020342

  9. A defense of fundamental principles and human rights: a reply to Robert Baker.

    PubMed

    Macklin, Ruth

    1998-12-01

    This article seeks to rebut Robert Baker's contention that attempts to ground international bioethics in fundamental principles cannot withstand the challenges posed by multiculturalism and postmodernism. First, several corrections are provided of Baker's account of the conclusions reached by the Advisory Committee on Human Radiation Experiments. Second, a rebuttal is offered to Baker's claim that an unbridgeable moral gap exists between Western individualism and non-Western communalism. In conclusion, this article argues that Baker's "nonnegotiable primary goods" cannot do the work of "classical human rights" and that the latter framework is preferable from both a practical and a theoretical standpoint. PMID:11657320

  10. The uncertainty threshold principle - Fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1976-01-01

    The fundamental limitations of the optimal control of dynamic systems with random parameters are analyzed by studying a scalar linear-quadratic optimal control example. It is demonstrated that optimum long-range decision making is possible only if the dynamic uncertainty (quantified by the means and covariances of the random parameters) is below a certain threshold. If this threshold is exceeded, there do not exist optimum decision rules. This phenomenon is called the 'uncertainty threshold principle'. The implications of this phenomenon to the field of modelling, identification, and adaptive control are discussed.

  11. Astronomical Tests of Relativity: Beyond Parameterized Post-Newtonian Formalism (PPN), to Testing Fundamental Principles

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik

    2009-05-01

    By the early 1970s, the improved accuracy of astrometric and time measurements enabled researchers not only to experimentally compare relativistic gravity with the Newtonian predictions, but also to compare different relativistic gravitational theories (e.g., the Brans-Dicke Scalar-Tensor Theory of Gravitation). For this comparison, Kip Thorne and others developed the Parameterized Post-Newtonian Formalism (PPN), and derived the dependence of different astronomically observable effects on the values of the corresponding parameters. Since then, all the observations have confirmed General Relativity. In other words, the question of which relativistic gravitation theory is in the best accordance with the experiments has been largely settled. This does not mean that General Relativity is the final theory of gravitation: it needs to be reconciled with quantum physics (into quantum gravity), it may also need to be reconciled with numerous surprising cosmological observations, etc. It is therefore reasonable to prepare an extended version of the PPN formalism, that will enable us to test possible quantum-related modifications of General Relativity. In particular, we need to include the possibility of violating fundamental principles that underlie the PPN formalism but that may be violated in quantum physics, such as scale-invariance, T-invariance, P-invariance, energy conservation, spatial isotropy violations, etc. In this talk, we present the first attempt to design the corresponding extended PPN formalism, with the (partial) analysis of the relation between the corresponding fundamental physical principles.

  12. Astronomical tests of relativity: beyond parameterized post-Newtonian formalism (PPN), to testing fundamental principles

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik

    2010-01-01

    By the early 1970s, the improved accuracy of astrometric and time measurements enabled researchers not only to experimentally compare relativistic gravity with the Newtonian predictions, but also to compare different relativistic gravitational theories (e.g., the Brans-Dicke Scalar-Tensor Theory of Gravitation). For this comparison, Kip Thorne and others developed the Parameterized Post-Newtonian Formalism (PPN), and derived the dependence of different astronomically observable effects on the values of the corresponding parameters. Since then, all the observations have confirmed General Relativity. In other words, the question of which relativistic gravitation theory is in the best accordance with the experiments has been largely settled. This does not mean that General Relativity is the final theory of gravitation: it needs to be reconciled with quantum physics (into quantum gravity), it may also need to be reconciled with numerous surprising cosmological observations, etc. It is, therefore, reasonable to prepare an extended version of the PPN formalism, that will enable us to test possible quantum-related modifications of General Relativity. In particular, we need to include the possibility of violating fundamental principles that underlie the PPN formalism but that may be violated in quantum physics, such as scale-invariance, T-invariance, P-invariance, energy conservation, spatial isotropy violations, etc. In this paper, we present the first attempt to design the corresponding extended PPN formalism, with the (partial) analysis of the relation between the corresponding fundamental physical principles.

  13. Polynomial-time algorithms for the integer minimal principle for centrosymmetric structures.

    PubMed

    Vaia, Anastasia; Sahinidis, Nikolaos V

    2005-07-01

    The minimal principle for structure determination from single-crystal X-ray diffraction measurements has recently been formulated as an integer linear optimization model for the case of centrosymmetric structures. Solution of this model via established combinatorial branch-and-bound algorithms provides the true global minimum of the minimal principle while operating exclusively in reciprocal space. However, integer programming techniques may require an exponential number of iterations to exhaust the search space. In this paper, a new approach is developed to solve the integer minimal principle to global optimality without requiring the solution of an optimization problem. Instead, properties of the solution of the optimization problem, as observed in a large number of computational experiments, are exploited in order to reduce the optimization formulation to a system of linear equations in the number field of two elements (F(2)). Two specialized Gaussian elimination algorithms are then developed to solve this system of equations in polynomial time in the number of atoms. Computational results on a collection of 38 structures demonstrate that the proposed approach provides very fast and accurate solutions to the phase problem for centrosymmetric structures. This approach also provided much better crystallographic R values than SHELXS for all 38 structures tested. PMID:15972998

  14. Contemporary extracorporeal membrane oxygenation therapy in adults: Fundamental principles and systematic review of the evidence.

    PubMed

    Squiers, John J; Lima, Brian; DiMaio, J Michael

    2016-07-01

    Extracorporeal membrane oxygenation (ECMO) provides days to weeks of support for patients with respiratory, cardiac, or combined cardiopulmonary failure. Since ECMO was first reported in 1974, nearly 70,000 runs of ECMO have been implemented, and the use of ECMO in adults increased by more than 400% from 2006 to 2011 in the United States. A variety of factors, including the 2009 influenza A epidemic, results from recent clinical trials, and improvements in ECMO technology, have motivated this increased use in adults. Because ECMO is increasingly becoming available to a diverse population of critically ill patients, we provide an overview of its fundamental principles and a systematic review of the evidence basis of this treatment modality for a variety of indications in adults. PMID:27060027

  15. Position-sensitive detection of slow neutrons: Survey of fundamental principles

    SciTech Connect

    Crawford, R.K.

    1992-07-01

    This paper sets forth the fundamental principles governing the development of position-sensitive detection systems for slow neutrons. Since neutrons are only weakly interacting with most materials, it is not generally practical to detect slow neutrons directly. Therefore all practical slow neutron detection mechanisms depend on the use of nuclear reactions to ``convert`` the neutron to one or more charged particles, followed by the subsequent detection of the charged particles. The different conversion reactions which can be used are discussed, along with the relative merits of each. This is followed with a discussion of the various methods of charged particle detection, how these lend themselves to position-sensitive encoding, and the means of position encoding which can be applied to each case. Detector performance characteristics which may be of importance to the end user are discussed and related to these various detection and position-encoding mechanisms.

  16. Position-sensitive detection of slow neutrons: Survey of fundamental principles

    SciTech Connect

    Crawford, R.K.

    1992-01-01

    This paper sets forth the fundamental principles governing the development of position-sensitive detection systems for slow neutrons. Since neutrons are only weakly interacting with most materials, it is not generally practical to detect slow neutrons directly. Therefore all practical slow neutron detection mechanisms depend on the use of nuclear reactions to convert'' the neutron to one or more charged particles, followed by the subsequent detection of the charged particles. The different conversion reactions which can be used are discussed, along with the relative merits of each. This is followed with a discussion of the various methods of charged particle detection, how these lend themselves to position-sensitive encoding, and the means of position encoding which can be applied to each case. Detector performance characteristics which may be of importance to the end user are discussed and related to these various detection and position-encoding mechanisms.

  17. Minimization principles for the coupled problem of Darcy-Biot-type fluid transport in porous media linked to phase field modeling of fracture

    NASA Astrophysics Data System (ADS)

    Miehe, Christian; Mauthe, Steffen; Teichtmeister, Stephan

    2015-09-01

    This work develops new minimization and saddle point principles for the coupled problem of Darcy-Biot-type fluid transport in porous media at fracture. It shows that the quasi-static problem of elastically deforming, fluid-saturated porous media is related to a minimization principle for the evolution problem. This two-field principle determines the rate of deformation and the fluid mass flux vector. It provides a canonically compact model structure, where the stress equilibrium and the inverse Darcy's law appear as the Euler equations of a variational statement. A Legendre transformation of the dissipation potential relates the minimization principle to a characteristic three field saddle point principle, whose Euler equations determine the evolutions of deformation and fluid content as well as Darcy's law. A further geometric assumption results in modified variational principles for a simplified theory, where the fluid content is linked to the volumetric deformation. The existence of these variational principles underlines inherent symmetries of Darcy-Biot theories of porous media. This can be exploited in the numerical implementation by the construction of time- and space-discrete variational principles, which fully determine the update problems of typical time stepping schemes. Here, the proposed minimization principle for the coupled problem is advantageous with regard to a new unconstrained stable finite element design, while space discretizations of the saddle point principles are constrained by the LBB condition. The variational principles developed provide the most fundamental approach to the discretization of nonlinear fluid-structure interactions, showing symmetric systems in algebraic update procedures. They also provide an excellent starting point for extensions towards more complex problems. This is demonstrated by developing a minimization principle for a phase field description of fracture in fluid-saturated porous media. It is designed for an

  18. The fundamental operating principles of electronic root canal length measurement devices.

    PubMed

    Nekoofar, M H; Ghandi, M M; Hayes, S J; Dummer, P M H

    2006-08-01

    It is generally accepted that root canal treatment procedures should be confined within the root canal system. To achieve this objective the canal terminus must be detected accurately during canal preparation and precise control of working length during the process must be maintained. Several techniques have been used for determining the apical canal terminus including electronic methods. However, the fundamental electronic operating principles and classification of the electronic devices used in this method are often unknown and a matter of controversy. The basic assumption with all electronic length measuring devices is that human tissues have certain characteristics that can be modelled by a combination of electrical components. Therefore, by measuring the electrical properties of the model, such as resistance and impedance, it should be possible to detect the canal terminus. The root canal system is surrounded by dentine and cementum that are insulators to electrical current. At the minor apical foramen, however, there is a small hole in which conductive materials within the canal space (tissue, fluid) are electrically connected to the periodontal ligament that is itself a conductor of electric current. Thus, dentine, along with tissue and fluid inside the canal, forms a resistor, the value of which depends on their dimensions, and their inherent resistivity. When an endodontic file penetrates inside the canal and approaches the minor apical foramen, the resistance between the endodontic file and the foramen decreases, because the effective length of the resistive material (dentine, tissue, fluid) decreases. As well as resistive properties, the structure of the tooth root has capacitive characteristics. Therefore, various electronic methods have been developed that use a variety of other principles to detect the canal terminus. Whilst the simplest devices measure resistance, other devices measure impedance using either high frequency, two frequencies, or

  19. Driving an Active Vibration Balancer to Minimize Vibrations at the Fundamental and Harmonic Frequencies

    NASA Technical Reports Server (NTRS)

    Holliday, Ezekiel S. (Inventor)

    2014-01-01

    Vibrations of a principal machine are reduced at the fundamental and harmonic frequencies by driving the drive motor of an active balancer with balancing signals at the fundamental and selected harmonics. Vibrations are sensed to provide a signal representing the mechanical vibrations. A balancing signal generator for the fundamental and for each selected harmonic processes the sensed vibration signal with adaptive filter algorithms of adaptive filters for each frequency to generate a balancing signal for each frequency. Reference inputs for each frequency are applied to the adaptive filter algorithms of each balancing signal generator at the frequency assigned to the generator. The harmonic balancing signals for all of the frequencies are summed and applied to drive the drive motor. The harmonic balancing signals drive the drive motor with a drive voltage component in opposition to the vibration at each frequency.

  20. Mobility analysis tool based on the fundamental principle of conservation of energy.

    SciTech Connect

    Spletzer, Barry Louis; Nho, Hyuchul C.; Salton, Jonathan Robert

    2007-08-01

    In the past decade, a great deal of effort has been focused in research and development of versatile robotic ground vehicles without understanding their performance in a particular operating environment. As the usage of robotic ground vehicles for intelligence applications increases, understanding mobility of the vehicles becomes critical to increase the probability of their successful operations. This paper describes a framework based on conservation of energy to predict the maximum mobility of robotic ground vehicles over general terrain. The basis of the prediction is the difference between traction capability and energy loss at the vehicle-terrain interface. The mission success of a robotic ground vehicle is primarily a function of mobility. Mobility of a vehicle is defined as the overall capability of a vehicle to move from place to place while retaining its ability to perform its primary mission. A mobility analysis tool based on the fundamental principle of conservation of energy is described in this document. The tool is a graphical user interface application. The mobility analysis tool has been developed at Sandia National Laboratories, Albuquerque, NM. The tool is at an initial stage of development. In the future, the tool will be expanded to include all vehicles and terrain types.

  1. [Fundamental ethical principles in the European framework programmes for research and development].

    PubMed

    Hirsch, François; Karatzas, Isidoros; Zilgalvis, Pēteris

    2009-01-01

    The European Commission is one of the most important international funding bodies for research conducted in Europe and beyond, including developing countries and countries in transition. Through its framework programmes for research and development, the European Union finances a vast array of projects concerning fields affecting the citizens' health, as well as the researchers' mobility, the development of new technologies or the safeguard of the environment. With the agreement of the European Parliament and of the Council of Ministers, the two decisional authorities of the European Union, the 7th framework programmes was started on December 2006. This program has a budget of 54 billion Euros to be distributed over a 7-year period. Therefore, the European Union aims to fully address the challenge as stated by the European Council of Lisbon (of March 2000) which declared the idea of providing 3% of the GDP of all the Member States for the purpose of research and development. One of the important conditions stated by the Members of the European Parliament to allocate this financing is to ensuring that "the funding research activities respect the fundamental ethical principles". In this article, we will approach this aspect of the evaluation. PMID:19765393

  2. Designing nanomaterials to maximize performance and minimize undesirable implications guided by the Principles of Green Chemistry.

    PubMed

    Gilbertson, Leanne M; Zimmerman, Julie B; Plata, Desiree L; Hutchison, James E; Anastas, Paul T

    2015-08-21

    The Twelve Principles of Green Chemistry were first published in 1998 and provide a framework that has been adopted not only by chemists, but also by design practitioners and decision-makers (e.g., materials scientists and regulators). The development of the Principles was initially motivated by the need to address decades of unintended environmental pollution and human health impacts from the production and use of hazardous chemicals. Yet, for over a decade now, the Principles have been applied to the synthesis and production of engineered nanomaterials (ENMs) and the products they enable. While the combined efforts of the global scientific community have led to promising advances in the field of nanotechnology, there remain significant research gaps and the opportunity to leverage the potential global economic, societal and environmental benefits of ENMs safely and sustainably. As such, this tutorial review benchmarks the successes to date and identifies critical research gaps to be considered as future opportunities for the community to address. A sustainable material design framework is proposed that emphasizes the importance of establishing structure-property-function (SPF) and structure-property-hazard (SPH) relationships to guide the rational design of ENMs. The goal is to achieve or exceed the functional performance of current materials and the technologies they enable, while minimizing inherent hazard to avoid risk to human health and the environment at all stages of the life cycle. PMID:25955514

  3. Simple approach to sediment provenance tracing using element analysis and fundamental principles

    NASA Astrophysics Data System (ADS)

    Matys Grygar, Tomas; Elznicova, Jitka; Popelka, Jan

    2016-04-01

    Common sediment fingerprinting techniques use either (1) extensive analytical datasets, sometimes nearly complete with respect to accessible characterization techniques; they are processed by multidimensional statistics based on certain statistical assumptions on distribution functions of analytical results and conservativeness/additivity of some components, or (2) analytically demanding characteristics such as isotope ratios assumed to be unequivocal "labels" on the parent material unaltered by any catchment process. The inherent problem of the approach ad (1) is that interpretation of statistical components ("sources") is done ex post and remains purely formal. The problem of the approach ad (2) is that catchment processes (weathering, transport, deposition) can modify most geochemical parameters of soils and sediments, in other words, that the idea that some geochemistry parameters are "conservative" may be idealistic. Grain-size effects and sediment provenance have a joint influence on chemical composition of fluvial sediments that is indeed not easy to distinguish. Attempts to separate those two main components using only statistics seem risky and equivocal, because grain-size dependence of element composition is nearly individual for each element and reflects sediment maturity and catchment-specific formation transport processes. We suppose that the use of less extensive datasets of analytical results and their interpretation respecting fundamental principles should be more robust than only statistic tools applied to overwhelming datasets. We examined sediment composition, both published by other researchers and gathered by us, and we found some general principles, which are in our opinion relevant for fingerprinting: (1) Concentrations of all elements are grain-size sensitive, i.e. there are no "conservative" elements in conventional sense of provenance- or transport-pathways tracing, (2) fractionation by catchment processes and fluvial transport changes

  4. D 1 , 2 (RN) versus C (RN) local minimizer and a Hopf-type maximum principle

    NASA Astrophysics Data System (ADS)

    Carl, Siegfried; Costa, David G.; Tehrani, Hossein

    2016-08-01

    We consider functionals of the form Φ (u) =1/2∫RN | ∇u|2 -∫RN b (x) G (u) on D 1 , 2 (RN), N ≥ 3, whose critical points are the weak solutions of a corresponding elliptic equation in the whole RN. We present a Brezis-Nirenberg type result and a Hopf-type maximum principle in the context of the space D 1 , 2 (RN). More precisely, we prove that a local minimizer of Φ in the topology of the subspace V must be a local minimizer of Φ in the D 1 , 2 (RN)-topology, where V is given by V : = { v ∈D 1 , 2 (RN) : v ∈ C (RN)withsupx∈RN ⁡ (1 + | x| N - 2) | v (x) | < ∞ }. It is well-known that the Brezis-Nirenberg result has been proved a strong tool in the study of multiple solutions for elliptic boundary value problems in bounded domains. We believe that the result obtained in this paper may play a similar role for elliptic problems in RN.

  5. Prediction of Metabolic Flux Distribution from Gene Expression Data Based on the Flux Minimization Principle

    PubMed Central

    Song, Hyun-Seob; Reifman, Jaques; Wallqvist, Anders

    2014-01-01

    Prediction of possible flux distributions in a metabolic network provides detailed phenotypic information that links metabolism to cellular physiology. To estimate metabolic steady-state fluxes, the most common approach is to solve a set of macroscopic mass balance equations subjected to stoichiometric constraints while attempting to optimize an assumed optimal objective function. This assumption is justifiable in specific cases but may be invalid when tested across different conditions, cell populations, or other organisms. With an aim to providing a more consistent and reliable prediction of flux distributions over a wide range of conditions, in this article we propose a framework that uses the flux minimization principle to predict active metabolic pathways from mRNA expression data. The proposed algorithm minimizes a weighted sum of flux magnitudes, while biomass production can be bounded to fit an ample range from very low to very high values according to the analyzed context. We have formulated the flux weights as a function of the corresponding enzyme reaction's gene expression value, enabling the creation of context-specific fluxes based on a generic metabolic network. In case studies of wild-type Saccharomyces cerevisiae, and wild-type and mutant Escherichia coli strains, our method achieved high prediction accuracy, as gauged by correlation coefficients and sums of squared error, with respect to the experimentally measured values. In contrast to other approaches, our method was able to provide quantitative predictions for both model organisms under a variety of conditions. Our approach requires no prior knowledge or assumption of a context-specific metabolic functionality and does not require trial-and-error parameter adjustments. Thus, our framework is of general applicability for modeling the transcription-dependent metabolism of bacteria and yeasts. PMID:25397773

  6. Emergent features and perceptual objects: re-examining fundamental principles in analogical display design.

    PubMed

    Holt, Jerred; Bennett, Kevin B; Flach, John M

    2015-01-01

    Two sets of design principles for analogical visual displays, based on the concepts of emergent features and perceptual objects, are described. An interpretation of previous empirical findings for three displays (bar graph, polar graphic, alphanumeric) is provided from both perspectives. A fourth display (configural coordinate) was designed using principles of ecological interface design (i.e. direct perception). An experiment was conducted to evaluate performance (accuracy and latency of state identification) with these four displays. Numerous significant effects were obtained and a clear rank ordering of performance emerged (from best to worst): configural coordinate, bar graph, alphanumeric and polar graphic. These findings are consistent with principles of design based on emergent features; they are inconsistent with principles based on perceptual objects. Some limitations of the configural coordinate display are discussed and a redesign is provided. Practitioner Summary: Principles of ecological interface design, which emphasise the quality of very specific mappings between domain, display and observer constraints, are described; these principles are applicable to the design of all analogical graphical displays. PMID:26218496

  7. Developing a Dynamics and Vibrations Course for Civil Engineering Students Based on Fundamental-Principles

    ERIC Educational Resources Information Center

    Barroso, Luciana R.; Morgan, James R.

    2012-01-01

    This paper describes the creation and evolution of an undergraduate dynamics and vibrations course for civil engineering students. Incorporating vibrations into the course allows students to see and study "real" civil engineering applications of the course content. This connection of academic principles to real life situations is in…

  8. The Uncertainty Threshold Principle: Some Fundamental Limitations of Optimal Decision Making Under Dynamic Uncertainity

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  9. The uncertainty threshold principle - Some fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  10. Fundamental principles of hollow-cathode-discharge operations in space and the design of a rocket-borne demonstration

    NASA Technical Reports Server (NTRS)

    Szuszczewicz, Edward P.; Mccoy, James E.; Bonifazi, Carlo; Dobrowolny, Marino

    1988-01-01

    The issue of hollow-cathode operations in space is treated from the point of view of fundamental principles of plasma interactions and their control over currents involving the device, the spaceborne vehicle, and the ambient space plasma. Particular attention is given to collective plasma processes, the effects of the ambient magnetic field, and the high probability of plasma turbulence triggered by hallow cathode operations. The paper presents a rocket payload and experiment scenario designed for accommodation on a Black Brant booster, launched from a midlatitude site to an apogee in excess of 400 km.

  11. Bench-to-bedside review: Fundamental principles of acid-base physiology

    PubMed Central

    Corey, Howard E

    2005-01-01

    Complex acid–base disorders arise frequently in critically ill patients, especially in those with multiorgan failure. In order to diagnose and treat these disorders better, some intensivists have abandoned traditional theories in favor of revisionist models of acid–base balance. With claimed superiority over the traditional approach, the new methods have rekindled debate over the fundmental principles of acid–base physiology. In order to shed light on this controversy, we review the derivation and application of new models of acid–base balance. PMID:15774076

  12. Al-Air Batteries: Fundamental Thermodynamic Limitations from First Principles Theory

    NASA Astrophysics Data System (ADS)

    Chen, Leanne D.; Noerskov, Jens K.; Luntz, Alan C.

    2015-03-01

    The Al-air battery possesses high theoretical specific energy (4140 Wh/kg) and is therefore an attractive candidate for vehicle propulsion applications. However, the experimentally observed open-circuit potential is much lower than what thermodynamics predicts, and this potential loss is widely believed to be an effect of corrosion. We present a detailed study of the Al-air battery using density functional theory. The results suggest that the difference between bulk thermodynamic and surface potentials is due to both the effects of asymmetry in multi-electron transfer reactions that define the anodic dissolution of Al and, more importantly, a large chemical step inherent to the formation of bulk Al(OH)3 from surface intermediates. The former results in an energy loss of 3%, while the latter accounts for 14 -29% of the total thermodynamic energy depending on the surface site where dissolution occurs. Therefore, the maximum open-circuit potential of the Al anode is only -1.87 V vs. SHE in the absence of thermal excitations, contrary to -2.34 V predicted by bulk thermodynamics at pH 14.6. This is a fundamental limitation of the system and governs the maximum output potential, which cannot be improved even if corrosion effects were completely suppressed. Supported by the Natural Sciences and Engineering Research Council of Canada and the ReLiable Project (#11-116792) funded by the Danish Council for Strategic Research.

  13. Al-Air Batteries: Fundamental Thermodynamic Limitations from First-Principles Theory.

    PubMed

    Chen, Leanne D; Nørskov, Jens K; Luntz, Alan C

    2015-01-01

    The Al-air battery possesses high theoretical specific energy (4140 W h/kg) and is therefore an attractive candidate for vehicle propulsion. However, the experimentally observed open-circuit potential is much lower than what bulk thermodynamics predicts, and this potential loss is typically attributed to corrosion. Similarly, large Tafel slopes associated with the battery are assumed to be due to film formation. We present a detailed thermodynamic study of the Al-air battery using density functional theory. The results suggest that the maximum open-circuit potential of the Al anode is only -1.87 V versus the standard hydrogen electrode at pH 14.6 instead of the traditionally assumed -2.34 V and that large Tafel slopes are inherent in the electrochemistry. These deviations from the bulk thermodynamics are intrinsic to the electrochemical surface processes that define Al anodic dissolution. This has contributions from both asymmetry in multielectron transfers and, more importantly, a large chemical stabilization inherent to the formation of bulk Al(OH)3 from surface intermediates. These are fundamental limitations that cannot be improved even if corrosion and film effects are completely suppressed. PMID:26263108

  14. A covariant action principle for dissipative fluid dynamics: from formalism to fundamental physics

    NASA Astrophysics Data System (ADS)

    Andersson, N.; Comer, G. L.

    2015-04-01

    We present a new variational framework for dissipative general relativistic fluid dynamics. The model extends the convective variational principle for multi-fluid systems to account for a range of dissipation channels. The key ingredients in the construction are (i) the use of a lower dimensional matter space for each fluid component, and (ii) an extended functional dependence for the associated volume forms. In an effort to make the concepts clear, the formalism is developed step-by-step with model examples considered at each level. Thus we consider a model for heat flow, derive the relativistic Navier-Stokes equations and discuss why the individual dissipative stress tensors need not be spacetime symmetric. We argue that the new formalism, which notably does not involve an expansion away from an assumed equilibrium state, provides a conceptual breakthrough in this area of research. We also provide an ambitious list of directions in which one may want to extend it in the future. This involves an exciting set of problems, relating to both applications and foundational issues.

  15. First Principles Studies of Tapered Silicon Nanowires: Fundamental Insights and Practical Applications

    NASA Astrophysics Data System (ADS)

    Wu, Zhigang

    2008-03-01

    Nanowires (NWs) are often observed experimentally to be tapered rather than straight-edged, with diameters (d) shrinking by as much as 1 nm per 10 nm of vertical growth. Previous theoretical studies have examined the electronic properties of straight-edged nanowires (SNWs), although the effects of tapering on quantum confinement may be of both fundamental and practical importance. We have employed ab initio calculations to study the structural and electronic properties of tapered Si NWs. As one may expect, tapered nanowires (TNWs) possess axially-dependent electronic properties; their local energy gaps vary along the wire axis, with the largest gap occurring at the narrowest point of the wire. In contrast to SNWs, where confinement tends to shift valence bands more than conduction bands away from the bulk gap, the unoccupied states in TNWs are much more sensitive to d than the occupied states. In addition, tapering causes the band-edge states to be spatially separated along the wire axis, a consequence of the interplay between a strong variation in quantum confinement strength with diameter and the tapering-induced charge transfer. This property may be exploited in electronic and optical applications, for example, in photovoltaic devices where the separation of the valence and conduction band states could be used to transport excited charges during the thermalization process. In order to gain insight into TNW photovoltaic properties, we have also carried out calculations of the dipole matrix elements near the band edges as well as the role of metal contacts on TNW electronic properties. Finally, a combination of ab initio total energy calculations and classical molecular dynamics (MD) simulations are employed to suggest a new technique for bringing nanoscale objects together to form ordered, ultra high-aspect ratio nanowires. This work was supported in part by the U.S. Department of Energy under Contract No. DE-AC02-05CH11231.

  16. A Greatly Under-Appreciated Fundamental Principle of Physical Organic Chemistry

    PubMed Central

    Cox, Robin A.

    2011-01-01

    If a species does not have a finite lifetime in the reaction medium, it cannot be a mechanistic intermediate. This principle was first enunciated by Jencks, as the concept of an enforced mechanism. For instance, neither primary nor secondary carbocations have long enough lifetimes to exist in an aqueous medium, so SN1 reactions involving these substrates are not possible, and an SN2 mechanism is enforced. Only tertiary carbocations and those stabilized by resonance (benzyl cations, acylium ions) are stable enough to be reaction intermediates. More importantly, it is now known that neither H3O+ nor HO− exist as such in dilute aqueous solution. Several recent high-level calculations on large proton clusters are unable to localize the positive charge; it is found to be simply “on the cluster” as a whole. The lifetime of any ionized water species is exceedingly short, a few molecular vibrations at most; the best experimental study, using modern IR instrumentation, has the most probable hydrated proton structure as H13O6+, but only an estimated quarter of the protons are present even in this form at any given instant. Thanks to the Grotthuss mechanism of chain transfer along hydrogen bonds, in reality a proton or a hydroxide ion is simply instantly available anywhere it is needed for reaction. Important mechanistic consequences result. Any charged oxygen species (e.g., a tetrahedral intermediate) is also not going to exist long enough to be a reaction intermediate, unless the charge is stabilized in some way, usually by resonance. General acid catalysis is the rule in reactions in concentrated aqueous acids. The Grotthuss mechanism also means that reactions involving neutral water are favored; the solvent is already highly structured, so the entropy involved in bringing several solvent molecules to the reaction center is unimportant. Examples are given. PMID:22272074

  17. Structural phase transitions and fundamental band gaps of MgxZn1 xO alloys from first principles

    SciTech Connect

    Maznichenko, I. V.; Ernst, Arthur; Bouhassoune, M.; Henk, J.; Daene, Markus W; Lueders, Martin; Bruno, Patrick; Wolfam, Hergert; Mertig, I.; Szotek, Zdzislawa; Temmerman, Walter M

    2009-01-01

    The structural phase transitions and the fundamental band gaps of MgxZn1 xO alloys are investigated by detailed first-principles calculations in the entire range of Mg concentrations x, applying a multiple-scattering theoretical approach (Korringa-Kohn-Rostoker method). Disordered alloys are treated within the coherent-potential approximation. The calculations for various crystal phases have given rise to a phase diagram in good agreement with experiments and other theoretical approaches. The phase transition from the wurtzite to the rock-salt structure is predicted at the Mg concentration of x=0.33, which is close to the experimental value of 0.33 0.40. The size of the fundamental band gap, typically underestimated by the local-density approximation, is considerably improved by the self-interaction correction. The increase in the gap upon alloying ZnO with Mg corroborates experimental trends. Our findings are relevant for applications in optical, electrical, and, in particular, in magnetoelectric devices.

  18. On classes of non-Gaussian asymptotic minimizers in entropic uncertainty principles

    NASA Astrophysics Data System (ADS)

    Zozor, S.; Vignat, C.

    2007-03-01

    In this paper we revisit the Bialynicki-Birula and Mycielski uncertainty principle and its cases of equality. This Shannon entropic version of the well-known Heisenberg uncertainty principle can be used when dealing with variables that admit no variance. In this paper, we extend this uncertainty principle to Rényi entropies. We recall that in both Shannon and Rényi cases, and for a given dimension n, the only case of equality occurs for Gaussian random vectors. We show that as n grows, however, the bound is also asymptotically attained in the cases of n-dimensional Student- t and Student- r distributions. A complete analytical study is performed in a special case of a Student- t distribution. We also show numerically that this effect exists for the particular case of a n-dimensional Cauchy variable, whatever the Rényi entropy considered, extending the results of Abe and illustrating the analytical asymptotic study of the Student- t case. In the Student- r case, we show numerically that the same behavior occurs for uniformly distributed vectors. These particular cases and other ones investigated in this paper are interesting since they show that this asymptotic behavior cannot be considered as a “Gaussianization” of the vector when the dimension increases.

  19. Quantum statistical entropy and minimal length of 5D Ricci-flat black string with generalized uncertainty principle

    SciTech Connect

    Liu Molin; Gui Yuanxing; Liu Hongya

    2008-12-15

    In this paper, we study the quantum statistical entropy in a 5D Ricci-flat black string solution, which contains a 4D Schwarzschild-de Sitter black hole on the brane, by using the improved thin-layer method with the generalized uncertainty principle. The entropy is the linear sum of the areas of the event horizon and the cosmological horizon without any cutoff and any constraint on the bulk's configuration rather than the usual uncertainty principle. The system's density of state and free energy are convergent in the neighborhood of horizon. The small-mass approximation is determined by the asymptotic behavior of metric function near horizons. Meanwhile, we obtain the minimal length of the position {delta}x, which is restrained by the surface gravities and the thickness of layer near horizons.

  20. First-principles study of the minimal model of magnetic interactions in Fe-based superconductors

    NASA Astrophysics Data System (ADS)

    Glasbrenner, J. K.; Velev, J. P.; Mazin, I. I.

    2014-02-01

    Using noncollinear first-principles calculations, we perform a systematic study of the magnetic order in several families of ferropnictides. We find a fairly universal energy dependence on the magnetization order in all cases. Our results confirm that a simple Heisenberg model fails to account for the energy dependence of the magnetization in a couple of ways: first, a biquadratic term is present in all cases and, second, the magnetic moment softens depending on the orientation. We also find that hole doping substantially reduces the biquadratic contribution, although the antiferromagnetic stripe state remains stable within the whole range of doping concentrations, and thus the reported lack of the orthorhombicity in Na-doped BaFe2As2 is probably due to factors other than a sign reversal of the biquadratic term. Finally, we discover that even with the biquadratic term, there is a limit to the accuracy of mapping the density functional theory energetics onto Heisenberg-type models, independent of the range of the model.

  1. Dielectric properties of agricultural products – fundamental principles, influencing factors, and measurement technirques. Chapter 4. Electrotechnologies for Food Processing: Book Series. Volume 3. Radio-Frequency Heating

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this chapter, definitions of dielectric properties, or permittivity, of materials and a brief discussion of the fundamental principles governing their behavior with respect to influencing factors are presented. The basic physics of the influence of frequency of the electric fields and temperatur...

  2. The inactivation principle: mathematical solutions minimizing the absolute work and biological implications for the planning of arm movements.

    PubMed

    Berret, Bastien; Darlot, Christian; Jean, Frédéric; Pozzo, Thierry; Papaxanthis, Charalambos; Gauthier, Jean Paul

    2008-10-01

    An important question in the literature focusing on motor control is to determine which laws drive biological limb movements. This question has prompted numerous investigations analyzing arm movements in both humans and monkeys. Many theories assume that among all possible movements the one actually performed satisfies an optimality criterion. In the framework of optimal control theory, a first approach is to choose a cost function and test whether the proposed model fits with experimental data. A second approach (generally considered as the more difficult) is to infer the cost function from behavioral data. The cost proposed here includes a term called the absolute work of forces, reflecting the mechanical energy expenditure. Contrary to most investigations studying optimality principles of arm movements, this model has the particularity of using a cost function that is not smooth. First, a mathematical theory related to both direct and inverse optimal control approaches is presented. The first theoretical result is the Inactivation Principle, according to which minimizing a term similar to the absolute work implies simultaneous inactivation of agonistic and antagonistic muscles acting on a single joint, near the time of peak velocity. The second theoretical result is that, conversely, the presence of non-smoothness in the cost function is a necessary condition for the existence of such inactivation. Second, during an experimental study, participants were asked to perform fast vertical arm movements with one, two, and three degrees of freedom. Observed trajectories, velocity profiles, and final postures were accurately simulated by the model. In accordance, electromyographic signals showed brief simultaneous inactivation of opposing muscles during movements. Thus, assuming that human movements are optimal with respect to a certain integral cost, the minimization of an absolute-work-like cost is supported by experimental observations. Such types of optimality

  3. Locomotor control of limb force switches from minimal intervention principle in early adaptation to noise reduction in late adaptation

    PubMed Central

    Selgrade, Brian P.

    2014-01-01

    During movement, errors are typically corrected only if they hinder performance. Preferential correction of task-relevant deviations is described by the minimal intervention principle but has not been demonstrated in the joints during locomotor adaptation. We studied hopping as a tractable model of locomotor adaptation of the joints within the context of a limb-force-specific task space. Subjects hopped while adapting to shifted visual feedback that induced them to increase peak ground reaction force (GRF). We hypothesized subjects would preferentially reduce task-relevant joint torque deviations over task-irrelevant deviations to increase peak GRF. We employed a modified uncontrolled manifold analysis to quantify task-relevant and task-irrelevant joint torque deviations for each individual hop cycle. As would be expected by the explicit goal of the task, peak GRF errors decreased in early adaptation before reaching steady state during late adaptation. Interestingly, during the early adaptation performance improvement phase, subjects reduced GRF errors by decreasing only the task-relevant joint torque deviations. In contrast, during the late adaption performance maintenance phase, all torque deviations decreased in unison regardless of task relevance. In deadaptation, when the shift in visual feedback was removed, all torque deviations decreased in unison, possibly because performance improvement was too rapid to detect changes in only the task-relevant dimension. We conclude that limb force adaptation in hopping switches from a minimal intervention strategy during performance improvement to a noise reduction strategy during performance maintenance, which may represent a general control strategy for locomotor adaptation of limb force in other bouncing gaits, such as running. PMID:25475343

  4. Lessons that Bear Repeating and Repeating that Bears Lessons: An Interdisciplinary Unit on Principles of Minimalism in Modern Music, Art, and Poetry (Grades 4-8)

    ERIC Educational Resources Information Center

    Smigel, Eric; McDonald, Nan L.

    2012-01-01

    This theory-to-practice article focuses on interdisciplinary classroom activities based on principles of minimalism in modern music, art, and poetry. A lesson sequence was designed for an inner-city Grades 4 and 5 general classroom of English language learners, where the unit was taught, assessed, and documented by the authors. Included in the…

  5. A review of the fundamentals of polymer-modified asphalts: Asphalt/polymer interactions and principles of compatibility.

    PubMed

    Polacco, Giovanni; Filippi, Sara; Merusi, Filippo; Stastna, George

    2015-10-01

    During the last decades, the number of vehicles per citizen as well as the traffic speed and load has dramatically increased. This sudden and somehow unplanned overloading has strongly shortened the life of pavements and increased its cost of maintenance and risks to users. In order to limit the deterioration of road networks, it is necessary to improve the quality and performance of pavements, which was achieved through the addition of a polymer to the bituminous binder. Since their introduction, polymer-modified asphalts have gained in importance during the second half of the twentieth century, and they now play a fundamental role in the field of road paving. With high-temperature and high-shear mixing with asphalt, the polymer incorporates asphalt molecules, thereby forming a swallowed network that involves the entire binder and results in a significant improvement of the viscoelastic properties in comparison with those of the unmodified binder. Such a process encounters the well-known difficulties related to the poor solubility of polymers, which limits the number of macromolecules able to not only form such a structure but also maintain it during high-temperature storage in static conditions, which may be necessary before laying the binder. Therefore, polymer-modified asphalts have been the subject of numerous studies aimed to understand and optimize their structure and storage stability, which gradually attracted polymer scientists into this field that was initially explored by civil engineers. The analytical techniques of polymer science have been applied to polymer-modified asphalts, which resulted in a good understanding of their internal structure. Nevertheless, the complexity and variability of asphalt composition rendered it nearly impossible to generalize the results and univocally predict the properties of a given polymer/asphalt pair. The aim of this paper is to review these aspects of polymer-modified asphalts. Together with a brief description of

  6. PET/CT: fundamental principles.

    PubMed

    Seemann, Marcus D

    2004-05-28

    Positron emission tomography (PET) facilitates the evaluation of metabolic and molecular characteristics of a wide variety of cancers, but is limited in its ability to visualize anatomical structures. Computed tomography (CT) facilitates the evaluation of anatomical structures of cancers, but can not visualize their metabolic and molecular aspects. Therefore, the combination of PET and CT provides the ability to accurately register metabolic and molecular aspects of disease with anatomical findings, adding further information to the diagnosis and staging of tumors. The recent generation of high performance PET/CT scanners combines a state of the art full-ring 3D PET scanner and a high-end 16-slice CT scanner. In PET/CT scanners, a CT examination is used for attenuation correction of PET images rather than standard transmission scanning using superset 68 Ge sources. This reduces the examination time, but metallic objects and contrast agents that alter the CT image quality and quantitative measurements of standardized uptake values (SUV) may lead to artifacts in the PET images. Hybrid PET/CT imaging will be very important in oncological applications in the decades to come, and possibly for use in cancer screening and cardiac imaging. PMID:15257877

  7. Animal and robot experiments to discover principles behind the evolution of a minimal locomotor apparatus for robust legged locomotion

    NASA Astrophysics Data System (ADS)

    McInroe, Benjamin; Astley, Henry; Kawano, Sandy; Blob, Richard; Goldman, Daniel I.

    2015-03-01

    In the evolutionary transition from an aquatic to a terrestrial environment, early walkers adapted to the challenges of locomotion on complex, flowable substrates (e.g. sand and mud). Our previous biological and robotic studies have demonstrated that locomotion on such substrates is sensitive to both limb morphology and kinematics. Although reconstructions of early vertebrate skeletal morphologies exist, the kinematic strategies required for successful locomotion by these organisms have not yet been explored. To gain insight into how early walkers contended with complex substrates, we developed a robotic model with appendage morphology inspired by a model analog organism, the mudskipper. We tested mudskippers and the robot on different substrates, including rigid ground and dry granular media, varying incline angle. The mudskippers moved effectively on all level substrates using a fin-driven gait. But as incline angle increased, the animals used their tails in concert with their fins to generate propulsion. Adding an actuated tail to the robot improved robustness, making possible locomotion on otherwise inaccessible inclines. With these discoveries, we are elucidating a minimal template that may have allowed the early walkers to adapt to locomotion on land. This work was supported by NSF PoLS.

  8. Transplant of bone marrow and cord blood hematopoietic stem cells in pediatric practice, revisited according to the fundamental principles of bioethics.

    PubMed

    Burgio, G R; Locatelli, F

    1997-06-01

    The two most widely used sources of hematopoietic stem cells for allogeneic transplants in pediatric practice are bone marrow (BM) and cord blood (CB). While bone marrow transplantation (BMT) is reaching its 30th year of application, human umbilical cord blood transplantation (HUCBT) is approaching its 10th. Although these procedures have basically the same purpose, a number of biological differences distinguish them. In particular, the intrinsically limited quantity of CB stem cells and their immunological naiveté confer peculiar characteristics to these hematopoietic progenitors. From a bioethical point of view, the problems which have repeatedly been raised when the BM donor is a child are well-known. Different but no less important ethical problems are raised when one considers HUCBT; in this regard the most important issues are the easier propensity of programming a CB donor in comparison with a BM donor (clearly due to the shorter time interval needed to collect the hematopoietic progenitors); the in utero HLA-typing; the implication of employing 'blood belonging to a neonate' for a third party; the need to perform a number of investigations both on the CB of the donor and on the mother and the implications that the discovery of disease may have for them, but also the need to establish banks for storing CB, with the accompanying administration and management problems. All these different aspects of UCBT will be discussed in the light of the four fundamental and traditional principles of bioethics, namely autonomy, nonmaleficence, beneficence and justice. PMID:9208108

  9. Buridan's Principle

    NASA Astrophysics Data System (ADS)

    Lamport, Leslie

    2012-08-01

    Buridan's principle asserts that a discrete decision based upon input having a continuous range of values cannot be made within a bounded length of time. It appears to be a fundamental law of nature. Engineers aware of it can design devices so they have an infinitessimal probability of not making a decision quickly enough. Ignorance of the principle could have serious consequences.

  10. Fundamentals of Diesel Engines.

    ERIC Educational Resources Information Center

    Marine Corps Inst., Washington, DC.

    This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the fundamentals of diesel engine mechanics. Addressed in the three individual units of the course are the following topics: basic principles of diesel mechanics; principles, mechanics, and…

  11. Use of minimal invasive extracorporeal circulation in cardiac surgery: principles, definitions and potential benefits. A position paper from the Minimal invasive Extra-Corporeal Technologies international Society (MiECTiS).

    PubMed

    Anastasiadis, Kyriakos; Murkin, John; Antonitsis, Polychronis; Bauer, Adrian; Ranucci, Marco; Gygax, Erich; Schaarschmidt, Jan; Fromes, Yves; Philipp, Alois; Eberle, Balthasar; Punjabi, Prakash; Argiriadou, Helena; Kadner, Alexander; Jenni, Hansjoerg; Albrecht, Guenter; van Boven, Wim; Liebold, Andreas; de Somer, Fillip; Hausmann, Harald; Deliopoulos, Apostolos; El-Essawi, Aschraf; Mazzei, Valerio; Biancari, Fausto; Fernandez, Adam; Weerwind, Patrick; Puehler, Thomas; Serrick, Cyril; Waanders, Frans; Gunaydin, Serdar; Ohri, Sunil; Gummert, Jan; Angelini, Gianni; Falk, Volkmar; Carrel, Thierry

    2016-05-01

    Minimal invasive extracorporeal circulation (MiECC) systems have initiated important efforts within science and technology to further improve the biocompatibility of cardiopulmonary bypass components to minimize the adverse effects and improve end-organ protection. The Minimal invasive Extra-Corporeal Technologies international Society was founded to create an international forum for the exchange of ideas on clinical application and research of minimal invasive extracorporeal circulation technology. The present work is a consensus document developed to standardize the terminology and the definition of minimal invasive extracorporeal circulation technology as well as to provide recommendations for the clinical practice. The goal of this manuscript is to promote the use of MiECC systems into clinical practice as a multidisciplinary strategy involving cardiac surgeons, anaesthesiologists and perfusionists. PMID:26819269

  12. Quantum computing with photons: introduction to the circuit model, the one-way quantum computer, and the fundamental principles of photonic experiments

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie

    2015-04-01

    Quantum physics has revolutionized our understanding of information processing and enables computational speed-ups that are unattainable using classical computers. This tutorial reviews the fundamental tools of photonic quantum information processing. The basics of theoretical quantum computing are presented and the quantum circuit model as well as measurement-based models of quantum computing are introduced. Furthermore, it is shown how these concepts can be implemented experimentally using photonic qubits, where information is encoded in the photons’ polarization.

  13. Fundamentals of fluid lubrication

    NASA Technical Reports Server (NTRS)

    Hamrock, Bernard J.

    1991-01-01

    The aim is to coordinate the topics of design, engineering dynamics, and fluid dynamics in order to aid researchers in the area of fluid film lubrication. The lubrication principles that are covered can serve as a basis for the engineering design of machine elements. The fundamentals of fluid film lubrication are presented clearly so that students that use the book will have confidence in their ability to apply these principles to a wide range of lubrication situations. Some guidance on applying these fundamentals to the solution of engineering problems is also provided.

  14. Homeschooling and Religious Fundamentalism

    ERIC Educational Resources Information Center

    Kunzman, Robert

    2010-01-01

    This article considers the relationship between homeschooling and religious fundamentalism by focusing on their intersection in the philosophies and practices of conservative Christian homeschoolers in the United States. Homeschooling provides an ideal educational setting to support several core fundamentalist principles: resistance to…

  15. Fundamental electrode kinetics

    NASA Technical Reports Server (NTRS)

    Elder, J. P.

    1968-01-01

    Report presents the fundamentals of electrode kinetics and the methods used in evaluating the characteristic parameters of rapid-charge transfer processes at electrode-electrolyte interfaces. The concept of electrode kinetics is outlined, followed by the principles underlying the experimental techniques for the investigation of electrode kinetics.

  16. Fundamental Reaction Pathway for Peptide Metabolism by Proteasome: Insights from First-principles Quantum Mechanical/Molecular Mechanical Free Energy Calculations

    PubMed Central

    Wei, Donghui; Fang, Lei; Tang, Mingsheng; Zhan, Chang-Guo

    2013-01-01

    Proteasome is the major component of the crucial nonlysosomal protein degradation pathway in the cells, but the detailed reaction pathway is unclear. In this study, first-principles quantum mechanical/molecular mechanical free energy calculations have been performed to explore, for the first time, possible reaction pathways for proteasomal proteolysis/hydrolysis of a representative peptide, succinyl-leucyl-leucyl-valyl-tyrosyl-7-amino-4-methylcoumarin (Suc-LLVY-AMC). The computational results reveal that the most favorable reaction pathway consists of six steps. The first is a water-assisted proton transfer within proteasome, activating Thr1-Oγ. The second is a nucleophilic attack on the carbonyl carbon of a Tyr residue of substrate by the negatively charged Thr1-Oγ, followed by the dissociation of the amine AMC (third step). The fourth step is a nucleophilic attack on the carbonyl carbon of the Tyr residue of substrate by a water molecule, accompanied by a proton transfer from the water molecule to Thr1-Nz. Then, Suc-LLVY is dissociated (fifth step), and Thr1 is regenerated via a direct proton transfer from Thr1-Nz to Thr1-Oγ. According to the calculated energetic results, the overall reaction energy barrier of the proteasomal hydrolysis is associated with the transition state (TS3b) for the third step involving a water-assisted proton transfer. The determined most favorable reaction pathway and the rate-determining step have provided a reasonable interpretation of the reported experimental observations concerning the substituent and isotopic effects on the kinetics. The calculated overall free energy barrier of 18.2 kcal/mol is close to the experimentally-derived activation free energy of ~18.3–19.4 kcal/mol, suggesting that the computational results are reasonable. PMID:24111489

  17. Minimal cosmography

    NASA Astrophysics Data System (ADS)

    Piazza, Federico; Schücker, Thomas

    2016-04-01

    The minimal requirement for cosmography—a non-dynamical description of the universe—is a prescription for calculating null geodesics, and time-like geodesics as a function of their proper time. In this paper, we consider the most general linear connection compatible with homogeneity and isotropy, but not necessarily with a metric. A light-cone structure is assigned by choosing a set of geodesics representing light rays. This defines a "scale factor" and a local notion of distance, as that travelled by light in a given proper time interval. We find that the velocities and relativistic energies of free-falling bodies decrease in time as a consequence of cosmic expansion, but at a rate that can be different than that dictated by the usual metric framework. By extrapolating this behavior to photons' redshift, we find that the latter is in principle independent of the "scale factor". Interestingly, redshift-distance relations and other standard geometric observables are modified in this extended framework, in a way that could be experimentally tested. An extremely tight constraint on the model, however, is represented by the blackbody-ness of the cosmic microwave background. Finally, as a check, we also consider the effects of a non-metric connection in a different set-up, namely, that of a static, spherically symmetric spacetime.

  18. Monte Carlo fundamentals

    SciTech Connect

    Brown, F.B.; Sutton, T.M.

    1996-02-01

    This report is composed of the lecture notes from the first half of a 32-hour graduate-level course on Monte Carlo methods offered at KAPL. These notes, prepared by two of the principle developers of KAPL`s RACER Monte Carlo code, cover the fundamental theory, concepts, and practices for Monte Carlo analysis. In particular, a thorough grounding in the basic fundamentals of Monte Carlo methods is presented, including random number generation, random sampling, the Monte Carlo approach to solving transport problems, computational geometry, collision physics, tallies, and eigenvalue calculations. Furthermore, modern computational algorithms for vector and parallel approaches to Monte Carlo calculations are covered in detail, including fundamental parallel and vector concepts, the event-based algorithm, master/slave schemes, parallel scaling laws, and portability issues.

  19. A Yoga Strengthening Program Designed to Minimize the Knee Adduction Moment for Women with Knee Osteoarthritis: A Proof-Of-Principle Cohort Study

    PubMed Central

    2015-01-01

    People with knee osteoarthritis may benefit from exercise prescriptions that minimize knee loads in the frontal plane. The primary objective of this study was to determine whether a novel 12-week strengthening program designed to minimize exposure to the knee adduction moment (KAM) could improve symptoms and knee strength in women with symptomatic knee osteoarthritis. A secondary objective was to determine whether the program could improve mobility and fitness, and decrease peak KAM during gait. The tertiary objective was to evaluate the biomechanical characteristics of this yoga program. In particular, we compared the peak KAM during gait with that during yoga postures at baseline. We also compared lower limb normalized mean electromyography (EMG) amplitudes during yoga postures between baseline and follow-up. Primary measures included self-reported pain and physical function (Knee injury and Osteoarthritis Outcome Score) and knee strength (extensor and flexor torques). Secondary measures included mobility (six-minute walk, 30-second chair stand, stair climbing), fitness (submaximal cycle ergometer test), and clinical gait analysis using motion capture synchronized with electromyography and force measurement. Also, KAM and normalized mean EMG amplitudes were collected during yoga postures. Forty-five women over age 50 with symptomatic knee osteoarthritis, consistent with the American College of Rheumatology criteria, enrolled in our 12-week (3 sessions per week) program. Data from 38 were analyzed (six drop-outs; one lost to co-intervention). Participants experienced reduced pain (mean improvement 10.1–20.1 normalized to 100; p<0.001), increased knee extensor strength (mean improvement 0.01 Nm/kg; p = 0.004), and increased flexor strength (mean improvement 0.01 Nm/kg; p = 0.001) at follow-up compared to baseline. Participants improved mobility on the six-minute walk (mean improvement 37.7 m; p<0.001) and 30-second chair stand (mean improvement 1.3; p = 0.006) at

  20. Marketing fundamentals.

    PubMed

    Redmond, W H

    2001-01-01

    This chapter outlines current marketing practice from a managerial perspective. The role of marketing within an organization is discussed in relation to efficiency and adaptation to changing environments. Fundamental terms and concepts are presented in an applied context. The implementation of marketing plans is organized around the four P's of marketing: product (or service), promotion (including advertising), place of delivery, and pricing. These are the tools with which marketers seek to better serve their clients and form the basis for competing with other organizations. Basic concepts of strategic relationship management are outlined. Lastly, alternate viewpoints on the role of advertising in healthcare markets are examined. PMID:11401791

  1. Fundamentals of fossil simulator instructor training

    SciTech Connect

    Not Available

    1984-01-01

    This single-volume, looseleaf text introduces the beginning instructor to fundamental instructor training principles, and then shows how to apply those principles to fossil simulator training. Topics include the fundamentals of classroom instruction, the learning process, course development, and the specifics of simulator training program development.

  2. Ethics fundamentals.

    PubMed

    Chambers, David W

    2011-01-01

    Ethics is about studying the right and the good; morality is about acting as one should. Although there are differences among what is legal, charitable, professional, ethical, and moral, these desirable characteristics tend to cluster and are treasured in dentistry. The traditional approach to professionalism in dentistry is based on a theory of biomedical ethics advanced 30 years ago. Known as the principles approach, general ideals such as respect for autonomy, nonmaleficence, beneficence, justice, and veracity, are offered as guides. Growth in professionalism consists in learning to interpret the application of these principles as one's peers do. Moral behavior is conceived as a continuous cycle of sensitivity to situations requiring moral response, moral reasoning, the moral courage to take action when necessary, and integration of habits of moral behavior into one's character. This essay is the first of two papers that provide the backbone for the IDEA Project of the College--an online, multiformat, interactive "textbook" of ethics for the profession. PMID:22263371

  3. VCSEL Fundamentals

    NASA Astrophysics Data System (ADS)

    Michalzik, Rainer

    In this chapter we outline major principles of vertical-cavity surface-emitting laser (VCSEL) design and operation. Basic device properties and generally applicable cavity design rules are introduced. Characteristic parameters like threshold gain and current, differential quantum efficiency and power conversion efficiency, as well as thermal resistance are discussed. We describe the design of Bragg reflectors and explain the transfer matrix method as a convenient tool to compute VCSEL resonator properties in a one-dimensional approximation. Experimental results illustrate the emission characteristics of high-efficiency VCSELs that apply selective oxidation for current and photon confinement. Both the 850 and 980 nm wavelength regions are considered. The basic treatment of laser dynamics and noise behavior is presented in terms of the small-signal modulation response as well as the relative intensity noise. Finally we give some examples of VCSEL applications in fiber-based optical interconnects, i.e., optical data transmission over short distances.

  4. How fundamental are fundamental constants?

    NASA Astrophysics Data System (ADS)

    Duff, M. J.

    2015-01-01

    I argue that the laws of physics should be independent of one's choice of units or measuring apparatus. This is the case if they are framed in terms of dimensionless numbers such as the fine structure constant, ?. For example, the standard model of particle physics has 19 such dimensionless parameters whose values all observers can agree on, irrespective of what clock, rulers or scales? they use to measure them. Dimensional constants, on the other hand, such as ?, c, G, e and k ?, are merely human constructs whose number and values differ from one choice of units to the next. In this sense, only dimensionless constants are 'fundamental'. Similarly, the possible time variation of dimensionless fundamental 'constants' of nature is operationally well defined and a legitimate subject of physical enquiry. By contrast, the time variation of dimensional constants such as ? or ? on which a good many (in my opinion, confusing) papers have been written, is a unit-dependent phenomenon on which different observers might disagree depending on their apparatus. All these confusions disappear if one asks only unit-independent questions. We provide a selection of opposing opinions in the literature and respond accordingly.

  5. Minimal metabolic pathway structure is consistent with associated biomolecular interactions

    PubMed Central

    Bordbar, Aarash; Nagarajan, Harish; Lewis, Nathan E; Latif, Haythem; Ebrahim, Ali; Federowicz, Stephen; Schellenberger, Jan; Palsson, Bernhard O

    2014-01-01

    Pathways are a universal paradigm for functionally describing cellular processes. Even though advances in high-throughput data generation have transformed biology, the core of our biological understanding, and hence data interpretation, is still predicated on human-defined pathways. Here, we introduce an unbiased, pathway structure for genome-scale metabolic networks defined based on principles of parsimony that do not mimic canonical human-defined textbook pathways. Instead, these minimal pathways better describe multiple independent pathway-associated biomolecular interaction datasets suggesting a functional organization for metabolism based on parsimonious use of cellular components. We use the inherent predictive capability of these pathways to experimentally discover novel transcriptional regulatory interactions in Escherichia coli metabolism for three transcription factors, effectively doubling the known regulatory roles for Nac and MntR. This study suggests an underlying and fundamental principle in the evolutionary selection of pathway structures; namely, that pathways may be minimal, independent, and segregated. PMID:24987116

  6. Minimal metabolic pathway structure is consistent with associated biomolecular interactions.

    PubMed

    Bordbar, Aarash; Nagarajan, Harish; Lewis, Nathan E; Latif, Haythem; Ebrahim, Ali; Federowicz, Stephen; Schellenberger, Jan; Palsson, Bernhard O

    2014-01-01

    Pathways are a universal paradigm for functionally describing cellular processes. Even though advances in high-throughput data generation have transformed biology, the core of our biological understanding, and hence data interpretation, is still predicated on human-defined pathways. Here, we introduce an unbiased, pathway structure for genome-scale metabolic networks defined based on principles of parsimony that do not mimic canonical human-defined textbook pathways. Instead, these minimal pathways better describe multiple independent pathway-associated biomolecular interaction datasets suggesting a functional organization for metabolism based on parsimonious use of cellular components. We use the inherent predictive capability of these pathways to experimentally discover novel transcriptional regulatory interactions in Escherichia coli metabolism for three transcription factors, effectively doubling the known regulatory roles for Nac and MntR. This study suggests an underlying and fundamental principle in the evolutionary selection of pathway structures; namely, that pathways may be minimal, independent, and segregated. PMID:24987116

  7. Does osteoderm growth follow energy minimization principles?

    PubMed

    Sensale, Sebastián; Jones, Washington; Blanco, R Ernesto

    2014-08-01

    Although the growth and development of tissues and organs of extinct species cannot be directly observed, their fossils can record and preserve evidence of these mechanisms. It is generally accepted that bone architecture is the result of genetically based biomechanical constraints, but what about osteoderms? In this article, the influence of physical constraints on cranial osteoderms growth is assessed. Comparisons among lepidosaurs, synapsids, and archosaurs are performed; according to these analyses, lepidosaur osteoderms growth is predicted to be less energy demanding than that of synapsids and archosaurs. Obtained results also show that, from an energetic viewpoint, ankylosaurid osteoderms growth resembles more that of mammals than the one of reptilians, adding evidence to debate whether dinosaurs were hot or cold blooded. PMID:24634089

  8. Commentary: Minimizing Evaluation Misuse as Principled Practice

    ERIC Educational Resources Information Center

    Cousins, J. Bradley

    2004-01-01

    "Ethical Challenges," in my experience, is invariably interesting, often instructive and sometimes amusing. Some of the most engaging stimulus scenarios raise thorny evaluation practice issues that ultimately lead to disparate points of view about the nature of the issue and how to handle it (Datta, 2002; Smith, 2002). Despite my poor performance…

  9. Minimal length uncertainty and accelerating universe

    NASA Astrophysics Data System (ADS)

    Farmany, A.; Mortazavi, S. S.

    2016-06-01

    In this paper, minimal length uncertainty is used as a constraint to solve the Friedman equation. It is shown that, based on the minimal length uncertainty principle, the Hubble scale is decreasing which corresponds to an accelerating universe.

  10. Fundamentals of Environmental Education. Report.

    ERIC Educational Resources Information Center

    1976

    An outline of fundamental definitions, relationships, and human responsibilities related to environment provides a basis from which a variety of materials, programs, and activities can be developed. The outline can be used in elementary, secondary, higher education, or adult education programs. The framework is based on principles of the science…

  11. Fundamentals of Structural Geology

    NASA Astrophysics Data System (ADS)

    Pollard, David D.; Fletcher, Raymond C.

    2005-09-01

    Fundamentals of Structural Geology provides a new framework for the investigation of geological structures by integrating field mapping and mechanical analysis. Assuming a basic knowledge of physical geology, introductory calculus and physics, it emphasizes the observational data, modern mapping technology, principles of continuum mechanics, and the mathematical and computational skills, necessary to quantitatively map, describe, model, and explain deformation in Earth's lithosphere. By starting from the fundamental conservation laws of mass and momentum, the constitutive laws of material behavior, and the kinematic relationships for strain and rate of deformation, the authors demonstrate the relevance of solid and fluid mechanics to structural geology. This book offers a modern quantitative approach to structural geology for advanced students and researchers in structural geology and tectonics. It is supported by a website hosting images from the book, additional colour images, student exercises and MATLAB scripts. Solutions to the exercises are available to instructors. The book integrates field mapping using modern technology with the analysis of structures based on a complete mechanics MATLAB is used to visualize physical fields and analytical results and MATLAB scripts can be downloaded from the website to recreate textbook graphics and enable students to explore their choice of parameters and boundary conditions The supplementary website hosts color images of outcrop photographs used in the text, supplementary color images, and images of textbook figures for classroom presentations The textbook website also includes student exercises designed to instill the fundamental relationships, and to encourage the visualization of the evolution of geological structures; solutions are available to instructors

  12. Evolutionary principles and their practical application

    PubMed Central

    Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P

    2011-01-01

    Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology. PMID:25567966

  13. Dynamic sealing principles

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function, these are presented and discussed. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin film seals, which enable leakage calculations to be made, are also presented. Concept of flow functions, application of Reynolds lubrication equation, and nonlubrication equation flow, friction and wear; and seal lubrication regimes are explained.

  14. The minimal work cost of information processing.

    PubMed

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  15. The minimal work cost of information processing

    PubMed Central

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  16. Fundamentals in Nuclear Physics

    NASA Astrophysics Data System (ADS)

    Basdevant, Jean-Louis, Rich, James, Spiro, Michael

    This course on nuclear physics leads the reader to the exploration of the field from nuclei to astrophysical issues. Much nuclear phenomenology can be understood from simple arguments such as those based on the Pauli principle and the Coulomb barrier. This book is concerned with extrapolating from such arguments and illustrating nuclear systematics with experimental data. Starting with the basic concepts in nuclear physics, nuclear models, and reactions, the book covers nuclear decays and the fundamental electro-weak interactions, radioactivity, and nuclear energy. After the discussions of fission and fusion leading into nuclear astrophysics, there is a presentation of the latest ideas about cosmology. As a primer this course will lay the foundations for more specialized subjects. This book emerged from a series of topical courses the authors delivered at the Ecole Polytechnique and will be useful for graduate students and for scientists in a variety of fields.

  17. Minimal Reduplication

    ERIC Educational Resources Information Center

    Kirchner, Jesse Saba

    2010-01-01

    This dissertation introduces Minimal Reduplication, a new theory and framework within generative grammar for analyzing reduplication in human language. I argue that reduplication is an emergent property in multiple components of the grammar. In particular, reduplication occurs independently in the phonology and syntax components, and in both cases…

  18. Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory: A brief review

    NASA Astrophysics Data System (ADS)

    Kerner, Boris S.

    2013-11-01

    It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliably used for control and optimization in traffic networks. It is shown that the generally accepted fundamentals and methodologies of the traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of the traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (for example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular (fixed or stochastic) value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of the traffic and transportation theory, we discuss the three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.

  19. Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory

    SciTech Connect

    Kerner, Boris S.

    2015-03-10

    It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliable used for control and optimization in traffic networks. It is shown that generally accepted fundamentals and methodologies of traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (for example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular stochastic value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of traffic and transportation theory, we discuss three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.

  20. Classical tests of general relativity: Brane-world Sun from minimal geometric deformation

    NASA Astrophysics Data System (ADS)

    Casadio, R.; Ovalle, J.; da Rocha, Roldão

    2015-05-01

    We consider a solution of the effective four-dimensional brane-world equations, obtained from the general relativistic Schwarzschild metric via the principle of minimal geometric deformation, and investigate the corresponding signatures stemming from the possible existence of a warped extra-dimension. In particular, we derive bounds on an extra-dimensional parameter, closely related with the fundamental gravitational length, from the experimental results of the classical tests of general relativity in the Solar system.

  1. Testing Our Fundamental Assumptions

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  2. Taxonomic minimalism.

    PubMed

    Beattle, A J; Oliver, I

    1994-12-01

    Biological surveys are in increasing demand while taxonomic resources continue to decline. How much formal taxonomy is required to get the job done? The answer depends on the kind of job but it is possible that taxonomic minimalism, especially (1) the use of higher taxonomic ranks, (2) the use of morphospecies rather than species (as identified by Latin binomials), and (3) the involvement of taxonomic specialists only for training and verification, may offer advantages for biodiversity assessment, environmental monitoring and ecological research. As such, formal taxonomy remains central to the process of biological inventory and survey but resources may be allocated more efficiently. For example, if formal Identification is not required, resources may be concentrated on replication and increasing sample sizes. Taxonomic minimalism may also facilitate the inclusion in these activities of important but neglected groups, especially among the invertebrates, and perhaps even microorganisms. PMID:21236933

  3. Generating minimal living systems from non-living materials and increasing their evolutionary abilities.

    PubMed

    Rasmussen, Steen; Constantinescu, Adi; Svaneborg, Carsten

    2016-08-19

    We review lessons learned about evolutionary transitions from a bottom-up construction of minimal life. We use a particular systemic protocell design process as a starting point for exploring two fundamental questions: (i) how may minimal living systems emerge from non-living materials? and (ii) how may minimal living systems support increasingly more evolutionary richness? Under (i), we present what has been accomplished so far and discuss the remaining open challenges and their possible solutions. Under (ii), we present a design principle we have used successfully both for our computational and experimental protocellular investigations, and we conjecture how this design principle can be extended for enhancing the evolutionary potential for a wide range of systems.This article is part of the themed issue 'The major synthetic evolutionary transitions'. PMID:27431518

  4. Fundamentals of Pharmacogenetics in Personalized, Precision Medicine.

    PubMed

    Valdes, Roland; Yin, DeLu Tyler

    2016-09-01

    This article introduces fundamental principles of pharmacogenetics as applied to personalized and precision medicine. Pharmacogenetics establishes relationships between pharmacology and genetics by connecting phenotypes and genotypes in predicting the response of therapeutics in individual patients. We describe differences between precision and personalized medicine and relate principles of pharmacokinetics and pharmacodynamics to applications in laboratory medicine. We also review basic principles of pharmacogenetics, including its evolution, how it enables the practice of personalized therapeutics, and the role of the clinical laboratory. These fundamentals are a segue for understanding specific clinical applications of pharmacogenetics described in subsequent articles in this issue. PMID:27514461

  5. Fundamentals of the Control of Gas-Turbine Power Plants for Aircraft. Part 2; Principles of Control Common to Jet, Turbine-Propeller Jet, and Ducted-Fan Jet Power Plants

    NASA Technical Reports Server (NTRS)

    Kuehl, H.

    1947-01-01

    After defining the aims and requirements to be set for a control system of gas-turbine power plants for aircraft, the report will deal with devices that prevent the quantity of fuel supplied per unit of time from exceeding the value permissible at a given moment. The general principles of the actuation of the adjustable parts of the power plant are also discussed.

  6. Testing Our Fundamental Assumptions

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  7. Periodic minimal surfaces

    NASA Astrophysics Data System (ADS)

    Mackay, Alan L.

    1985-04-01

    A minimal surface is one for which, like a soap film with the same pressure on each side, the mean curvature is zero and, thus, is one where the two principal curvatures are equal and opposite at every point. For every closed circuit in the surface, the area is a minimum. Schwarz1 and Neovius2 showed that elements of such surfaces could be put together to give surfaces periodic in three dimensions. These periodic minimal surfaces are geometrical invariants, as are the regular polyhedra, but the former are curved. Minimal surfaces are appropriate for the description of various structures where internal surfaces are prominent and seek to adopt a minimum area or a zero mean curvature subject to their topology; thus they merit more complete numerical characterization. There seem to be at least 18 such surfaces3, with various symmetries and topologies, related to the crystallographic space groups. Recently, glyceryl mono-oleate (GMO) was shown by Longley and McIntosh4 to take the shape of the F-surface. The structure postulated is shown here to be in good agreement with an analysis of the fundamental geometry of periodic minimal surfaces.

  8. The ZOOM minimization package

    SciTech Connect

    Fischler, Mark S.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  9. Basic Comfort Heating Principles.

    ERIC Educational Resources Information Center

    Dempster, Chalmer T.

    The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…

  10. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  11. GRBs and Fundamental Physics

    NASA Astrophysics Data System (ADS)

    Petitjean, Patrick; Wang, F. Y.; Wu, X. F.; Wei, J. J.

    2016-02-01

    Gamma-ray bursts (GRBs) are short and intense flashes at the cosmological distances, which are the most luminous explosions in the Universe. The high luminosities of GRBs make them detectable out to the edge of the visible universe. So, they are unique tools to probe the properties of high-redshift universe: including the cosmic expansion and dark energy, star formation rate, the reionization epoch and the metal evolution of the Universe. First, they can be used to constrain the history of cosmic acceleration and the evolution of dark energy in a redshift range hardly achievable by other cosmological probes. Second, long GRBs are believed to be formed by collapse of massive stars. So they can be used to derive the high-redshift star formation rate, which can not be probed by current observations. Moreover, the use of GRBs as cosmological tools could unveil the reionization history and metal evolution of the Universe, the intergalactic medium (IGM) properties and the nature of first stars in the early universe. But beyond that, the GRB high-energy photons can be applied to constrain Lorentz invariance violation (LIV) and to test Einstein's Equivalence Principle (EEP). In this paper, we review the progress on the GRB cosmology and fundamental physics probed by GRBs.

  12. Fundamentals of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Tang, C. L.

    2005-06-01

    Quantum mechanics has evolved from a subject of study in pure physics to one with a wide range of applications in many diverse fields. The basic concepts of quantum mechanics are explained in this book in a concise and easy-to-read manner emphasising applications in solid state electronics and modern optics. Following a logical sequence, the book is focused on the key ideas and is conceptually and mathematically self-contained. The fundamental principles of quantum mechanics are illustrated by showing their application to systems such as the hydrogen atom, multi-electron ions and atoms, the formation of simple organic molecules and crystalline solids of practical importance. It leads on from these basic concepts to discuss some of the most important applications in modern semiconductor electronics and optics. Containing many homework problems and worked examples, the book is suitable for senior-level undergraduate and graduate level students in electrical engineering, materials science and applied physics. Clear exposition of quantum mechanics written in a concise and accessible style Precise physical interpretation of the mathematical foundations of quantum mechanics Illustrates the important concepts and results by reference to real-world examples in electronics and optoelectronics Contains homeworks and worked examples, with solutions available for instructors

  13. Fundamentals of Cryogenics

    NASA Technical Reports Server (NTRS)

    Johnson, Wesley; Tomsik, Thomas; Moder, Jeff

    2014-01-01

    Analysis of the extreme conditions that are encountered in cryogenic systems requires the most effort out of analysts and engineers. Due to the costs and complexity associated with the extremely cold temperatures involved, testing is sometimes minimized and extra analysis is often relied upon. This short course is designed as an introduction to cryogenic engineering and analysis, and it is intended to introduce the basic concepts related to cryogenic analysis and testing as well as help the analyst understand the impacts of various requests on a test facility. Discussion will revolve around operational functions often found in cryogenic systems, hardware for both tests and facilities, and what design or modelling tools are available for performing the analysis. Emphasis will be placed on what scenarios to use what hardware or the analysis tools to get the desired results. The class will provide a review of first principles, engineering practices, and those relations directly applicable to this subject including such topics as cryogenic fluids, thermodynamics and heat transfer, material properties at low temperature, insulation, cryogenic equipment, instrumentation, refrigeration, testing of cryogenic systems, cryogenics safety and typical thermal and fluid analysis used by the engineer. The class will provide references for further learning on various topics in cryogenics for those who want to dive deeper into the subject or have encountered specific problems.

  14. Towards a Minimal System for Cell Division

    NASA Astrophysics Data System (ADS)

    Schwille, Petra

    We have entered the "omics" era of the life sciences, meaning that our general knowledge about biological systems has become vast, complex, and almost impossible to fully comprehend. Consequently, the challenge for quantitative biology and biophysics is to identify appropriate procedures and protocols that allow the researcher to strip down the complexity of a biological system to a level that can be reliably modeled but still retains the essential features of its "real" counterpart. The virtue of physics has always been the reductionist approach, which allowed scientists to identify the underlying basic principles of seemingly complex phenomena, and subject them to rigorous mathematical treatment. Biological systems are obviously among the most complex phenomena we can think of, and it is fair to state that our rapidly increasing knowledge does not make it easier to identify a small set of fundamental principles of the big concept of "life" that can be defined and quantitatively understood. Nevertheless, it is becoming evident that only by tight cooperation and interdisciplinary exchange between the life sciences and quantitative sciences, and by applying intelligent reductionist approaches also to biology, will we be able to meet the intellectual challenges of the twenty-first century. These include not only the collection and proper categorization of the data, but also their true understanding and harnessing such that we can solve important practical problems imposed by medicine or the worldwide need for new energy sources. Many of these approaches are reflected by the modern buzz word "synthetic biology", therefore I briefly discuss this term in the first section. Further, I outline some endeavors of our and other groups to model minimal biological systems, with particular focus on the possibility of generating a minimal system for cell division.

  15. The Seven Cardinal Principles Revisited

    ERIC Educational Resources Information Center

    Shane, Harold G.

    1976-01-01

    The seven cardinal principles of education as stated in 1918--health, command of fundamental processes; worthy home membership, vocation; citizenship, use of leisure, and ethical character--were reassessed by panelists and the future development of each principle examined in the light of a changing world. (JD)

  16. Toward systematic integration between self-determination theory and motivational interviewing as examples of top-down and bottom-up intervention development: autonomy or volition as a fundamental theoretical principle

    PubMed Central

    2012-01-01

    Clinical interventions can be developed through two distinct pathways. In the first, which we call top-down, a well-articulated theory drives the development of the intervention, whereas in the case of a bottom-up approach, clinical experience, more so than a dedicated theoretical perspective, drives the intervention. Using this dialectic, this paper discusses Self-Determination Theory (SDT) [1,2] and Motivational Interviewing (MI) [3] as prototypical examples of a top-down and bottom-up approaches, respectively. We sketch the different starting points, foci and developmental processes of SDT and MI, but equally note the complementary character and the potential for systematic integration between both approaches. Nevertheless, for a deeper integration to take place, we contend that MI researchers might want to embrace autonomy as a fundamental basic process underlying therapeutic change and we discuss the advantages of doing so. PMID:22385828

  17. Plasma Modeling Enabled Technology Development Empowered by Fundamental Scattering Data

    NASA Astrophysics Data System (ADS)

    Kushner, Mark J.

    2016-05-01

    Technology development increasingly relies on modeling to speed the innovation cycle. This is particularly true for systems using low temperature plasmas (LTPs) and their role in enabling energy efficient processes with minimal environmental impact. In the innovation cycle, LTP modeling supports investigation of fundamental processes that seed the cycle, optimization of newly developed technologies, and prediction of performance of unbuilt systems for new applications. Although proof-of-principle modeling may be performed for idealized systems in simple gases, technology development must address physically complex systems that use complex gas mixtures that now may be multi-phase (e.g., in contact with liquids). The variety of fundamental electron and ion scattering, and radiation transport data (FSRD) required for this modeling increases as the innovation cycle progresses, while the accuracy required of that data depends on the intended outcome. In all cases, the fidelity, depth and impact of the modeling depends on the availability of FSRD. Modeling and technology development are, in fact, empowered by the availability and robustness of FSRD. In this talk, examples of the impact of and requirements for FSRD in the innovation cycle enabled by plasma modeling will be discussed using results from multidimensional and global models. Examples of fundamental studies and technology optimization will focus on microelectronics fabrication and on optically pumped lasers. Modeling of systems as yet unbuilt will address the interaction of atmospheric pressure plasmas with liquids. Work supported by DOE Office of Fusion Energy Science and the National Science Foundation.

  18. Combustion Fundamentals Research

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Increased emphasis is placed on fundamental and generic research at Lewis Research Center with less systems development efforts. This is especially true in combustion research, where the study of combustion fundamentals has grown significantly in order to better address the perceived long term technical needs of the aerospace industry. The main thrusts for this combustion fundamentals program area are as follows: analytical models of combustion processes, model verification experiments, fundamental combustion experiments, and advanced numeric techniques.

  19. Exchange Rates and Fundamentals.

    ERIC Educational Resources Information Center

    Engel, Charles; West, Kenneth D.

    2005-01-01

    We show analytically that in a rational expectations present-value model, an asset price manifests near-random walk behavior if fundamentals are I (1) and the factor for discounting future fundamentals is near one. We argue that this result helps explain the well-known puzzle that fundamental variables such as relative money supplies, outputs,…

  20. Reconstruction of fundamental SUSY parameters

    SciTech Connect

    P. M. Zerwas et al.

    2003-09-25

    We summarize methods and expected accuracies in determining the basic low-energy SUSY parameters from experiments at future e{sup +}e{sup -} linear colliders in the TeV energy range, combined with results from LHC. In a second step we demonstrate how, based on this set of parameters, the fundamental supersymmetric theory can be reconstructed at high scales near the grand unification or Planck scale. These analyses have been carried out for minimal supergravity [confronted with GMSB for comparison], and for a string effective theory.

  1. Making the Most of Minimalism in Music.

    ERIC Educational Resources Information Center

    Geiersbach, Frederick J.

    1998-01-01

    Describes the minimalist movement in music. Discusses generations of minimalist musicians and, in general, the minimalist approach. Considers various ways that minimalist strategies can be integrated into the music classroom focusing on (1) minimalism and (2) student-centered composition and principles of minimalism for use with elementary band…

  2. Itch Management: General Principles.

    PubMed

    Misery, Laurent

    2016-01-01

    Like pain, itch is a challenging condition that needs to be managed. Within this setting, the first principle of itch management is to get an appropriate diagnosis to perform an etiology-oriented therapy. In several cases it is not possible to treat the cause, the etiology is undetermined, there are several causes, or the etiological treatment is not effective enough to alleviate itch completely. This is also why there is need for symptomatic treatment. In all patients, psychological support and associated pragmatic measures might be helpful. General principles and guidelines are required, yet patient-centered individual care remains fundamental. PMID:27578069

  3. Basic principles of the Stirling cycle

    NASA Astrophysics Data System (ADS)

    1983-03-01

    The basic principles of the Stirling cycle are outlined. From an elementary theory the general properties of the cycle are derived with a discussion of the most important losses. The performance of the fundamental and ideal (isothermal) cycle are described. The actual cycle, which differs from the ideal one by the occurrence of losses is also described. In the ideal Stirling cycle, the cold is produced by the reversible expansion of a gas. The gas performs a closed cycle, during which it is alternately compressed at ambient temperature in a compression space and expanded at the desired low temperature in an expansion space, thereby reciprocating between these spaces through one connecting duct, wherein a regenerator provides for the heat exchange between the outgoing and the returning gas flow. The problem of how to minimize the total sum of the losses is examined.

  4. Swarm robotics and minimalism

    NASA Astrophysics Data System (ADS)

    Sharkey, Amanda J. C.

    2007-09-01

    Swarm Robotics (SR) is closely related to Swarm Intelligence, and both were initially inspired by studies of social insects. Their guiding principles are based on their biological inspiration and take the form of an emphasis on decentralized local control and communication. Earlier studies went a step further in emphasizing the use of simple reactive robots that only communicate indirectly through the environment. More recently SR studies have moved beyond these constraints to explore the use of non-reactive robots that communicate directly, and that can learn and represent their environment. There is no clear agreement in the literature about how far such extensions of the original principles could go. Should there be any limitations on the individual abilities of the robots used in SR studies? Should knowledge of the capabilities of social insects lead to constraints on the capabilities of individual robots in SR studies? There is a lack of explicit discussion of such questions, and researchers have adopted a variety of constraints for a variety of reasons. A simple taxonomy of swarm robotics is presented here with the aim of addressing and clarifying these questions. The taxonomy distinguishes subareas of SR based on the emphases and justifications for minimalism and individual simplicity.

  5. A correspondence principle

    NASA Astrophysics Data System (ADS)

    Hughes, Barry D.; Ninham, Barry W.

    2016-02-01

    A single mathematical theme underpins disparate physical phenomena in classical, quantum and statistical mechanical contexts. This mathematical "correspondence principle", a kind of wave-particle duality with glorious realizations in classical and modern mathematical analysis, embodies fundamental geometrical and physical order, and yet in some sense sits on the edge of chaos. Illustrative cases discussed are drawn from classical and anomalous diffusion, quantum mechanics of single particles and ideal gases, quasicrystals and Casimir forces.

  6. A Fundamental Theorem on Particle Acceleration

    SciTech Connect

    Xie, Ming

    2003-05-01

    A fundamental theorem on particle acceleration is derived from the reciprocity principle of electromagnetism and a rigorous proof of the theorem is presented. The theorem establishes a relation between acceleration and radiation, which is particularly useful for insightful understanding of and practical calculation about the first order acceleration in which energy gain of the accelerated particle is linearly proportional to the accelerating field.

  7. Sensors, Volume 1, Fundamentals and General Aspects

    NASA Astrophysics Data System (ADS)

    Grandke, Thomas; Ko, Wen H.

    1996-12-01

    'Sensors' is the first self-contained series to deal with the whole area of sensors. It describes general aspects, technical and physical fundamentals, construction, function, applications and developments of the various types of sensors. This volume deals with the fundamentals and common principles of sensors and covers the wide areas of principles, technologies, signal processing, and applications. Contents include: Sensor Fundamentals, e.g. Sensor Parameters, Modeling, Design and Packaging; Basic Sensor Technologies, e.g. Thin and Thick Films, Integrated Magnetic Sensors, Optical Fibres and Intergrated Optics, Ceramics and Oxides; Sensor Interfaces, e.g. Signal Processing, Multisensor Signal Processing, Smart Sensors, Interface Systems; Sensor Applications, e.g. Automotive: On-board Sensors, Traffic Surveillance and Control, Home Appliances, Environmental Monitoring, etc. This volume is an indispensable reference work and text book for both specialits and newcomers, researchers and developers.

  8. The New Minimal Standard Model

    SciTech Connect

    Davoudiasl, Hooman; Kitano, Ryuichiro; Li, Tianjun; Murayama, Hitoshi

    2005-01-13

    We construct the New Minimal Standard Model that incorporates the new discoveries of physics beyond the Minimal Standard Model (MSM): Dark Energy, non-baryonic Dark Matter, neutrino masses, as well as baryon asymmetry and cosmic inflation, adopting the principle of minimal particle content and the most general renormalizable Lagrangian. We base the model purely on empirical facts rather than aesthetics. We need only six new degrees of freedom beyond the MSM. It is free from excessive flavor-changing effects, CP violation, too-rapid proton decay, problems with electroweak precision data, and unwanted cosmological relics. Any model of physics beyond the MSM should be measured against the phenomenological success of this model.

  9. Fundamental Physical Constants

    National Institute of Standards and Technology Data Gateway

    SRD 121 CODATA Fundamental Physical Constants (Web, free access)   This site, developed in the Physics Laboratory at NIST, addresses three topics: fundamental physical constants, the International System of Units (SI), which is the modern metric system, and expressing the uncertainty of measurement results.

  10. Principled Narrative

    ERIC Educational Resources Information Center

    MacBeath, John; Swaffield, Sue; Frost, David

    2009-01-01

    This article provides an overview of the "Carpe Vitam: Leadership for Learning" project, accounting for its provenance and purposes, before focusing on the principles for practice that constitute an important part of the project's legacy. These principles framed the dialogic process that was a dominant feature of the project and are presented,…

  11. The Design of MACs (Minimal Actin Cortices)

    PubMed Central

    Vogel, Sven K; Heinemann, Fabian; Chwastek, Grzegorz; Schwille, Petra

    2013-01-01

    The actin cell cortex in eukaryotic cells is a key player in controlling and maintaining the shape of cells, and in driving major shape changes such as in cytokinesis. It is thereby constantly being remodeled. Cell shape changes require forces acting on membranes that are generated by the interplay of membrane coupled actin filaments and assemblies of myosin motors. Little is known about how their interaction regulates actin cell cortex remodeling and cell shape changes. Because of the vital importance of actin, myosin motors and the cell membrane, selective in vivo experiments and manipulations are often difficult to perform or not feasible. Thus, the intelligent design of minimal in vitro systems for actin-myosin-membrane interactions could pave a way for investigating actin cell cortex mechanics in a detailed and quantitative manner. Here, we present and discuss the design of several bottom-up in vitro systems accomplishing the coupling of actin filaments to artificial membranes, where key parameters such as actin densities and membrane properties can be varied in a controlled manner. Insights gained from these in vitro systems may help to uncover fundamental principles of how exactly actin-myosin-membrane interactions govern actin cortex remodeling and membrane properties for cell shape changes. © 2013 Wiley Periodicals, Inc. PMID:24039068

  12. Pattern formation in a minimal model of continuum dislocation plasticity

    NASA Astrophysics Data System (ADS)

    Sandfeld, Stefan; Zaiser, Michael

    2015-09-01

    The spontaneous emergence of heterogeneous dislocation patterns is a conspicuous feature of plastic deformation and strain hardening of crystalline solids. Despite long-standing efforts in the materials science and physics of defect communities, there is no general consensus regarding the physical mechanism which leads to the formation of dislocation patterns. In order to establish the fundamental mechanism, we formulate an extremely simplified, minimal model to investigate the formation of patterns based on the continuum theory of fluxes of curved dislocations. We demonstrate that strain hardening as embodied in a Taylor-type dislocation density dependence of the flow stress, in conjunction with the structure of the kinematic equations that govern dislocation motion under the action of external stresses, is already sufficient for the formation of dislocation patterns that are consistent with the principle of similitude.

  13. Esophagectomy - minimally invasive

    MedlinePlus

    Minimally invasive esophagectomy; Robotic esophagectomy; Removal of the esophagus - minimally invasive; Achalasia - esophagectomy; Barrett esophagus - esophagectomy; Esophageal cancer - esophagectomy - laparoscopic; Cancer of the ...

  14. Governing during an Institutional Crisis: 10 Fundamental Principles

    ERIC Educational Resources Information Center

    White, Lawrence

    2012-01-01

    In today's world, managing a campus crisis poses special challenges for an institution's governing board, which may operate some distance removed from the immediate events giving rise to the crisis. In its most challenging form, a campus crisis--a shooting, a natural disaster, a fraternity hazing death, the arrest of a prominent campus…

  15. Fundamental principles and applications of natural gas hydrates

    NASA Astrophysics Data System (ADS)

    Sloan, E. Dendy

    2003-11-01

    Natural gas hydrates are solid, non-stoichiometric compounds of small gas molecules and water. They form when the constituents come into contact at low temperature and high pressure. The physical properties of these compounds, most notably that they are non-flowing crystalline solids that are denser than typical fluid hydrocarbons and that the gas molecules they contain are effectively compressed, give rise to numerous applications in the broad areas of energy and climate effects. In particular, they have an important bearing on flow assurance and safety issues in oil and gas pipelines, they offer a largely unexploited means of energy recovery and transportation, and they could play a significant role in past and future climate change.

  16. Fundamental principles and applications of natural gas hydrates.

    PubMed

    Sloan, E Dendy

    2003-11-20

    Natural gas hydrates are solid, non-stoichiometric compounds of small gas molecules and water. They form when the constituents come into contact at low temperature and high pressure. The physical properties of these compounds, most notably that they are non-flowing crystalline solids that are denser than typical fluid hydrocarbons and that the gas molecules they contain are effectively compressed, give rise to numerous applications in the broad areas of energy and climate effects. In particular, they have an important bearing on flow assurance and safety issues in oil and gas pipelines, they offer a largely unexploited means of energy recovery and transportation, and they could play a significant role in past and future climate change. PMID:14628065

  17. Fundamental Principles of Writing a Successful Grant Proposal

    PubMed Central

    Chung, Kevin C.; Shauver, Melissa J.

    2015-01-01

    It is important for the field of hand surgery to develop a new generation of surgeon-scientists who can produce high impact studies to raise the profile of this specialty. To this end, organizations such as the American Society for Surgery of the Hand have initiated programs to promote multicenter clinical research that can be competitive for fiscal support from the National Institutes of Health and other funding agencies. Crafting a well-structured grant proposal is critical to securing adequate funding to investigate the many ambitious clinical and basic science projects in hand surgery. In this paper, we present the key elements to a successful grant proposal to help potential applicants to navigate the complex pathways in the grant application process. PMID:18406962

  18. 47 CFR 36.2 - Fundamental principles underlying procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... interexchange classification, there are three broad types of plant, i.e., operator systems, switching plant, and....e., operator systems, switching plant, truck equipment and subscriber plant. Subscriber plant... basis for measuring the use of local and toll switching plant. (iii) Conversation-minute-kilometers...

  19. Developing Fundamental Principles for Teacher Education Programs and Practices

    ERIC Educational Resources Information Center

    Korthagen, Fred; Loughran, John; Russell, Tom

    2006-01-01

    Traditional approaches to teacher education are increasingly critiqued for their limited relationship to student teachers' needs and for their meager impact on practice. Many pleas are heard for a radical new and effective pedagogy of teacher education in which theory and practice are linked effectively. Although various attempts to restructure…

  20. DESIGNA ND ANALYSIS FOR THEMATIC MAP ACCURACY ASSESSMENT: FUNDAMENTAL PRINCIPLES

    EPA Science Inventory

    Before being used in scientific investigations and policy decisions, thematic maps constructed from remotely sensed data should be subjected to a statistically rigorous accuracy assessment. The three basic components of an accuracy assessment are: 1) the sampling design used to s...

  1. Fundamental monogamy relation between contextuality and nonlocality.

    PubMed

    Kurzyński, Paweł; Cabello, Adán; Kaszlikowski, Dagomir

    2014-03-14

    We show that the no-disturbance principle imposes a tradeoff between locally contextual correlations violating the Klyachko-Can-Biniciogˇlu-Shumovski inequality and spatially separated correlations violating the Clauser-Horne-Shimony-Holt inequality. The violation of one inequality forbids the violation of the other. We also obtain the corresponding monogamy relation imposed by quantum theory for a qutrit-qubit system. Our results show the existence of fundamental monogamy relations between contextuality and nonlocality that suggest that entanglement might be a particular form of a more fundamental resource. PMID:24679270

  2. Laser Wakefield Acceleration and Fundamental Physics

    SciTech Connect

    Tajima, Toshiki

    2011-06-20

    The laser wakefield acceleration (LWFA) along with the now available laser technology allows us to look at TeV physics both in leptons and hadrons. Near future proof-of-principle experiments for a collider as well as high energy frontier experiments without a collider paradigm are suggested. The intense laser can also contribute to other fundamental physics explorations such as those of dark matter and dark energy candidates. Finally the combination of intense laser and laser-accelerated particles (electrons, hadrons, gammas) provides a further avenue of fundamental research.

  3. Fundamental Monogamy Relation between Contextuality and Nonlocality

    NASA Astrophysics Data System (ADS)

    Kurzyński, Paweł; Cabello, Adán; Kaszlikowski, Dagomir

    2014-03-01

    We show that the no-disturbance principle imposes a tradeoff between locally contextual correlations violating the Klyachko-Can-Binicioǧlu-Shumovski inequality and spatially separated correlations violating the Clauser-Horne-Shimony-Holt inequality. The violation of one inequality forbids the violation of the other. We also obtain the corresponding monogamy relation imposed by quantum theory for a qutrit-qubit system. Our results show the existence of fundamental monogamy relations between contextuality and nonlocality that suggest that entanglement might be a particular form of a more fundamental resource.

  4. The Subordination of Aesthetic Fundamentals in College Art Instruction

    ERIC Educational Resources Information Center

    Lavender, Randall

    2003-01-01

    Opportunities for college students of art and design to study fundamentals of visual aesthetics, integrity of form, and principles of composition are limited today by a number of factors. With the well-documented prominence of postmodern critical theory in the world of contemporary art, the study of aesthetic fundamentals is largely subordinated…

  5. Basic principles of remote sensing. [bibliography

    NASA Technical Reports Server (NTRS)

    Clapp, J. L.

    1973-01-01

    Forty eight selected bibliographic references dealing with the remote sensing of the environment are given. Emphasis was placed on data that deal with fundamental aspects and principles of the technique.

  6. Toward a Minimal Artificial Axon.

    PubMed

    Ariyaratne, Amila; Zocchi, Giovanni

    2016-07-01

    The electrophysiology of action potentials is usually studied in neurons, through relatively demanding experiments which are difficult to scale up to a defined network. Here we pursue instead the minimal artificial system based on the essential biological components-ion channels and lipid bilayers-where action potentials can be generated, propagated, and eventually networked. The fundamental unit is the classic supported bilayer: a planar bilayer patch with embedded ion channels in a fluidic environment where an ionic gradient is imposed across the bilayer. Two such units electrically connected form the basic building block for a network. The system is minimal in that we demonstrate that one kind of ion channel and correspondingly a gradient of only one ionic species is sufficient to generate an excitable system which shows amplification and threshold behavior. PMID:27049652

  7. Fundamentals of fluid sealing

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamentals of fluid sealing, including seal operating regimes, are discussed and the general fluid-flow equations for fluid sealing are developed. Seal performance parameters such as leakage and power loss are presented. Included in the discussion are the effects of geometry, surface deformations, rotation, and both laminar and turbulent flows. The concept of pressure balancing is presented, as are differences between liquid and gas sealing. Mechanisms of seal surface separation, fundamental friction and wear concepts applicable to seals, seal materials, and pressure-velocity (PV) criteria are discussed.

  8. Prepulse minimization in KALI-5000.

    PubMed

    Kumar, D Durga Praveen; Mitra, S; Senthil, K; Sharma, Vishnu K; Singh, S K; Roy, A; Sharma, Archana; Nagesh, K V; Chakravarthy, D P

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given. PMID:19655979

  9. Prepulse minimization in KALI-5000

    NASA Astrophysics Data System (ADS)

    Kumar, D. Durga Praveen; Mitra, S.; Senthil, K.; Sharma, Vishnu K.; Singh, S. K.; Roy, A.; Sharma, Archana; Nagesh, K. V.; Chakravarthy, D. P.

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given.

  10. Common Principles and Multiculturalism

    PubMed Central

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720

  11. The principle of reciprocity.

    PubMed

    Hoult, D I

    2011-12-01

    The circumstances surrounding the realisation that NMR signal reception could be quantified in a simple fundamental manner using Lorentz's Principle of Reciprocity are described. The poor signal-to-noise ratio of the first European superconducting magnet is identified as a major motivating factor, together with the author's need to understand phenomena at a basic level. A summary is then given of the thought processes leading to the very simple pseudo-static formula that has been the basis of signal-to-noise calculations for over a generation. PMID:21889377

  12. About the ZOOM minimization package

    SciTech Connect

    Fischler, M.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  13. Food Service Fundamentals.

    ERIC Educational Resources Information Center

    Marine Corps Inst., Washington, DC.

    Developed as part of the Marine Corps Institute (MCI) correspondence training program, this course on food service fundamentals is designed to provide a general background in the basic aspects of the food service program in the Marine Corps; it is adaptable for nonmilitary instruction. Introductory materials include specific information for MCI…

  14. Unification of Fundamental Forces

    NASA Astrophysics Data System (ADS)

    Salam, Abdus; Taylor, Foreword by John C.

    2005-10-01

    Foreword John C. Taylor; 1. Unification of fundamental forces Abdus Salam; 2. History unfolding: an introduction to the two 1968 lectures by W. Heisenberg and P. A. M. Dirac Abdus Salam; 3. Theory, criticism, and a philosophy Werner Heisenberg; 4. Methods in theoretical physics Paul Adrian Maurice Dirac.

  15. Reading Is Fundamental, 1977.

    ERIC Educational Resources Information Center

    Smithsonian Institution, Washington, DC. National Reading is Fun-damental Program.

    Reading Is Fundamental (RIF) is a national, nonprofit organization designed to motivate children to read by making a wide variety of inexpensive books available to them and allowing the children to choose and keep books that interest them. This annual report for 1977 contains the following information on the RIF project: an account of the…

  16. Fundamentals of soil science

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study guide provides comments and references for professional soil scientists who are studying for the soil science fundamentals exam needed as the first step for certification. The performance objectives were determined by the Soil Science Society of America's Council of Soil Science Examiners...

  17. Fundamentals of tribology

    SciTech Connect

    Suh, N.P.; Saka, N.

    1980-01-01

    This book presents the proceedings of the June 1978 International Conference on the Fundamentals of Tribology. The papers discuss the effects of surface topography and of the properties of materials on wear; friction, wear, and thermomechanical effects; wear mechanisms in metal processing; polymer wear; wear monitoring and prevention; and lubrication. (LCL)

  18. Fundamental research data base

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A fundamental research data base containing ground truth, image, and Badhwar profile feature data for 17 North Dakota, South Dakota, and Minnesota agricultural sites is described. Image data was provided for a minimum of four acquisition dates for each site and all four images were registered to one another.

  19. Laser Fundamentals and Experiments.

    ERIC Educational Resources Information Center

    Van Pelt, W. F.; And Others

    As a result of work performed at the Southwestern Radiological Health Laboratory with respect to lasers, this manual was prepared in response to the increasing use of lasers in high schools and colleges. It is directed primarily toward the high school instructor who may use the text for a short course in laser fundamentals. The definition of the…

  20. The Fundamental Property Relation.

    ERIC Educational Resources Information Center

    Martin, Joseph J.

    1983-01-01

    Discusses a basic equation in thermodynamics (the fundamental property relation), focusing on a logical approach to the development of the relation where effects other than thermal, compression, and exchange of matter with the surroundings are considered. Also demonstrates erroneous treatments of the relation in three well-known textbooks. (JN)

  1. Fundamentals of Library Instruction

    ERIC Educational Resources Information Center

    McAdoo, Monty L.

    2012-01-01

    Being a great teacher is part and parcel of being a great librarian. In this book, veteran instruction services librarian McAdoo lays out the fundamentals of the discipline in easily accessible language. Succinctly covering the topic from top to bottom, he: (1) Offers an overview of the historical context of library instruction, drawing on recent…

  2. Basic Publication Fundamentals.

    ERIC Educational Resources Information Center

    Savedge, Charles E., Ed.

    Designed for students who produce newspapers and newsmagazines in junior high, middle, and elementary schools, this booklet is both a scorebook and a fundamentals text. The scorebook provides realistic criteria for judging publication excellence at these educational levels. All the basics for good publications are included in the text of the…

  3. The 4th Thermodynamic Principle?

    SciTech Connect

    Montero Garcia, Jose de la Luz; Novoa Blanco, Jesus Francisco

    2007-04-28

    It should be emphasized that the 4th Principle above formulated is a thermodynamic principle and, at the same time, is mechanical-quantum and relativist, as it should inevitably be and its absence has been one of main the theoretical limitations of the physical theory until today.We show that the theoretical discovery of Dimensional Primitive Octet of Matter, the 4th Thermodynamic Principle, the Quantum Hexet of Matter, the Global Hexagonal Subsystem of Fundamental Constants of Energy and the Measurement or Connected Global Scale or Universal Existential Interval of the Matter is that it is possible to be arrived at a global formulation of the four 'forces' or fundamental interactions of nature. The Einstein's golden dream is possible.

  4. Fundamentals of Refrigeration.

    ERIC Educational Resources Information Center

    Sutliff, Ronald D.; And Others

    This self-study course is designed to familiarize Marine enlisted personnel with the principles of the refrigeration process. The course contains five study units. Each study unit begins with a general objective, which is a statement of what the student should learn from the unit. The study units are divided into numbered work units, each…

  5. Marine Electrician--Fundamentals.

    ERIC Educational Resources Information Center

    Sutliff, Ronald D.; And Others

    This self-study course is designed to familiarize Marine Corps enlisted personnel with the principles of electricity, safety, and tools. The course contains three study units. Each study unit begins with a general objective, which is a statement of what the student should learn from the unit. The study units are divided into numbered work units,…

  6. FUNDAMENTALS OF TELEVISION SYSTEMS.

    ERIC Educational Resources Information Center

    KESSLER, WILLIAM J.

    DESIGNED FOR A READER WITHOUT SPECIAL TECHNICAL KNOWLEDGE, THIS ILLUSTRATED RESOURCE PAPER EXPLAINS THE COMPONENTS OF A TELEVISION SYSTEM AND RELATES THEM TO THE COMPLETE SYSTEM. SUBJECTS DISCUSSED ARE THE FOLLOWING--STUDIO ORGANIZATION AND COMPATIBLE COLOR TELEVISION PRINCIPLES, WIRED AND RADIO TRANSMISSION SYSTEMS, DIRECT VIEW AND PROJECTION…

  7. Minimal change disease

    MedlinePlus

    Minimal change nephrotic syndrome; Nil disease; Lipoid nephrosis; Idiopathic nephrotic syndrome of childhood ... which filter blood and produce urine. In minimal change disease, there is damage to the glomeruli. These ...

  8. Minimal change disease

    MedlinePlus

    ... seen under a very powerful microscope called an electron microscope. Minimal change disease is the most common ... biopsy and examination of the tissue with an electron microscope can show signs of minimal change disease.

  9. Ecological Principles and Guidelines for Managing the Use of Land

    SciTech Connect

    Dale, Virginia H; Brown, Sandra; Haeuber, R A; Hobbs, N T; Huntly, N; Naiman, R J; Riebsame, W E; Turner, M G; Valone, T J

    2014-01-01

    The many ways that people have used and managed land throughout history has emerged as a primary cause of land-cover change around the world. Thus, land use and land management increasingly represent a fundamental source of change in the global environment. Despite their global importance, however, many decisions about the management and use of land are made with scant attention to ecological impacts. Thus, ecologists' knowledge of the functioning of Earth's ecosystems is needed to broaden the scientific basis of decisions on land use and management. In response to this need, the Ecological Society of America established a committee to examine the ways that land-use decisions are made and the ways that ecologists could help inform those decisions. This paper reports the scientific findings of that committee. Five principles of ecological science have particular implications for land use and can assure that fundamental processes of Earth's ecosystems are sustained. These ecological principles deal with time, species, place, dis- turbance, and the landscape. The recognition that ecological processes occur within a temporal setting and change over time is fundamental to analyzing the effects of land use. In addition, individual species and networks of interacting species have strong and far-reaching effects on ecological processes. Furthermore, each site or region has a unique set of organisms and abiotic conditions influencing and constraining ecological processes. Distur- bances are important and ubiquitous ecological events whose effects may strongly influence population, com- munity, and ecosystem dynamics. Finally, the size, shape, and spatial relationships of habitat patches on the landscape affect the structure and function of ecosystems. The responses of the land to changes in use and management by people depend on expressions of these fundamental principles in nature. These principles dictate several guidelines for land use. The guidelines give practical

  10. Fundamentals of Polarized Light

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael

    2003-01-01

    The analytical and numerical basis for describing scattering properties of media composed of small discrete particles is formed by the classical electromagnetic theory. Although there are several excellent textbooks outlining the fundamentals of this theory, it is convenient for our purposes to begin with a summary of those concepts and equations that are central to the subject of this book and will be used extensively in the following chapters. We start by formulating Maxwell's equations and constitutive relations for time- harmonic macroscopic electromagnetic fields and derive the simplest plane-wave solution that underlies the basic optical idea of a monochromatic parallel beam of light. This solution naturally leads to the introduction of such fundamental quantities as the refractive index and the Stokes parameters. Finally, we define the concept of a quasi-monochromatic beam of light and discuss its implications.

  11. Fundamentals of Geophysics

    NASA Astrophysics Data System (ADS)

    Frohlich, Cliff

    Choosing an intermediate-level geophysics text is always problematic: What should we teach students after they have had introductory courses in geology, math, and physics, but little else? Fundamentals of Geophysics is aimed specifically at these intermediate-level students, and the author's stated approach is to construct a text “using abundant diagrams, a simplified mathematical treatment, and equations in which the student can follow each derivation step-by-step.” Moreover, for Lowrie, the Earth is round, not flat—the “fundamentals of geophysics” here are the essential properties of our Earth the planet, rather than useful techniques for finding oil and minerals. Thus this book is comparable in both level and approach to C. M. R. Fowler's The Solid Earth (Cambridge University Press, 1990).

  12. Fundamental limits on EMC

    NASA Astrophysics Data System (ADS)

    Showers, R. M.; Lin, S.-Y.; Schulz, R. B.

    1981-02-01

    Both fundamental and state-of-the-art limits are treated with emphasis on the former. Fundamental limits result from both natural and man-made electromagnetic noise which then affect two basic ratios, signal-to-noise (S/N) and extraneous-input-to-noise (I/N). Tolerable S/N values are discussed for both digital and analog communications systems. These lead to tolerable signal-to-extraneous-input (S/I) ratios, again for digital and analog communications systems, as well as radar and sonar. State-of-the-art limits for transmitters include RF noise emission, spurious emissions, and intermodulation. Receiver limits include adjacent-channel interactions, image, IF, and other spurious responses, including cross modulation, intermodulation, and desensitization. Unintentional emitters and receivers are also discussed. Coupling limitations between undesired sources and receptors are considered from mechanisms including radiation, induction, and conduction.

  13. Fundamental studies in geodynamics

    NASA Technical Reports Server (NTRS)

    Anderson, D. L.; Hager, B. H.; Kanamori, H.

    1981-01-01

    Research in fundamental studies in geodynamics continued in a number of fields including seismic observations and analysis, synthesis of geochemical data, theoretical investigation of geoid anomalies, extensive numerical experiments in a number of geodynamical contexts, and a new field seismic volcanology. Summaries of work in progress or completed during this report period are given. Abstracts of publications submitted from work in progress during this report period are attached as an appendix.

  14. Free-energy minimization and the dark-room problem.

    PubMed

    Friston, Karl; Thornton, Christopher; Clark, Andy

    2012-01-01

    Recent years have seen the emergence of an important new fundamental theory of brain function. This theory brings information-theoretic, Bayesian, neuroscientific, and machine learning approaches into a single framework whose overarching principle is the minimization of surprise (or, equivalently, the maximization of expectation). The most comprehensive such treatment is the "free-energy minimization" formulation due to Karl Friston (see e.g., Friston and Stephan, 2007; Friston, 2010a,b - see also Fiorillo, 2010; Thornton, 2010). A recurrent puzzle raised by critics of these models is that biological systems do not seem to avoid surprises. We do not simply seek a dark, unchanging chamber, and stay there. This is the "Dark-Room Problem." Here, we describe the problem and further unpack the issues to which it speaks. Using the same format as the prolog of Eddington's Space, Time, and Gravitation (Eddington, 1920) we present our discussion as a conversation between: an information theorist (Thornton), a physicist (Friston), and a philosopher (Clark). PMID:22586414

  15. Value of Fundamental Science

    NASA Astrophysics Data System (ADS)

    Burov, Alexey

    Fundamental science is a hard, long-term human adventure that has required high devotion and social support, especially significant in our epoch of Mega-science. The measure of this devotion and this support expresses the real value of the fundamental science in public opinion. Why does fundamental science have value? What determines its strength and what endangers it? The dominant answer is that the value of science arises out of curiosity and is supported by the technological progress. Is this really a good, astute answer? When trying to attract public support, we talk about the ``mystery of the universe''. Why do these words sound so attractive? What is implied by and what is incompatible with them? More than two centuries ago, Immanuel Kant asserted an inseparable entanglement between ethics and metaphysics. Thus, we may ask: which metaphysics supports the value of scientific cognition, and which does not? Should we continue to neglect the dependence of value of pure science on metaphysics? If not, how can this issue be addressed in the public outreach? Is the public alienated by one or another message coming from the face of science? What does it mean to be politically correct in this sort of discussion?

  16. [Principle of least action, physiology of vision, and conditioned reflexes theory].

    PubMed

    Shelepin, Iu E; Krasil'nikov, N N

    2003-06-01

    The variation principles such as principle of least action by Maupertuis (1740) and Fermat principle (1660) are fundamental for physics. They permit to establish a property by which the actual state is differing from all possible states of the system. The variation approach permits to establish equation of motion and equilibrium of a material system on the basis of one common rule which reduces to the search of the function extremes, describes this property of the system. So for the optical systems, crucial is the time and not the length of the way. According to Fermat principles, the light "choosen" from all possible ways connects two dots in the way which needs the least time. Generality of the variation principles guarantees success of their use in brain function investigations. Between different attempts to apply the variation principles to psychology and linguistics, the Zipf principle of least effort must be distinguished. Zipf (1949) demonstrated that languages and some artificial codes satisfied the least principle. For the brain physiology, classical conditioned reflex theory is the ideal area of variation principles application. According to this approach, conditioning leads to finding the extreme during fixation of the temporal link. In vision, physiological investigations are difficult because the signal has many dimensions. For example, during perception of spatial properties of surrounding world, in vision is realized minimization (reduction) of spatial-frequency spectrum of the scene. The receptive fields provide optimal accumulation of the signal. In ontogenesis, signal--noise ratio becomes optimal as receptive fields minimized the internal noise spectrum. According to the theory of match filtration, in the visual system recognition is carryied out by minimal differences between the image description in the visual system and storage in the human memory template of that image. The variation principles help to discover the physical property of

  17. Quantum mechanics and the generalized uncertainty principle

    SciTech Connect

    Bang, Jang Young; Berger, Micheal S.

    2006-12-15

    The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

  18. Review of receptor model fundamentals

    NASA Astrophysics Data System (ADS)

    Henry, Ronald C.; Lewis, Charles W.; Hopke, Philip K.; Williamson, Hugh J.

    There are several broad classes of mathematical models used to apportion the aerosol measured at a receptor site to its likely sources. This paper surveys the two types applied in exercises for the Mathematical and Empirical Receptor Models Workshop (Quail Roost II): chemical mass balance models and multivariate models. The fundamental principles of each are reviewed. Also considered are the specific models available within each class. These include: tracer element, linear programming, ordinary linear least-squares, effective variance least-squares and ridge regression (all solutions to the chemical mass balance equation), and factor analysis, target transformation factor analysis, multiple linear regression and extended Q-mode factor analysis (all multivariate models). In practical application of chemical mass balance models, a frequent problem is the presence of two or more emission sources whose signatures are very similar. Several techniques to reduce the effects of such multicollinearity are discussed. The propagation of errors for source contribution estimates, another practical concern, also is given special attention.

  19. Fundamental experiments in velocimetry

    SciTech Connect

    Briggs, Matthew Ellsworth; Hull, Larry; Shinas, Michael

    2009-01-01

    One can understand what velocimetry does and does not measure by understanding a few fundamental experiments. Photon Doppler Velocimetry (PDV) is an interferometer that will produce fringe shifts when the length of one of the legs changes, so we might expect the fringes to change whenever the distance from the probe to the target changes. However, by making PDV measurements of tilted moving surfaces, we have shown that fringe shifts from diffuse surfaces are actually measured only from the changes caused by the component of velocity along the beam. This is an important simplification in the interpretation of PDV results, arising because surface roughness randomizes the scattered phases.

  20. Fundamental research data base

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A fundamental research data base was created on a single 9-track 1600 BPI tape containing ground truth, image, and Badhwar profile feature data for 17 North Dakota, South Dakota, and Minnesota agricultural sites. Each site is 5x6 nm in area. Image data has been provided for a minimum of four acquisition dates for each site. All four images have been registered to one another. A list of the order of the files on tape and the dates of acquisition is provided.

  1. Fundamentals of satellite navigation

    NASA Astrophysics Data System (ADS)

    Stiller, A. H.

    The basic operating principles and capabilities of conventional and satellite-based navigation systems for air, sea, and land vehicles are reviewed and illustrated with diagrams. Consideration is given to autonomous onboard systems; systems based on visible or radio beacons; the Transit, Cicada, Navstar-GPS, and Glonass satellite systems; the physical laws and parameters of satellite motion; the definition of time in satellite systems; and the content of the demodulated GPS data signal. The GPS and Glonass data format frames are presented graphically, and tables listing the GPS and Glonass satellites, their technical characteristics, and the (past or scheduled) launch dates are provided.

  2. The Role of Fisher Information Theory in the Development of Fundamental Laws in Physical Chemistry

    ERIC Educational Resources Information Center

    Honig, J. M.

    2009-01-01

    The unifying principle that involves rendering the Fisher information measure an extremum is reviewed. It is shown that with this principle, in conjunction with appropriate constraints, a large number of fundamental laws can be derived from a common source in a unified manner. The resulting economy of thought pertaining to fundamental principles…

  3. Semi-analytical formulation of modal dispersion parameter of an optical fiber with Kerr nonlinearity and using a novel fundamental modal field approximation

    NASA Astrophysics Data System (ADS)

    Choudhury, Raja Roy; Choudhury, Arundhati Roy; Ghose, Mrinal Kanti

    2013-09-01

    To characterize nonlinear optical fiber, a semi-analytical formulation using variational principle and the Nelder-Mead Simplex method for nonlinear unconstrained minimization is proposed. The number of optimizing parameters in order to optimize core parameter U has been increased to incorporate more flexibility in the formulation of an innovative form of fundamental modal field. This formulation provides accurate analytical expressions for modal dispersion parameter (g) of optical fiber with Kerr nonlinearity. The minimization of core parameter (U), which involves Kerr nonlinearity through the nonstationary expression of propagation constant, is carried out by the Nelder-Mead Simplex method of nonlinear unconstrained minimization, suitable for problems with nonsmooth functions as the method does not require any derivative information. This formulation has less computational burden for calculation of modal parameters than full numerical methods.

  4. Fundamentals of electrokinetics

    NASA Astrophysics Data System (ADS)

    Kozak, M. W.

    The study of electrokinetics is a very mature field. Experimental studies date from the early 1800s, and acceptable theoretical analyses have existed since the early 1900s. The use of electrokinetics in practical field problems is more recent, but it is still quite mature. Most developments in the fundamental understanding of electrokinetics are in the colloid science literature. A significant and increasing divergence between the theoretical understanding of electrokinetics found in the colloid science literature and the theoretical analyses used in interpreting applied experimental studies in soil science and waste remediation has developed. The soil science literature has to date restricted itself to the use of very early theories, with their associated limitations. The purpose of this contribution is to review fundamental aspects of electrokinetic phenomena from a colloid science viewpoint. It is hoped that a bridge can be built between the two branches of the literature, from which both will benefit. Attention is paid to special topics such as the effects of overlapping double layers, applications in unsaturated soils, the influence of dispersivity, and the differences between electrokinetic theory and conductivity theory.

  5. Fundamental Atomtronic Circuit Elements

    NASA Astrophysics Data System (ADS)

    Lee, Jeffrey; McIlvain, Brian; Lobb, Christopher; Hill, Wendell T., III

    2012-06-01

    Recent experiments with neutral superfluid gases have shown that it is possible to create atomtronic circuits analogous to existing superconducting circuits. The goals of these experiments are to create complex systems such as Josephson junctions. In addition, there are theoretical models for active atomtronic components analogous to diodes, transistors and oscillators. In order for any of these devices to function, an understanding of the more fundamental atomtronic elements is needed. Here we describe the first experimental realization of these more fundamental elements. We have created an atomtronic capacitor that is discharged through a resistance and inductance. We will discuss a theoretical description of the system that allows us to determine values for the capacitance, resistance and inductance. The resistance is shown to be analogous to the Sharvin resistance, and the inductance analogous to kinetic inductance in electronics. This atomtronic circuit is implemented with a thermal sample of laser cooled rubidium atoms. The atoms are confined using what we call free-space atom chips, a novel optical dipole trap produced using a generalized phase-contrast imaging technique. We will also discuss progress toward implementing this atomtronic system in a degenerate Bose gas.

  6. Controlling molecular transport in minimal emulsions

    PubMed Central

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of ‘minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  7. Controlling molecular transport in minimal emulsions

    NASA Astrophysics Data System (ADS)

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of `minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions.

  8. Controlling molecular transport in minimal emulsions.

    PubMed

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of 'minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  9. Analysis of lipid flow on minimal surfaces

    NASA Astrophysics Data System (ADS)

    Bahmani, Fatemeh; Christenson, Joel; Rangamani, Padmini

    2016-03-01

    Interaction between the bilayer shape and surface flow is important for capturing the flow of lipids in many biological membranes. Recent microscopy evidence has shown that minimal surfaces (planes, catenoids, and helicoids) occur often in cellular membranes. In this study, we explore lipid flow in these geometries using a `stream function' formulation for viscoelastic lipid bilayers. Using this formulation, we derive two-dimensional lipid flow equations for the commonly occurring minimal surfaces in lipid bilayers. We show that for three minimal surfaces (planes, catenoids, and helicoids), the surface flow equations satisfy Stokes flow equations. In helicoids and catenoids, we show that the tangential velocity field is a Killing vector field. Thus, our analysis provides fundamental insight into the flow patterns of lipids on intracellular organelle membranes that are characterized by fixed shapes reminiscent of minimal surfaces.

  10. Achieving sustainable plant disease management through evolutionary principles.

    PubMed

    Zhan, Jiasui; Thrall, Peter H; Burdon, Jeremy J

    2014-09-01

    Plants and their pathogens are engaged in continuous evolutionary battles and sustainable disease management requires novel systems to create environments conducive for short-term and long-term disease control. In this opinion article, we argue that knowledge of the fundamental factors that drive host-pathogen coevolution in wild systems can provide new insights into disease development in agriculture. Such evolutionary principles can be used to guide the formulation of sustainable disease management strategies which can minimize disease epidemics while simultaneously reducing pressure on pathogens to evolve increased infectivity and aggressiveness. To ensure agricultural sustainability, disease management programs that reflect the dynamism of pathogen population structure are essential and evolutionary biologists should play an increasing role in their design. PMID:24853471

  11. Unification of Fundamental Forces

    NASA Astrophysics Data System (ADS)

    Salam, Abdus

    1990-05-01

    This is an expanded version of the third Dirac Memorial Lecture, given in 1988 by the Nobel Laureate Abdus Salam. Salam's lecture presents an overview of the developments in modern particle physics from its inception at the turn of the century to the present theories seeking to unify all the fundamental forces. In addition, two previously unpublished lectures by Paul Dirac, and Werner Heisenberg are included. These lectures provide a fascinating insight into their approach to research and the developments in particle physics at that time. Nonspecialists, undergraduates and researchers will find this a fascinating book. It contains a clear introduction to the major themes of particle physics and cosmology by one of the most distinguished contemporary physicists.

  12. Wall of fundamental constants

    SciTech Connect

    Olive, Keith A.; Peloso, Marco; Uzan, Jean-Philippe

    2011-02-15

    We consider the signatures of a domain wall produced in the spontaneous symmetry breaking involving a dilatonlike scalar field coupled to electromagnetism. Domains on either side of the wall exhibit slight differences in their respective values of the fine-structure constant, {alpha}. If such a wall is present within our Hubble volume, absorption spectra at large redshifts may or may not provide a variation in {alpha} relative to the terrestrial value, depending on our relative position with respect to the wall. This wall could resolve the contradiction between claims of a variation of {alpha} based on Keck/Hires data and of the constancy of {alpha} based on Very Large Telescope data. We derive the properties of the wall and the parameters of the underlying microscopic model required to reproduce the possible spatial variation of {alpha}. We discuss the constraints on the existence of the low-energy domain wall and describe its observational implications concerning the variation of the fundamental constants.

  13. Fundamentals of battery dynamics

    NASA Astrophysics Data System (ADS)

    Jossen, Andreas

    Modern applications, such as wireless communication systems or hybrid electric vehicles operate at high power fluctuations. For some applications, where the power frequencies are high (above some 10 or 100 Hz) it is possible to filter the high frequencies using passive components; yet this results in additional costs. In other applications, where the dynamic time constants are in the range up to some seconds, filtering cannot be done. Batteries are hence operated with the dynamic loads. But what happens under these dynamic operation conditions? This paper describes the fundamentals of the dynamic characteristics of batteries in a frequency range from some MHz down to the mHz range. As the dynamic behaviour depends on the actual state of charge (SOC) and the state of health (SOH), it is possible to gain information on the battery state by analysing the dynamic behaviour. High dynamic loads can influence the battery temperature, the battery performance and the battery lifetime.

  14. Fundamentals of zoological scaling

    NASA Astrophysics Data System (ADS)

    Lin, Herbert

    1982-01-01

    Most introductory physics courses emphasize highly idealized problems with unique well-defined answers. Though many textbooks complement these problems with estimation problems, few books present anything more than an elementary discussion of scaling. This paper presents some fundamentals of scaling in the zoological domain—a domain complex by any standard, but one also well suited to illustrate the power of very simple physical ideas. We consider the following animal characteristics: skeletal weight, speed of running, height and range of jumping, food consumption, heart rate, lifetime, locomotive efficiency, frequency of wing flapping, and maximum sizes of animals that fly and hover. These relationships are compared to zoological data and everyday experience, and match reasonably well.

  15. Fundamentals of gel dosimeters

    NASA Astrophysics Data System (ADS)

    McAuley, K. B.; Nasr, A. T.

    2013-06-01

    Fundamental chemical and physical phenomena that occur in Fricke gel dosimeters, polymer gel dosimeters, micelle gel dosimeters and genipin gel dosimeters are discussed. Fricke gel dosimeters are effective even though their radiation sensitivity depends on oxygen concentration. Oxygen contamination can cause severe problems in polymer gel dosimeters, even when THPC is used. Oxygen leakage must be prevented between manufacturing and irradiation of polymer gels, and internal calibration methods should be used so that contamination problems can be detected. Micelle gel dosimeters are promising due to their favourable diffusion properties. The introduction of micelles to gel dosimetry may open up new areas of dosimetry research wherein a range of water-insoluble radiochromic materials can be explored as reporter molecules.

  16. Fundamentals of plasma simulation

    SciTech Connect

    Forslund, D.W.

    1985-01-01

    With the increasing size and speed of modern computers, the incredibly complex nonlinear properties of plasmas in the laboratory and in space are being successfully explored in increasing depth. Of particular importance have been numerical simulation techniques involving finite size particles on a discrete mesh. After discussing the importance of this means of understanding a variety of nonlinear plasma phenomena, we describe the basic elements of particle-in-cell simulation and their limitations and advantages. The differencing techniques, stability and accuracy issues, data management and optimization issues are discussed by means of a simple example of a particle-in-cell code. Recent advances in simulation methods allowing large space and time scales to be treated with minimal sacrifice in physics are reviewed. Various examples of nonlinear processes successfully studied by plasma simulation will be given.

  17. DOE Fundamentals Handbook: Instrumentation and Control, Volume 1

    SciTech Connect

    Not Available

    1992-06-01

    The Instrumentation and Control Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of instrumentation and control systems. The handbook includes information on temperature, pressure, flow, and level detection systems; position indication systems; process control systems; and radiation detection principles. This information will provide personnel with an understanding of the basic operation of various types of DOE nuclear facility instrumentation and control systems.

  18. Complementary Huygens Principle for Geometrical and Nongeometrical Optics

    ERIC Educational Resources Information Center

    Luis, Alfredo

    2007-01-01

    We develop a fundamental principle depicting the generalized ray formulation of optics provided by the Wigner function. This principle is formally identical to the Huygens-Fresnel principle but in terms of opposite concepts, rays instead of waves, and incoherent superpositions instead of coherent ones. This ray picture naturally includes…

  19. A systems approach to theoretical fluid mechanics: Fundamentals

    NASA Technical Reports Server (NTRS)

    Anyiwo, J. C.

    1978-01-01

    A preliminary application of the underlying principles of the investigator's general system theory to the description and analyses of the fluid flow system is presented. An attempt is made to establish practical models, or elements of the general fluid flow system from the point of view of the general system theory fundamental principles. Results obtained are applied to a simple experimental fluid flow system, as test case, with particular emphasis on the understanding of fluid flow instability, transition and turbulence.

  20. Fundamentals of Atmospheric Radiation

    NASA Astrophysics Data System (ADS)

    Bohren, Craig F.; Clothiaux, Eugene E.

    2006-02-01

    This textbook fills a gap in the literature for teaching material suitable for students of atmospheric science and courses on atmospheric radiation. It covers the fundamentals of emission, absorption, and scattering of electromagnetic radiation from ultraviolet to infrared and beyond. Much of the book applies to planetary atmosphere. The authors are physicists and teach at the largest meteorology department of the US at Penn State. Craig T. Bohren has taught the atmospheric radiation course there for the past 20 years with no book. Eugene Clothiaux has taken over and added to the course notes. Problems given in the text come from students, colleagues, and correspondents. The design of the figures especially for this book is meant to ease comprehension. Discussions have a graded approach with a thorough treatment of subjects, such as single scattering by particles, at different levels of complexity. The discussion of the multiple scattering theory begins with piles of plates. This simple theory introduces concepts in more advanced theories, i.e. optical thickness, single-scattering albedo, asymmetry parameter. The more complicated theory, the two-stream theory, then takes the reader beyond the pile-of-plates theory. Ideal for advanced undergraduate and graduate students of atmospheric science.

  1. Overlay accuracy fundamentals

    NASA Astrophysics Data System (ADS)

    Kandel, Daniel; Levinski, Vladimir; Sapiens, Noam; Cohen, Guy; Amit, Eran; Klein, Dana; Vakshtein, Irina

    2012-03-01

    Currently, the performance of overlay metrology is evaluated mainly based on random error contributions such as precision and TIS variability. With the expected shrinkage of the overlay metrology budget to < 0.5nm, it becomes crucial to include also systematic error contributions which affect the accuracy of the metrology. Here we discuss fundamental aspects of overlay accuracy and a methodology to improve accuracy significantly. We identify overlay mark imperfections and their interaction with the metrology technology, as the main source of overlay inaccuracy. The most important type of mark imperfection is mark asymmetry. Overlay mark asymmetry leads to a geometrical ambiguity in the definition of overlay, which can be ~1nm or less. It is shown theoretically and in simulations that the metrology may enhance the effect of overlay mark asymmetry significantly and lead to metrology inaccuracy ~10nm, much larger than the geometrical ambiguity. The analysis is carried out for two different overlay metrology technologies: Imaging overlay and DBO (1st order diffraction based overlay). It is demonstrated that the sensitivity of DBO to overlay mark asymmetry is larger than the sensitivity of imaging overlay. Finally, we show that a recently developed measurement quality metric serves as a valuable tool for improving overlay metrology accuracy. Simulation results demonstrate that the accuracy of imaging overlay can be improved significantly by recipe setup optimized using the quality metric. We conclude that imaging overlay metrology, complemented by appropriate use of measurement quality metric, results in optimal overlay accuracy.

  2. Minimally Invasive Valve Surgery

    PubMed Central

    Pope, Nicolas H.; Ailawadi, Gorav

    2014-01-01

    Cardiac valve surgery is life saving for many patients. The advent of minimally invasive surgical techniques has historically allowed for improvement in both post-operative convalescence and important clinical outcomes. The development of minimally invasive cardiac valve repair and replacement surgery over the past decade is poised to revolutionize the care of cardiac valve patients. Here, we present a review of the history and current trends in minimally invasive aortic and mitral valve repair and replacement, including the development of sutureless bioprosthetic valves. PMID:24797148

  3. Fundamentals of neurogastroenterology

    PubMed Central

    Wood, J; Alpers, D; Andrews, P

    1999-01-01

    Current concepts and basic principles of neurogastroenterology in relation to functional gastrointestinal disorders are reviewed. Neurogastroenterology is emphasized as a new and advancing subspecialty of clinical gastroenterology and digestive science. As such, it embraces the investigative sciences dealing with functions, malfunctions, and malformations in the brain and spinal cord, and the sympathetic, parasympathetic and enteric divisions of the autonomic innervation of the digestive tract. Somatomotor systems are included insofar as pharyngeal phases of swallowing and pelvic floor involvement in defecation, continence, and pelvic pain are concerned. Inclusion of basic physiology of smooth muscle, mucosal epithelium, and the enteric immune system in the neurogastroenterologic domain relates to requirements for compatibility with neural control mechanisms. Psychologic and psychiatric relations to functional gastrointestinal disorders are included because they are significant components of neurogastroenterology, especially in relation to projections of discomfort and pain to the digestive tract.


Keywords: enteric nervous system; brain-gut axis; autonomic nervous system; nausea; gut motility; mast cells; gastrointestinal pain; Rome II PMID:10457039

  4. Role of Fundamental Physics in Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Turyshev, Slava

    2004-01-01

    This talk will discuss the critical role that fundamental physics research plays for the human space exploration. In particular, the currently available technologies can already provide significant radiation reduction, minimize bone loss, increase crew productivity and, thus, uniquely contribute to overall mission success. I will discuss how fundamental physics research and emerging technologies may not only further reduce the risks of space travel, but also increase the crew mobility, enhance safety and increase the value of space exploration in the near future.

  5. System level electrochemical principles

    NASA Technical Reports Server (NTRS)

    Thaller, L. H.

    1985-01-01

    The traditional electrochemical storage concepts are difficult to translate into high power, high voltage multikilowatt storage systems. The increased use of electronics, and the use of electrochemical couples that minimize the difficulties associated with the corrective measures to reduce the cell to cell capacity dispersion were adopted by battery technology. Actively cooled bipolar concepts are described which represent some attractive alternative system concepts. They are projected to have higher energy densities lower volumes than current concepts. They should be easier to scale from one capacity to another and have a closer cell to cell capacity balance. These newer storage system concepts are easier to manage since they are designed to be a fully integrated battery. These ideas are referred to as system level electrochemistry. The hydrogen-oxygen regenerative fuel cells (RFC) is probably the best example of the integrated use of these principles.

  6. Fundamentals and Techniques of Nonimaging

    SciTech Connect

    O'Gallagher, J. J.; Winston, R.

    2003-07-10

    This is the final report describing a long term basic research program in nonimaging optics that has led to major advances in important areas, including solar energy, fiber optics, illumination techniques, light detectors, and a great many other applications. The term ''nonimaging optics'' refers to the optics of extended sources in systems for which image forming is not important, but effective and efficient collection, concentration, transport, and distribution of light energy is. Although some of the most widely known developments of the early concepts have been in the field of solar energy, a broad variety of other uses have emerged. Most important, under the auspices of this program in fundamental research in nonimaging optics established at the University of Chicago with support from the Office of Basic Energy Sciences at the Department of Energy, the field has become very dynamic, with new ideas and concepts continuing to develop, while applications of the early concepts continue to be pursued. While the subject began as part of classical geometrical optics, it has been extended subsequently to the wave optics domain. Particularly relevant to potential new research directions are recent developments in the formalism of statistical and wave optics, which may be important in understanding energy transport on the nanoscale. Nonimaging optics permits the design of optical systems that achieve the maximum possible concentration allowed by physical conservation laws. The earliest designs were constructed by optimizing the collection of the extreme rays from a source to the desired target: the so-called ''edge-ray'' principle. Later, new concentrator types were generated by placing reflectors along the flow lines of the ''vector flux'' emanating from lambertian emitters in various geometries. A few years ago, a new development occurred with the discovery that making the design edge-ray a functional of some other system parameter permits the construction of whole

  7. Quantum measurements and Landauer's principle

    NASA Astrophysics Data System (ADS)

    Shevchenko, V.

    2015-05-01

    Information processing systems must obey laws of physics. One of particular examples of this general statement is known as Landauer's principle - irreversible operations (such as erasure) performed by any computing device at finite temperature have to dissipate some amount of heat bound from below. Together with other results of this kind, Landauer's principle represents a fundamental limit any modern or future computer must obey. We discuss interpretation of the physics behind the Landauer's principle using a model of Unruh-DeWitt detector. Of particular interest is the validity of this limit in quantum domain. We systematically study finite time effects. It is shown, in particular, that in high temperature limit finiteness of measurement time leads to renormalization of the detector's temperature.

  8. Revisiting Tversky's diagnosticity principle.

    PubMed

    Evers, Ellen R K; Lakens, Daniël

    2014-01-01

    Similarity is a fundamental concept in cognition. In 1977, Amos Tversky published a highly influential feature-based model of how people judge the similarity between objects. The model highlights the context-dependence of similarity judgments, and challenged geometric models of similarity. One of the context-dependent effects Tversky describes is the diagnosticity principle. The diagnosticity principle determines which features are used to cluster multiple objects into subgroups. Perceived similarity between items within clusters is expected to increase, while similarity between items in different clusters decreases. Here, we present two pre-registered replications of the studies on the diagnosticity effect reported in Tversky (1977). Additionally, one alternative mechanism that has been proposed to play a role in the original studies, an increase in the choice for distractor items (a substitution effect, see Medin et al., 1995), is examined. Our results replicate those found by Tversky (1977), revealing an average diagnosticity-effect of 4.75%. However, when we eliminate the possibility of substitution effects confounding the results, a meta-analysis of the data provides no indication of any remaining effect of diagnosticity. PMID:25161638

  9. Revisiting Tversky's diagnosticity principle

    PubMed Central

    Evers, Ellen R. K.; Lakens, Daniël

    2013-01-01

    Similarity is a fundamental concept in cognition. In 1977, Amos Tversky published a highly influential feature-based model of how people judge the similarity between objects. The model highlights the context-dependence of similarity judgments, and challenged geometric models of similarity. One of the context-dependent effects Tversky describes is the diagnosticity principle. The diagnosticity principle determines which features are used to cluster multiple objects into subgroups. Perceived similarity between items within clusters is expected to increase, while similarity between items in different clusters decreases. Here, we present two pre-registered replications of the studies on the diagnosticity effect reported in Tversky (1977). Additionally, one alternative mechanism that has been proposed to play a role in the original studies, an increase in the choice for distractor items (a substitution effect, see Medin et al., 1995), is examined. Our results replicate those found by Tversky (1977), revealing an average diagnosticity-effect of 4.75%. However, when we eliminate the possibility of substitution effects confounding the results, a meta-analysis of the data provides no indication of any remaining effect of diagnosticity. PMID:25161638

  10. Prostate resection - minimally invasive

    MedlinePlus

    ... are: Erection problems (impotence) No symptom improvement Passing semen back into your bladder instead of out through ... Whelan JP, Goeree L. Systematic review and meta-analysis of transurethral resection of the prostate versus minimally ...

  11. Minimizing Shortness of Breath

    MedlinePlus

    ... Top Doctors in the Nation Departments & Divisions Home Health Insights Stress & Relaxation Breathing and Relaxation Minimizing Shortness of Breath ... Management Assess Your Stress Coping Strategies Identifying ... & Programs Health Insights Doctors & Departments Research & Science Education & Training Make ...

  12. Minimally invasive hip replacement

    MedlinePlus

    ... Smits SA, Swinford RR, Bahamonde RE. A randomized, prospective study of 3 minimally invasive surgical approaches in total hip arthroplasty: comprehensive gait analysis. J Arthroplasty . 2008;23:68-73. PMID: 18722305 ...

  13. Minimal Orderings Revisited

    SciTech Connect

    Peyton, B.W.

    1999-07-01

    When minimum orderings proved too difficult to deal with, Rose, Tarjan, and Leuker instead studied minimal orderings and how to compute them (Algorithmic aspects of vertex elimination on graphs, SIAM J. Comput., 5:266-283, 1976). This paper introduces an algorithm that is capable of computing much better minimal orderings much more efficiently than the algorithm in Rose et al. The new insight is a way to use certain structures and concepts from modern sparse Cholesky solvers to re-express one of the basic results in Rose et al. The new algorithm begins with any initial ordering and then refines it until a minimal ordering is obtained. it is simple to obtain high-quality low-cost minimal orderings by using fill-reducing heuristic orderings as initial orderings for the algorithm. We examine several such initial orderings in some detail.

  14. Minimalism. Clip and Save.

    ERIC Educational Resources Information Center

    Hubbard, Guy

    2002-01-01

    Provides background information on the art movement called "Minimalism" discussing why it started and its characteristics. Includes learning activities and information on the artist, Donald Judd. Includes a reproduction of one of his art works and discusses its content. (CMK)

  15. Minimum Principles in Motor Control.

    PubMed

    Engelbrecht, Sascha E.

    2001-06-01

    Minimum (or minimal) principles are mathematical laws that were first used in physics: Hamilton's principle and Fermat's principle of least time are two famous example. In the past decade, a number of motor control theories have been proposed that are formally of the same kind as the minimum principles of physics, and some of these have been quite successful at predicting motor performance in a variety of tasks. The present paper provides a comprehensive review of this work. Particular attention is given to the relation between minimum theories in motor control and those used in other disciplines. Other issues around which the review is organized include: (1) the relation between minimum principles and structural models of motor planning and motor control, (2) the empirically-driven development of minimum principles and the danger of circular theorizing, and (3) the design of critical tests for minimum theories. Some perspectives for future research are discussed in the concluding section of the paper. Copyright 2001 Academic Press. PMID:11401453

  16. Solar astrophysical fundamental parameters

    NASA Astrophysics Data System (ADS)

    Meftah, M.; Irbah, A.; Hauchecorne, A.

    2014-08-01

    The accurate determination of the solar photospheric radius has been an important problem in astronomy for many centuries. From the measurements made by the PICARD spacecraft during the transit of Venus in 2012, we obtained a solar radius of 696,156±145 kilometres. This value is consistent with recent measurements carried out atmosphere. This observation leads us to propose a change of the canonical value obtained by Arthur Auwers in 1891. An accurate value for total solar irradiance (TSI) is crucial for the Sun-Earth connection, and represents another solar astrophysical fundamental parameter. Based on measurements collected from different space instruments over the past 35 years, the absolute value of the TSI, representative of a quiet Sun, has gradually decreased from 1,371W.m-2 in 1978 to around 1,362W.m-2 in 2013, mainly due to the radiometers calibration differences. Based on the PICARD data and in agreement with Total Irradiance Monitor measurements, we predicted the TSI input at the top of the Earth's atmosphere at a distance of one astronomical unit (149,597,870 kilometres) from the Sun to be 1,362±2.4W.m-2, which may be proposed as a reference value. To conclude, from the measurements made by the PICARD spacecraft, we obtained a solar photospheric equator-to-pole radius difference value of 5.9±0.5 kilometres. This value is consistent with measurements made by different space instruments, and can be given as a reference value.

  17. Li-O2 Kinetic Overpotentials: Tafel Plots from Experiment and First-Principles Theory.

    PubMed

    Viswanathan, V; Nørskov, J K; Speidel, A; Scheffler, R; Gowda, S; Luntz, A C

    2013-02-21

    We report the current dependence of the fundamental kinetic overpotentials for Li-O2 discharge and charge (Tafel plots) that define the optimal cycle efficiency in a Li-air battery. Comparison of the unusual experimental Tafel plots obtained in a bulk electrolysis cell with those obtained by first-principles theory is semiquantitative. The kinetic overpotentials for any practical current density are very small, considerably less than polarization losses due to iR drops from the cell impedance in Li-O2 batteries. If only the kinetic overpotentials were present, then a discharge-charge voltaic cycle efficiency of ∼85% should be possible at ∼10 mA/cm(2) superficial current density in a battery of ∼0.1 m(2) total cathode area. We therefore suggest that minimizing the cell impedance is a more important problem than minimizing the kinetic overpotentials to develop higher current Li-air batteries. PMID:26281865

  18. Fundamentals of phosphate transfer.

    PubMed

    Kirby, Anthony J; Nome, Faruk

    2015-07-21

    Historically, the chemistry of phosphate transfer-a class of reactions fundamental to the chemistry of Life-has been discussed almost exclusively in terms of the nucleophile and the leaving group. Reactivity always depends significantly on both factors; but recent results for reactions of phosphate triesters have shown that it can also depend strongly on the nature of the nonleaving or "spectator" groups. The extreme stabilities of fully ionised mono- and dialkyl phosphate esters can be seen as extensions of the same effect, with one or two triester OR groups replaced by O(-). Our chosen lead reaction is hydrolysis-phosphate transfer to water: because water is the medium in which biological chemistry takes place; because the half-life of a system in water is an accepted basic index of stability; and because the typical mechanisms of hydrolysis, with solvent H2O providing specific molecules to act as nucleophiles and as general acids or bases, are models for reactions involving better nucleophiles and stronger general species catalysts. Not least those available in enzyme active sites. Alkyl monoester dianions compete with alkyl diester monoanions for the slowest estimated rates of spontaneous hydrolysis. High stability at physiological pH is a vital factor in the biological roles of organic phosphates, but a significant limitation for experimental investigations. Almost all kinetic measurements of phosphate transfer reactions involving mono- and diesters have been followed by UV-visible spectroscopy using activated systems, conveniently compounds with good leaving groups. (A "good leaving group" OR* is electron-withdrawing, and can be displaced to generate an anion R*O(-) in water near pH 7.) Reactivities at normal temperatures of P-O-alkyl derivatives-better models for typical biological substrates-have typically had to be estimated: by extended extrapolation from linear free energy relationships, or from rate measurements at high temperatures. Calculation is free

  19. Fundamentals of Space Medicine

    NASA Astrophysics Data System (ADS)

    Clément, Gilles

    2005-03-01

    A total of more than 240 human space flights have been completed to date, involving about 450 astronauts from various countries, for a combined total presence in space of more than 70 years. The seventh long-duration expedition crew is currently in residence aboard the International Space Station, continuing a permanent presence in space that began in October 2000. During that time, investigations have been conducted on both humans and animal models to study the bone demineralization and muscle deconditioning, space motion sickness, the causes and possible treatment of postflight orthostatic intolerance, the changes in immune function, crew and crew-ground interactions, and the medical issues of living in a space environment, such as the effects of radiation or the risk of developing kidney stones. Some results of these investigations have led to fundamental discoveries about the adaptation of the human body to the space environment. Gilles Clément has been active in this research. This readable text presents the findings from the life science experiments conducted during and after space missions. Topics discussed in this book include: adaptation of sensory-motor, cardio-vascular, bone, and muscle systems to the microgravity of spaceflight; psychological and sociological issues of living in a confined, isolated, and stressful environment; operational space medicine, such as crew selection, training and in-flight health monitoring, countermeasures and support; results of space biology experiments on individual cells, plants, and animal models; and the impact of long-duration missions such as the human mission to Mars. The author also provides a detailed description of how to fly a space experiment, based on his own experience with research projects conducted onboard Salyut-7, Mir, Spacelab, and the Space Shuttle. Now is the time to look at the future of human spaceflight and what comes next. The future human exploration of Mars captures the imagination of both the

  20. Fundamentals of Space Medicine

    NASA Astrophysics Data System (ADS)

    Clément, G.

    2003-10-01

    As of today, a total of more than 240 human space flights have been completed, involving about 450 astronauts from various countries, for a combined total presence in space of more than 70 years. The seventh long-duration expedition crew is currently in residence aboard the International Space Station, continuing a permanent presence in space that began in October 2000. During that time, investigations have been conducted on both humans and animal models to study the bone demineralization and muscle deconditioning, space motion sickness, the causes and possible treatment of postflight orthostatic intolerance, the changes in immune function, crew and crew-ground interactions, and the medical issues of living in a space environment, such as the effects of radiation or the risk of developing kidney stones. Some results of these investigations have led to fundamental discoveries about the adaptation of the human body to the space environment. Gilles Clément has been active in this research. This book presents in a readable text the findings from the life science experiments conducted during and after space missions. Topics discussed in this book include: adaptation of sensory-motor, cardiovascular, bone and muscle systems to the microgravity of spaceflight; psychological and sociological issues of living in a confined, isolated and stressful environment; operational space medicine, such as crew selection, training and in-flight health monitoring, countermeasures and support; results of space biology experiments on individual cells, plants, and animal models; and the impact of long-duration missions such as the human mission to Mars. The author also provides a detailed description of how to fly a space experiment, based on his own experience with research projects conducted onboard Salyut-7, Mir, Spacelab, and the Space Shuttle. Now is the time to look at the future of human spaceflight and what comes next. The future human exploration of Mars captures the imagination

  1. Minimally invasive procedures

    PubMed Central

    Baltayiannis, Nikolaos; Michail, Chandrinos; Lazaridis, George; Anagnostopoulos, Dimitrios; Baka, Sofia; Mpoukovinas, Ioannis; Karavasilis, Vasilis; Lampaki, Sofia; Papaiwannou, Antonis; Karavergou, Anastasia; Kioumis, Ioannis; Pitsiou, Georgia; Katsikogiannis, Nikolaos; Tsakiridis, Kosmas; Rapti, Aggeliki; Trakada, Georgia; Zissimopoulos, Athanasios; Zarogoulidis, Konstantinos

    2015-01-01

    Minimally invasive procedures, which include laparoscopic surgery, use state-of-the-art technology to reduce the damage to human tissue when performing surgery. Minimally invasive procedures require small “ports” from which the surgeon inserts thin tubes called trocars. Carbon dioxide gas may be used to inflate the area, creating a space between the internal organs and the skin. Then a miniature camera (usually a laparoscope or endoscope) is placed through one of the trocars so the surgical team can view the procedure as a magnified image on video monitors in the operating room. Specialized equipment is inserted through the trocars based on the type of surgery. There are some advanced minimally invasive surgical procedures that can be performed almost exclusively through a single point of entry—meaning only one small incision, like the “uniport” video-assisted thoracoscopic surgery (VATS). Not only do these procedures usually provide equivalent outcomes to traditional “open” surgery (which sometimes require a large incision), but minimally invasive procedures (using small incisions) may offer significant benefits as well: (I) faster recovery; (II) the patient remains for less days hospitalized; (III) less scarring and (IV) less pain. In our current mini review we will present the minimally invasive procedures for thoracic surgery. PMID:25861610

  2. Fundamentals of Nursing Science: Units 1 through 8.

    ERIC Educational Resources Information Center

    Einstoss, Esther

    A description is provided of "Fundamentals of Nursing," a two-year college course designed to introduce nursing students to the basic principles of patient care. First, information is presented on the place of the course in the nursing curriculum, in-class time allotments, and course prerequisites. A section on course content includes a statement…

  3. Religious Fundamentalism among Young Muslims in Egypt and Saudi Arabia

    ERIC Educational Resources Information Center

    Moaddel, Mansoor; Karabenick, Stuart A.

    2008-01-01

    Religious fundamentalism is conceived as a distinctive set of beliefs and attitudes toward one's religion, including obedience to religious norms, belief in the universality and immutability of its principles, the validity of its claims, and its indispensability for human happiness. Surveys of Egyptian and Saudi youth, ages 18-25, reveal that…

  4. 48 CFR 9904.405-40 - Fundamental requirement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... 9904.405-40 Section 9904.405-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.405-40 Fundamental requirement. (a) Costs expressly... be subject to the same cost accounting principles governing cost allocability as allowable costs....

  5. 48 CFR 9904.405-40 - Fundamental requirement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... 9904.405-40 Section 9904.405-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.405-40 Fundamental requirement. (a) Costs expressly... be subject to the same cost accounting principles governing cost allocability as allowable costs....

  6. 48 CFR 9904.405-40 - Fundamental requirement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... 9904.405-40 Section 9904.405-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.405-40 Fundamental requirement. (a) Costs expressly... be subject to the same cost accounting principles governing cost allocability as allowable costs....

  7. Promoting patient-centred fundamental care in acute healthcare systems.

    PubMed

    Feo, Rebecca; Kitson, Alison

    2016-05-01

    Meeting patients' fundamental care needs is essential for optimal safety and recovery and positive experiences within any healthcare setting. There is growing international evidence, however, that these fundamentals are often poorly executed in acute care settings, resulting in patient safety threats, poorer and costly care outcomes, and dehumanising experiences for patients and families. Whilst care standards and policy initiatives are attempting to address these issues, their impact has been limited. This discussion paper explores, through a series of propositions, why fundamental care can be overlooked in sophisticated, high technology acute care settings. We argue that the central problem lies in the invisibility and subsequent devaluing of fundamental care. Such care is perceived to involve simple tasks that require little skill to execute and have minimal impact on patient outcomes. The propositions explore the potential origins of this prevailing perception, focusing upon the impact of the biomedical model, the consequences of managerial approaches that drive healthcare cultures, and the devaluing of fundamental care by nurses themselves. These multiple sources of invisibility and devaluing surrounding fundamental care have rendered the concept underdeveloped and misunderstood both conceptually and theoretically. Likewise, there remains minimal role clarification around who should be responsible for and deliver such care, and a dearth of empirical evidence and evidence-based metrics. In explicating these propositions, we argue that key to transforming the delivery of acute healthcare is a substantial shift in the conceptualisation of fundamental care. The propositions present a cogent argument that counters the prevailing perception that fundamental care is basic and does not require systematic investigation. We conclude by calling for the explicit valuing and embedding of fundamental care in healthcare education, research, practice and policy. Without this

  8. A Matter of Principle: The Principles of Quantum Theory, Dirac's Equation, and Quantum Information

    NASA Astrophysics Data System (ADS)

    Plotnitsky, Arkady

    2015-10-01

    This article is concerned with the role of fundamental principles in theoretical physics, especially quantum theory. The fundamental principles of relativity will be addressed as well, in view of their role in quantum electrodynamics and quantum field theory, specifically Dirac's work, which, in particular Dirac's derivation of his relativistic equation of the electron from the principles of relativity and quantum theory, is the main focus of this article. I shall also consider Heisenberg's earlier work leading him to the discovery of quantum mechanics, which inspired Dirac's work. I argue that Heisenberg's and Dirac's work was guided by their adherence to and their confidence in the fundamental principles of quantum theory. The final section of the article discusses the recent work by D'Ariano and coworkers on the principles of quantum information theory, which extend quantum theory and its principles in a new direction. This extension enabled them to offer a new derivation of Dirac's equations from these principles alone, without using the principles of relativity.

  9. Free-Energy Minimization and the Dark-Room Problem

    PubMed Central

    Friston, Karl; Thornton, Christopher; Clark, Andy

    2012-01-01

    Recent years have seen the emergence of an important new fundamental theory of brain function. This theory brings information-theoretic, Bayesian, neuroscientific, and machine learning approaches into a single framework whose overarching principle is the minimization of surprise (or, equivalently, the maximization of expectation). The most comprehensive such treatment is the “free-energy minimization” formulation due to Karl Friston (see e.g., Friston and Stephan, 2007; Friston, 2010a,b – see also Fiorillo, 2010; Thornton, 2010). A recurrent puzzle raised by critics of these models is that biological systems do not seem to avoid surprises. We do not simply seek a dark, unchanging chamber, and stay there. This is the “Dark-Room Problem.” Here, we describe the problem and further unpack the issues to which it speaks. Using the same format as the prolog of Eddington’s Space, Time, and Gravitation (Eddington, 1920) we present our discussion as a conversation between: an information theorist (Thornton), a physicist (Friston), and a philosopher (Clark). PMID:22586414

  10. Minimally Invasive Radiofrequency Devices.

    PubMed

    Sadick, Neil; Rothaus, Kenneth O

    2016-07-01

    This article reviews minimally invasive radiofrequency options for skin tightening, focusing on describing their mechanism of action and clinical profile in terms of safety and efficacy and presenting peer-reviewed articles associated with the specific technologies. Treatments offered by minimally invasive radiofrequency devices (fractional, microneedling, temperature-controlled) are increasing in popularity due to the dramatic effects they can have without requiring skin excision, downtime, or even extreme financial burden from the patient's perspective. Clinical applications thus far have yielded impressive results in treating signs of the aging face and neck, either as stand-alone or as postoperative maintenance treatments. PMID:27363771

  11. Equivalence principles and electromagnetism

    NASA Technical Reports Server (NTRS)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  12. Effects of Phonetic Context on Relative Fundamental Frequency

    ERIC Educational Resources Information Center

    Lien, Yu-An S.; Gattuccio, Caitlin I.; Stepp, Cara E.

    2014-01-01

    Purpose: The effect of phonetic context on relative fundamental frequency (RFF) was examined, in order to develop stimuli sets with minimal within-speaker variability that can be implemented in future clinical protocols. Method: Sixteen speakers with healthy voices produced RFF stimuli. Uniform utterances consisted of 3 repetitions of the same…

  13. Ways To Minimize Bullying.

    ERIC Educational Resources Information Center

    Mueller, Mary Ellen; Parisi, Mary Joy

    This report delineates a series of interventions aimed at minimizing incidences of bullying in a suburban elementary school. The social services staff was scheduled to initiate an anti-bullying incentive in fall 2001 due to the increased occurrences of bullying during the prior year. The target population consisted of third- and fourth-grade…

  14. Minimally invasive pancreatic surgery.

    PubMed

    Yiannakopoulou, E

    2015-12-01

    Minimally invasive pancreatic surgery is feasible and safe. Laparoscopic distal pancreatectomy should be widely adopted for benign lesions of the pancreas. Laparoscopic pancreaticoduodenectomy, although technically demanding, in the setting of pancreatic ductal adenocarcinoma has a number of advantages including shorter hospital stay, faster recovery, allowing patients to recover in a timelier manner and pursue adjuvant treatment options. Furthermore, it seems that progression-free survival is longer in patients undergoing laparoscopic pancreaticoduodenectomy in comparison with those undergoing open pancreaticoduodenectomy. Minimally invasive middle pancreatectomy seems appropriate for benign or borderline tumors of the neck of the pancreas. Technological advances including intraoperative ultrasound and intraoperative fluorescence imaging systems are expected to facilitate the wide adoption of minimally invasive pancreatic surgery. Although, the oncological outcome seems similar with that of open surgery, there are still concerns, as the majority of relevant evidence comes from retrospective studies. Large multicenter randomized studies comparing laparoscopic with open pancreatectomy as well as robotic assisted with both open and laparoscopic approaches are needed. Robotic approach could be possibly shown to be less invasive than conventional laparoscopic approach through the less traumatic intra-abdominal handling of tissues. In addition, robotic approach could enable the wide adoption of the technique by surgeon who is not that trained in advanced laparoscopic surgery. A putative clinical benefit of minimally invasive pancreatic surgery could be the attenuated surgical stress response leading to reduced morbidity and mortality as well as lack of the detrimental immunosuppressive effect especially for the oncological patients. PMID:26530291

  15. The Minimal Era

    ERIC Educational Resources Information Center

    Van Ness, Wilhelmina

    1974-01-01

    Described the development of Minimal Art, a composite name that has been applied to the scattering of bland, bleak, non-objective fine arts painting and sculpture forms that proliferated slightly mysteriously in the middle 1960's as Pop Art began to decline. (Author/RK)

  16. Water Balance Covers For Waste Containment: Principles and Practice

    EPA Science Inventory

    Water Balance Covers for Waste Containment: Principles and Practices introduces water balance covers and compares them with conventional approaches to waste containment. The authors provided detailed analysis of the fundamentals of soil physics and design issues, introduce appl...

  17. The Future of Financial Aid: Principles, Problems, Probable Outcomes.

    ERIC Educational Resources Information Center

    Johnstone, D. Bruce

    1986-01-01

    Forces threatening the fundamental principles and practices of student financial aid are examined, and some advice to the profession is offered. A large dedicated profession has emerged that is skilled at bringing together students, colleges, and resources. (MLW)

  18. Waste Minimization Crosscut Plan

    SciTech Connect

    Not Available

    1992-05-13

    On November 27, 1991, the Secretary of Energy directed that a Department of Energy (DOE) crosscut plan for waste minimization (WMin) be prepared and submitted by March 1, 1992. This Waste Minimization Crosscut Plan responds to the Secretary`s direction and supports the National Energy Strategy (NES) goals of achieving greater energy security, increasing energy and economic efficiency, and enhancing environmental quality. It provides a DOE-wide planning framework for effective coordination of all DOE WMin activities. This Plan was jointly prepared by the following Program Secretarial Officer (PSO) organizations: Civilian Radioactive Waste Management (RW); Conservation and Renewable Energy (CE); Defense Programs (DP); Environmental Restoration and Waste Management (EM), lead; Energy Research (ER); Fossil Energy (FE); Nuclear Energy (NE); and New Production Reactors (NP). Assistance and guidance was provided by the offices of Policy, Planning, and Analysis (PE) and Environment, Safety and Health (EH). Comprehensive application of waste minimization within the Department and in both the public and private sectors will provide significant benefits and support National Energy Strategy goals. These benefits include conservation of a substantial proportion of the energy now used by industry and Government, improved environmental quality, reduced health risks, improved production efficiencies, and longer useful life of disposal capacity. Taken together, these benefits will mean improved US global competitiveness, expanded job opportunities, and a better quality of life for all citizens.

  19. Minimally invasive radioguided parathyroidectomy.

    PubMed

    Costello, D; Norman, J

    1999-07-01

    The last decade has been characterized by an emphasis on minimizing interventional techniques, hospital stays, and overall costs of patient care. It is clear that most patients with sporadic HPT do not require a complete neck exploration. We now know that a minimal approach is appropriate for this disease. Importantly, the MIRP technique can be applied to most patients with sporadic HPT and can be performed by surgeons with modest advanced training. The use of a gamma probe as a surgical tool converts the sestamibi to a functional and anatomical scan eliminating the need for any other preoperative localizing study. Quantification of the radioactivity within the removed gland eliminates the need for routine frozen section histologic examination and obviates the need for costly intraoperative parathyroid hormone measurements. This radioguided technique allows the benefit of local anesthesia, dramatically reduces operative times, eliminates postoperative blood tests, provides a smaller scar, requires minimal time spent in the hospital, and almost assures a rapid, near pain-free recovery. This combination is beneficial to the patient whereas helping achieve a reduction in overall costs. PMID:10448697

  20. Basic principles of variable speed drives

    NASA Technical Reports Server (NTRS)

    Loewenthal, S. H.

    1973-01-01

    An understanding of the principles which govern variable speed drive operation is discussed for successful drive application. The fundamental factors of torque, speed ratio, and power as they relate to drive selection are discussed. The basic types of variable speed drives, their operating characteristics and their applications are also presented.

  1. Beyond the Virtues-Principles Debate.

    ERIC Educational Resources Information Center

    Keat, Marilyn S.

    1992-01-01

    Indicates basic ontological assumptions in the virtues-principles debate in moral philosophy, noting Aristotle's and Kant's fundamental ideas about morality and considering a hermeneutic synthesis of theories. The article discusses what acceptance of the synthesis might mean in the theory and practice of moral pedagogy, offering examples of…

  2. Development of Canonical Transformations from Hamilton's Principle.

    ERIC Educational Resources Information Center

    Quade, C. Richard

    1979-01-01

    The theory of canonical transformations and its development are discussed with regard to its application to Hutton's principle. Included are the derivation of the equations of motion and a lack of symmetry in the formulaion with respect to Lagrangian and the fundamental commutator relations of quantum mechanics. (Author/SA)

  3. Principles of Guided Missiles and Nuclear Weapons.

    ERIC Educational Resources Information Center

    Naval Personnel Program Support Activity, Washington, DC.

    Fundamentals of missile and nuclear weapons systems are presented in this book which is primarily prepared as the second text of a three-volume series for students of the Navy Reserve Officers' Training Corps and the Officer Candidate School. Following an introduction to guided missiles and nuclear physics, basic principles and theories are…

  4. Against Explanatory Minimalism in Psychiatry

    PubMed Central

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell’s criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein’s Zettel. But attention to the context of Wittgenstein’s remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  5. Against Explanatory Minimalism in Psychiatry.

    PubMed

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell's criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein's Zettel. But attention to the context of Wittgenstein's remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  6. Principles of nanoscience: an overview.

    PubMed

    Behari, Jitendra

    2010-10-01

    The scientific basis of nanotechnology as envisaged from the first principles is compared to bulk behavior. Development of nanoparticles having controllable physical and electronic properties has opened up possibility of designing artificial solids. Top down and bottom up approaches are emphasized. The role of nanoparticle (quantum dots) application in nanophotonics (photovoltaic cell), and drug delivery vehicle is discussed. Fundamentals of DNA structure as the prime site in bionanotechnological manipulations is also discussed. A summary of presently available devices and applications are presented. PMID:21299044

  7. Design of the fundamental power coupler and photocathode inserts for the 112MHz superconducting electron gun

    SciTech Connect

    Xin, T.; Ben-Zvi, I.; Belomestnykh, S.; Chang, X.; Rao, T.; Skaritka, J.; Wu, Q.; Wang, E.; Liang, X.

    2011-07-25

    A 112 MHz superconducting quarter-wave resonator electron gun will be used as the injector of the Coherent Electron Cooling (CEC) proof-of-principle experiment at BNL. Furthermore, this electron gun can be the testing cavity for various photocathodes. In this paper, we present the design of the cathode stalks and a Fundamental Power Coupler (FPC) designated to the future experiments. Two types of cathode stalks are discussed. Special shape of the stalk is applied in order to minimize the RF power loss. The location of cathode plane is also optimized to enable the extraction of low emittance beam. The coaxial waveguide structure FPC has the properties of tunable coupling factor and small interference to the electron beam output. The optimization of the coupling factor and the location of the FPC are discussed in detail. Based on the transmission line theory, we designed a half wavelength cathode stalk which significantly brings down the voltage drop between the cavity and the stalk from more than 5.6 kV to 0.1 kV. The transverse field distribution on cathode has been optimized by carefully choosing the position of cathode stalk inside the cavity. Moreover, in order to decrease the RF power loss, a variable diameter design of cathode stalk has been applied. Compared to the uniform shape of stalk, this design gives us much smaller power losses in important locations. Besides that, we also proposed a fundamental power coupler based on the designed beam parameters for the future proof-of-principle CEC experiment. This FPC should give a strong enough coupling which has the Q external range from 1.5e7 to 2.6e8.

  8. Magnetism: Principles and Applications

    NASA Astrophysics Data System (ADS)

    Craik, Derek J.

    2003-09-01

    If you are studying physics, chemistry, materials science, electrical engineering, information technology or medicine, then you'll know that understanding magnetism is fundamental to success in your studies and here is the key to unlocking the mysteries of magnetism....... You can: obtain a simple overview of magnetism, including the roles of B and H, resonances and special techniques take full advantage of modern magnets with a wealth of expressions for fields and forces develop realistic general design programmes using isoparametric finite elements study the subtleties of the general theory of magnetic moments and their dynamics follow the development of outstanding materials appreciate how magnetism encompasses topics as diverse as rock magnetism, chemical reaction rates, biological compasses, medical therapies, superconductivity and levitation understand the basis and remarkable achievements of magnetic resonance imaging In his new book, Magnetism, Derek Craik throws light on the principles and applications of this fascinating subject. From formulae for calculating fields to quantum theory, the secrets of magnetism are exposed, ensuring that whether you are a chemist or engineer, physicist, medic or materials scientist Magnetism is the book for our course.

  9. The validity of the extended energy principle

    SciTech Connect

    Chance, M.S.; Johnson, J.L.; Kulsrud, R.M.

    1994-04-01

    A recent analysis of plasma stability based on modifications of the extended energy principle for magnetohydrodynamic stability led to conclusions that are too optimistic. The original interpretation of this principle is indeed applicable. The present analysis demonstrates explicitly the fallacy of using the wrong functional for {delta}W in the extended energy principle. It then shows that the original energy principle functional {delta}W{sub B} is also obtained for a model in which a surface mass is incorporated to provide pressure balance. This work therefore indicates, but does not prove, that the eigenfunctions that are obtained from a minimization of the extended energy principle with the proper kinetic energy norm provide a good representation of what would be achieved with an exact treatment.

  10. Fundamentals of natural computing: an overview

    NASA Astrophysics Data System (ADS)

    de Castro, Leandro Nunes

    2007-03-01

    Natural computing is a terminology introduced to encompass three classes of methods: (1) those that take inspiration from nature for the development of novel problem-solving techniques; (2) those that are based on the use of computers to synthesize natural phenomena; and (3) those that employ natural materials (e.g., molecules) to compute. The main fields of research that compose these three branches are the artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others. This paper provides an overview of the fundamentals of natural computing, particularly the fields listed above, emphasizing the biological motivation, some design principles, their scope of applications, current research trends and open problems. The presentation is concluded with a discussion about natural computing, and when it should be used.

  11. Ablative Thermal Protection System Fundamentals

    NASA Technical Reports Server (NTRS)

    Beck, Robin A. S.

    2013-01-01

    This is the presentation for a short course on the fundamentals of ablative thermal protection systems. It covers the definition of ablation, description of ablative materials, how they work, how to analyze them and how to model them.

  12. Fundamentals of freeze-drying.

    PubMed

    Nail, Steven L; Jiang, Shan; Chongprasert, Suchart; Knopp, Shawn A

    2002-01-01

    Given the increasing importance of reducing development time for new pharmaceutical products, formulation and process development scientists must continually look for ways to "work smarter, not harder." Within the product development arena, this means reducing the amount of trial and error empiricism in arriving at a formulation and identification of processing conditions which will result in a quality final dosage form. Characterization of the freezing behavior of the intended formulation is necessary for developing processing conditions which will result in the shortest drying time while maintaining all critical quality attributes of the freeze-dried product. Analysis of frozen systems was discussed in detail, particularly with respect to the glass transition as the physical event underlying collapse during freeze-drying, eutectic mixture formation, and crystallization events upon warming of frozen systems. Experiments to determine how freezing and freeze-drying behavior is affected by changes in the composition of the formulation are often useful in establishing the "robustness" of a formulation. It is not uncommon for seemingly subtle changes in composition of the formulation, such as a change in formulation pH, buffer salt, drug concentration, or an additional excipient, to result in striking differences in freezing and freeze-drying behavior. With regard to selecting a formulation, it is wise to keep the formulation as simple as possible. If a buffer is needed, a minimum concentration should be used. The same principle applies to added salts: If used at all, the concentration should be kept to a minimum. For many proteins a combination of an amorphous excipient, such as a disaccharide, and a crystallizing excipient, such as glycine, will result in a suitable combination of chemical stability and physical stability of the freeze-dried solid. Concepts of heat and mass transfer are valuable in rational design of processing conditions. Heat transfer by conduction

  13. Minimal length in quantum gravity and gravitational measurements

    NASA Astrophysics Data System (ADS)

    Farag Ali, Ahmed; Khalil, Mohammed M.; Vagenas, Elias C.

    2015-10-01

    The existence of a minimal length is a common prediction of various theories of quantum gravity. This minimal length leads to a modification of the Heisenberg uncertainty principle to a Generalized Uncertainty Principle (GUP). Various studies showed that a GUP modifies the Hawking radiation of black holes. In this paper, we propose a modification of the Schwarzschild metric based on the modified Hawking temperature derived from the GUP. Based on this modified metric, we calculate corrections to the deflection of light, time delay of light, perihelion precession, and gravitational redshift. We compare our results with gravitational measurements to set an upper bound on the GUP parameter.

  14. A Colorful Demonstration of Le Chbtelier's Principle.

    ERIC Educational Resources Information Center

    Last, Arthur M.; Slade, Peter W.

    1997-01-01

    Le Chbtelier's Principle states that, when a system at equilibrium is subjected to stress, the system will respond in such a way as to minimize the effect of the stress. Describes a lecture demonstration that illustrates shifts in the position of equilibrium caused by a variety of factors. The equilibrium mixture contains iron (III) and…

  15. Minimally invasive mediastinal surgery

    PubMed Central

    Melfi, Franca M. A.; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a “no-touch” technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally

  16. Minimally invasive mediastinal surgery.

    PubMed

    Melfi, Franca M A; Fanucchi, Olivia; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a "no-touch" technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally invasive

  17. Minimally refined biomass fuel

    DOEpatents

    Pearson, Richard K.; Hirschfeld, Tomas B.

    1984-01-01

    A minimally refined fluid composition, suitable as a fuel mixture and derived from biomass material, is comprised of one or more water-soluble carbohydrates such as sucrose, one or more alcohols having less than four carbons, and water. The carbohydrate provides the fuel source; water solubilizes the carbohydrates; and the alcohol aids in the combustion of the carbohydrate and reduces the vicosity of the carbohydrate/water solution. Because less energy is required to obtain the carbohydrate from the raw biomass than alcohol, an overall energy savings is realized compared to fuels employing alcohol as the primary fuel.

  18. Wake Vortex Minimization

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A status report is presented on research directed at reducing the vortex disturbances of aircraft wakes. The objective of such a reduction is to minimize the hazard to smaller aircraft that might encounter these wakes. Inviscid modeling was used to study trailing vortices and viscous effects were investigated. Laser velocimeters were utilized in the measurement of aircraft wakes. Flight and wind tunnel tests were performed on scale and full model scale aircraft of various design. Parameters investigated included the effect of wing span, wing flaps, spoilers, splines and engine thrust on vortex attenuation. Results indicate that vortives may be alleviated through aerodynamic means.

  19. Minimally Invasive Parathyroidectomy

    PubMed Central

    Starker, Lee F.; Fonseca, Annabelle L.; Carling, Tobias; Udelsman, Robert

    2011-01-01

    Minimally invasive parathyroidectomy (MIP) is an operative approach for the treatment of primary hyperparathyroidism (pHPT). Currently, routine use of improved preoperative localization studies, cervical block anesthesia in the conscious patient, and intraoperative parathyroid hormone analyses aid in guiding surgical therapy. MIP requires less surgical dissection causing decreased trauma to tissues, can be performed safely in the ambulatory setting, and is at least as effective as standard cervical exploration. This paper reviews advances in preoperative localization, anesthetic techniques, and intraoperative management of patients undergoing MIP for the treatment of pHPT. PMID:21747851

  20. Highly precise clocks to test fundamental physics

    NASA Astrophysics Data System (ADS)

    Bize, S.; Wolf, P.

    2012-12-01

    Highly precise atomic clocks and precision oscillators are excellent tools to test founding principles, such as the Equivalence Principle, which are the basis of modern physics. A large variety of tests are possible, including tests of Local Lorentz Invariance, of Local Position Invariance like, for example, tests of the variability of natural constants with time and with gravitation potential, tests of isotropy of space, etc. Over several decades, SYRTE has developed an ensemble of highly accurate atomic clocks and oscillators using a large diversity of atomic species and methods. The SYRTE clock ensemble comprises hydrogen masers, Cs and Rb atomic fountain clocks, Sr and Hg optical lattice clocks, as well as ultra stable oscillators both in the microwave domain (cryogenic sapphire oscillator) and in the optical domain (Fabry-Perot cavity stabilized ultra stable lasers) and means to compare these clocks locally or remotely (fiber links in the RF and the optical domain, femtosecond optical frequency combs, satellite time and frequency transfer methods). In this paper, we list the fundamental physics tests that have been performed over the years with the SYRTE clock ensemble. Several of these tests are done thanks to the collaboration with partner institutes including the University of Western Australia, the Max Planck Institut für Quantenoptik in Germany, and others.

  1. Generalized uncertainty principle: Approaches and applications

    NASA Astrophysics Data System (ADS)

    Tawfik, A.; Diab, A.

    2014-11-01

    In this paper, we review some highlights from the String theory, the black hole physics and the doubly special relativity and some thought experiments which were suggested to probe the shortest distances and/or maximum momentum at the Planck scale. Furthermore, all models developed in order to implement the minimal length scale and/or the maximum momentum in different physical systems are analyzed and compared. They entered the literature as the generalized uncertainty principle (GUP) assuming modified dispersion relation, and therefore are allowed for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. A second one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.

  2. Fundamental concepts and limitations in precision pointing and tracking problems

    NASA Astrophysics Data System (ADS)

    Johnson, Carroll D.; Masten, Michael K.

    1993-10-01

    In this paper, we first describe the generic pointing and tracking problems in a general dynamical system/state-space context. Then, we analyze the information-theoretical aspects of the various uncertain signals in those problems, and establish some fundamental performance limitations those uncertainties induce, using various results and principles of modern control theory. It is shown that the introduction of 'waveform models' for uncertain signals, leading to an extended-state formulation of pointing and tracking problems, is the most effective rational means of coping with those fundamental limitations.

  3. Minimizing hazardous waste

    SciTech Connect

    DeClue, S.C.

    1996-06-01

    Hazardous waste minimization is a broad term often associated with pollution prevention, saving the environment or protecting Mother Earth. Some associate hazardous waste minimization with saving money. Thousands of hazardous materials are used in processes every day, but when these hazardous materials become hazardous wastes, dollars must be spent for disposal. When hazardous waste is reduced, an organization will spend less money on hazardous waste disposal. In 1993, Fort Bragg reduced its hazardous waste generation by over 100,000 pounds and spent nearly $90,000 less on hazardous waste disposal costs than in 1992. Fort Bragg generates a variety of wastes: Vehicle maintenance wastes such as antifreeze, oil, grease and solvents; helicopter maintenance wastes, including solvents, adhesives, lubricants and paints; communication operation wastes such as lithium, magnesium, mercury and nickel-cadmium batteries; chemical defense wastes detection, decontamination, and protective mask filters. The Hazardous Waste Office has the responsibility to properly identify, characterize, classify and dispose of these waste items in accordance with US Environmental Protection Agency (EPA) and US Department of Transportation (DOT) regulations.

  4. A minimal fate-selection switch.

    PubMed

    Weinberger, Leor S

    2015-12-01

    To preserve fitness in unpredictable, fluctuating environments, a range of biological systems probabilistically generate variant phenotypes--a process often referred to as 'bet-hedging', after the financial practice of diversifying assets to minimize risk in volatile markets. The molecular mechanisms enabling bet-hedging have remained elusive. Here, we review how HIV makes a bet-hedging decision between active replication and proviral latency, a long-lived dormant state that is the chief barrier to an HIV cure. The discovery of a virus-encoded bet-hedging circuit in HIV revealed an ancient evolutionary role for latency and identified core regulatory principles, such as feedback and stochastic 'noise', that enable cell-fate decisions. These core principles were later extended to fate selection in stem cells and cancer, exposed new therapeutic targets for HIV, and led to a potentially broad strategy of using 'noise modulation' to redirect cell fate. PMID:26611210

  5. The Elements and Principles of Design: A Baseline Study

    ERIC Educational Resources Information Center

    Adams, Erin

    2013-01-01

    Critical to the discipline, both professionally and academically, are the fundamentals of interior design. These fundamentals include the elements and principles of interior design: the commonly accepted tools and vocabulary used to create and communicate successful interior environments. Research indicates a lack of consistency in both the…

  6. A review of the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Nasser Tawfik, Abdel; Magied Diab, Abdel

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some ‘thought’ experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  7. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed. PMID:26512022

  8. The minimal length and quantum partition functions

    NASA Astrophysics Data System (ADS)

    Abbasiyan-Motlaq, M.; Pedram, P.

    2014-08-01

    We study the thermodynamics of various physical systems in the framework of the generalized uncertainty principle that implies a minimal length uncertainty proportional to the Planck length. We present a general scheme to analytically calculate the quantum partition function of the physical systems to first order of the deformation parameter based on the behavior of the modified energy spectrum and compare our results with the classical approach. Also, we find the modified internal energy and heat capacity of the systems for the anti-Snyder framework.

  9. What Metadata Principles Apply to Scientific Data?

    NASA Astrophysics Data System (ADS)

    Mayernik, M. S.

    2014-12-01

    Information researchers and professionals based in the library and information science fields often approach their work through developing and applying defined sets of principles. For example, for over 100 years, the evolution of library cataloging practice has largely been driven by debates (which are still ongoing) about the fundamental principles of cataloging and how those principles should manifest in rules for cataloging. Similarly, the development of archival research and practices over the past century has proceeded hand-in-hand with the emergence of principles of archival arrangement and description, such as maintaining the original order of records and documenting provenance. This project examines principles related to the creation of metadata for scientific data. The presentation will outline: 1) how understandings and implementations of metadata can range broadly depending on the institutional context, and 2) how metadata principles developed by the library and information science community might apply to metadata developments for scientific data. The development and formalization of such principles would contribute to the development of metadata practices and standards in a wide range of institutions, including data repositories, libraries, and research centers. Shared metadata principles would potentially be useful in streamlining data discovery and integration, and would also benefit the growing efforts to formalize data curation education.

  10. Minimal noise subsystems

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoting; Byrd, Mark; Jacobs, Kurt

    2016-03-01

    A system subjected to noise contains a decoherence-free subspace or subsystem (DFS) only if the noise possesses an exact symmetry. Here we consider noise models in which a perturbation breaks a symmetry of the noise, so that if S is a DFS under a given noise process it is no longer so under the new perturbed noise process. We ask whether there is a subspace or subsystem that is more robust to the perturbed noise than S . To answer this question we develop a numerical method that allows us to search for subspaces or subsystems that are maximally robust to arbitrary noise processes. We apply this method to a number of examples, and find that a subsystem that is a DFS is often not the subsystem that experiences minimal noise when the symmetry of the noise is broken by a perturbation. We discuss which classes of noise have this property.

  11. Minimal quiver standard model

    SciTech Connect

    Berenstein, David; Pinansky, Samuel

    2007-05-01

    This paper discusses the minimal quiver gauge theory embedding of the standard model that could arise from brane world type string theory constructions. It is based on the low energy effective field theory of D branes in the perturbative regime. The model differs from the standard model by the addition of one extra massive gauge boson, and contains only one additional parameter to the standard model: the mass of this new particle. The coupling of this new particle to the standard model is uniquely determined by input from the standard model and consistency conditions of perturbative string theory. We also study some aspects of the phenomenology of this model and bounds on its possible observation at the Large Hadron Collider.

  12. [Minimally invasive breast surgery].

    PubMed

    Mátrai, Zoltán; Gulyás, Gusztáv; Kunos, Csaba; Sávolt, Akos; Farkas, Emil; Szollár, András; Kásler, Miklós

    2014-02-01

    Due to the development in medical science and industrial technology, minimally invasive procedures have appeared in the surgery of benign and malignant breast diseases. In general , such interventions result in significantly reduced breast and chest wall scars, shorter hospitalization and less pain, but they require specific, expensive devices, longer surgical time compared to open surgery. Furthermore, indications or oncological safety have not been established yet. It is quite likely, that minimally invasive surgical procedures with high-tech devices - similar to other surgical subspecialties -, will gradually become popular and it may form part of routine breast surgery even. Vacuum-assisted core biopsy with a therapeutic indication is suitable for the removal of benign fibroadenomas leaving behind an almost invisible scar, while endoscopically assisted skin-sparing and nipple-sparing mastectomy, axillary staging and reconstruction with latissimus dorsi muscle flap are all feasible through the same short axillary incision. Endoscopic techniques are also suitable for the diagnostics and treatment of intracapsular complications of implant-based breast reconstructions (intracapsular fluid, implant rupture, capsular contracture) and for the biopsy of intracapsular lesions with uncertain pathology. Perception of the role of radiofrequency ablation of breast tumors requires further hands-on experience, but it is likely that it can serve as a replacement of surgical removal in a portion of primary tumors in the future due to the development in functional imaging and anticancer drugs. With the reduction of the price of ductoscopes routine examination of the ductal branch system, guided microdochectomy and targeted surgical removal of terminal ducto-lobular units or a "sick lobe" as an anatomical unit may become feasible. The paper presents the experience of the authors and provides a literature review, for the first time in Hungarian language on the subject. Orv. Hetil

  13. Minimally invasive parathyroid surgery

    PubMed Central

    Noureldine, Salem I.; Gooi, Zhen

    2015-01-01

    Traditionally, bilateral cervical exploration for localization of all four parathyroid glands and removal of any that are grossly enlarged has been the standard surgical treatment for primary hyperparathyroidism (PHPT). With the advances in preoperative localization studies and greater public demand for less invasive procedures, novel targeted, minimally invasive techniques to the parathyroid glands have been described and practiced over the past 2 decades. Minimally invasive parathyroidectomy (MIP) can be done either through the standard Kocher incision, a smaller midline incision, with video assistance (purely endoscopic and video-assisted techniques), or through an ectopically placed, extracervical, incision. In current practice, once PHPT is diagnosed, preoperative evaluation using high-resolution radiographic imaging to localize the offending parathyroid gland is essential if MIP is to be considered. The imaging study results suggest where the surgeon should begin the focused procedure and serve as a road map to allow tailoring of an efficient, imaging-guided dissection while eliminating the unnecessary dissection of multiple glands or a bilateral exploration. Intraoperative parathyroid hormone (IOPTH) levels may be measured during the procedure, or a gamma probe used during radioguided parathyroidectomy, to ascertain that the correct gland has been excised and that no other hyperfunctional tissue is present. MIP has many advantages over the traditional bilateral, four-gland exploration. MIP can be performed using local anesthesia, requires less operative time, results in fewer complications, and offers an improved cosmetic result and greater patient satisfaction. Additional advantages of MIP are earlier hospital discharge and decreased overall associated costs. This article aims to address the considerations for accomplishing MIP, including the role of preoperative imaging studies, intraoperative adjuncts, and surgical techniques. PMID:26425454

  14. Minimal complexity control law synthesis

    NASA Technical Reports Server (NTRS)

    Bernstein, Dennis S.; Haddad, Wassim M.; Nett, Carl N.

    1989-01-01

    A paradigm for control law design for modern engineering systems is proposed: Minimize control law complexity subject to the achievement of a specified accuracy in the face of a specified level of uncertainty. Correspondingly, the overall goal is to make progress towards the development of a control law design methodology which supports this paradigm. Researchers achieve this goal by developing a general theory of optimal constrained-structure dynamic output feedback compensation, where here constrained-structure means that the dynamic-structure (e.g., dynamic order, pole locations, zero locations, etc.) of the output feedback compensation is constrained in some way. By applying this theory in an innovative fashion, where here the indicated iteration occurs over the choice of the compensator dynamic-structure, the paradigm stated above can, in principle, be realized. The optimal constrained-structure dynamic output feedback problem is formulated in general terms. An elegant method for reducing optimal constrained-structure dynamic output feedback problems to optimal static output feedback problems is then developed. This reduction procedure makes use of star products, linear fractional transformations, and linear fractional decompositions, and yields as a byproduct a complete characterization of the class of optimal constrained-structure dynamic output feedback problems which can be reduced to optimal static output feedback problems. Issues such as operational/physical constraints, operating-point variations, and processor throughput/memory limitations are considered, and it is shown how anti-windup/bumpless transfer, gain-scheduling, and digital processor implementation can be facilitated by constraining the controller dynamic-structure in an appropriate fashion.

  15. Fundamentals of direct digital control

    SciTech Connect

    Zimmerman, A.J.

    1996-05-01

    The 14th Century British philosopher William of Occam introduced the principle known today as Occam`s Razor, which can be paraphrased as: The right answer to a given problem requires only the minimum assumptions necessary to explain it adequately. In this article the author uses Occam`s Razor to carve away the excess rhetoric and verbiage surrounding direct digital control (DDC). It is often surprising just how simple in principle a DDC system can be while producing sophisticated performance. This article will examine the basic components and functions common to most DDC systems for commercial building HVAC control, from the point of view of specifier, owner, and operator.

  16. Testing the Pauli Exclusion Principle for Electrons

    NASA Astrophysics Data System (ADS)

    Marton, J.; Bartalucci, S.; Bertolucci, S.; Berucci, C.; Bragadireanu, M.; Cargnelli, M.; Curceanu (Petrascu, C.; Di Matteo, S.; Egger, J.-P.; Guaraldo, C.; Iliescu, M.; Ishiwatari, T.; Laubenstein, M.; Milotti, E.; Pietreanu, D.; Piscicchia, K.; Ponta, T.; Romero Vidal, A.; Scordo, A.; Sirghi, D. L.; Sirghi, F.; Sperandio, L.; Vazquez Doce, O.; Widmann, E.; Zmeskal, J.

    2013-07-01

    One of the fundamental rules of nature and a pillar in the foundation of quantum theory and thus of modern physics is represented by the Pauli Exclusion Principle. We know that this principle is extremely well fulfilled due to many observations. Numerous experiments were performed to search for tiny violation of this rule in various systems. The experiment VIP at the Gran Sasso underground laboratory is searching for possible small violations of the Pauli Exclusion Principle for electrons leading to forbidden X-ray transitions in copper atoms. VIP is aiming at a test of the Pauli Exclusion Principle for electrons with high accuracy, down to the level of 10-29 - 10-30, thus improving the previous limit by 3-4 orders of magnitude. The experimental method, results obtained so far and new developments within VIP2 (follow-up experiment at Gran Sasso, in preparation) to further increase the precision by 2 orders of magnitude will be presented.

  17. Electronic Coolers Based on Superconducting Tunnel Junctions: Fundamentals and Applications

    NASA Astrophysics Data System (ADS)

    Courtois, H.; Hekking, F. W. J.; Nguyen, H. Q.; Winkelmann, C. B.

    2014-06-01

    Thermo-electric transport at the nano-scale is a rapidly developing topic, in particular in superconductor-based hybrid devices. In this review paper, we first discuss the fundamental principles of electronic cooling in mesoscopic superconducting hybrid structures, the related limitations and applications. We review recent work performed in Grenoble on the effects of Andreev reflection, photonic heat transport, phonon cooling, as well as on an innovative fabrication technique for powerful coolers.

  18. Chemical Principls Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1973-01-01

    Two topics are discussed: (1) Stomach Upset Caused by Aspirin, illustrating principles of acid-base equilibrium and solubility; (2) Physical Chemistry of the Drinking Duck, illustrating principles of phase equilibria and thermodynamics. (DF)

  19. Principles of project management

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.

  20. Minimizing Accidents and Risks in High Adventure Outdoor Pursuits.

    ERIC Educational Resources Information Center

    Meier, Joel

    The fundamental dilemma in adventure programming is eliminating unreasonable risks to participants without also reducing levels of excitement, challenge, and stress. Most accidents are caused by a combination of unsafe conditions, unsafe acts, and error judgments. The best and only way to minimize critical human error in adventure programs is…

  1. Environmental Law: Fundamentals for Schools.

    ERIC Educational Resources Information Center

    Day, David R.

    This booklet outlines the environmental problems most likely to arise in schools. An overview provides a fundamental analysis of environmental issues rather than comprehensive analysis and advice. The text examines the concerns that surround superfund cleanups, focusing on the legal framework, and furnishes some practical pointers, such as what to…

  2. Fundamental Cycles of Cognitive Growth.

    ERIC Educational Resources Information Center

    Pegg, John

    Over recent years, various theories have arisen to explain and predict cognitive development in mathematics education. We focus on an underlying theme that recurs throughout such theories: a fundamental cycle of growth in the learning of specific concepts, which we frame within broader global theories of individual cognitive growth. Our purpose is…

  3. Fundamentals of the Slide Library.

    ERIC Educational Resources Information Center

    Boerner, Susan Zee

    This paper is an introduction to the fundamentals of the art (including architecture) slide library, with some emphasis on basic procedures of the science slide library. Information in this paper is particularly relevant to the college, university, and museum slide library. Topics addressed include: (1) history of the slide library; (2) duties of…

  4. Lighting Fundamentals. Monograph Number 13.

    ERIC Educational Resources Information Center

    Locatis, Craig N.; Gerlach, Vernon S.

    Using an accompanying, specified film that consists of 10-second pictures separated by blanks, the learner can, with the 203-step, self-correcting questions and answers provided in this program, come to understand the fundamentals of lighting in photography. The learner should, by the end of the program, be able to describe and identify the…

  5. Fundamentals of Microelectronics Processing (VLSI).

    ERIC Educational Resources Information Center

    Takoudis, Christos G.

    1987-01-01

    Describes a 15-week course in the fundamentals of microelectronics processing in chemical engineering, which emphasizes the use of very large scale integration (VLSI). Provides a listing of the topics covered in the course outline, along with a sample of some of the final projects done by students. (TW)

  6. Brake Fundamentals. Automotive Articulation Project.

    ERIC Educational Resources Information Center

    Cunningham, Larry; And Others

    Designed for secondary and postsecondary auto mechanics programs, this curriculum guide contains learning exercises in seven areas: (1) brake fundamentals; (2) brake lines, fluid, and hoses; (3) drum brakes; (4) disc brake system and service; (5) master cylinder, power boost, and control valves; (6) parking brakes; and (7) trouble shooting. Each…

  7. Museum Techniques in Fundamental Education.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Paris (France).

    Some museum techniques and methods can be used in fundamental educational programs without elaborate buildings or equipment; exhibitions should be based on valid presumptions and should take into account the "common sense" beliefs of people for whom the exhibit is designed. They can be used profitably in the economic development of local cultural…

  8. Fundamentals of Welding. Teacher Edition.

    ERIC Educational Resources Information Center

    Fortney, Clarence; And Others

    These instructional materials assist teachers in improving instruction on the fundamentals of welding. The following introductory information is included: use of this publication; competency profile; instructional/task analysis; related academic and workplace skills list; tools, materials, and equipment list; and 27 references. Seven units of…

  9. Status of Fundamental Physics Program

    NASA Technical Reports Server (NTRS)

    Lee, Mark C.

    2003-01-01

    Update of the Fundamental Physics Program. JEM/EF Slip. 2 years delay. Reduced budget. Community support and advocacy led by Professor Nick Bigelow. Reprogramming led by Fred O Callaghan/JPL team. LTMPF M1 mission (DYNAMX and SUMO). PARCS. Carrier re baselined on JEM/EF.

  10. Light as a Fundamental Particle

    ERIC Educational Resources Information Center

    Weinberg, Steven

    1975-01-01

    Presents two arguments concerning the role of the photon. One states that the photon is just another particle distinguished by a particular value of charge, spin, mass, lifetime, and interaction properties. The second states that the photon plays a fundamental role with a deep relation to ultimate formulas of physics. (GS)

  11. Minimal Marking: A Success Story

    ERIC Educational Resources Information Center

    McNeilly, Anne

    2014-01-01

    The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…

  12. Principles of Modern Soccer.

    ERIC Educational Resources Information Center

    Beim, George

    This book is written to give a better understanding of the principles of modern soccer to coaches and players. In nine chapters the following elements of the game are covered: (1) the development of systems; (2) the principles of attack; (3) the principles of defense; (4) training games; (5) strategies employed in restarts; (6) physical fitness…

  13. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1970-01-01

    This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…

  14. Fundamentals of microelectronic equipment design

    NASA Astrophysics Data System (ADS)

    Nenashev, A. P.; Koledov, L. A.

    Microelectronic equipment design is reviewed with reference to general design principles and criteria, role of standardization, design of microcircuit components, and assembly techniques. Particular attention is given to the design of hybrid integrated circuitry and discrete radioelectronic components. Design methods for providing adequate temperature conditions and protection from humidity and mechanical overloads as well as component layout criteria are also discussed.

  15. Fundamentals of Aqueous Microwave Chemistry

    EPA Science Inventory

    The first chemical revolution changed modern life with a host of excellent amenities and services, but created serious problems related to environmental pollution. After 150 years of current chemistry principles and practices, we need a radical change to a new type of chemistry k...

  16. Diesel Fundamentals. Teacher Edition (Revised).

    ERIC Educational Resources Information Center

    Clark, Elton; And Others

    This module is one of a series of teaching guides that cover diesel mechanics. The module contains 4 sections and 19 units. Section A--Orientation includes the following units: introduction to diesel mechanics and shop safety; basic shop tools; test equipment and service tools; fasteners; bearings; and seals. Section B--Engine Principles and…

  17. Superpower nuclear minimalism

    SciTech Connect

    Graben, E.K.

    1992-01-01

    During the Cold War, the United States and the Soviet Union competed in building weapons -- now it seems like America and Russia are competing to get rid of them the fastest. The lengthy process of formal arms control has been replaced by exchanges of unilateral force reductions and proposals for reciprocal reductions not necessarily codified by treaty. Should superpower nuclear strategies change along with force postures President Bush has yet to make a formal pronouncement on post-Cold War American nuclear strategy, and it is uncertain if the Soviet/Russian doctrine of reasonable sufficiency formulated in the Gorbachev era actually heralds a change in strategy. Some of the provisions in the most recent round of unilateral proposals put forth by Presidents Bush and Yeltsin in January 1992 are compatible with a change in strategy. Whether such a change has actually occurred remains to be seen. With the end of the Cold War and the breakup of the Soviet Union, the strategic environment has fundamentally changed, so it would seem logical to reexamine strategy as well. There are two main schools of nuclear strategic thought: a maximalist school, mutual assured destruction (MAD) which emphasizes counterforce superiority and nuclear war- fighting capability, and a MAD-plus school, which emphasizes survivability of an assured destruction capability along with the ability to deliver small, limited nuclear attacks in the event that conflict occurs. The MAD-plus strategy is based on an attempt to conventionalize nuclear weapons which is unrealistic.

  18. Radiation protection principles of NCRP.

    PubMed

    Kase, Kenneth R

    2004-09-01

    The current recommendations of the National Council on Radiation Protection and Measurements (NCRP) relative to ionizing radiation are based on radiation protection principles that developed historically as information about radiation effects on human populations became available. Because the NCRP Charter states that the NCRP will cooperate with the International Commission on Radiological Protection (ICRP), the basic principles and recommendations for radiation protection of the NCRP are closely coupled with those of the ICRP. Thus, the fundamental principles of justification, optimization, and dose limitation as initially stated in ICRP Publication 26 have been adopted and applied by the NCRP in its recommendations. ICRP and NCRP recommendations on dose limitation for the general public and for occupationally exposed individuals are based on the same analyses of radiation risk, and, while similar, there are differences reflecting the aspects of radiation application and exposure circumstances unique to the United States. The NCRP has recently extended its guidance to address exposure to individuals engaged in space activities. Several reports have been issued or are in preparation to provide recommendations on dose limitation and the development of radiation safety programs to apply the radiation protection principles in space activities. The biological basis for these recommendations is provided in these and accompanying NCRP reports. Recommendations for the application of basic radiation protection principles have been made in many reports over the years. Those that are most current appear in approximately 50 reports published in the last 15 y. These address radiation safety practices in industrial and medical institutions, control of radionuclides in the environment, protection of the public, and assessment of radiation risk. Some of the aspects of these recommendations will be discussed. Current recommendations related to radiation safety practice are based

  19. Minimally legally invasive dentistry.

    PubMed

    Lam, R

    2014-12-01

    One disadvantage of the rapid advances in modern dentistry is that treatment options have never been more varied or confusing. Compounded by a more educated population greatly assisted by online information in an increasingly litigious society, a major concern in recent times is increased litigation against health practitioners. The manner in which courts handle disputes is ambiguous and what is considered fair or just may not be reflected in the judicial process. Although legal decisions in Australia follow a doctrine of precedent, the law is not static and is often reflected by community sentiment. In medical litigation, this has seen the rejection of the Bolam principle with a preference towards greater patient rights. Recent court decisions may change the practice of dentistry and it is important that the clinician is not caught unaware. The aim of this article is to discuss legal issues that are pertinent to the practice of modern dentistry through an analysis of legal cases that have shaped health law. Through these discussions, the importance of continuing professional development, professional association and informed consent will be realized as a means to limit the legal complications of dental practice. PMID:25160114

  20. Minimal Length, Maximal Momentum and the Entropic Force Law

    NASA Astrophysics Data System (ADS)

    Nozari, Kourosh; Pedram, Pouria; Molkara, M.

    2012-04-01

    Different candidates of quantum gravity proposal such as string theory, noncommutative geometry, loop quantum gravity and doubly special relativity, all predict the existence of a minimum observable length and/or a maximal momentum which modify the standard Heisenberg uncertainty principle. In this paper, we study the effects of minimal length and maximal momentum on the entropic force law formulated recently by E. Verlinde.

  1. Minimal distances between SCFTs

    NASA Astrophysics Data System (ADS)

    Buican, Matthew

    2014-01-01

    We study lower bounds on the minimal distance in theory space between four-dimensional superconformal field theories (SCFTs) connected via broad classes of renormalization group (RG) flows preserving various amounts of supersymmetry (SUSY). For = 1 RG flows, the ultraviolet (UV) and infrared (IR) endpoints of the flow can be parametrically close. On the other hand, for RG flows emanating from a maximally supersymmetric SCFT, the distance to the IR theory cannot be arbitrarily small regardless of the amount of (non-trivial) SUSY preserved along the flow. The case of RG flows from =2 UV SCFTs is more subtle. We argue that for RG flows preserving the full =2 SUSY, there are various obstructions to finding examples with parametrically close UV and IR endpoints. Under reasonable assumptions, these obstructions include: unitarity, known bounds on the c central charge derived from associativity of the operator product expansion, and the central charge bounds of Hofman and Maldacena. On the other hand, for RG flows that break = 2 → = 1, it is possible to find IR fixed points that are parametrically close to the UV ones. In this case, we argue that if the UV SCFT possesses a single stress tensor, then such RG flows excite of order all the degrees of freedom of the UV theory. Furthermore, if the UV theory has some flavor symmetry, we argue that the UV central charges should not be too large relative to certain parameters in the theory.

  2. Slow Magic Angle Sample Spinning: A Non- or Minimally Invasive Method for High- Resolution 1H Nuclear Magnetic Resonance (NMR) Metabolic Profiling

    SciTech Connect

    Hu, Jian Z.

    2011-05-01

    High resolution 1H magic angle spinning nuclear magnetic resonance (NMR), using a sample spinning rate of several kHz or more (i.e., high resolution-magic angle spinning (hr-MAS)), is a well established method for metabolic profiling in intact tissues without the need for sample extraction. The only shortcoming with hr-MAS is that it is invasive and is thus unusable for non-destructive detections. Recently, a method called slow-MAS, using the concept of two dimensional NMR spectroscopy, has emerged as an alternative method for non- or minimal invasive metabolomics in intact tissues, including live animals, due to the slow or ultra-slow-sample spinning used. Although slow-MAS is a powerful method, its applications are hindered by experimental challenges. Correctly designing the experiment and choosing the appropriate slow-MAS method both require a fundamental understanding of the operation principles, in particular the details of line narrowing due to the presence of molecular diffusion. However, these fundamental principles have not yet been fully disclosed in previous publications. The goal of this chapter is to provide an in depth evaluation of the principles associated with slow-MAS techniques by emphasizing the challenges associated with a phantom sample consisting of glass beads and H2O, where an unusually large magnetic susceptibility field gradient is obtained.

  3. Cosmic polarization rotation: An astrophysical test of fundamental physics

    NASA Astrophysics Data System (ADS)

    di Serego Alighieri, Sperello

    2015-02-01

    Possible violations of fundamental physical principles, e.g. the Einstein equivalence principle on which all metric theories of gravity are based, including general relativity (GR), would lead to a rotation of the plane of polarization for linearly polarized radiation traveling over cosmological distances, the so-called cosmic polarization rotation (CPR). We review here the astrophysical tests which have been carried out so far to check if CPR exists. These are using the radio and ultraviolet polarization of radio galaxies and the polarization of the cosmic microwave background (both E-mode and B-mode). These tests so far have been negative, leading to upper limits of the order of one degree on any CPR angle, thereby increasing our confidence in those physical principles, including GR. We also discuss future prospects in detecting CPR or improving the constraints on it.

  4. The minimal autopoietic unit.

    PubMed

    Luisi, Pier Luigi

    2014-12-01

    It is argued that closed, cell-like compartments, may have existed in prebiotic time, showing a simplified metabolism which was bringing about a primitive form of stationary state- a kind of homeostasis. The autopoietic primitive cell can be taken as an example and there are preliminary experimental data supporting the possible existence of this primitive form of cell activity. The genetic code permits, among other things, the continuous self-reproduction of proteins; enzymic proteins permit the synthesis of nucleic acids, and in this way there is a perfect recycling between the two most important classes of biopolymers in our life. On the other hand, the genetic code is a complex machinery, which cannot be posed at the very early time of the origin of life. And the question then arises, whether some form of alternative beginning, prior to the genetic code, would have been possible: and this is the core of the question asked. Is something with the flavor of early life conceivable, prior to the genetic code? My answer is positive, although I am too well aware that the term "conceivable" does not mean that this something is easily to be performed experimentally. To illustrate my answer, I would first go back to the operational description of cellular life as given by the theory of autopoiesis. Accordingly, a living cell is an open system capable of self-maintenance, due to a process of internal self-regeneration of the components, all within a boundary which is itself product from within. This is a universal code, valid not only for a cell, but for any living macroscopic entity, as no living system exists on Earth which does not obey this principle. In this definition (or better operational description) there is no mention of DNA or genetic code. I added in that definition the term "open system"-which is not present in the primary literature (Varela, et al., 1974) to make clear that every living system is indeed an open system-without this addition, it may seem that

  5. Driving Toward Guiding Principles

    PubMed Central

    Buckovich, Suzy A.; Rippen, Helga E.; Rozen, Michael J.

    1999-01-01

    As health care moves from paper to electronic data collection, providing easier access and dissemination of health information, the development of guiding privacy, confidentiality, and security principles is necessary to help balance the protection of patients' privacy interests against appropriate information access. A comparative review and analysis was done, based on a compilation of privacy, confidentiality, and security principles from many sources. Principles derived from ten identified sources were compared with each of the compiled principles to assess support level, uniformity, and inconsistencies. Of 28 compiled principles, 23 were supported by at least 50 percent of the sources. Technology could address at least 12 of the principles. Notable consistencies among the principles could provide a basis for consensus for further legislative and organizational work. It is imperative that all participants in our health care system work actively toward a viable resolution of this information privacy debate. PMID:10094065

  6. Minimal Higgs inflation

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Kawai, Hikaru; Oda, Kin-ya

    2014-02-01

    We consider a possibility that the Higgs field in the Standard Model (SM) serves as an inflaton when its value is around the Planck scale. We assume that the SM is valid up to an ultraviolet cutoff scale Λ , which is slightly below the Planck scale, and that the Higgs potential becomes almost flat above Λ . Contrary to the ordinary Higgs inflation scenario, we do not assume the huge non-minimal coupling, of O(10^4), of the Higgs field to the Ricci scalar. We find that Λ must be less than 5× 10^{17} {GeV} in order to explain the observed fluctuation of the cosmic microwave background, no matter how we extrapolate the Higgs potential above Λ . The scale 10^{17} {GeV} coincides with the perturbative string scale, which suggests that the SM is directly connected with string theory. For this to be true, the top quark mass is restricted to around 171 GeV, with which Λ can exceed 10^{17} {GeV}. As a concrete example of the potential above Λ , we propose a simple log-type potential. The predictions of this specific model for the e-foldings N_*=50-60 are consistent with the current observation, namely, the scalar spectral index is n_s=0.977hbox {-}0.983 and the tensor to scalar ratio 0

  7. DOE Fundamentals Handbook: Classical Physics

    SciTech Connect

    Not Available

    1992-06-01

    The Classical Physics Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of physical forces and their properties. The handbook includes information on the units used to measure physical properties; vectors, and how they are used to show the net effect of various forces; Newton's Laws of motion, and how to use these laws in force and motion applications; and the concepts of energy, work, and power, and how to measure and calculate the energy involved in various applications. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility systems and equipment.

  8. Fundamental neutron physics at LANSCE

    SciTech Connect

    Greene, G.

    1995-10-01

    Modern neutron sources and science share a common origin in mid-20th-century scientific investigations concerned with the study of the fundamental interactions between elementary particles. Since the time of that common origin, neutron science and the study of elementary particles have evolved into quite disparate disciplines. The neutron became recognized as a powerful tool for studying condensed matter with modern neutron sources being primarily used (and justified) as tools for neutron scattering and materials science research. The study of elementary particles has, of course, led to the development of rather different tools and is now dominated by activities performed at extremely high energies. Notwithstanding this trend, the study of fundamental interactions using neutrons has continued and remains a vigorous activity at many contemporary neutron sources. This research, like neutron scattering research, has benefited enormously by the development of modern high-flux neutron facilities. Future sources, particularly high-power spallation sources, offer exciting possibilities for continuing this research.

  9. Fundamentals of gas measurement I

    SciTech Connect

    Dodds, D.E.

    1995-12-01

    To truly understand gas measurement, a person must understand gas measurement fundamentals. This includes the units of measurement, the behavior of the gas molecule, the property of gases, the gas laws, and the methods and means of measuring gas. Since the quality of gas is often the responsibility of the gas measurement technician, it is important that he or she have a knowledge of natural gas chemistry.

  10. TLS from fundamentals to practice

    PubMed Central

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Adams, Paul D.

    2014-01-01

    The Translation-Libration-Screw-rotation (TLS) model of rigid-body harmonic displacements introduced in crystallography by Schomaker & Trueblood (1968) is now a routine tool in macromolecular studies and is a feature of most modern crystallographic structure refinement packages. In this review we consider a number of simple examples that illustrate important features of the TLS model. Based on these examples simplified formulae are given for several special cases that may occur in structure modeling and refinement. The derivation of general TLS formulae from basic principles is also provided. This manuscript describes the principles of TLS modeling, as well as some select algorithmic details for practical application. An extensive list of applications references as examples of TLS in macromolecular crystallography refinement is provided. PMID:25249713

  11. Dynamic sealing principles. [design configurations for fluid leakage control

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function. They are: (1) selection and control of seal geometry, (2) control of leakage fluid properties, and (3) control of forces acting on leakage fluids. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin-film seals, which enable leakage calculations to be made, are also presented.

  12. Fundamentals of small animal orthodontics.

    PubMed

    Surgeon, Thoulton W

    2005-07-01

    The basic principles and concepts that govern the discipline of orthodontics are explored. The movement of teeth is mediated primarily through the periodontal ligament. When the periodontal ligament is stretched, bone apposition occurs. Conversely, in areas of compression, bone resorption occurs. The subject tooth moves in the direction of the force. The orthodontist must be cognizant of the prevailing ethical guidelines and the functional needs of the patient. PMID:15979517

  13. Fundamentals of interpretation in echocardiography

    SciTech Connect

    Harrigan, P.; Lee, R.M.

    1985-01-01

    This illustrated book provides familiarity with the many clinical, physical, and electronic factors that bear on echocardiographic interpretation. Physical and clinical principles are integrated with considerations of anatomy and physiology to address interpretive problems. This approach yields, for example, sections on the physics and electronics of M-mode, cross sectional, and Doppler systems which are informal, full of echocardiagrams, virtually devoid of mathematics, and rigorously related to common issues faced by echocardiograph interpreters.

  14. Bacillus subtilis and Escherichia coli essential genes and minimal cell factories after one decade of genome engineering.

    PubMed

    Juhas, Mario; Reuß, Daniel R; Zhu, Bingyao; Commichau, Fabian M

    2014-11-01

    Investigation of essential genes, besides contributing to understanding the fundamental principles of life, has numerous practical applications. Essential genes can be exploited as building blocks of a tightly controlled cell 'chassis'. Bacillus subtilis and Escherichia coli K-12 are both well-characterized model bacteria used as hosts for a plethora of biotechnological applications. Determination of the essential genes that constitute the B. subtilis and E. coli minimal genomes is therefore of the highest importance. Recent advances have led to the modification of the original B. subtilis and E. coli essential gene sets identified 10 years ago. Furthermore, significant progress has been made in the area of genome minimization of both model bacteria. This review provides an update, with particular emphasis on the current essential gene sets and their comparison with the original gene sets identified 10 years ago. Special attention is focused on the genome reduction analyses in B. subtilis and E. coli and the construction of minimal cell factories for industrial applications. PMID:25092907

  15. Systems Biology Perspectives on Minimal and Simpler Cells

    PubMed Central

    Xavier, Joana C.; Patil, Kiran Raosaheb

    2014-01-01

    SUMMARY The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. PMID:25184563

  16. Low Temperature Detectors: Principles and Applications

    SciTech Connect

    Hilton, G. C.

    2009-12-16

    Despite the added cost and complexity of operating at sub-Kelvin temperatures, there are many measurement applications where the sensitivity and precision provided by low temperature detectors greatly outweigh any disadvantages. As a result, low temperature detectors are now finding wide application for measurements ranging from cosmology to homeland defense. In this tutorial I will introduce the basic operating principles and fundamental performance limits of several types of low temperature detectors.

  17. Principles of thermoacoustic energy harvesting

    NASA Astrophysics Data System (ADS)

    Avent, A. W.; Bowen, C. R.

    2015-11-01

    Thermoacoustics exploit a temperature gradient to produce powerful acoustic pressure waves. The technology has a key role to play in energy harvesting systems. A time-line in the development of thermoacoustics is presented from its earliest recorded example in glass blowing through to the development of the Sondhauss and Rijke tubes to Stirling engines and pulse-tube cryo-cooling. The review sets the current literature in context, identifies key publications and promising areas of research. The fundamental principles of thermoacoustic phenomena are explained; design challenges and factors influencing efficiency are explored. Thermoacoustic processes involve complex multi-physical coupling and transient, highly non-linear relationships which are computationally expensive to model; appropriate numerical modelling techniques and options for analyses are presented. Potential methods of harvesting the energy in the acoustic waves are also examined.

  18. Fundamental Scientific Problems in Magnetic Recording

    SciTech Connect

    Schulthess, T.C.; Miller, M.K.

    2007-06-27

    Magnetic data storage technology is presently leading the high tech industry in advancing device integration--doubling the storage density every 12 months. To continue these advancements and to achieve terra bit per inch squared recording densities, new approaches to store and access data will be needed in about 3-5 years. In this project, collaboration between Oak Ridge National Laboratory (ORNL), Center for Materials for Information Technology (MINT) at University of Alabama (UA), Imago Scientific Instruments, and Seagate Technologies, was undertaken to address the fundamental scientific problems confronted by the industry in meeting the upcoming challenges. The areas that were the focus of this study were to: (1) develop atom probe tomography for atomic scale imaging of magnetic heterostructures used in magnetic data storage technology; (2) develop a first principles based tools for the study of exchange bias aimed at finding new anti-ferromagnetic materials to reduce the thickness of the pinning layer in the read head; (3) develop high moment magnetic materials and tools to study magnetic switching in nanostructures aimed at developing improved writers of high anisotropy magnetic storage media.

  19. Fundamental Limits to Cellular Sensing

    NASA Astrophysics Data System (ADS)

    ten Wolde, Pieter Rein; Becker, Nils B.; Ouldridge, Thomas E.; Mugler, Andrew

    2016-03-01

    In recent years experiments have demonstrated that living cells can measure low chemical concentrations with high precision, and much progress has been made in understanding what sets the fundamental limit to the precision of chemical sensing. Chemical concentration measurements start with the binding of ligand molecules to receptor proteins, which is an inherently noisy process, especially at low concentrations. The signaling networks that transmit the information on the ligand concentration from the receptors into the cell have to filter this receptor input noise as much as possible. These networks, however, are also intrinsically stochastic in nature, which means that they will also add noise to the transmitted signal. In this review, we will first discuss how the diffusive transport and binding of ligand to the receptor sets the receptor correlation time, which is the timescale over which fluctuations in the state of the receptor, arising from the stochastic receptor-ligand binding, decay. We then describe how downstream signaling pathways integrate these receptor-state fluctuations, and how the number of receptors, the receptor correlation time, and the effective integration time set by the downstream network, together impose a fundamental limit on the precision of sensing. We then discuss how cells can remove the receptor input noise while simultaneously suppressing the intrinsic noise in the signaling network. We describe why this mechanism of time integration requires three classes (groups) of resources—receptors and their integration time, readout molecules, energy—and how each resource class sets a fundamental sensing limit. We also briefly discuss the scheme of maximum-likelihood estimation, the role of receptor cooperativity, and how cellular copy protocols differ from canonical copy protocols typically considered in the computational literature, explaining why cellular sensing systems can never reach the Landauer limit on the optimal trade

  20. Solid Lubrication Fundamentals and Applications

    NASA Technical Reports Server (NTRS)

    Miyoshi, Kazuhisa

    2001-01-01

    Solid Lubrication Fundamentals and Applications description of the adhesion, friction, abrasion, and wear behavior of solid film lubricants and related tribological materials, including diamond and diamond-like solid films. The book details the properties of solid surfaces, clean surfaces, and contaminated surfaces as well as discussing the structures and mechanical properties of natural and synthetic diamonds; chemical-vapor-deposited diamond film; surface design and engineering toward wear-resistant, self-lubricating diamond films and coatings. The author provides selection and design criteria as well as applications for synthetic and natural coatings in the commercial, industrial and aerospace industries..

  1. Fundamentals of Clinical Outcomes Assessment for Spinal Disorders: Clinical Outcome Instruments and Applications.

    PubMed

    Vavken, Patrick; Ganal-Antonio, Anne Kathleen B; Quidde, Julia; Shen, Francis H; Chapman, Jens R; Samartzis, Dino

    2015-08-01

    Study Design A broad narrative review. Objectives Outcome assessment in spinal disorders is imperative to help monitor the safety and efficacy of the treatment in an effort to change the clinical practice and improve patient outcomes. The following article, part two of a two-part series, discusses the various outcome tools and instruments utilized to address spinal disorders and their management. Methods A thorough review of the peer-reviewed literature was performed, irrespective of language, addressing outcome research, instruments and tools, and applications. Results Numerous articles addressing the development and implementation of health-related quality-of-life, neck and low back pain, overall pain, spinal deformity, and other condition-specific outcome instruments have been reported. Their applications in the context of the clinical trial studies, the economic analyses, and overall evidence-based orthopedics have been noted. Additional issues regarding the problems and potential sources of bias utilizing outcomes scales and the concept of minimally clinically important difference were discussed. Conclusion Continuing research needs to assess the outcome instruments and tools used in the clinical outcome assessment for spinal disorders. Understanding the fundamental principles in spinal outcome assessment may also advance the field of "personalized spine care." PMID:26225283

  2. PRINCIPLE OF INTERACTION REGION LOCAL CORRECTION

    SciTech Connect

    WEI,J.

    1999-09-07

    For hadron storage rings like the Relativistic Heavy Ion Collider (RHIC) and the Large Hadron Collider (LHC), the machine performance at collision is usually limited by the field quality of the interaction region (IR) magnets. A robust local correction for the IR region is valuable in improving the dynamic aperture with practically achievable magnet field quality. The authors present in this paper the action-angle kick minimization principle on which the local IR correction for both RHIC and the LHC are based.

  3. Application of Kick Minimization to the RTML 'Front End'

    SciTech Connect

    Tenenbaum, P.; /SLAC

    2007-02-03

    The ''front end'' of the ILC RTML constitutes the sections of the RTML which are upstream of the first RF cavity of the first stage bunch compressor: specifically, the SKEW, COLL, TURN, SPIN, and EMIT sections. Although in principle it should be easy to transport the beam through these sections with low emittance growth, since the energy spread of the beam is relatively low, in practice it is difficult because of the large number of betatron wavelengths and strong focusing, especially in the TURN section. We report here on the use of the Kick Minimization Method for limiting the emittance growth in the ''front end'' of the RTML. Kick Minimization (KM) is a steering method which balances two optima: minimization of the RMS measured orbit on the BPMs (often called 1:1 steering), and minimization of the RMS corrector strength [1]. The simulation program used for these studies is Lucretia [2].

  4. An entropy method for induced drag minimization

    NASA Technical Reports Server (NTRS)

    Greene, George C.

    1989-01-01

    A fundamentally new approach to the aircraft minimum induced drag problem is presented. The method, a 'viscous lifting line', is based on the minimum entropy production principle and does not require the planar wake assumption. An approximate, closed form solution is obtained for several wing configurations including a comparison of wing extension, winglets, and in-plane wing sweep, with and without a constraint on wing-root bending moment. Like the classical lifting-line theory, this theory predicts that induced drag is proportional to the square of the lift coefficient and inversely proportioinal to the wing aspect ratio. Unlike the classical theory, it predicts that induced drag is Reynolds number dependent and that the optimum spanwise circulation distribution is non-elliptic.

  5. Influenza SIRS with Minimal Pneumonitis

    PubMed Central

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A.

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement.

  6. Guidelines for mixed waste minimization

    SciTech Connect

    Owens, C.

    1992-02-01

    Currently, there is no commercial mixed waste disposal available in the United States. Storage and treatment for commercial mixed waste is limited. Host States and compacts region officials are encouraging their mixed waste generators to minimize their mixed wastes because of management limitations. This document provides a guide to mixed waste minimization.

  7. Waste minimization handbook, Volume 1

    SciTech Connect

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  8. Guiding Principles for Evaluators.

    ERIC Educational Resources Information Center

    Shadish, William R., Ed.; And Others

    1995-01-01

    The 12 articles (including an index) of this theme issue are devoted to documenting and critiquing the American Evaluation Association's "Guiding Principles for Evaluators," a code of ethics and standards. The development of these principles is traced, and their strengths and weaknesses are analyzed at general and specific levels. (SLD)

  9. Assessment Principles and Tools

    PubMed Central

    Golnik, Karl C.

    2014-01-01

    The goal of ophthalmology residency training is to produce competent ophthalmologists. Competence can only be determined by appropriately assessing resident performance. There are accepted guiding principles that should be applied to competence assessment methods. These principles are enumerated herein and ophthalmology-specific assessment tools that are available are described. PMID:24791100

  10. Principled Grammar Teaching

    ERIC Educational Resources Information Center

    Batstone, Rob; Ellis, Rod

    2009-01-01

    A key aspect of the acquisition of grammar for second language learners involves learning how to make appropriate connections between grammatical forms and the meanings which they typically signal. We argue that learning form/function mappings involves three interrelated principles. The first is the Given-to-New Principle, where existing world…

  11. Hamilton's Principle for Beginners

    ERIC Educational Resources Information Center

    Brun, J. L.

    2007-01-01

    I find that students have difficulty with Hamilton's principle, at least the first time they come into contact with it, and therefore it is worth designing some examples to help students grasp its complex meaning. This paper supplies the simplest example to consolidate the learning of the quoted principle: that of a free particle moving along a…

  12. The genetic difference principle.

    PubMed

    Farrelly, Colin

    2004-01-01

    In the newly emerging debates about genetics and justice three distinct principles have begun to emerge concerning what the distributive aim of genetic interventions should be. These principles are: genetic equality, a genetic decent minimum, and the genetic difference principle. In this paper, I examine the rationale of each of these principles and argue that genetic equality and a genetic decent minimum are ill-equipped to tackle what I call the currency problem and the problem of weight. The genetic difference principle is the most promising of the three principles and I develop this principle so that it takes seriously the concerns of just health care and distributive justice in general. Given the strains on public funds for other important social programmes, the costs of pursuing genetic interventions and the nature of genetic interventions, I conclude that a more lax interpretation of the genetic difference principle is appropriate. This interpretation stipulates that genetic inequalities should be arranged so that they are to the greatest reasonable benefit of the least advantaged. Such a proposal is consistent with prioritarianism and provides some practical guidance for non-ideal societies--that is, societies that do not have the endless amount of resources needed to satisfy every requirement of justice. PMID:15186680

  13. The Principles of Leadership.

    ERIC Educational Resources Information Center

    Burns, Gerald P.

    The primary but not exclusive concern in this monograph is the principles and qualities of dynamic leaders of people rather than of ideas or cultural and artistic pursuits. Theories of leadership in the past, present, and future are discussed, as are the principles, rewards, exercise, and philosophy of leadership. A bibliography is included. (MSE)

  14. Government Information Policy Principles.

    ERIC Educational Resources Information Center

    Hernon, Peter

    1991-01-01

    Analyzes the utility of policy principles advanced by professional associations for public access to government information. The National Commission on Libraries and Information Science (NCLIS), the Information Industry Association (IIA), and the Office of Technology Assessment (OTA) urge the adoption of principles for the dissemination of public…

  15. Principlism and communitarianism.

    PubMed

    Callahan, D

    2003-10-01

    The decline in the interest in ethical theory is first outlined, as a background to the author's discussion of principlism. The author's own stance, that of a communitarian philosopher, is then described, before the subject of principlism itself is addressed. Two problems stand in the way of the author's embracing principlism: its individualistic bias and its capacity to block substantive ethical inquiry. The more serious problem the author finds to be its blocking function. Discussing the four scenarios the author finds that the utility of principlism is shown in the two scenarios about Jehovah's Witnesses but that when it comes to selling kidneys for transplantation and germline enhancement, principlism is of little help. PMID:14519838

  16. The simplicity principle in perception and cognition.

    PubMed

    Feldman, Jacob

    2016-09-01

    The simplicity principle, traditionally referred to as Occam's razor, is the idea that simpler explanations of observations should be preferred to more complex ones. In recent decades the principle has been clarified via the incorporation of modern notions of computation and probability, allowing a more precise understanding of how exactly complexity minimization facilitates inference. The simplicity principle has found many applications in modern cognitive science, in contexts as diverse as perception, categorization, reasoning, and neuroscience. In all these areas, the common idea is that the mind seeks the simplest available interpretation of observations- or, more precisely, that it balances a bias toward simplicity with a somewhat opposed constraint to choose models consistent with perceptual or cognitive observations. This brief tutorial surveys some of the uses of the simplicity principle across cognitive science, emphasizing how complexity minimization in a number of forms has been incorporated into probabilistic models of inference. WIREs Cogn Sci 2016, 7:330-340. doi: 10.1002/wcs.1406 For further resources related to this article, please visit the WIREs website. PMID:27470193

  17. Challenging the principle of proportionality.

    PubMed

    Andersson, Anna-Karin Margareta

    2016-04-01

    The first objective of this article is to examine one aspect of the principle of proportionality (PP) as advanced by Alan Gewirth in his 1978 bookReason and Morality Gewirth claims that being capable of exercising agency to some minimal degree is a property that justifies having at least prima facie rights not to get killed. However, according to the PP, before the being possesses the capacity for exercising agency to that minimal degree, the extent of her rights depends on to what extent she approaches possession of agential capacities. One interpretation of PP holds that variations in degree of possession of the physical constitution necessary to exercise agency are morally relevant. The other interpretation holds that only variations in degree of actual mental capacity are morally relevant. The first of these interpretations is vastly more problematic than the other. The second objective is to argue that according to the most plausible interpretation of the PP, the fetus' level of development before at least the 20th week of pregnancy does not affect the fetus' moral rights status. I then suggest that my argument is not restricted to such fetuses, although extending my argument to more developed fetuses requires caution. PMID:26839114

  18. Astronomical reach of fundamental physics.

    PubMed

    Burrows, Adam S; Ostriker, Jeremiah P

    2014-02-18

    Using basic physical arguments, we derive by dimensional and physical analysis the characteristic masses and sizes of important objects in the universe in terms of just a few fundamental constants. This exercise illustrates the unifying power of physics and the profound connections between the small and the large in the cosmos we inhabit. We focus on the minimum and maximum masses of normal stars, the corresponding quantities for neutron stars, the maximum mass of a rocky planet, the maximum mass of a white dwarf, and the mass of a typical galaxy. To zeroth order, we show that all these masses can be expressed in terms of either the Planck mass or the Chandrasekar mass, in combination with various dimensionless quantities. With these examples, we expose the deep interrelationships imposed by nature between disparate realms of the universe and the amazing consequences of the unifying character of physical law. PMID:24477692

  19. Fundamentals of air quality systems

    SciTech Connect

    Noll, K.E.

    1999-08-01

    The book uses numerous examples to demonstrate how basic design concepts can be applied to the control of air emissions from industrial sources. It focuses on the design of air pollution control devices for the removal of gases and particles from industrial sources, and provides detailed, specific design methods for each major air pollution control system. Individual chapters provide design methods that include both theory and practice with emphasis on the practical aspect by providing numerous examples that demonstrate how air pollution control devices are designed. Contents include air pollution laws, air pollution control devices; physical properties of air, gas laws, energy concepts, pressure; motion of airborne particles, filter and water drop collection efficiency; fundamentals of particulate emission control; cyclones; fabric filters; wet scrubbers; electrostatic precipitators; control of volatile organic compounds; adsorption; incineration; absorption; control of gaseous emissions from motor vehicles; practice problems (with solutions) for the P.E. examination in environmental engineering. Design applications are featured throughout.

  20. Understand vacuum-system fundamentals

    SciTech Connect

    Martin, G.R. ); Lines, J.R. ); Golden, S.W. )

    1994-10-01

    Crude vacuum unit heavy vacuum gas-oil (HVGO) yield is significantly impacted by ejector-system performance, especially at conditions below 20 mmHg absolute pressure. A deepcut vacuum unit, to reliably meet the yields, calls for proper design of all the major pieces of equipment. Ejector-system performance at deepcut vacuum column pressures may be independently or concurrently affected by: atmospheric column overflash, stripper performance or cutpoint; vacuum column top temperature and heat balance; light vacuum gas-oil (LVGO) pumparound entrainment to the ejector system; cooling-water temperature; motive steam pressure; non-condensible loading, either air leakage or cracked light-end hydrocarbons; condensible hydrocarbons; intercondenser or aftercondenser fouling ejector internal erosion or product build-up; and system vent back pressure. The paper discusses gas-oil yields; ejector-system fundamentals; condensers; vacuum-system troubleshooting; process operations; and a case study of deepcut operations.

  1. Fundamental reaction pathways during coprocessing

    SciTech Connect

    Stock, L.M.; Gatsis, J.G.

    1992-12-01

    The objective of this research was to investigate the fundamental reaction pathways in coal petroleum residuum coprocessing. Once the reaction pathways are defined, further efforts can be directed at improving those aspects of the chemistry of coprocessing that are responsible for the desired results such as high oil yields, low dihydrogen consumption, and mild reaction conditions. We decided to carry out this investigation by looking at four basic aspects of coprocessing: (1) the effect of fossil fuel materials on promoting reactions essential to coprocessing such as hydrogen atom transfer, carbon-carbon bond scission, and hydrodemethylation; (2) the effect of varied mild conditions on the coprocessing reactions; (3) determination of dihydrogen uptake and utilization under severe conditions as a function of the coal or petroleum residuum employed; and (4) the effect of varied dihydrogen pressure, temperature, and residence time on the uptake and utilization of dihydrogen and on the distribution of the coprocessed products. Accomplishments are described.

  2. Astronomical reach of fundamental physics

    PubMed Central

    Burrows, Adam S.; Ostriker, Jeremiah P.

    2014-01-01

    Using basic physical arguments, we derive by dimensional and physical analysis the characteristic masses and sizes of important objects in the universe in terms of just a few fundamental constants. This exercise illustrates the unifying power of physics and the profound connections between the small and the large in the cosmos we inhabit. We focus on the minimum and maximum masses of normal stars, the corresponding quantities for neutron stars, the maximum mass of a rocky planet, the maximum mass of a white dwarf, and the mass of a typical galaxy. To zeroth order, we show that all these masses can be expressed in terms of either the Planck mass or the Chandrasekar mass, in combination with various dimensionless quantities. With these examples, we expose the deep interrelationships imposed by nature between disparate realms of the universe and the amazing consequences of the unifying character of physical law. PMID:24477692

  3. [INFORMATION, A FUNDAMENTAL PATIENT RIGHT?].

    PubMed

    Mémeteau, Gérard

    2015-03-01

    Although expressed before the "Lambert" case, which has led us to think about refusal and assent in the context of internal rights, conventional rights--and in the context of the patient's bed!--these simple remarks present the patient's right to medical information as a so-called fundamental right. But it can only be understood with a view to a treatment or other medical act; otherwise it has no reason to be and is only an academic exercise, however exciting, but not much use by itself. What if we reversed the terms of the problem: the right of the doctor to information? (The beautiful thesis of Ph. Gaston, Paris 8, 2 December 2014). PMID:26606765

  4. Fundamental Travel Demand Model Example

    NASA Technical Reports Server (NTRS)

    Hanssen, Joel

    2010-01-01

    Instances of transportation models are abundant and detailed "how to" instruction is available in the form of transportation software help documentation. The purpose of this paper is to look at the fundamental inputs required to build a transportation model by developing an example passenger travel demand model. The example model reduces the scale to a manageable size for the purpose of illustrating the data collection and analysis required before the first step of the model begins. This aspect of the model development would not reasonably be discussed in software help documentation (it is assumed the model developer comes prepared). Recommendations are derived from the example passenger travel demand model to suggest future work regarding the data collection and analysis required for a freight travel demand model.

  5. Holographic viscosity of fundamental matter.

    PubMed

    Mateos, David; Myers, Robert C; Thomson, Rowan M

    2007-03-01

    A holographic dual of a finite-temperature SU(Nc) gauge theory with a small number of flavors Nf or =1/4pi. Given the known results for the entropy density, the contribution of the fundamental matter eta fund is therefore enhanced at strong 't Hooft coupling lambda; for example, eta fund approximately lambda NcNfT3 in four dimensions. Other transport coefficients are analogously enhanced. These results hold with or without a baryon number chemical potential. PMID:17358523

  6. Cognition is … Fundamentally Cultural

    PubMed Central

    Bender, Andrea; Beller, Sieghard

    2013-01-01

    A prevailing concept of cognition in psychology is inspired by the computer metaphor. Its focus on mental states that are generated and altered by information input, processing, storage and transmission invites a disregard for the cultural dimension of cognition, based on three (implicit) assumptions: cognition is internal, processing can be distinguished from content, and processing is independent of cultural background. Arguing against each of these assumptions, we point out how culture may affect cognitive processes in various ways, drawing on instances from numerical cognition, ethnobiological reasoning, and theory of mind. Given the pervasive cultural modulation of cognition—on all of Marr’s levels of description—we conclude that cognition is indeed fundamentally cultural, and that consideration of its cultural dimension is essential for a comprehensive understanding. PMID:25379225

  7. Fundamental enabling issues in nanotechnology :

    SciTech Connect

    Floro, Jerrold Anthony; Foiles, Stephen Martin; Hearne, Sean Joseph; Hoyt, Jeffrey John; Seel, Steven Craig; Webb, Edmund Blackburn,; Morales, Alfredo Martin; Zimmerman, Jonathan A.

    2007-10-01

    To effectively integrate nanotechnology into functional devices, fundamental aspects of material behavior at the nanometer scale must be understood. Stresses generated during thin film growth strongly influence component lifetime and performance; stress has also been proposed as a mechanism for stabilizing supported nanoscale structures. Yet the intrinsic connections between the evolving morphology of supported nanostructures and stress generation are still a matter of debate. This report presents results from a combined experiment and modeling approach to study stress evolution during thin film growth. Fully atomistic simulations are presented predicting stress generation mechanisms and magnitudes during all growth stages, from island nucleation to coalescence and film thickening. Simulations are validated by electrodeposition growth experiments, which establish the dependence of microstructure and growth stresses on process conditions and deposition geometry. Sandia is one of the few facilities with the resources to combine experiments and modeling/theory in this close a fashion. Experiments predicted an ongoing coalescence process that generates signficant tensile stress. Data from deposition experiments also supports the existence of a kinetically limited compressive stress generation mechanism. Atomistic simulations explored island coalescence and deposition onto surfaces intersected by grain boundary structures to permit investigation of stress evolution during later growth stages, e.g. continual island coalescence and adatom incorporation into grain boundaries. The predictive capabilities of simulation permit direct determination of fundamental processes active in stress generation at the nanometer scale while connecting those processes, via new theory, to continuum models for much larger island and film structures. Our combined experiment and simulation results reveal the necessary materials science to tailor stress, and therefore performance, in

  8. Rare Isotopes and Fundamental Symmetries

    NASA Astrophysics Data System (ADS)

    Brown, B. Alex; Engel, Jonathan; Haxton, Wick; Ramsey-Musolf, Michael; Romalis, Michael; Savard, Guy

    2009-01-01

    Experiments searching for new interactions in nuclear beta decay / Klaus P. Jungmann -- The beta-neutrino correlation in sodium-21 and other nuclei / P. A. Vetter ... [et al.] -- Nuclear structure and fundamental symmetries/ B. Alex Brown -- Schiff moments and nuclear structure / J. Engel -- Superallowed nuclear beta decay: recent results and their impact on V[symbol] / J. C. Hardy and I. S. Towner -- New calculation of the isospin-symmetry breaking correlation to superallowed Fermi beta decay / I. S. Towner and J. C. Hardy -- Precise measurement of the [symbol]H to [symbol]He mass difference / D. E. Pinegar ... [et al.] -- Limits on scalar currents from the 0+ to 0+ decay of [symbol]Ar and isospin breaking in [symbol]Cl and [symbol]Cl / A. Garcia -- Nuclear constraints on the weak nucleon-nucleon interaction / W. C. Haxton -- Atomic PNC theory: current status and future prospects / M. S. Safronova -- Parity-violating nucleon-nucleon interactions: what can we learn from nuclear anapole moments? / B. Desplanques -- Proposed experiment for the measurement of the anapole moment in francium / A. Perez Galvan ... [et al.] -- The Radon-EDM experiment / Tim Chupp for the Radon-EDM collaboration -- The lead radius Eexperiment (PREX) and parity violating measurements of neutron densities / C. J. Horowitz -- Nuclear structure aspects of Schiff moment and search for collective enhancements / Naftali Auerbach and Vladimir Zelevinsky -- The interpretation of atomic electric dipole moments: Schiff theorem and its corrections / C. -P. Liu -- T-violation and the search for a permanent electric dipole moment of the mercury atom / M. D. Swallows ... [et al.] -- The new concept for FRIB and its potential for fundamental interactions studies / Guy Savard -- Collinear laser spectroscopy and polarized exotic nuclei at NSCL / K. Minamisono -- Environmental dependence of masses and coupling constants / M. Pospelov.

  9. Minimizing waste in environmental restoration

    SciTech Connect

    Moos, L.; Thuot, J.R.

    1996-07-01

    Environmental restoration, decontamination and decommissioning and facility dismantelment projects are not typically known for their waste minimization and pollution prevention efforts. Typical projects are driven by schedules and milestones with little attention given to cost or waste minimization. Conventional wisdom in these projects is that the waste already exists and cannot be reduced or minimized. In fact, however, there are three significant areas where waste and cost can be reduced. Waste reduction can occur in three ways: beneficial reuse or recycling; segregation of waste types; and reducing generation of secondary waste. This paper will discuss several examples of reuse, recycle, segregation, and secondary waste reduction at ANL restoration programs.

  10. [Essential genes, minimal genome and synthetic cell of bacteria: a review].

    PubMed

    Qiu, Dongru

    2012-05-01

    Single-cell prokaryotes represent a simple and primitive cellular life form. The identification of the essential genes of bacteria and the minimal genome for the free-living cellular life could provide insights into the origin, evolution, and essence of life forms. The principles, methodology, and recent progresses in the identification of essential genes and minimal genome and the creation of synthetic cells are reviewed and particularly the strategies for creating the minimal genome and the potential applications are introduced. PMID:22916492

  11. Irreversible information loss: Fundamental notions and entropy costs

    NASA Astrophysics Data System (ADS)

    Anderson, Neal G.

    2014-09-01

    Landauer's Principle (LP) associates an entropy increase with the irreversible loss of information from a physical system. Clear statement, unambiguous interpretation, and proper application of LP requires precise, mutually consistent, and sufficiently general definitions for a set of interlocking fundamental notions and quantities (entropy, information, irreversibility, erasure). In this work, we critically assess some common definitions and quantities used or implied in statements of LP, and reconsider their definition within an alternative “referential” approach to physical information theory that embodies an overtly relational conception of physical information. We prove an inequality on the entropic cost of irreversible information loss within this context, as well as “referential analogs” of LP and its more general restatement by Bennett. Advantages of the referential approach for establishing fundamental limits on the physical costs of irreversible information loss in communication and computing systems are discussed throughout.

  12. From fundamental fields to constituent quarks and nucleon form factors

    SciTech Connect

    Coester, F.

    1990-01-01

    Constituent-quark models formulated in the frame work of nonrelativistic quantum mechanics have been successful in accounting for the mass spectra of mesons and baryons. Applications to elastic electron scattering require relativistic dynamics. Relativistic quantum mechanics of constituent quarks can be formulated by constructing a suitable unitary representation of the Poincare group on the three-quark Hilbert space. The mass and spin operators of this representation specify the relativistic model dynamics. The dynamics of fundamental quark fields, on the other hand, is specified by a Euclidean functional integral. In this paper I show how the dynamics of the fundamental fields can be related in principle to the Hamiltonian dynamics of quark particles through the properties of the Wightman functions. 14 refs.

  13. The thrust minimization problem and its applications

    NASA Astrophysics Data System (ADS)

    Ivanyukhin, A. V.; Petukhov, V. G.

    2015-07-01

    An indirect approach to the optimization of trajectories with finite thrust based on Pontryagin's maximum principle is discussed. The optimization is aimed at calculating the minimum thrust for a point-to-point flight completed within a given interval of time with a constant exhaust velocity and a constant power. This may help calculate the region of existence of the optimum trajectory with thrust switching: it is evident that the latter problem may be solved if minimum thrust is lower than or equal to the available thrust in the problem with switching. A technique for calculating the optimum trajectories with a finite thrust by solving the problem of minimization of the thrust acceleration with a subsequent numerical continuation with respect to the mass flow towards the thrust minimization problem is proposed. This technique offers an opportunity to detect degeneracies associated with the lack of thrust or specific impulse. In effect, it allows one to calculate the boundaries of the region of existence of trajectories with thrust switching and thus makes it possible to automate the process of solving the problem of optimization of trajectories with thrust switching.

  14. Minimally invasive treatment options in fixed prosthodontics.

    PubMed

    Edelhoff, Daniel; Liebermann, Anja; Beuer, Florian; Stimmelmayr, Michael; Güth, Jan-Frederik

    2016-03-01

    Minimally invasive treatment options have become increasingly feasible in restorative dentistry, due to the introduction of the adhesive technique in combination with restorative materials featuring translucent properties similar to those of natural teeth. Mechanical anchoring of restorations via conventional cementation represents a predominantly subtractive treatment approach that is gradually being superseded by a primarily defect-oriented additive method in prosthodontics. Modifications of conventional treatment procedures have led to the development of an economical approach to the removal of healthy tooth structure. This is possible because the planned treatment outcome is defined in a wax-up before the treatment is commenced and this wax-up is subsequently used as a reference during tooth preparation. Similarly, resin- bonded FDPs and implants have made it possible to preserve the natural tooth structure of potential abutment teeth. This report describes a number of clinical cases to demonstrate the principles of modern prosthetic treatment strategies and discusses these approaches in the context of minimally invasive prosthetic dentistry. PMID:26925471

  15. [Minimally Invasive Open Surgery for Lung Cancer].

    PubMed

    Nakagawa, Kazuo; Watanabe, Shunichi

    2016-07-01

    Significant efforts have been made to reduce the invasiveness of surgical procedures by surgeons for a long time. Surgeons always keep it in mind that the basic principle performing less invasive surgical procedures for malignant tumors is to decrease the invasiveness for patients without compromising oncological curability and surgical safety. Video-assisted thoracic surgery (VATS) has been used increasingly as a minimally invasive approach to lung cancer surgery. Whereas, whether VATS lobectomy is a less invasive procedure and has equivalent or better clinical effect compared with open lobectomy for patients with lung cancer remains controversial because of the absence of randomized prospective studies. The degree of difficulty for anatomical lung resection depends on the degree of the fissure development, mobility of hilar lymph nodes, and the degree of pleural adhesions. During pulmonary surgery, thoracic surgeons always have to deal with not only these difficulties but other unexpected events such as intraoperative bleeding. Recently, we perform pulmonary resection for lung cancer with minimally invasive open surgery (MIOS) approach. In this article, we introduce the surgical procedure of MIOS and demonstrate short-term results. Off course, the efficacy of MIOS needs to be further evaluated with long-term results. PMID:27440030

  16. Mitral valve surgery - minimally invasive

    MedlinePlus

    ... that does many of these procedures. Minimally invasive heart valve surgery has improved greatly in recent years. These ... WT, Mack MJ. Transcatheter cardiac valve interventions. Surg Clin North Am . 2009;89:951-66. ...

  17. Heart bypass surgery - minimally invasive

    MedlinePlus

    ... in 30-day outcomes in high-risk patients randomized to off-pump versus on-pump coronary bypass ... Thiele H, Neumann-Schniedewind P, Jacobs S, et al. Randomized comparison of minimally invasive direct coronary artery bypass ...

  18. Lanthanide upconversion luminescence at the nanoscale: fundamentals and optical properties.

    PubMed

    Nadort, Annemarie; Zhao, Jiangbo; Goldys, Ewa M

    2016-07-01

    Upconversion photoluminescence is a nonlinear effect where multiple lower energy excitation photons produce higher energy emission photons. This fundamentally interesting process has many applications in biomedical imaging, light source and display technology, and solar energy harvesting. In this review we discuss the underlying physical principles and their modelling using rate equations. We discuss how the understanding of photophysical processes enabled a strategic influence over the optical properties of upconversion especially in rationally designed materials. We subsequently present an overview of recent experimental strategies to control and optimize the optical properties of upconversion nanoparticles, focussing on their emission spectral properties and brightness. PMID:26986473

  19. Lanthanide upconversion luminescence at the nanoscale: fundamentals and optical properties

    NASA Astrophysics Data System (ADS)

    Nadort, Annemarie; Zhao, Jiangbo; Goldys, Ewa M.

    2016-07-01

    Upconversion photoluminescence is a nonlinear effect where multiple lower energy excitation photons produce higher energy emission photons. This fundamentally interesting process has many applications in biomedical imaging, light source and display technology, and solar energy harvesting. In this review we discuss the underlying physical principles and their modelling using rate equations. We discuss how the understanding of photophysical processes enabled a strategic influence over the optical properties of upconversion especially in rationally designed materials. We subsequently present an overview of recent experimental strategies to control and optimize the optical properties of upconversion nanoparticles, focussing on their emission spectral properties and brightness.

  20. The duality principle and inversion of Laplace-Stielties transforms

    NASA Astrophysics Data System (ADS)

    Pavelyev, A. G.

    2016-04-01

    The fundamental relation between the Laplace transform, the Stielties transform, and the generalized integral equation of refraction is revealed, and a duality principle is formulated for the solution of inverse problems of radio physics. New formulas of the Laplace-transform inversion satisfying the duality principle are obtained. There is no necessity of contour integration in a complex plane for the relations found, which considerably simplifies the reconstruction of originals and makes it possible to control systematic errors in the experimental data.

  1. The SAMI Pilot Survey: the fundamental and mass planes in three low-redshift clusters

    NASA Astrophysics Data System (ADS)

    Scott, Nicholas; Fogarty, L. M. R.; Owers, Matt S.; Croom, Scott M.; Colless, Matthew; Davies, Roger L.; Brough, S.; Pracy, Michael B.; Bland-Hawthorn, Joss; Jones, D. Heath; Allen, J. T.; Bryant, Julia J.; Cortese, Luca; Goodwin, Michael; Green, Andrew W.; Konstantopoulos, Iraklis S.; Lawrence, J. S.; Richards, Samuel; Sharp, Rob

    2015-08-01

    Using new integral field observations of 106 galaxies in three nearby clusters, we investigate how the intrinsic scatter of the Fundamental Plane depends on the way in which the velocity dispersion and effective radius are measured. Our spatially resolved spectroscopy, combined with a cluster sample with negligible relative distance errors, allows us to derive a Fundamental Plane with minimal systematic uncertainties. From the apertures we tested, we find that velocity dispersions measured within a circular aperture with radius equal to one effective radius minimizes the intrinsic scatter of the Fundamental Plane. Using simple yet powerful Jeans dynamical models, we determine dynamical masses for our galaxies. Replacing luminosity in the Fundamental Plane with dynamical mass, we demonstrate that the resulting Mass Plane has further reduced scatter, consistent with zero intrinsic scatter. Using these dynamical models, we also find evidence for a possibly non-linear relationship between dynamical mass-to-light ratio and velocity dispersion.

  2. Communication: Fitting potential energy surfaces with fundamental invariant neural network.

    PubMed

    Shao, Kejie; Chen, Jun; Zhao, Zhiqiang; Zhang, Dong H

    2016-08-21

    A more flexible neural network (NN) method using the fundamental invariants (FIs) as the input vector is proposed in the construction of potential energy surfaces for molecular systems involving identical atoms. Mathematically, FIs finitely generate the permutation invariant polynomial (PIP) ring. In combination with NN, fundamental invariant neural network (FI-NN) can approximate any function to arbitrary accuracy. Because FI-NN minimizes the size of input permutation invariant polynomials, it can efficiently reduce the evaluation time of potential energy, in particular for polyatomic systems. In this work, we provide the FIs for all possible molecular systems up to five atoms. Potential energy surfaces for OH3 and CH4 were constructed with FI-NN, with the accuracy confirmed by full-dimensional quantum dynamic scattering and bound state calculations. PMID:27544080

  3. Minimizing pollutants with multimedia strategies

    SciTech Connect

    Phillips, J.B.; Hindawi, M.A.

    1997-01-01

    A multimedia approach to pollution prevention that focuses on minimizing or eliminating production of pollutants is one of the most advantageous strategies to adopt in preparing an overall facility environmental plan. If processes are optimized to preclude or minimize the manufacture of streams containing pollutants, or to reduce the levels of pollutants in waste streams, then the task of multimedia pollution prevention becomes more manageable simply as a result of a smaller problem needing to be addressed. An orderly and systematic approach to waste minimization can result in a comprehensive strategy to reduce the production of waste streams and simultaneously improve the profitability of a process or industrial operation. There are a number of miscellaneous strategies for a waste minimization that attack the problem via process chemistry or engineering. Examples include installation of low-NO{sub x} burners, selection of valves that minimize fugitive emissions, high-level switches on storage tanks, the use of in-plant stills for recycling and reusing solvents and using water-based products instead of hydrocarbon-based products wherever possible. Other waste minimization countermeasures can focus on O and M issues.

  4. Archimedes' Principle in Action

    ERIC Educational Resources Information Center

    Kires, Marian

    2007-01-01

    The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers. (Contains 5 figures.)

  5. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1972-01-01

    Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)

  6. Global ethics and principlism.

    PubMed

    Gordon, John-Stewart

    2011-09-01

    This article examines the special relation between common morality and particular moralities in the four-principles approach and its use for global ethics. It is argued that the special dialectical relation between common morality and particular moralities is the key to bridging the gap between ethical universalism and relativism. The four-principles approach is a good model for a global bioethics by virtue of its ability to mediate successfully between universal demands and cultural diversity. The principle of autonomy (i.e., the idea of individual informed consent), however, does need to be revised so as to make it compatible with alternatives such as family- or community-informed consent. The upshot is that the contribution of the four-principles approach to global ethics lies in the so-called dialectical process and its power to deal with cross-cultural issues against the background of universal demands by joining them together. PMID:22073817

  7. Fundamentals of the DIGES code

    SciTech Connect

    Simos, N.; Philippacopoulos, A.J.

    1994-08-01

    Recently the authors have completed the development of the DIGES code (Direct GEneration of Spectra) for the US Nuclear Regulatory Commission. This paper presents the fundamental theoretical aspects of the code. The basic modeling involves a representation of typical building-foundation configurations as multi degree-of-freedom dynamic which are subjected to dynamic inputs in the form of applied forces or pressure at the superstructure or in the form of ground motions. Both the deterministic as well as the probabilistic aspects of DIGES are described. Alternate ways of defining the seismic input for the estimation of in-structure spectra and their consequences in terms of realistically appraising the variability of the structural response is discussed in detaiL These include definitions of the seismic input by ground acceleration time histories, ground response spectra, Fourier amplitude spectra or power spectral densities. Conversions of one of these forms to another due to requirements imposed by certain analysis techniques have been shown to lead, in certain cases, in controversial results. Further considerations include the definition of the seismic input as the excitation which is directly applied at the foundation of a structure or as the ground motion of the site of interest at a given point. In the latter case issues related to the transferring of this motion to the foundation through convolution/deconvolution and generally through kinematic interaction approaches are considered.

  8. Fundamental studies of fusion plasmas

    SciTech Connect

    Aamodt, R.E.; Catto, P.J.; D'Ippolito, D.A.; Myra, J.R.; Russell, D.A.

    1992-05-26

    The major portion of this program is devoted to critical ICH phenomena. The topics include edge physics, fast wave propagation, ICH induced high frequency instabilities, and a preliminary antenna design for Ignitor. This research was strongly coordinated with the world's experimental and design teams at JET, Culham, ORNL, and Ignitor. The results have been widely publicized at both general scientific meetings and topical workshops including the speciality workshop on ICRF design and physics sponsored by Lodestar in April 1992. The combination of theory, empirical modeling, and engineering design in this program makes this research particularly important for the design of future devices and for the understanding and performance projections of present tokamak devices. Additionally, the development of a diagnostic of runaway electrons on TEXT has proven particularly useful for the fundamental understanding of energetic electron confinement. This work has led to a better quantitative basis for quasilinear theory and the role of magnetic vs. electrostatic field fluctuations on electron transport. An APS invited talk was given on this subject and collaboration with PPPL personnel was also initiated. Ongoing research on these topics will continue for the remainder fo the contract period and the strong collaborations are expected to continue, enhancing both the relevance of the work and its immediate impact on areas needing critical understanding.

  9. Technological fundamentals of endoscopic haemostasis.

    PubMed

    Reidenbach, H D

    1992-01-01

    In order to perform endoscopic haemostasis there exist several different mechanical, biochemical and thermal methods, which may be applied together with rigid or fully flexible endoscopes in different situations. The technological fundamentals of convective, conductive and radiative heat transfer, the irradiation with coherent electromagnetic waves like microwaves and laser radiation and the resistive heating by RF-current are described. A review of the state of the art of haemostatic coagulation by laser radiation (photocoagulation) and radio-frequency currents (surgical diathermy, high-frequency coagulation) is given. The wavelength-dependent interactions of coherent light waves are compared especially for the three mainly different laser types, i.e., carbon-dioxide-, neodymium-YAG- and argon-ion-laser. The well-known disadvantages of the conventional RF-coagulation are overcome by the so-called electrohydrothermosation (EHT), i.e. the liquid-assisted application of resistive heating of biological tissues to perform haemostasis. Different technological solutions for bipolar RF-coagulation probes including ball-tips and forceps are shown and the first experimental results are discussed in comparison. PMID:1595405

  10. Do goldfish miss the fundamental?

    NASA Astrophysics Data System (ADS)

    Fay, Richard R.

    2003-10-01

    The perception of harmonic complexes was studied in goldfish using classical respiratory conditioning and a stimulus generalization paradigm. Groups of animals were initially conditioned to several harmonic complexes with a fundamental frequency (f0) of 100 Hz. ln some cases the f0 component was present, and in other cases, the f0 component was absent. After conditioning, animals were tested for generalization to novel harmonic complexes having different f0's, some with f0 present and some with f0 absent. Generalization gradients always peaked at 100 Hz, indicating that the pitch value of the conditioning complexes was consistent with the f0, whether or not f0 was present in the conditioning or test complexes. Thus, goldfish do not miss the fundmental with respect to a pitch-like perceptual dimension. However, generalization gradients tended to have different skirt slopes for the f0-present and f0-absent conditioning and test stimuli. This suggests that goldfish distinguish between f0 present/absent stimuli, probably on the basis of a timbre-like perceptual dimension. These and other results demonstrate that goldfish respond to complex sounds as if they possessed perceptual dimensions similar to pitch and timbre as defined for human and other vertebrate listeners. [Work supported by NIH/NIDCD.

  11. Principles of Tendon Transfer.

    PubMed

    Wilbur, Danielle; Hammert, Warren C

    2016-08-01

    Tendon transfers provide a substitute, either temporary or permanent, when function is lost due to neurologic injury in stroke, cerebral palsy or central nervous system lesions, peripheral nerve injuries, or injuries to the musculotendinous unit itself. This article reviews the basic principles of tendon transfer, which are important when planning surgery and essential for an optimal outcome. In addition, concepts for coapting the tendons during surgery and general principles to be followed during the rehabilitation process are discussed. PMID:27387072

  12. BOOK REVIEWS: Quantum Mechanics: Fundamentals

    NASA Astrophysics Data System (ADS)

    Whitaker, A.

    2004-02-01

    mechanics, which is assumed, but to examine whether it gives a consistent account of measurement. The conclusion is that after a measurement, interference terms are ‘effectively’ absent; the set of ‘one-to-one correlations between states of the apparatus and the object’ has the same form as that of everyday statistics and is thus a probability distribution. This probability distribution refers to potentialities, only one of which is actually realized in any one trial. Opinions may differ on whether their treatment is any less vulnerable to criticisms such as those of Bell. To sum up, Gottfried and Yan’s book contains a vast amount of knowledge and understanding. As well as explaining the way in which quantum theory works, it attempts to illuminate fundamental aspects of the theory. A typical example is the ‘fable’ elaborated in Gottfried’s article in Nature cited above, that if Newton were shown Maxwell’s equations and the Lorentz force law, he could deduce the meaning of E and B, but if Maxwell were shown Schrödinger’s equation, he could not deduce the meaning of Psi. For use with a well-constructed course (and, of course, this is the avowed purpose of the book; a useful range of problems is provided for each chapter), or for the relative expert getting to grips with particular aspects of the subject or aiming for a deeper understanding, the book is certainly ideal. It might be suggested, though, that, even compared to the first edition, the isolated learner might find the wide range of topics, and the very large number of mathematical and conceptual techniques, introduced in necessarily limited space, somewhat overwhelming. The second book under consideration, that of Schwabl, contains ‘Advanced’ elements of quantum theory; it is designed for a course following on from one for which Gottfried and Yan, or Schwabl’s own `Quantum Mechanics' might be recommended. It is the second edition in English, and is a translation of the third German edition

  13. Communication: Fundamental measure theory for hard disks: fluid and solid.

    PubMed

    Roth, Roland; Mecke, Klaus; Oettel, Martin

    2012-02-28

    Two-dimensional hard-particle systems are rather easy to simulate but surprisingly difficult to treat by theory. Despite their importance from both theoretical and experimental points of view, theoretical approaches are usually qualitative or at best semi-quantitative. Here, we present a density functional theory based on the ideas of fundamental measure theory for two-dimensional hard-disk mixtures, which allows for the first time an accurate description of the structure of the dense fluid and the equation of state for the solid phase within the framework of density functional theory. The properties of the solid phase are obtained by freely minimizing the functional. PMID:22380024

  14. Improved fundamental frequency coding in cochlear implant signal processing.

    PubMed

    Milczynski, Matthias; Wouters, Jan; van Wieringen, Astrid

    2009-04-01

    A new signal processing algorithm for improved pitch perception in cochlear implants is proposed. The algorithm realizes fundamental frequency (F0) coding by explicitly modulating the amplitude of the electrical stimulus. The proposed processing scheme is compared with the standard advanced combination encoder strategy in psychophysical music perception related tasks. Possible filter-bank and loudness cues between the strategies under study were minimized to predominantly focus on differences in temporal processing. The results demonstrate significant benefits provided by the new coding strategy for pitch ranking, melodic contour identification, and familiar melody identification. PMID:19354401

  15. New approach to nonperturbative quantum mechanics with minimal length uncertainty

    NASA Astrophysics Data System (ADS)

    Pedram, Pouria

    2012-01-01

    The existence of a minimal measurable length is a common feature of various approaches to quantum gravity such as string theory, loop quantum gravity, and black-hole physics. In this scenario, all commutation relations are modified and the Heisenberg uncertainty principle is changed to the so-called Generalized (Gravitational) Uncertainty Principle (GUP). Here, we present a one-dimensional nonperturbative approach to quantum mechanics with minimal length uncertainty relation which implies X=x to all orders and P=p+(1)/(3)βp3 to first order of GUP parameter β, where X and P are the generalized position and momentum operators and [x,p]=iℏ. We show that this formalism is an equivalent representation of the seminal proposal by Kempf, Mangano, and Mann and predicts the same physics. However, this proposal reveals many significant aspects of the generalized uncertainty principle in a simple and comprehensive form and the existence of a maximal canonical momentum is manifest through this representation. The problems of the free particle and the harmonic oscillator are exactly solved in this GUP framework and the effects of GUP on the thermodynamics of these systems are also presented. Although X, P, and the Hamiltonian of the harmonic oscillator all are formally self-adjoint, the careful study of the domains of these operators shows that only the momentum operator remains self-adjoint in the presence of the minimal length uncertainty. We finally discuss the difficulties with the definition of potentials with infinitely sharp boundaries.

  16. Fundamental mechanisms of micromachine reliability

    SciTech Connect

    DE BOER,MAARTEN P.; SNIEGOWSKI,JEFFRY J.; KNAPP,JAMES A.; REDMOND,JAMES M.; MICHALSKE,TERRY A.; MAYER,THOMAS K.

    2000-01-01

    Due to extreme surface to volume ratios, adhesion and friction are critical properties for reliability of Microelectromechanical Systems (MEMS), but are not well understood. In this LDRD the authors established test structures, metrology and numerical modeling to conduct studies on adhesion and friction in MEMS. They then concentrated on measuring the effect of environment on MEMS adhesion. Polycrystalline silicon (polysilicon) is the primary material of interest in MEMS because of its integrated circuit process compatibility, low stress, high strength and conformal deposition nature. A plethora of useful micromachined device concepts have been demonstrated using Sandia National Laboratories' sophisticated in-house capabilities. One drawback to polysilicon is that in air the surface oxidizes, is high energy and is hydrophilic (i.e., it wets easily). This can lead to catastrophic failure because surface forces can cause MEMS parts that are brought into contact to adhere rather than perform their intended function. A fundamental concern is how environmental constituents such as water will affect adhesion energies in MEMS. The authors first demonstrated an accurate method to measure adhesion as reported in Chapter 1. In Chapter 2 through 5, they then studied the effect of water on adhesion depending on the surface condition (hydrophilic or hydrophobic). As described in Chapter 2, they find that adhesion energy of hydrophilic MEMS surfaces is high and increases exponentially with relative humidity (RH). Surface roughness is the controlling mechanism for this relationship. Adhesion can be reduced by several orders of magnitude by silane coupling agents applied via solution processing. They decrease the surface energy and render the surface hydrophobic (i.e. does not wet easily). However, only a molecular monolayer coats the surface. In Chapters 3-5 the authors map out the extent to which the monolayer reduces adhesion versus RH. They find that adhesion is independent of

  17. Fundamental Mechanisms of Interface Roughness

    SciTech Connect

    Randall L. Headrick

    2009-01-06

    Publication quality results were obtained for several experiments and materials systems including: (i) Patterning and smoothening of sapphire surfaces by energetic Ar+ ions. Grazing Incidence Small Angle X-ray Scattering (GISAXS) experiments were performed in the system at the National Synchrotron Light Source (NSLS) X21 beamline. Ar+ ions in the energy range from 300 eV to 1000 eV were used to produce ripples on the surfaces of single-crystal sapphire. It was found that the ripple wavelength varies strongly with the angle of incidence of the ions, which increase significantly as the angle from normal is varied from 55° to 35°. A smooth region was found for ion incidence less than 35° away from normal incidence. In this region a strong smoothening mechanism with strength proportional to the second derivative of the height of the surface was found to be responsible for the effect. The discovery of this phase transition between stable and unstable regimes as the angle of incidence is varied has also stimulated new work by other groups in the field. (ii) Growth of Ge quantum dots on Si(100) and (111). We discovered the formation of quantum wires on 4° misoriented Si(111) using real-time GISAXS during the deposition of Ge. The results represent the first time-resolved GISAXS study of Ge quantum dot formation. (iii) Sputter deposition of amorphous thin films and multilayers composed of WSi2 and Si. Our in-situ GISAXS experiments reveal fundamental roughening and smoothing phenomena on surfaces during film deposition. The main results of this work is that the WSi2 layers actually become smoother during deposition due to the smoothening effect of energetic particles in the sputter deposition process.

  18. Fundamental Studies of Recombinant Hydrogenases

    SciTech Connect

    Adams, Michael W

    2014-01-25

    This research addressed the long term goals of understanding the assembly and organization of hydrogenase enzymes, of reducing them in size and complexity, of determining structure/function relationships, including energy conservation via charge separation across membranes, and in screening for novel H2 catalysts. A key overall goal of the proposed research was to define and characterize minimal hydrogenases that are produced in high yields and are oxygen-resistant. Remarkably, in spite of decades of research carried out on hydrogenases, it is not possible to readily manipulate or design the enzyme using molecular biology approaches since a recombinant form produced in a suitable host is not available. Such resources are essential if we are to understand what constitutes a “minimal” hydrogenase and design such catalysts with certain properties, such as resistance to oxygen, extreme stability and specificity for a given electron donor. The model system for our studies is Pyrococcus furiosus, a hyperthermophile that grows optimally at 100°C, which contains three different nickel-iron [NiFe-] containing hydrogenases. Hydrogenases I and II are cytoplasmic while the other, MBH, is an integral membrane protein that functions to both evolve H2 and pump protons. Three important breakthroughs were made during the funding period with P. furiosus soluble hydrogenase I (SHI). First, we produced an active recombinant form of SHI in E. coli by the co-expression of sixteen genes using anaerobically-induced promoters. Second, we genetically-engineered P. furiosus to overexpress SHI by an order of magnitude compared to the wild type strain. Third, we generated the first ‘minimal’ form of SHI, one that contained two rather than four subunits. This dimeric form was stable and active, and directly interacted with a pyruvate-oxidizing enzyme with any intermediate electron carrier. The research resulted in five peer-reviewed publications.

  19. Perfusion Magnetic Resonance Imaging: A Comprehensive Update on Principles and Techniques

    PubMed Central

    Li, Ka-Loh; Ostergaard, Leif; Calamante, Fernando

    2014-01-01

    Perfusion is a fundamental biological function that refers to the delivery of oxygen and nutrients to tissue by means of blood flow. Perfusion MRI is sensitive to microvasculature and has been applied in a wide variety of clinical applications, including the classification of tumors, identification of stroke regions, and characterization of other diseases. Perfusion MRI techniques are classified with or without using an exogenous contrast agent. Bolus methods, with injections of a contrast agent, provide better sensitivity with higher spatial resolution, and are therefore more widely used in clinical applications. However, arterial spin-labeling methods provide a unique opportunity to measure cerebral blood flow without requiring an exogenous contrast agent and have better accuracy for quantification. Importantly, MRI-based perfusion measurements are minimally invasive overall, and do not use any radiation and radioisotopes. In this review, we describe the principles and techniques of perfusion MRI. This review summarizes comprehensive updated knowledge on the physical principles and techniques of perfusion MRI. PMID:25246817

  20. Structural principles for computational and de novo design of 4Fe-4S metalloproteins.

    PubMed

    Nanda, Vikas; Senn, Stefan; Pike, Douglas H; Rodriguez-Granillo, Agustina; Hansen, Will A; Khare, Sagar D; Noy, Dror

    2016-05-01

    Iron-sulfur centers in metalloproteins can access multiple oxidation states over a broad range of potentials, allowing them to participate in a variety of electron transfer reactions and serving as catalysts for high-energy redox processes. The nitrogenase FeMoCO cluster converts di-nitrogen to ammonia in an eight-electron transfer step. The 2(Fe4S4) containing bacterial ferredoxin is an evolutionarily ancient metalloprotein fold and is thought to be a primordial progenitor of extant oxidoreductases. Controlling chemical transformations mediated by iron-sulfur centers such as nitrogen fixation, hydrogen production as well as electron transfer reactions involved in photosynthesis are of tremendous importance for sustainable chemistry and energy production initiatives. As such, there is significant interest in the design of iron-sulfur proteins as minimal models to gain fundamental understanding of complex natural systems and as lead-molecules for industrial and energy applications. Herein, we discuss salient structural characteristics of natural iron-sulfur proteins and how they guide principles for design. Model structures of past designs are analyzed in the context of these principles and potential directions for enhanced designs are presented, and new areas of iron-sulfur protein design are proposed. This article is part of a Special issue entitled Biodesign for Bioenergetics - the design and engineering of electronic transfer cofactors, protein networks, edited by Ronald L. Koder and J.L Ross Anderson. PMID:26449207

  1. Minimally invasive video-assisted versus minimally invasive nonendoscopic thyroidectomy.

    PubMed

    Fík, Zdeněk; Astl, Jaromír; Zábrodský, Michal; Lukeš, Petr; Merunka, Ilja; Betka, Jan; Chovanec, Martin

    2014-01-01

    Minimally invasive video-assisted thyroidectomy (MIVAT) and minimally invasive nonendoscopic thyroidectomy (MINET) represent well accepted and reproducible techniques developed with the main goal to improve cosmetic outcome, accelerate healing, and increase patient's comfort following thyroid surgery. Between 2007 and 2011, a prospective nonrandomized study of patients undergoing minimally invasive thyroid surgery was performed to compare advantages and disadvantages of the two different techniques. There were no significant differences in the length of incision to perform surgical procedures. Mean duration of hemithyroidectomy was comparable in both groups, but it was more time consuming to perform total thyroidectomy by MIVAT. There were more patients undergoing MIVAT procedures without active drainage in the postoperative course and we also could see a trend for less pain in the same group. This was paralleled by statistically significant decreased administration of both opiates and nonopiate analgesics. We encountered two cases of recurrent laryngeal nerve palsies in the MIVAT group only. MIVAT and MINET represent safe and feasible alternative to conventional thyroid surgery in selected cases and this prospective study has shown minimal differences between these two techniques. PMID:24800227

  2. The traveltime holographic principle

    NASA Astrophysics Data System (ADS)

    Huang, Yunsong; Schuster, Gerard T.

    2015-01-01

    Fermat's interferometric principle is used to compute interior transmission traveltimes τpq from exterior transmission traveltimes τsp and τsq. Here, the exterior traveltimes are computed for sources s on a boundary B that encloses a volume V of interior points p and q. Once the exterior traveltimes are computed, no further ray tracing is needed to calculate the interior times τpq. Therefore this interferometric approach can be more efficient than explicitly computing interior traveltimes τpq by ray tracing. Moreover, the memory requirement of the traveltimes is reduced by one dimension, because the boundary B is of one fewer dimension than the volume V. An application of this approach is demonstrated with interbed multiple (IM) elimination. Here, the IMs in the observed data are predicted from the migration image and are subsequently removed by adaptive subtraction. This prediction is enabled by the knowledge of interior transmission traveltimes τpq computed according to Fermat's interferometric principle. We denote this principle as the `traveltime holographic principle', by analogy with the holographic principle in cosmology where information in a volume is encoded on the region's boundary.

  3. Applying the four principles.

    PubMed

    Macklin, R

    2003-10-01

    Gillon is correct that the four principles provide a sound and useful way of analysing moral dilemmas. As he observes, the approach using these principles does not provide a unique solution to dilemmas. This can be illustrated by alternatives to Gillon's own analysis of the four case scenarios. In the first scenario, a different set of factual assumptions could yield a different conclusion about what is required by the principle of beneficence. In the second scenario, although Gillon's conclusion is correct, what is open to question is his claim that what society regards as the child's best interest determines what really is in the child's best interest. The third scenario shows how it may be reasonable for the principle of beneficence to take precedence over autonomy in certain circumstances, yet like the first scenario, the ethical conclusion relies on a set of empirical assumptions and predictions of what is likely to occur. The fourth scenario illustrates how one can draw different conclusions based on the importance given to the precautionary principle. PMID:14519836

  4. Astronomia Motivadora no Ensino Fundamental

    NASA Astrophysics Data System (ADS)

    Melo, J.; Voelzke, M. R.

    2008-09-01

    O objetivo principal deste trabalho é procurar desenvolver o interesse dos alunos pelas ciências através da Astronomia. Uma pesquisa com perguntas sobre Astronomia foi realizada junto a 161 alunos do Ensino Fundamental, com o intuito de descobrir conhecimentos prévios dos alunos sobre o assunto. Constatou-se, por exemplo, que 29,3% da 6ª série responderam corretamente o que é eclipse, 30,0% da 8ª série acertaram o que a Astronomia estuda, enquanto 42,3% dos alunos da 5ª série souberam definir o Sol. Pretende-se ampliar as turmas participantes e trabalhar, principalmente de forma prática com: dimensões e escalas no Sistema Solar, construção de luneta, questões como dia e noite, estações do ano e eclipses. Busca-se abordar, também, outros conteúdos de Física tais como a óptica na construção da luneta, e a mecânica no trabalho com escalas e medidas, e ao utilizar uma luminária para representar o Sol na questão do eclipse, e de outras disciplinas como a Matemática na transformação de unidades, regras de três; Artes na modelagem ou desenho dos planetas; a própria História com relação à busca pela origem do universo, e a Informática que possibilita a busca mais rápida por informações, além de permitir simulações e visualizações de imagens importantes. Acredita-se que a Astronomia é importante no processo ensino aprendizagem, pois permite a discussão de temas curiosos como, por exemplo, a origem do universo, viagens espaciais a existência ou não de vida em outros planetas, além de temas atuais como as novas tecnologias.

  5. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    NASA Astrophysics Data System (ADS)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  6. Spaceborne receivers: Basic principles

    NASA Technical Reports Server (NTRS)

    Stacey, J. M.

    1984-01-01

    The underlying principles of operation of microwave receivers for space observations of planetary surfaces were examined. The design philosophy of the receiver as it is applied to operate functionally as an efficient receiving system, the principle of operation of the key components of the receiver, and the important differences among receiver types are explained. The operating performance and the sensitivity expectations for both the modulated and total power receiver configurations are outlined. The expressions are derived from first principles and are developed through the important intermediate stages to form practicle and easily applied equations. The transfer of thermodynamic energy from point to point within the receiver is illustrated. The language of microwave receivers is applied statistics.

  7. Bloch principle for elliptic differential operators with periodic coefficients

    NASA Astrophysics Data System (ADS)

    Zhikov, V. V.; Pastukhova, S. E.

    2016-04-01

    Differential operators corresponding to elliptic equations of divergent type with 1-periodic coefficients are considered. The equations are put in Sobolev spaces with an arbitrary 1-periodic Borel measure on the entire space R d . In the study of the spectrum of operators of this kind, the Bloch principle is of fundamental importance. According to this principle, all points of the desired spectrum are obtained when studying the equation on the unit cube with quasiperiodic boundary conditions. The proof of the Bloch principle for problems in the above formulation is proved, in several versions of the principle. Examples of the application of the principle to finding the spectrum of specific operators, for example, for the Laplacian in a weighted space or on a singular structure of lattice type.

  8. Application of principles of integrated agricultural systems: results from farmer panels

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An Integrated Agricultural Systems working group comprised of USDA-ARS scientists is examining different agricultural systems from various geographic regions of the United States to determine fundamental principles that underlie successful integrated agricultural systems. Our hypothesis is that prin...

  9. Teaching/learning principles

    NASA Technical Reports Server (NTRS)

    Hankins, D. B.; Wake, W. H.

    1981-01-01

    The potential remote sensing user community is enormous, and the teaching and training tasks are even larger; however, some underlying principles may be synthesized and applied at all levels from elementary school children to sophisticated and knowledgeable adults. The basic rules applying to each of the six major elements of any training course and the underlying principle involved in each rule are summarized. The six identified major elements are: (1) field sites for problems and practice; (2) lectures and inside study; (3) learning materials and resources (the kit); (4) the field experience; (5) laboratory sessions; and (6) testing and evaluation.

  10. In quest of constitutional principles of "neurolaw".

    PubMed

    Pizzetti, Federico Gustavo

    2011-01-01

    The growing use of brain imaging technology and the developing of cognitive neuroscience pose unaccustomed challenges to legal systems. Until now, the fields of Law much affected are the civil and criminal law and procedure, but the constitutional dimension of "neurolaw" cannot be easily underestimated. As the capacity to investigate and to trace brain mechanisms and functional neural activities increases, it becomes urgent the recognition and definition of the unalienable rights and fundamental values in respect of this new techno-scientific power, that must be protected and safeguard at "constitutional level" of norms such as: human dignity, personal identity, authenticity and the pursuit of individual "happiness". As the same as for the law regulating research and experimentation on human genome adopted in the past years, one may also argue if the above mentioned fundamental principles of "neurolaw" must be fixed and disciplined also at European and International level. PMID:23057208

  11. Core Principles for Transforming Remedial Education: A Joint Statement

    ERIC Educational Resources Information Center

    Jobs for the Future, 2012

    2012-01-01

    As a result of new research and promising practice, we have more clarity than ever about how we can fundamentally transform our developmental education system to improve success for all students. To propel the movement forward, this statement offers a set of clear and actionable principles that, although not the final word on remedial education…

  12. Compression as a Universal Principle of Animal Behavior

    ERIC Educational Resources Information Center

    Ferrer-i-Cancho, Ramon; Hernández-Fernández, Antoni; Lusseau, David; Agoramoorthy, Govindasamy; Hsu, Minna J.; Semple, Stuart

    2013-01-01

    A key aim in biology and psychology is to identify fundamental principles underpinning the behavior of animals, including humans. Analyses of human language and the behavior of a range of non-human animal species have provided evidence for a common pattern underlying diverse behavioral phenomena: Words follow Zipf's law of brevity (the…

  13. The Didactic Principles and Their Applications in the Didactic Activity

    ERIC Educational Resources Information Center

    Marius-Costel, Esi

    2010-01-01

    The evaluation and reevaluation of the fundamental didactic principles suppose the acceptance at the level of an instructive-educative activity of a new educational paradigm. Thus, its understanding implies an assumption at a conceptual-theoretical level of some approaches where the didactic aspects find their usefulness by relating to value…

  14. Chemical principles of single-molecule electronics

    NASA Astrophysics Data System (ADS)

    Su, Timothy A.; Neupane, Madhav; Steigerwald, Michael L.; Venkataraman, Latha; Nuckolls, Colin

    2016-03-01

    The field of single-molecule electronics harnesses expertise from engineering, physics and chemistry to realize circuit elements at the limit of miniaturization; it is a subfield of nanoelectronics in which the electronic components are single molecules. In this Review, we survey the field from a chemical perspective and discuss the structure-property relationships of the three components that form a single-molecule junction: the anchor, the electrode and the molecular bridge. The spatial orientation and electronic coupling between each component profoundly affect the conductance properties and functions of the single-molecule device. We describe the design principles of the anchor group, the influence of the electronic configuration of the electrode and the effect of manipulating the structure of the molecular backbone and of its substituent groups. We discuss single-molecule conductance switches as well as the phenomenon of quantum interference and then trace their fundamental roots back to chemical principles.

  15. The Fundamental Values of Academic Integrity: Honesty, Trust, Respect, Fairness, Responsibility.

    ERIC Educational Resources Information Center

    Duke Univ., Durham, NC. Center for Academic Integrity.

    The Center for Academic Integrity defines academic integrity as a commitment, even in the face of adversity, to five fundamental values: honesty, trust, fairness, respect, and responsibility. From these values come principles of behavior that enable academic communities to translate ideals into action. This essay discusses each of these values and…

  16. FUNDAMENTAL MASS TRANSFER MODEL FOR INDOOR AIR EMISSION FROM SURFACE COATINGS

    EPA Science Inventory

    The paper, discusses the work of researchers at the U.S. EPA's Air and Energy Engineering Research Laboratory (Indoor Air Branch) who are evaluating mass transfer models based on fundamental principles to determine their effectiveness in predicting emissions from indoor architect...

  17. Evidence for the Fundamental Difference Hypothesis or Not?: Island Constraints Revisited

    ERIC Educational Resources Information Center

    Belikova, Alyona; White, Lydia

    2009-01-01

    This article examines how changes in linguistic theory affect the debate between the fundamental difference hypothesis and the access-to-Universal Grammar (UG) approach to SLA. With a focus on subjacency (Chomsky, 1973), a principle of UG that places constraints on "wh"-movement and that has frequently been taken as a test case for verifying…

  18. The uncertainty principle and quantum chaos

    NASA Technical Reports Server (NTRS)

    Chirikov, Boris V.

    1993-01-01

    The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.

  19. Minimally invasive aortic valve surgery.

    PubMed

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-09-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  20. LLNL Waste Minimization Program Plan

    SciTech Connect

    Not Available

    1990-02-14

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs.

  1. WASTE MINIMIZATION OPPORTUNITY ASSESSMENT MANUAL

    EPA Science Inventory

    Waste minimization (WM) is a policy specifically mandated by the U.S. Congress in the 1984 Hazardous and Solid Wastes Amendments to the Resource Conservation and Recovery Act (RCRA). The RCRA regulations require that generators of hazardous waste have a program in place to reduce...

  2. Assembly of a minimal protocell

    NASA Astrophysics Data System (ADS)

    Rasmussen, Steen

    2007-03-01

    What is minimal life, how can we make it, and how can it be useful? We present experimental and computational results towards bridging nonliving and living matter, which results in life that is different and much simpler than contemporary life. A simple yet tightly coupled catalytic cooperation between genes, metabolism, and container forms the design underpinnings of our protocell, which is a minimal self-replicating molecular machine. Experimentally, we have recently demonstrated this coupling by having an informational molecule (8-oxoguanine) catalytically control the light driven metabolic (Ru-bpy based) production of container materials (fatty acids). This is a significant milestone towards assembling a minimal self-replicating molecular machine. Recent theoretical investigations indicate that coordinated exponential component growth should naturally emerge as a result from such a catalytic coupling between the main protocellular components. A 3-D dissipative particle simulation (DPD) study of the full protocell life-cycle exposes a number of anticipated systemic issues associated with the remaining experimental challenges for the implementation of the minimal protocell. Finally we outline how more general self-replicating materials could be useful.

  3. A Defense of Semantic Minimalism

    ERIC Educational Resources Information Center

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  4. Minimally invasive aortic valve surgery

    PubMed Central

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-01-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  5. General Quantum Interference Principle and Duality Computer

    NASA Astrophysics Data System (ADS)

    Long, Gui-Lu

    2006-05-01

    In this article, we propose a general principle of quantum interference for quantum system, and based on this we propose a new type of computing machine, the duality computer, that may outperform in principle both classical computer and the quantum computer. According to the general principle of quantum interference, the very essence of quantum interference is the interference of the sub-waves of the quantum system itself. A quantum system considered here can be any quantum system: a single microscopic particle, a composite quantum system such as an atom or a molecule, or a loose collection of a few quantum objects such as two independent photons. In the duality computer, the wave of the duality computer is split into several sub-waves and they pass through different routes, where different computing gate operations are performed. These sub-waves are then re-combined to interfere to give the computational results. The quantum computer, however, has only used the particle nature of quantum object. In a duality computer, it may be possible to find a marked item from an unsorted database using only a single query, and all NP-complete problems may have polynomial algorithms. Two proof-of-the-principle designs of the duality computer are presented: the giant molecule scheme and the nonlinear quantum optics scheme. We also propose thought experiment to check the related fundamental issues, the measurement efficiency of a partial wave function.

  6. Design principles underlying circadian clocks.

    PubMed Central

    Rand, D. A.; Shulgin, B. V.; Salazar, D.; Millar, A. J.

    2004-01-01

    A fundamental problem for regulatory networks is to understand the relation between form and function: to uncover the underlying design principles of the network. Circadian clocks present a particularly interesting instance, as recent work has shown that they have complex structures involving multiple interconnected feedback loops with both positive and negative feedback. While several authors have speculated on the reasons for this, a convincing explanation is still lacking.We analyse both the flexibility of clock networks and the relationships between various desirable properties such as robust entrainment, temperature compensation, and stability to environmental variations and parameter fluctuations. We use this to argue that the complexity provides the flexibility necessary to simultaneously attain multiple key properties of circadian clocks. As part of our analysis we show how to quantify the key evolutionary aims using infinitesimal response curves, a tool that we believe will be of general utility in the analysis of regulatory networks. Our results suggest that regulatory and signalling networks might be much less flexible and of lower dimension than their apparent complexity would suggest. PMID:16849158

  7. The Species Delimitation Uncertainty Principle

    PubMed Central

    Adams, Byron J.

    2001-01-01

    If, as Einstein said, "it is the theory which decides what we can observe," then "the species problem" could be solved by simply improving our theoretical definition of what a species is. However, because delimiting species entails predicting the historical fate of evolutionary lineages, species appear to behave according to the Heisenberg Uncertainty Principle, which states that the most philosophically satisfying definitions of species are the least operational, and as species concepts are modified to become more operational they tend to lose their philosophical integrity. Can species be delimited operationally without losing their philosophical rigor? To mitigate the contingent properties of species that tend to make them difficult for us to delimit, I advocate a set of operations that takes into account the prospective nature of delimiting species. Given the fundamental role of species in studies of evolution and biodiversity, I also suggest that species delimitation proceed within the context of explicit hypothesis testing, like other scientific endeavors. The real challenge is not so much the inherent fallibility of predicting the future but rather adequately sampling and interpreting the evidence available to us in the present. PMID:19265874

  8. Principles of Naval Engineering.

    ERIC Educational Resources Information Center

    Naval Personnel Program Support Activity, Washington, DC.

    Fundamentals of shipboard machinery, equipment, and engineering plants are presented in this text prepared for engineering officers. A general description is included of the development of naval ships, ship design and construction, stability and buoyancy, and damage and casualty control. Engineering theories are explained on the background of ship…

  9. Precautionary principles: a jurisdiction-free framework for decision-making under risk.

    PubMed

    Ricci, Paolo F; Cox, Louis A; MacDonald, Thomas R

    2004-12-01

    Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives--defined as a choice that makes preferred consequences more likely--requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the

  10. The Idiom Principle Revisited

    ERIC Educational Resources Information Center

    Siyanova-Chanturia, Anna; Martinez, Ron

    2015-01-01

    John Sinclair's Idiom Principle famously posited that most texts are largely composed of multi-word expressions that "constitute single choices" in the mental lexicon. At the time that assertion was made, little actual psycholinguistic evidence existed in support of that holistic, "single choice," view of formulaic language. In…

  11. Reprographic Principles Made Easy.

    ERIC Educational Resources Information Center

    Young, J. B.

    Means for reproducing graphic materials are explained. There are several types of processes: those using light sensitive material, those using heat sensitive material, those using photo conductive materials (electrophotography), and duplicating processes using ink. For each of these, the principles behind them are explained, the necessary…

  12. PRINCIPLES OF WATER FILTRATION

    EPA Science Inventory

    This paper reviews principles involved in the processes commonly used to filter drinking water for public water systems. he most common approach is to chemically pretreat water and filter it through a deep (2-1/2 to 3 ft) bed of granuu1ar media (coal or sand or combinations of th...

  13. Extended Mach Principle.

    ERIC Educational Resources Information Center

    Rosen, Joe

    1981-01-01

    Discusses the meaning of symmetry of the laws of physics and symmetry of the universe and the connection between symmetries and asymmetries of the laws of physics and those of the universe. An explanation of Hamilton's principle is offered. The material is suitable for informal discussions with students. (Author/SK)

  14. Matters of Principle.

    ERIC Educational Resources Information Center

    Martz, Carlton

    1999-01-01

    This issue of "Bill of Rights in Action" looks at individuals who have stood on principle against authority or popular opinion. The first article investigates John Adams and his defense of British soldiers at the Boston Massacre trials. The second article explores Archbishop Thomas Becket's fatal conflict with England's King Henry II. The final…

  15. Principles of Biomedical Ethics

    PubMed Central

    Athar, Shahid

    2012-01-01

    In this presentation, I will discuss the principles of biomedical and Islamic medical ethics and an interfaith perspective on end-of-life issues. I will also discuss three cases to exemplify some of the conflicts in ethical decision-making. PMID:23610498

  16. Fermat's Principle Revisited.

    ERIC Educational Resources Information Center

    Kamat, R. V.

    1991-01-01

    A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)

  17. Radiological images on personal computers: introduction and fundamental principles of digital images.

    PubMed

    Gillespy, T; Rowberg, A H

    1993-05-01

    This series of articles will explore the issue related to displaying, manipulating, and analyzing radiological images on personal computers (PC). This first article discusses the digital image data file, standard PC graphic file formats, and various methods for importing radiological images into the PC. PMID:8334176

  18. Optical Second Harmonic Generation in Plasmonic Nanostructures: From Fundamental Principles to Advanced Applications.

    PubMed

    Butet, Jérémy; Brevet, Pierre-François; Martin, Olivier J F

    2015-11-24

    Plasmonics has emerged as an important research field in nanoscience and nanotechnology. Recently, significant attention has been devoted to the observation and the understanding of nonlinear optical processes in plasmonic nanostructures, giving rise to the new research field called nonlinear plasmonics. This review provides a comprehensive insight into the physical mechanisms of one of these nonlinear optical processes, namely, second harmonic generation (SHG), with an emphasis on the main differences with the linear response of plasmonic nanostructures. The main applications, ranging from the nonlinear optical characterization of nanostructure shapes to the optimization of laser beams at the nanoscale, are summarized and discussed. Future directions and developments, made possible by the unique combination of SHG surface sensitivity and field enhancements associated with surface plasmon resonances, are also addressed. PMID:26474346

  19. 75 FR 71317 - Fundamental Principles and Policymaking Criteria for Partnerships With Faith-Based and Other...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... other person. (Presidential Sig.) THE WHITE HOUSE, November 17, 2010. [FR Doc. 2010-29579 Filed 11-19-10... Business Administration; (xii) the Administrator of the United States Agency for International Development... Agency; (xiii) the Small Business Administration; (xiv) the United States Agency for...

  20. 3 CFR 13559 - Executive Order 13559 of November 17, 2010. Fundamental Principles and Policymaking Criteria for...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...; (vi) the Secretary of Housing and Urban Development; (vii) the Secretary of Education; (viii) the... Federal financial assistance, without removing or altering religious art, icons, scriptures, or other...; (vii) the Department of Health and Human Services; (viii) the Department of Housing and...

  1. Nanotechnology in hyperthermia cancer therapy: From fundamental principles to advanced applications.

    PubMed

    Beik, Jaber; Abed, Ziaeddin; Ghoreishi, Fatemeh S; Hosseini-Nami, Samira; Mehrzadi, Saeed; Shakeri-Zadeh, Ali; Kamrava, S Kamran

    2016-08-10

    In this work, we present an in-depth review of recent breakthroughs in nanotechnology for hyperthermia cancer therapy. Conventional hyperthermia methods do not thermally discriminate between the target and the surrounding normal tissues, and this non-selective tissue heating can lead to serious side effects. Nanotechnology is expected to have great potential to revolutionize current hyperthermia methods. To find an appropriate place in cancer treatment, all nanotechnology-based hyperthermia methods and their risks/benefits must be thoroughly understood. In this review paper, we extensively examine and compare four modern nanotechnology-based hyperthermia methods. For each method, the possible physical mechanisms of heat generation and enhancement due to the presence of nanoparticles are explained, and recent in vitro and in vivo studies are reviewed and discussed. Nano-Photo-Thermal Therapy (NPTT) and Nano-Magnetic Hyperthermia (NMH) are reviewed as the two first exciting approaches for targeted hyperthermia. The third novel hyperthermia method, Nano-Radio-Frequency Ablation (NaRFA) is discussed together with the thermal effects of novel nanoparticles in the presence of radiofrequency waves. Finally, Nano-Ultrasound Hyperthermia (NUH) is described as the fourth modern method for cancer hyperthermia. PMID:27264551

  2. A New Big Five: Fundamental Principles for an Integrative Science of Personality

    ERIC Educational Resources Information Center

    McAdams, Dan P.; Pals, Jennifer L.

    2006-01-01

    Despite impressive advances in recent years with respect to theory and research, personality psychology has yet to articulate clearly a comprehensive framework for understanding the whole person. In an effort to achieve that aim, the current article draws on the most promising empirical and theoretical trends in personality psychology today to…

  3. Integrating Fundamental Principles Underlying Somatic Practices into the Dance Technique Class

    ERIC Educational Resources Information Center

    Brodie, Julie; Lobel, Elin

    2004-01-01

    Integrating somatic practices into the dance technique class by bringing awareness to the bodily processes of breathing, sensing, connecting, and initiating can help students reconnect the mind with the body within the context of the classroom environment. Dance educators do not always have the resources to implement separate somatics courses…

  4. Enhancing Student Learning in Marketing Courses: An Exploration of Fundamental Principles for Website Platforms

    ERIC Educational Resources Information Center

    Hollenbeck, Candice R.; Mason, Charlotte H.; Song, Ji Hee

    2011-01-01

    The design of a course has potential to help marketing students achieve their learning objectives. Marketing courses are increasingly turning to technology to facilitate teaching and learning, and pedagogical tools such as Blackboard, WebCT, and e-Learning Commons are essential to the design of a course. Here, the authors investigate the research…

  5. On a Possibly Fundamental Principle in Chemistry as Viewed in a Cosmogonic Context

    NASA Astrophysics Data System (ADS)

    Hoyle, F.; Wickramasinghe, N. C.

    1999-10-01

    Subject to the condition that atoms be conserved - i.e. without nuclear transmutations - we conjecture that it is impossible to synthesise organic materials in appreciable quantity from inorganic materials without the intervention of biological systems. The restriction is not a consequence of a mystic quality in the laws of physics and chemistry but of the practical disposition of materials on the Earth and in the cosmos generally. It is a corollary that biology is the means by which an approximation to thermodynamic equilibrium is maintained in materials at temperatures ~ 300 K.

  6. Two Essays on Learning Disabilities in the Application of Fundamental Financial Principles

    ERIC Educational Resources Information Center

    Auciello, Daria Joy

    2010-01-01

    This dissertation consists of two essays which examine the relationship between dyslexia and the application and acquisition of financial knowledge. Recent behavioral research has documented that factors such as representativeness, overconfidence, loss aversion, naivete, wealth, age and gender all impact a person's risk perception and asset…

  7. ``From Fundamental Motives to Rational Expectation Equilibrium[REE, henceworth] of Indeterminacy''

    NASA Astrophysics Data System (ADS)

    Maksoed, Ssi, Wh-

    For ``Principle of Indeterminacy''from Heisenberg states: ``one of the fundamental cornerstone of quantum mechanics is the Heisenberg uncertainty principle''.whereas canonically conjugate quantities can be determined simultaneously only with a characteristic indeterminacy[M. Arevalo Aguilar, et.al]. Accompanying Alfred North Whitehead conclusion in ``The Aims of Education''that mathematical symbols are artificial before new meanings given, two kinds of fundamental motives: (i) expectation-expectation, (ii) expectation-certainty inherently occurs with determinacy properties of rational expectation equilibrium(REE, henceworth)- Guido Ascari & Tizano Ropele:''Trend inflation, Taylor principle & Indeterminacy'', Kiel Institute, June 2007. Furthers, relative price expression can be compare of their α and (1 - α) configurations in the expression of possible activity. Acknowledgment to Prof[asc]. Dr. Bobby Eka Gunara for ``made a rank through physics'' denotes...

  8. Dilaton cosmology, noncommutativity, and generalized uncertainty principle

    SciTech Connect

    Vakili, Babak

    2008-02-15

    The effects of noncommutativity and of the existence of a minimal length on the phase space of a dilatonic cosmological model are investigated. The existence of a minimum length results in the generalized uncertainty principle (GUP), which is a deformed Heisenberg algebra between the minisuperspace variables and their momenta operators. I extend these deformed commutating relations to the corresponding deformed Poisson algebra. For an exponential dilaton potential, the exact classical and quantum solutions in the commutative and noncommutative cases, and some approximate analytical solutions in the case of GUP, are presented and compared.

  9. Magnetic driving principle of a swimming microrobot

    NASA Astrophysics Data System (ADS)

    Chen, Yong; Mei, Tao; Kong, De-Yi; Xiong, Xiao-Yi; Li, Ke

    2001-09-01

    A swimming microrobot driven by magnetic field is presented. A new smart material, ferromagnetic polymer was utilized as actuation material. The microrobot has a pari of FMP fins, which are soft and driven by magnetic field symmetrically. The principle of actuation is given. The size of the robot is 20mm by 14mm by 5mm. The robot can move forward and backward dependent on the magnetic flux density and the frequency. The robot has many possible applications, such as minimally invasive medical techniques.

  10. Cannulation Strategies and Pitfalls in Minimally Invasive Cardiac Surgery

    PubMed Central

    Ramchandani, Mahesh; Al Jabbari, Odeaa; Abu Saleh, Walid K.; Ramlawi, Basel

    2016-01-01

    For any given cardiac surgery, there are two invasive components: the surgical approach and the cardiopulmonary bypass circuit. The standard approach for cardiac surgery is the median sternotomy, which offers unrestricted access to the thoracic organs—the heart, lung, and major vessels. However, it carries a long list of potential complications such as wound infection, brachial plexus palsies, respiratory dysfunction, and an unpleasant-looking scar. The cardiopulmonary bypass component also carries potential complications such as end-organ dysfunction, coagulopathy, hemodilution, bleeding, and blood transfusion requirement. Furthermore, the aortic manipulation during cannulation and cross clamping increases the risk of dissection, arterial embolization, and stroke. Minimally invasive cardiac surgery is an iconic event in the history of cardiothoracic medicine and has become a widely adapted approach as it minimizes many of the inconvenient side effects associated with the median sternotomy and bypass circuit placement. This type of surgery requires the use of novel perfusion strategies, especially in patients who hold the highest potential for postoperative morbidity. Cannulation techniques are a fundamental element in minimally invasive cardiac surgery, and there are numerous cannulation procedures for each type of minimally invasive operation. In this review, we will highlight the strategies and pitfalls associated with a minimally invasive cannulation. PMID:27127556

  11. Universal Principles in the Repair of Communication Problems

    PubMed Central

    Dingemanse, Mark; Roberts, Seán G.; Baranova, Julija; Blythe, Joe; Drew, Paul; Floyd, Simeon; Gisladottir, Rosa S.; Kendrick, Kobin H.; Levinson, Stephen C.; Manrique, Elizabeth; Rossi, Giovanni; Enfield, N. J.

    2015-01-01

    There would be little adaptive value in a complex communication system like human language if there were no ways to detect and correct problems. A systematic comparison of conversation in a broad sample of the world’s languages reveals a universal system for the real-time resolution of frequent breakdowns in communication. In a sample of 12 languages of 8 language families of varied typological profiles we find a system of ‘other-initiated repair’, where the recipient of an unclear message can signal trouble and the sender can repair the original message. We find that this system is frequently used (on average about once per 1.4 minutes in any language), and that it has detailed common properties, contrary to assumptions of radical cultural variation. Unrelated languages share the same three functionally distinct types of repair initiator for signalling problems and use them in the same kinds of contexts. People prefer to choose the type that is the most specific possible, a principle that minimizes cost both for the sender being asked to fix the problem and for the dyad as a social unit. Disruption to the conversation is kept to a minimum, with the two-utterance repair sequence being on average no longer that the single utterance which is being fixed. The findings, controlled for historical relationships, situation types and other dependencies, reveal the fundamentally cooperative nature of human communication and offer support for the pragmatic universals hypothesis: while languages may vary in the organization of grammar and meaning, key systems of language use may be largely similar across cultural groups. They also provide a fresh perspective on controversies about the core properties of language, by revealing a common infrastructure for social interaction which may be the universal bedrock upon which linguistic diversity rests. PMID:26375483

  12. Minimally Informative Prior Distributions for PSA

    SciTech Connect

    Dana L. Kelly; Robert W. Youngblood; Kurt G. Vedros

    2010-06-01

    A salient feature of Bayesian inference is its ability to incorporate information from a variety of sources into the inference model, via the prior distribution (hereafter simply “the prior”). However, over-reliance on old information can lead to priors that dominate new data. Some analysts seek to avoid this by trying to work with a minimally informative prior distribution. Another reason for choosing a minimally informative prior is to avoid the often-voiced criticism of subjectivity in the choice of prior. Minimally informative priors fall into two broad classes: 1) so-called noninformative priors, which attempt to be completely objective, in that the posterior distribution is determined as completely as possible by the observed data, the most well known example in this class being the Jeffreys prior, and 2) priors that are diffuse over the region where the likelihood function is nonnegligible, but that incorporate some information about the parameters being estimated, such as a mean value. In this paper, we compare four approaches in the second class, with respect to their practical implications for Bayesian inference in Probabilistic Safety Assessment (PSA). The most commonly used such prior, the so-called constrained noninformative prior, is a special case of the maximum entropy prior. This is formulated as a conjugate distribution for the most commonly encountered aleatory models in PSA, and is correspondingly mathematically convenient; however, it has a relatively light tail and this can cause the posterior mean to be overly influenced by the prior in updates with sparse data. A more informative prior that is capable, in principle, of dealing more effectively with sparse data is a mixture of conjugate priors. A particular diffuse nonconjugate prior, the logistic-normal, is shown to behave similarly for some purposes. Finally, we review the so-called robust prior. Rather than relying on the mathematical abstraction of entropy, as does the constrained

  13. Individual differences in fundamental social motives.

    PubMed

    Neel, Rebecca; Kenrick, Douglas T; White, Andrew Edward; Neuberg, Steven L

    2016-06-01

    Motivation has long been recognized as an important component of how people both differ from, and are similar to, each other. The current research applies the biologically grounded fundamental social motives framework, which assumes that human motivational systems are functionally shaped to manage the major costs and benefits of social life, to understand individual differences in social motives. Using the Fundamental Social Motives Inventory, we explore the relations among the different fundamental social motives of Self-Protection, Disease Avoidance, Affiliation, Status, Mate Seeking, Mate Retention, and Kin Care; the relationships of the fundamental social motives to other individual difference and personality measures including the Big Five personality traits; the extent to which fundamental social motives are linked to recent life experiences; and the extent to which life history variables (e.g., age, sex, childhood environment) predict individual differences in the fundamental social motives. Results suggest that the fundamental social motives are a powerful lens through which to examine individual differences: They are grounded in theory, have explanatory value beyond that of the Big Five personality traits, and vary meaningfully with a number of life history variables. A fundamental social motives approach provides a generative framework for considering the meaning and implications of individual differences in social motivation. (PsycINFO Database Record PMID:26371400

  14. Investigating the Fundamental Theorem of Calculus

    ERIC Educational Resources Information Center

    Johnson, Heather L.

    2010-01-01

    The fundamental theorem of calculus, in its simplified complexity, connects differential and integral calculus. The power of the theorem comes not merely from recognizing it as a mathematical fact but from using it as a systematic tool. As a high school calculus teacher, the author developed and taught lessons on this fundamental theorem that were…

  15. Minimal Doubling and Point Splitting

    SciTech Connect

    Creutz, M.

    2010-06-14

    Minimally-doubled chiral fermions have the unusual property of a single local field creating two fermionic species. Spreading the field over hypercubes allows construction of combinations that isolate specific modes. Combining these fields into bilinears produces meson fields of specific quantum numbers. Minimally-doubled fermion actions present the possibility of fast simulations while maintaining one exact chiral symmetry. They do, however, introduce some peculiar aspects. An explicit breaking of hyper-cubic symmetry allows additional counter-terms to appear in the renormalization. While a single field creates two different species, spreading this field over nearby sites allows isolation of specific states and the construction of physical meson operators. Finally, lattice artifacts break isospin and give two of the three pseudoscalar mesons an additional contribution to their mass. Depending on the sign of this mass splitting, one can either have a traditional Goldstone pseudoscalar meson or a parity breaking Aoki-like phase.

  16. Anaesthesia for minimally invasive surgery

    PubMed Central

    Dec, Marta

    2015-01-01

    Minimally invasive surgery (MIS) is rising in popularity. It offers well-known benefits to the patient. However, restricted access to the surgical site and gas insufflation into the body cavities may result in severe complications. From the anaesthetic point of view MIS poses unique challenges associated with creation of pneumoperitoneum, carbon dioxide absorption, specific positioning and monitoring a patient to whom the anaesthetist has often restricted access, in a poorly lit environment. Moreover, with refinement of surgical procedures and growing experience the anaesthetist is presented with patients from high-risk groups (obese, elderly, with advanced cardiac and respiratory disease) who once were deemed unsuitable for the laparoscopic technique. Anaesthetic management is aimed at getting the patient safely through the procedure, minimizing the specific risks arising from laparoscopy and the patient's coexisting medical problems, ensuring quick recovery and a relatively pain-free postoperative course with early return to normal function. PMID:26865885

  17. Minimal universal quantum heat machine.

    PubMed

    Gelbwaser-Klimovsky, D; Alicki, R; Kurizki, G

    2013-01-01

    In traditional thermodynamics the Carnot cycle yields the ideal performance bound of heat engines and refrigerators. We propose and analyze a minimal model of a heat machine that can play a similar role in quantum regimes. The minimal model consists of a single two-level system with periodically modulated energy splitting that is permanently, weakly, coupled to two spectrally separated heat baths at different temperatures. The equation of motion allows us to compute the stationary power and heat currents in the machine consistent with the second law of thermodynamics. This dual-purpose machine can act as either an engine or a refrigerator (heat pump) depending on the modulation rate. In both modes of operation, the maximal Carnot efficiency is reached at zero power. We study the conditions for finite-time optimal performance for several variants of the model. Possible realizations of the model are discussed. PMID:23410316

  18. Minimizing liability during internal investigations.

    PubMed

    Morris, Cole

    2010-01-01

    Today's security professional must appreciate the potential landmines in any investigative effort and work collaboratively with others to minimize liability risks, the author points out. In this article he examines six civil torts that commonly arise from unprofessionally planned or poorly executed internal investigations-defamation, false imprisonment. intentional infliction of emotional distress, assault and battery, invasion of privacy, and malicious prosecution and abuse of process. PMID:20873494

  19. Gauge theories under incorporation of a generalized uncertainty principle

    SciTech Connect

    Kober, Martin

    2010-10-15

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  20. Minimizing radiation exposure during percutaneous nephrolithotomy.

    PubMed

    Chen, T T; Preminger, G M; Lipkin, M E

    2015-12-01

    Given the recent trends in growing per capita radiation dose from medical sources, there have been increasing concerns over patient radiation exposure. Patients with kidney stones undergoing percutaneous nephrolithotomy (PNL) are at particular risk for high radiation exposure. There exist several risk factors for increased radiation exposure during PNL which include high Body Mass Index, multiple access tracts, and increased stone burden. We herein review recent trends in radiation exposure, radiation exposure during PNL to both patients and urologists, and various approaches to reduce radiation exposure. We discuss incorporating the principles of As Low As reasonably Achievable (ALARA) into clinical practice and review imaging techniques such as ultrasound and air contrast to guide PNL access. Alternative surgical techniques and approaches to reducing radiation exposure, including retrograde intra-renal surgery, retrograde nephrostomy, endoscopic-guided PNL, and minimally invasive PNL, are also highlighted. It is important for urologists to be aware of these concepts and techniques when treating stone patients with PNL. The discussions outlined will assist urologists in providing patient counseling and high quality of care. PMID:26354615

  1. Principles of Natural Photosynthesis.

    PubMed

    Krewald, Vera; Retegan, Marius; Pantazis, Dimitrios A

    2016-01-01

    Nature relies on a unique and intricate biochemical setup to achieve sunlight-driven water splitting. Combined experimental and computational efforts have produced significant insights into the structural and functional principles governing the operation of the water-oxidizing enzyme Photosystem II in general, and of the oxygen-evolving manganese-calcium cluster at its active site in particular. Here we review the most important aspects of biological water oxidation, emphasizing current knowledge on the organization of the enzyme, the geometric and electronic structure of the catalyst, and the role of calcium and chloride cofactors. The combination of recent experimental work on the identification of possible substrate sites with computational modeling have considerably limited the possible mechanistic pathways for the critical O-O bond formation step. Taken together, the key features and principles of natural photosynthesis may serve as inspiration for the design, development, and implementation of artificial systems. PMID:26099285

  2. AUTOMOTIVE DIESEL MAINTENANCE 2. UNIT XVI, LEARNING ABOUT AC GENERATOR (ALTERNATOR) PRINCIPLES (PART I).

    ERIC Educational Resources Information Center

    Human Engineering Inst., Cleveland, OH.

    THIS MODULE OF A 25-MODULE COURSE IS DESIGNED TO DEVELOP AN UNDERSTANDING OF THE OPERATING PRINCIPLES OF ALTERNATING CURRENT GENERATORS USED ON DIESEL POWERED EQUIPMENT. TOPICS ARE REVIEWING ELECTRICAL FUNDAMENTALS, AND OPERATING PRINCIPLES OF ALTERNATORS. THE MODULE CONSISTS OF A SELF-INSTRUCTIONAL PROGRAMED TRAINING FILM "AC GENERATORS…

  3. Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms

    ERIC Educational Resources Information Center

    Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

    2002-01-01

    Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…

  4. Personality Theories Facilitate Integrating the Five Principles and Deducing Hypotheses for Testing

    ERIC Educational Resources Information Center

    Maddi, Salvatore R.

    2007-01-01

    Comments on the original article "A New Big Five: Fundamental Principles for an Integrative Science of Personality," by Dan P. McAdams and Jennifer L. Pals (see record 2006-03947-002). In presenting their view of personality science, McAdams and Pals (April 2006) elaborated the importance of five principles for building an integrated science of…

  5. Pauli Exclusion Principle

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    A principle of quantum theory, devised in 1925 by Wolfgang Pauli (1900-58), which states that no two fermions may exist in the same quantum state. The quantum state of a particle is defined by a set of numbers that describe quantities such as energy, angular momentum and spin. Fermions are particles such as quarks, protons, neutrons and electrons, that have spin = ½ (in units of h/2π, where h is ...

  6. Computational principles of memory.

    PubMed

    Chaudhuri, Rishidev; Fiete, Ila

    2016-03-01

    The ability to store and later use information is essential for a variety of adaptive behaviors, including integration, learning, generalization, prediction and inference. In this Review, we survey theoretical principles that can allow the brain to construct persistent states for memory. We identify requirements that a memory system must satisfy and analyze existing models and hypothesized biological substrates in light of these requirements. We also highlight open questions, theoretical puzzles and problems shared with computer science and information theory. PMID:26906506

  7. Principles of nuclear geology

    SciTech Connect

    Aswathanarayana, U.

    1985-01-01

    This book treats the basic principles of nuclear physics and the mineralogy, geochemistry, distribution and ore deposits of uranium and thorium. The application of nuclear methodology in radiogenic heat and thermal regime of the earth, radiometric prospecting, isotopic age dating, stable isotopes and cosmic-ray produced isotopes is covered. Geological processes, such as metamorphic chronology, petrogenesis, groundwater movement, and sedimentation rate are focussed on.

  8. Heisenberg's observability principle

    NASA Astrophysics Data System (ADS)

    Wolff, Johanna

    2014-02-01

    Werner Heisenberg's 1925 paper 'Quantum-theoretical re-interpretation of kinematic and mechanical relations' marks the beginning of quantum mechanics. Heisenberg famously claims that the paper is based on the idea that the new quantum mechanics should be 'founded exclusively upon relationships between quantities which in principle are observable'. My paper is an attempt to understand this observability principle, and to see whether its employment is philosophically defensible. Against interpretations of 'observability' along empiricist or positivist lines I argue that such readings are philosophically unsatisfying. Moreover, a careful comparison of Heisenberg's reinterpretation of classical kinematics with Einstein's argument against absolute simultaneity reveals that the positivist reading does not fit with Heisenberg's strategy in the paper. Instead the appeal to observability should be understood as a specific criticism of the causal inefficacy of orbital electron motion in Bohr's atomic model. I conclude that the tacit philosophical principle behind Heisenberg's argument is not a positivistic connection between observability and meaning, but the idea that a theory should not contain causally idle wheels.

  9. Grice's cooperative principle in the psychoanalytic setting.

    PubMed

    Ephratt, Michal

    2014-12-01

    Grice's "cooperative principle," including conversational implicatures and maxims, is commonplace in current pragmatics (a subfield of linguistics), and is generally applied in conversational analysis. The author examines the unique contribution of Grice's principle in considering the psychotherapeutic setting and its discourse. Such an investigation is called for chiefly because of the central role of speech in psychoanalytic practice (the "talking cure"). Symptoms and transference, which are characterized as forms of expression that are fundamentally deceptive, must be equivocal and indirect, and must breach all four of Grice's categories and maxims: truth (Quality), relation (Relevance), Manner (be clear), and Quantity. Therapeutic practice, according to Freud's "fundamental rule of psychoanalysis," encourages the parties (analysand and analyst) to breach each and every one of Grice's maxims. Using case reports drawn from the literature, the author shows that these breachings are essential for therapeutic progress. They serve as a unique and important ground for revealing inner (psychic) contents, and demarcating real self from illusive self, which in turn constitutes leverage for integrating these contents with the self. PMID:25490077

  10. Microrover Operates With Minimal Computation

    NASA Technical Reports Server (NTRS)

    Miller, David P.; Loch, John L.; Gat, Erann; Desai, Rajiv S.; Angle, Colin; Bickler, Donald B.

    1992-01-01

    Small, light, highly mobile robotic vehicles called "microrovers" use sensors and artificial intelligence to perform complicated tasks autonomously. Vehicle navigates, avoids obstacles, and picks up objects using reactive control scheme selected from among few preprogrammed behaviors to respond to environment while executing assigned task. Under development for exploration and mining of other planets. Also useful in firefighting, cleaning up chemical spills, and delivering materials in factories. Reactive control scheme and principle of behavior-description language useful in reducing computational loads in prosthetic limbs and automotive collision-avoidance systems.

  11. Rotor-Liquid-Fundament System's Oscillation

    NASA Astrophysics Data System (ADS)

    Kydyrbekuly, A.

    The work is devoted to research of oscillation and sustainability of stationary twirl of vertical flexible static dynamically out-of-balance rotor with cavity partly filled with liquid and set on relative frame fundament. The accounting of such factors like oscillation of fundament, liquid oscillation, influence of asymmetry of installation of a rotor on a shaft, anisotropism of shaft support and fundament, static and dynamic out-of-balance of a rotor, an external friction, an internal friction of a shaft, allows to settle an invoice more precisely kinematic and dynamic characteristics of system.

  12. Modeling of fundamental phenomena in welds

    SciTech Connect

    Zacharia, T.; Vitek, J.M.; Goldak, J.A.; DebRoy, T.A.; Rappaz, M.; Bhadeshia, H.K.D.H.

    1993-12-31

    Recent advances in the mathematical modeling of fundamental phenomena in welds are summarized. State-of-the-art mathematical models, advances in computational techniques, emerging high-performance computers, and experimental validation techniques have provided significant insight into the fundamental factors that control the development of the weldment. The current status and scientific issues in the areas of heat and fluid flow in welds, heat source metal interaction, solidification microstructure, and phase transformations are assessed. Future research areas of major importance for understanding the fundamental phenomena in weld behavior are identified.

  13. Fundamentals of preparative and nonlinear chromatography

    SciTech Connect

    Guiochon, Georges A; Felinger, Attila; Katti, Anita; Shirazi, Dean G

    2006-02-01

    The second edition of Fundamentals of Preparative and Nonlinear Chromatography is devoted to the fundamentals of a new process of purification or extraction of chemicals or proteins widely used in the pharmaceutical industry and in preparative chromatography. This process permits the preparation of extremely pure compounds satisfying the requests of the US Food and Drug Administration. The book describes the fundamentals of thermodynamics, mass transfer kinetics, and flow through porous media that are relevant to chromatography. It presents the models used in chromatography and their solutions, discusses the applications made, describes the different processes used, their numerous applications, and the methods of optimization of the experimental conditions of this process.

  14. Risk minimization through portfolio replication

    NASA Astrophysics Data System (ADS)

    Ciliberti, S.; Mã©Zard, M.

    2007-05-01

    We use a replica approach to deal with portfolio optimization problems. A given risk measure is minimized using empirical estimates of asset values correlations. We study the phase transition which happens when the time series is too short with respect to the size of the portfolio. We also study the noise sensitivity of portfolio allocation when this transition is approached. We consider explicitely the cases where the absolute deviation and the conditional value-at-risk are chosen as a risk measure. We show how the replica method can study a wide range of risk measures, and deal with various types of time series correlations, including realistic ones with volatility clustering.

  15. Diagnosis of minimal hepatic encephalopathy.

    PubMed

    Weissenborn, Karin

    2015-03-01

    Minimal hepatic encephalopathy (mHE) has significant impact upon a liver patient's daily living and health related quality of life. Therefore a majority of clinicians agree that mHE should be diagnosed and treated. The optimal means for diagnosing mHE, however, is controversial. This paper describes the currently most frequently used methods-EEG, critical flicker frequency, Continuous Reaction time Test, Inhibitory Control Test, computerized test batteries such as the Cognitive Drug Research test battery, the psychometric hepatic encephalopathy score (PHES) and the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS)-and their pros and cons. PMID:26041959

  16. Minimizing medical litigation, part 2.

    PubMed

    Harold, Tan Keng Boon

    2006-01-01

    Provider-patient disputes are inevitable in the healthcare sector. Healthcare providers and regulators should recognize this and plan opportunities to enforce alternative dispute resolution (ADR) a early as possible in the care delivery process. Negotiation is often the main dispute resolution method used by local healthcare providers, failing which litigation would usually follow. The role of mediation in resolving malpractice disputes has been minimal. Healthcare providers, administrators, and regulators should therefore look toward a post-event communication-cum-mediation framework as the key national strategy to resolving malpractice disputes. PMID:16711089

  17. The minimal scenario of leptogenesis

    NASA Astrophysics Data System (ADS)

    Blanchet, Steve; Di Bari, Pasquale

    2012-12-01

    We review the main features and results of thermal leptogenesis within the type I seesaw mechanism, the minimal extension of the Standard Model explaining neutrino masses and mixing. After presenting the simplest approach, the vanilla scenario, we discuss various important developments of recent years, such as the inclusion of lepton and heavy neutrino flavour effects, a description beyond a hierarchical heavy neutrino mass spectrum and an improved kinetic description within the density matrix and the closed-time-path formalisms. We also discuss how leptogenesis can ultimately represent an important phenomenological tool to test the seesaw mechanism and the underlying model of new physics.

  18. Principle of Spacetime and Black Hole Equivalence

    NASA Astrophysics Data System (ADS)

    Zhang, Tianxi

    2016-06-01

    Modelling the universe without relying on a set of hypothetical entities (HEs) to explain observations and overcome problems and difficulties is essential to developing a physical cosmology. The well-known big bang cosmology, widely accepted as the standard model, stands on two fundamentals, which are Einstein’s general relativity (GR) that describes the effect of matter on spacetime and the cosmological principle (CP) of spacetime isotropy and homogeneity. The field equation of GR along with the Friedmann-Lemaitre-Robertson-Walker (FLRW) metric of spacetime derived from CP generates the Friedmann equation (FE) that governs the development and dynamics of the universe. The big bang theory has made impressive successes in explaining the universe, but still has problems and solutions of them rely on an increasing number of HEs such as inflation, dark matter, dark energy, and so on. Recently, the author has developed a new cosmological model called black hole universe, which, instead of making many those hypotheses, only includes a new single postulate (or a new principle) to the cosmology - Principle of Spacetime and Black Hole Equivalence (SBHEP) - to explain all the existing observations of the universe and overcome all the existing problems in conventional cosmologies. This study thoroughly demonstrates how this newly developed black hole universe model, which therefore stands on the three fundamentals (GR, CP, and SBHEP), can fully explain the universe as well as easily conquer the difficulties according to the well-developed physics, thus, neither needing any other hypotheses nor existing any unsolved difficulties. This work was supported by NSF/REU (Grant #: PHY-1263253) at Alabama A & M University.

  19. On the Minimal Length Uncertainty Relation and the Foundations of String Theory

    DOE PAGESBeta

    Chang, Lay Nam; Lewis, Zachary; Minic, Djordje; Takeuchi, Tatsu

    2011-01-01

    We review our work on the minimal length uncertainty relation as suggested by perturbative string theory. We discuss simple phenomenological implications of the minimal length uncertainty relation and then argue that the combination of the principles of quantum theory and general relativity allow for a dynamical energy-momentum space. We discuss the implication of this for the problem of vacuum energy and the foundations of nonperturbative string theory.

  20. Fundamental approaches in molecular biology for communication sciences and disorders

    PubMed Central

    Bartlett, Rebecca; Jetté, Marie E; King, Suzanne N.; Schaser, Allison; Thibeault, Susan L.

    2012-01-01

    Purpose This contemporary tutorial will introduce general principles of molecular biology, common DNA, RNA and protein assays and their relevance in the field of communication sciences and disorders (CSD). Methods Over the past two decades, knowledge of the molecular pathophysiology of human disease has increased at a remarkable pace. Most of this progress can be attributed to concomitant advances in basic molecular biology and, specifically, the development of an ever-expanding armamentarium of technologies for analysis of DNA, RNA and protein structure and function. Details of these methodologies, their limitations and examples from the CSD literature are presented. Results/Conclusions The use of molecular biology techniques in the fields of speech, language and hearing sciences is increasing, facilitating the need for an understanding of molecular biology fundamentals and common experimental assays. PMID:22232415

  1. Instructor Special Report: RIF (Reading Is FUNdamental)

    ERIC Educational Resources Information Center

    Instructor, 1976

    1976-01-01

    At a time when innovative programs of the sixties are quickly falling out of the picture, Reading Is FUNdamental, after ten years and five million free paperbacks, continues to expand and show results. (Editor)

  2. A New Principle of Sound Frequency Analysis

    NASA Technical Reports Server (NTRS)

    Theodorsen, Theodore

    1932-01-01

    In connection with the study of aircraft and propeller noises, the National Advisory Committee for Aeronautics has developed an instrument for sound-frequency analysis which differs fundamentally from previous types, and which, owing to its simplicity of principle, construction, and operation, has proved to be of value in this investigation. The method is based on the well-known fact that the Ohmic loss in an electrical resistance is equal to the sum of the losses of the harmonic components of a complex wave, except for the case in which any two components approach or attain vectorial identity, in which case the Ohmic loss is increased by a definite amount. The principle of frequency analysis has been presented mathematically and a number of distinct advantages relative to previous methods have been pointed out. An automatic recording instrument embodying this principle is described in detail. It employs a beat-frequency oscillator as a source of variable frequency. A large number of experiments have verified the predicted superiority of the method. A number of representative records are presented.

  3. Principles of tendon transfers.

    PubMed

    Coulet, B

    2016-04-01

    Tendon transfers are carried out to restore functional deficits by rerouting the remaining intact muscles. Transfers are highly attractive in the context of hand surgery because of the possibility of restoring the patient's ability to grip. In palsy cases, tendon transfers are only used when a neurological procedure is contraindicated or has failed. The strategy used to restore function follows a common set of principles, no matter the nature of the deficit. The first step is to clearly distinguish between deficient muscles and muscles that could be transferred. Next, the type of palsy will dictate the scope of the program and the complexity of the gripping movements that can be restored. Based on this reasoning, a surgical strategy that matches the means (transferable muscles) with the objectives (functions to restore) will be established and clearly explained to the patient. Every paralyzed hand can be described using three parameters. 1) Deficient segments: wrist, thumb and long fingers; 2) mechanical performance of muscles groups being revived: high energy-wrist extension and finger flexion that require strong transfers with long excursion; low energy-wrist flexion and finger extension movements that are less demanding mechanically, because they can be accomplished through gravity alone in some cases; 3) condition of the two primary motors in the hand: extrinsics (flexors and extensors) and intrinsics (facilitator). No matter the type of palsy, the transfer surgery follows the same technical principles: exposure, release, fixation, tensioning and rehabilitation. By performing an in-depth analysis of each case and by following strict technical principles, tendon transfer surgery leads to reproducible results; this allows the surgeon to establish clear objectives for the patient preoperatively. PMID:27117119

  4. A minimal living system and the origin of a protocell

    NASA Technical Reports Server (NTRS)

    Oro, J.; Lazcano, A.

    1984-01-01

    The essential molecular attributes of a minimal living system are discussed, and the evolution of a protocell from such a system is considered. Present thought on the emergence and evolution of life is summarized, and the complexity of biological systems is reviewed. The five fundamental molecular attributes considered are: informational molecules, catalytic peptides, a decoding and translation system, protoribosomes, and protomembranes. Their functions in a primitive cell are discussed. Positive feedback interaction between proto-RNA, proto-AA-tRNA, and protoenzyme are identified as the three major steps to the formation of a primitive living cell.

  5. Tensorial Minkowski functionals of triply periodic minimal surfaces

    PubMed Central

    Mickel, Walter; Schröder-Turk, Gerd E.; Mecke, Klaus

    2012-01-01

    A fundamental understanding of the formation and properties of a complex spatial structure relies on robust quantitative tools to characterize morphology. A systematic approach to the characterization of average properties of anisotropic complex interfacial geometries is provided by integral geometry which furnishes a family of morphological descriptors known as tensorial Minkowski functionals. These functionals are curvature-weighted integrals of tensor products of position vectors and surface normal vectors over the interfacial surface. We here demonstrate their use by application to non-cubic triply periodic minimal surface model geometries, whose Weierstrass parametrizations allow for accurate numerical computation of the Minkowski tensors. PMID:24098847

  6. Minimizing travel claims cost with minimal-spanning tree model

    NASA Astrophysics Data System (ADS)

    Jamalluddin, Mohd Helmi; Jaafar, Mohd Azrul; Amran, Mohd Iskandar; Ainul, Mohd Sharizal; Hamid, Aqmar; Mansor, Zafirah Mohd; Nopiah, Zulkifli Mohd

    2014-06-01

    Travel demand necessitates a big expenditure in spending, as has been proven by the National Audit Department (NAD). Every year the auditing process is carried out throughout the country involving official travel claims. This study focuses on the use of the Spanning Tree model to determine the shortest path to minimize the cost of the NAD's official travel claims. The objective is to study the possibility of running a network based in the Kluang District Health Office to eight Rural Clinics in Johor state using the Spanning Tree model applications for optimizing travelling distances and make recommendations to the senior management of the Audit Department to analyze travelling details before an audit is conducted. Result of this study reveals that there were claims of savings of up to 47.4% of the original claims, over the course of the travel distance.

  7. Annual Waste Minimization Summary Report

    SciTech Connect

    Alfred J. Karns

    2007-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC (NSTec), for the U. S. Department of Energy (DOE) National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during CY06. This report was developed in accordance with the requirements of the Nevada Test Site (NTS) Resource Conservation and Recovery Act (RCRA) Permit (No. NEV HW0021) and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the DOE, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by the NNSA/NSO and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by the NNSA/NSO.

  8. Less minimal supersymmetric standard model

    SciTech Connect

    de Gouvea, Andre; Friedland, Alexander; Murayama, Hitoshi

    1998-03-28

    Most of the phenomenological studies of supersymmetry have been carried out using the so-called minimal supergravity scenario, where one assumes a universal scalar mass, gaugino mass, and trilinear coupling at M{sub GUT}. Even though this is a useful simplifying assumption for phenomenological analyses, it is rather too restrictive to accommodate a large variety of phenomenological possibilities. It predicts, among other things, that the lightest supersymmetric particle (LSP) is an almost pure B-ino, and that the {mu}-parameter is larger than the masses of the SU(2){sub L} and U(1){sub Y} gauginos. We extend the minimal supergravity framework by introducing one extra parameter: the Fayet'Iliopoulos D-term for the hypercharge U(1), D{sub Y}. Allowing for this extra parameter, we find a much more diverse phenomenology, where the LSP is {tilde {nu}}{sub {tau}}, {tilde {tau}} or a neutralino with a large higgsino content. We discuss the relevance of the different possibilities to collider signatures. The same type of extension can be done to models with the gauge mediation of supersymmetry breaking. We argue that it is not wise to impose cosmological constraints on the parameter space.

  9. Symmetry breaking for drag minimization

    NASA Astrophysics Data System (ADS)

    Roper, Marcus; Squires, Todd M.; Brenner, Michael P.

    2005-11-01

    For locomotion at high Reynolds numbers drag minimization favors fore-aft asymmetric slender shapes with blunt noses and sharp trailing edges. On the other hand, in an inertialess fluid the drag experienced by a body is independent of whether it travels forward or backward through the fluid, so there is no advantage to having a single preferred swimming direction. In fact numerically determined minimum drag shapes are known to exhibit almost no fore-aft asymmetry even at moderate Re. We show that asymmetry persists, albeit extremely weakly, down to vanishingly small Re, scaling asymptotically as Re^3. The need to minimize drag to maximize speed for a given propulsive capacity gives one possible mechanism for the increasing asymmetry in the body plans seen in nature, as organisms increase in size and swimming speed from bacteria like E-Coli up to pursuit predator fish such as tuna. If it is the dominant mechanism, then this signature scaling will be observed in the shapes of motile micro-organisms.

  10. Dynamical minimalism: why less is more in psychology.

    PubMed

    Nowak, Andrzej

    2004-01-01

    The principle of parsimony, embraced in all areas of science, states that simple explanations are preferable to complex explanations in theory construction. Parsimony, however, can necessitate a trade-off with depth and richness in understanding. The approach of dynamical minimalism avoids this trade-off. The goal of this approach is to identify the simplest mechanisms and fewest variables capable of producing the phenomenon in question. A dynamical model in which change is produced by simple rules repetitively interacting with each other can exhibit unexpected and complex properties. It is thus possible to explain complex psychological and social phenomena with very simple models if these models are dynamic. In dynamical minimalist theories, then, the principle of parsimony can be followed without sacrificing depth in understanding. Computer simulations have proven especially useful for investigating the emergent properties of simple models. PMID:15223518

  11. Protection - Principles and practice.

    NASA Technical Reports Server (NTRS)

    Graham, G. S.; Denning, P. J.

    1972-01-01

    The protection mechanisms of computer systems control the access to objects, especially information objects. The principles of protection system design are formalized as a model (theory) of protection. Each process has a unique identification number which is attached by the system to each access attempted by the process. Details of system implementation are discussed, taking into account the storing of the access matrix, aspects of efficiency, and the selection of subjects and objects. Two systems which have protection features incorporating all the elements of the model are described.

  12. Talus fractures: surgical principles.

    PubMed

    Rush, Shannon M; Jennings, Meagan; Hamilton, Graham A

    2009-01-01

    Surgical treatment of talus fractures can challenge even the most skilled foot and ankle surgeon. Complicated fracture patterns combined with joint dislocation of variable degrees require accurate assessment, sound understanding of principles of fracture care, and broad command of internal fixation techniques needed for successful surgical care. Elimination of unnecessary soft tissue dissection, a low threshold for surgical reduction, liberal use of malleolar osteotomy to expose body fracture, and detailed attention to fracture reduction and joint alignment are critical to the success of treatment. Even with the best surgical care complications are common and seem to correlate with injury severity and open injuries. PMID:19121756

  13. Nonequilibrium quantum Landauer principle.

    PubMed

    Goold, John; Paternostro, Mauro; Modi, Kavan

    2015-02-13

    Using the operational framework of completely positive, trace preserving operations and thermodynamic fluctuation relations, we derive a lower bound for the heat exchange in a Landauer erasure process on a quantum system. Our bound comes from a nonphenomenological derivation of the Landauer principle which holds for generic nonequilibrium dynamics. Furthermore, the bound depends on the nonunitality of dynamics, giving it a physical significance that differs from other derivations. We apply our framework to the model of a spin-1/2 system coupled to an interacting spin chain at finite temperature. PMID:25723198

  14. Principles of smile design

    PubMed Central

    Bhuvaneswaran, Mohan

    2010-01-01

    An organized and systematic approach is required to evaluate, diagnose and resolve esthetic problems predictably. It is of prime importance that the final result is not dependent only on the looks alone. Our ultimate goal as clinicians is to achieve pleasing composition in the smile by creating an arrangement of various esthetic elements. This article reviews the various principles that govern the art of smile designing. The literature search was done using PubMed search and Medline. This article will provide a basic knowledge to the reader to bring out a functional stable smile. PMID:21217950

  15. Precision laser spectroscopy in fundamental studies

    NASA Astrophysics Data System (ADS)

    Kolachevsky, N. N.; Khabarova, K. Yu

    2014-12-01

    The role of precision spectroscopic measurements in the development of fundamental theories is discussed, with particular emphasis on the hydrogen atom, the simplest stable atomic system amenable to the accurate calculation of energy levels from quantum electrodynamics. Research areas that greatly benefited from the participation of the Lebedev Physical Institute are reviewed, including the violation of fundamental symmetries, the stability of the fine-structure constant α, and sensitive tests of quantum electrodynamics.

  16. Next-to-minimal SOFTSUSY

    NASA Astrophysics Data System (ADS)

    Allanach, B. C.; Athron, P.; Tunstall, Lewis C.; Voigt, A.; Williams, A. G.

    2014-09-01

    We describe an extension to the SOFTSUSY program that provides for the calculation of the sparticle spectrum in the Next-to-Minimal Supersymmetric Standard Model (NMSSM), where a chiral superfield that is a singlet of the Standard Model gauge group is added to the Minimal Supersymmetric Standard Model (MSSM) fields. Often, a Z3 symmetry is imposed upon the model. SOFTSUSY can calculate the spectrum in this case as well as the case where general Z3 violating (denoted as =) terms are added to the soft supersymmetry breaking terms and the superpotential. The user provides a theoretical boundary condition for the couplings and mass terms of the singlet. Radiative electroweak symmetry breaking data along with electroweak and CKM matrix data are used as weak-scale boundary conditions. The renormalisation group equations are solved numerically between the weak scale and a high energy scale using a nested iterative algorithm. This paper serves as a manual to the NMSSM mode of the program, detailing the approximations and conventions used. Catalogue identifier: ADPM_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPM_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154886 No. of bytes in distributed program, including test data, etc.: 1870890 Distribution format: tar.gz Programming language: C++, fortran. Computer: Personal computer. Operating system: Tested on Linux 3.x. Word size: 64 bits Classification: 11.1, 11.6. Does the new version supersede the previous version?: Yes Catalogue identifier of previous version: ADPM_v3_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 785 Nature of problem: Calculating supersymmetric particle spectrum and mixing parameters in the next-to-minimal supersymmetric standard model. The solution to the

  17. Waste minimization plan construction and operation of the replacement cross-site transfer system, project W-058

    SciTech Connect

    Boucher, T.D.

    1996-04-01

    This report addresses the research and development of a waste minimization plan for the construction and operation of Project W-058, Replacement of the Cross-Site Transfer System, on the Hanford Site. The plan is based on Washington Administrative Code (WAC) 173-307, Plans. The waste minimization plan identifies areas where pollution prevention/waste minimization principles can be incorporated into the construction and operation of the cross-site transfer system.

  18. Academic Principles: A Brief Introduction

    ERIC Educational Resources Information Center

    Association of American Universities, 2013

    2013-01-01

    For many decades certain core principles have guided the conduct of teaching, research, and scholarship at American universities, as well as the ways in which these institutions are governed. There is ample evidence that these principles have strongly contributed to the quality of American universities. The principles have also made these…

  19. Archimedes' Principle in General Coordinates

    ERIC Educational Resources Information Center

    Ridgely, Charles T.

    2010-01-01

    Archimedes' principle is well known to state that a body submerged in a fluid is buoyed up by a force equal to the weight of the fluid displaced by the body. Herein, Archimedes' principle is derived from first principles by using conservation of the stress-energy-momentum tensor in general coordinates. The resulting expression for the force is…

  20. Principles of climate service development

    NASA Astrophysics Data System (ADS)

    Buontempo, Carlo; Liggins, Felicity; Newton, Paula

    2015-04-01

    In November 2014, a group of 30 international experts in climate service development gathered in Honiton, UK, to discuss and identify the key principles that should be considered when developing new climate services by all the actors involved. Through an interactive and dynamic workshop the attendees identified seven principles. This contribution summarises these principles.

  1. THERMODYNAMIC FUNDAMENTALS USED IN HAZARDOUS WASTE INCINERATION

    EPA Science Inventory

    Thermodynamics is the basic foundation of many engineeringpractices. nvironmental engineering is no exception, it is usingthermodynamic principles in many applications. n particular,those who are involved in the incineration of various wastes suchas hazardous and municipal wastes...

  2. Minimizing communication cost among distributed controllers in software defined networks

    NASA Astrophysics Data System (ADS)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  3. Fundamental performance improvement to dispersive spectrograph based imaging technologies

    NASA Astrophysics Data System (ADS)

    Meade, Jeff T.; Behr, Bradford B.; Cenko, Andrew T.; Christensen, Peter; Hajian, Arsen R.; Hendrikse, Jan; Sweeney, Frederic D.

    2011-03-01

    Dispersive-based spectrometers may be qualified by their spectral resolving power and their throughput efficiency. A device known as a virtual slit is able to improve the resolving power by factors of several with a minimal loss in throughput, thereby fundamentally improving the quality of the spectrometer. A virtual slit was built and incorporated into a low performing spectrometer (R ~ 300) and was shown to increase the performance without a significant loss in signal. The operation and description of virtual slits is also given. High-performance, lowlight, and high-speed imaging instruments based on a dispersive-type spectrometer see the greatest impact from a virtual slit. The impact of a virtual slit on spectral domain optical coherence tomography (SD-OCT) is shown to improve the imaging quality substantially.

  4. The Principle of Maximum Conformality

    SciTech Connect

    Brodsky, Stanley J; Giustino, Di; /SLAC

    2011-04-05

    A key problem in making precise perturbative QCD predictions is the uncertainty in determining the renormalization scale of the running coupling {alpha}{sub s}({mu}{sup 2}). It is common practice to guess a physical scale {mu} = Q which is of order of a typical momentum transfer Q in the process, and then vary the scale over a range Q/2 and 2Q. This procedure is clearly problematic since the resulting fixed-order pQCD prediction will depend on the renormalization scheme, and it can even predict negative QCD cross sections at next-to-leading-order. Other heuristic methods to set the renormalization scale, such as the 'principle of minimal sensitivity', give unphysical results for jet physics, sum physics into the running coupling not associated with renormalization, and violate the transitivity property of the renormalization group. Such scale-setting methods also give incorrect results when applied to Abelian QED. Note that the factorization scale in QCD is introduced to match nonperturbative and perturbative aspects of the parton distributions in hadrons; it is present even in conformal theory and thus is a completely separate issue from renormalization scale setting. The PMC provides a consistent method for determining the renormalization scale in pQCD. The PMC scale-fixed prediction is independent of the choice of renormalization scheme, a key requirement of renormalization group invariance. The results avoid renormalon resummation and agree with QED scale-setting in the Abelian limit. The PMC global scale can be derived efficiently at NLO from basic properties of the PQCD cross section. The elimination of the renormalization scheme ambiguity using the PMC will not only increases the precision of QCD tests, but it will also increase the sensitivity of colliders to new physics beyond the Standard Model.

  5. Ergonomic T-Handle for Minimally Invasive Surgical Instruments

    PubMed Central

    Parekh, J; Shepherd, DET; Hukins, DWL; Maffulli, N

    2016-01-01

    A T-handle has been designed to be used for minimally invasive implantation of a dynamic hip screw to repair fractures of the proximal femur. It is capable of being used in two actions: (i) push and hold (while using an angle guide) and (ii) application of torque when using the insertion wrench and lag screw tap. The T-handle can be held in a power or precision grip. It is suitable for either single (sterilised by γ-irradiation) or multiple (sterilised by autoclaving) use. The principles developed here are applicable to handles for a wide range of surgical instruments. PMID:27326394

  6. Spinless Particle in a Magnetic Field Under Minimal Length Scenario

    NASA Astrophysics Data System (ADS)

    Amirfakhrian, S. M.

    2016-06-01

    In this article, we studied the Klein-Gordon equation in a generalised uncertainty principle (GUP) framework which predicts a minimal uncertainty in position. We considered a spinless particle in this framework in the presence of a magnetic field, applied in the z-direction, which varies as {1 over {{x^2}}}. We found the energy eigenvalues of this system and also obtained the correspounding eigenfunctions, using the numerical method. When GUP parameter tends to zero, our solutions were in agreement with those obtained in the absence of GUP.

  7. Probing minimal flavor violation at the CERN LHC

    SciTech Connect

    Grossman, Yuval; Nir, Yosef; Volansky, Tomer; Thaler, Jesse; Zupan, Jure

    2007-11-01

    If the LHC experiments discover new particles that couple to the standard model fermions, then measurements by ATLAS and CMS can contribute to our understanding of the flavor puzzles. We demonstrate this statement by investigating a scenario where extra SU(2)-singlet down-type quarks are within the LHC reach. By measuring masses, production cross sections, and relative decay rates, minimal flavor violation (MFV) can in principle be excluded. Conversely, these measurements can probe the way in which MFV applies to the new degrees of freedom. Many of our conclusions are valid in a much more general context than this specific extension of the standard model.

  8. On the quantum mechanical solutions with minimal length uncertainty

    NASA Astrophysics Data System (ADS)

    Shababi, Homa; Pedram, Pouria; Chung, Won Sang

    2016-06-01

    In this paper, we study two generalized uncertainty principles (GUPs) including [X,P] = iℏ(1 + βP2j) and [X,P] = iℏ(1 + βP2 + kβ2P4) which imply minimal measurable lengths. Using two momentum representations, for the former GUP, we find eigenvalues and eigenfunctions of the free particle and the harmonic oscillator in terms of generalized trigonometric functions. Also, for the latter GUP, we obtain quantum mechanical solutions of a particle in a box and harmonic oscillator. Finally we investigate the statistical properties of the harmonic oscillator including partition function, internal energy, and heat capacity in the context of the first GUP.

  9. Ergonomic T-Handle for Minimally Invasive Surgical Instruments.

    PubMed

    Parekh, J; Shepherd, Det; Hukins, Dwl; Maffulli, N

    2016-05-01

    A T-handle has been designed to be used for minimally invasive implantation of a dynamic hip screw to repair fractures of the proximal femur. It is capable of being used in two actions: (i) push and hold (while using an angle guide) and (ii) application of torque when using the insertion wrench and lag screw tap. The T-handle can be held in a power or precision grip. It is suitable for either single (sterilised by γ-irradiation) or multiple (sterilised by autoclaving) use. The principles developed here are applicable to handles for a wide range of surgical instruments. PMID:27326394

  10. Fundamental limit on accuracy in interferometry.

    PubMed

    Kafri, O

    1989-07-01

    We derive a golden rule to evaluate the limit on fringe resolution. The rule, which is derived from basic principles, takes into consideration the existence of mechanical vibration. We show that conventional interferometer systems are limited by the available vibration-isolation tables to approximately 1/1000 of a fringe. In principle, moiré deflectometry can yield a much better signal-to-noise ratio. PMID:19752926

  11. Minimal genome: Worthwhile or worthless efforts toward being smaller?

    PubMed

    Choe, Donghui; Cho, Suhyung; Kim, Sun Chang; Cho, Byung-Kwan

    2016-02-01

    Microbial cells are versatile hosts for the production of value-added products due to the well-established background knowledge, various genetic tools, and ease of manipulation. Despite those advantages, efficiency of newly incorporated synthetic pathways in microbial cells is frequently limited by innate metabolism, product toxicity, and growth-mediated genetic instability. To overcome those obstacles, a minimal genome harboring only the essential set of genes was proposed, which is a fascinating concept with potential for use as a platform strain. Here, we review the currently available artificial reduced genomes and discuss the prospects for extending use of the genome-reduced strains as programmable chasses. The genome-reduced strains generally showed comparable growth to and higher productivity than their ancestral strains. In Escherichia coli, about 300 genes are estimated as the minimal number of genes under laboratory conditions. However, recent advances revealed that there are non-essential components in essential genes, suggesting that the design principle of minimal genomes should be reconstructed. Current technology is not efficient enough to reduce large amount of interspaced genomic regions or to synthesize the genome. Furthermore, construction of minimal genome frequently has failed due to lack of genomic information. Technological breakthroughs and intense systematic studies on genomes remain tasks. PMID:26356135

  12. Principles of Safety Pharmacology

    PubMed Central

    Pugsley, M K; Authier, S; Curtis, M J

    2008-01-01

    Safety Pharmacology is a rapidly developing discipline that uses the basic principles of pharmacology in a regulatory-driven process to generate data to inform risk/benefit assessment. The aim of Safety Pharmacology is to characterize the pharmacodynamic/pharmacokinetic (PK/PD) relationship of a drug's adverse effects using continuously evolving methodology. Unlike toxicology, Safety Pharmacology includes within its remit a regulatory requirement to predict the risk of rare lethal events. This gives Safety Pharmacology its unique character. The key issues for Safety Pharmacology are detection of an adverse effect liability, projection of the data into safety margin calculation and finally clinical safety monitoring. This article sets out to explain the drivers for Safety Pharmacology so that the wider pharmacology community is better placed to understand the discipline. It concludes with a summary of principles that may help inform future resolution of unmet needs (especially establishing model validation for accurate risk assessment). Subsequent articles in this issue of the journal address specific aspects of Safety Pharmacology to explore the issues of model choice, the burden of proof and to highlight areas of intensive activity (such as testing for drug-induced rare event liability, and the challenge of testing the safety of so-called biologics (antibodies, gene therapy and so on.). PMID:18604233

  13. Principles of immunology.

    PubMed

    Lentz, Ashley K; Feezor, Robert J

    2003-12-01

    The immune system, composed of innate and acquired immunity, allows an organism to fight off foreign pathogens. Healthy immunity accomplishes four essential principles: (1) ability to detect and fight off infection; (2) ability to recognize a host's own cells as "self," thereby protecting them from attack; (3) a memory from previous foreign infections; and (4) ability to limit the response after the pathogen has been removed. In an unaltered state, the intricate network of immunologic organs and cells creates an environment for proper host defense. Without adequate execution of immunologic mechanisms, a host is rendered defenseless against pathogens. Conversely, an unchecked immune response can be self-destructive. As a result of either of these untoward sequelae, immune dysfunction can elicit disease states in the host. The goal of this review is to elucidate the characteristics of a healthy immune system, focusing on the principles of immunity and the cells that participate in host protection. We also briefly discuss the clinical ramifications of immune dysfunction. PMID:16215081

  14. Fracture mechanics principles.

    PubMed

    Mecholsky, J J

    1995-03-01

    The principles of linear elastic fracture mechanics (LEFM) were developed in the 1950s by George Irwin (1957). This work was based on previous investigations of Griffith (1920) and Orowan (1944). Irwin (1957) demonstrated that a crack shape in a particular location with respect to the loading geometry had a stress intensity associated with it. He also demonstrated the equivalence between the stress intensity concept and the familiar Griffith criterion of failure. More importantly, he described the systematic and controlled evaluation of the toughness of a material. Toughness is defined as the resistance of a material to rapid crack propagation and can be characterized by one parameter, Kic. In contrast, the strength of a material is dependent on the size of the initiating crack present in that particular sample or component. The fracture toughness of a material is generally independent of the size of the initiating crack. The strength of any product is limited by the size of the cracks or defects during processing, production and handling. Thus, the application of fracture mechanics principles to dental biomaterials is invaluable in new material development, production control and failure analysis. This paper describes the most useful equations of fracture mechanics to be used in the failure analysis of dental biomaterials. PMID:8621030

  15. Update on designing and building minimal cells

    PubMed Central

    Jewett, Michael C.; Forster, Anthony C.

    2010-01-01

    Summary Minimal cells comprise only the genes and biomolecular machinery necessary for basic life. Synthesizing minimal and minimized cells will improve understanding of core biology, enhance development of biotechnology strains of bacteria, and enable evolutionary optimization of natural and unnatural biopolymers. Design and construction of minimal cells is proceeding in two different directions: “top-down” reduction of bacterial genomes in vivo and “bottom-up” integration of DNA/RNA/protein/membrane syntheses in vitro. Major progress in the last 5 years has occurred in synthetic genomics, minimization of the Escherichia coli genome, sequencing of minimal bacterial endosymbionts, identification of essential genes, and integration of biochemical systems. PMID:20638265

  16. Principles of Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Landé, Alfred

    2013-10-01

    Preface; Introduction: 1. Observation and interpretation; 2. Difficulties of the classical theories; 3. The purpose of quantum theory; Part I. Elementary Theory of Observation (Principle of Complementarity): 4. Refraction in inhomogeneous media (force fields); 5. Scattering of charged rays; 6. Refraction and reflection at a plane; 7. Absolute values of momentum and wave length; 8. Double ray of matter diffracting light waves; 9. Double ray of matter diffracting photons; 10. Microscopic observation of ρ (x) and σ (p); 11. Complementarity; 12. Mathematical relation between ρ (x) and σ (p) for free particles; 13. General relation between ρ (q) and σ (p); 14. Crystals; 15. Transition density and transition probability; 16. Resultant values of physical functions; matrix elements; 17. Pulsating density; 18. General relation between ρ (t) and σ (є); 19. Transition density; matrix elements; Part II. The Principle of Uncertainty: 20. Optical observation of density in matter packets; 21. Distribution of momenta in matter packets; 22. Mathematical relation between ρ and σ; 23. Causality; 24. Uncertainty; 25. Uncertainty due to optical observation; 26. Dissipation of matter packets; rays in Wilson Chamber; 27. Density maximum in time; 28. Uncertainty of energy and time; 29. Compton effect; 30. Bothe-Geiger and Compton-Simon experiments; 31. Doppler effect; Raman effect; 32. Elementary bundles of rays; 33. Jeans' number of degrees of freedom; 34. Uncertainty of electromagnetic field components; Part III. The Principle of Interference and Schrödinger's equation: 35. Physical functions; 36. Interference of probabilities for p and q; 37. General interference of probabilities; 38. Differential equations for Ψp (q) and Xq (p); 39. Differential equation for фβ (q); 40. The general probability amplitude Φβ' (Q); 41. Point transformations; 42. General theorem of interference; 43. Conjugate variables; 44. Schrödinger's equation for conservative systems; 45. Schr

  17. Fundamentals of microfluidic cell culture in controlled microenvironments†

    PubMed Central

    Young, Edmond W. K.; Beebe, David J.

    2010-01-01

    Microfluidics has the potential to revolutionize the way we approach cell biology research. The dimensions of microfluidic channels are well suited to the physical scale of biological cells, and the many advantages of microfluidics make it an attractive platform for new techniques in biology. One of the key benefits of microfluidics for basic biology is the ability to control parameters of the cell microenvironment at relevant length and time scales. Considerable progress has been made in the design and use of novel microfluidic devices for culturing cells and for subsequent treatment and analysis. With the recent pace of scientific discovery, it is becoming increasingly important to evaluate existing tools and techniques, and to synthesize fundamental concepts that would further improve the efficiency of biological research at the microscale. This tutorial review integrates fundamental principles from cell biology and local microenvironments with cell culture techniques and concepts in microfluidics. Culturing cells in microscale environments requires knowledge of multiple disciplines including physics, biochemistry, and engineering. We discuss basic concepts related to the physical and biochemical microenvironments of the cell, physicochemical properties of that microenvironment, cell culture techniques, and practical knowledge of microfluidic device design and operation. We also discuss the most recent advances in microfluidic cell culture and their implications on the future of the field. The goal is to guide new and interested researchers to the important areas and challenges facing the scientific community as we strive toward full integration of microfluidics with biology. PMID:20179823

  18. Fundamental constants and cosmic vacuum: The micro and macro connection

    NASA Astrophysics Data System (ADS)

    Fritzsch, Harald; Solà, Joan

    2015-06-01

    The idea that the vacuum energy density ρΛ could be time-dependent is a most reasonable one in the expanding Universe; in fact, much more reasonable than just a rigid cosmological constant for the entire cosmic history. Being ρΛ = ρΛ(t) dynamical, it offers a possibility to tackle the cosmological constant problem in its various facets. Furthermore, for a long time (most prominently since Dirac’s first proposal on a time variable gravitational coupling) the possibility that the fundamental “constants” of Nature are slowly drifting with the cosmic expansion has been continuously investigated. In the last two decades, and specially in recent times, mounting experimental evidence attests that this could be the case. In this paper, we consider the possibility that these two groups of facts might be intimately connected, namely that the observed acceleration of the Universe and the possible time variation of the fundamental constants are two manifestations of the same underlying dynamics. We call it: the “micro and macro connection”, and on its basis we expect that the cosmological term in Einstein’s equations, Newton’s coupling and the masses of all the particles in the Universe, both the dark matter (DM) particles and the ordinary baryons and leptons, should all drift with the cosmic expansion. Here, we discuss specific cosmological models realizing such possibility in a way that preserves the principle of covariance of general relativity (GR).

  19. Catalyst design for enhanced sustainability through fundamental surface chemistry.

    PubMed

    Personick, Michelle L; Montemore, Matthew M; Kaxiras, Efthimios; Madix, Robert J; Biener, Juergen; Friend, Cynthia M

    2016-02-28

    Decreasing energy consumption in the production of platform chemicals is necessary to improve the sustainability of the chemical industry, which is the largest consumer of delivered energy. The majority of industrial chemical transformations rely on catalysts, and therefore designing new materials that catalyse the production of important chemicals via more selective and energy-efficient processes is a promising pathway to reducing energy use by the chemical industry. Efficiently designing new catalysts benefits from an integrated approach involving fundamental experimental studies and theoretical modelling in addition to evaluation of materials under working catalytic conditions. In this review, we outline this approach in the context of a particular catalyst-nanoporous gold (npAu)-which is an unsupported, dilute AgAu alloy catalyst that is highly active for the selective oxidative transformation of alcohols. Fundamental surface science studies on Au single crystals and AgAu thin-film alloys in combination with theoretical modelling were used to identify the principles which define the reactivity of npAu and subsequently enabled prediction of new reactive pathways on this material. Specifically, weak van der Waals interactions are key to the selectivity of Au materials, including npAu. We also briefly describe other systems in which this integrated approach was applied. PMID:26755756

  20. Fundamental role of bistability in optimal homeostatic control

    NASA Astrophysics Data System (ADS)

    Wang, Guanyu

    2013-03-01

    Bistability is a fundamental phenomenon in nature and has a number of fine properties. However, these properties are consequences of bistability at the physiological level, which do not explain why it had to emerge during evolution. Using optimal homeostasis as the first principle and Pontryagin's Maximum Principle as the optimization approach, I find that bistability emerges as an indispensable control mechanism. Because the mathematical model is general and the result is independent of parameters, it is likely that most biological systems use bistability to control homeostasis. Glucose homeostasis represents a good example. It turns out that bistability is the only solution to a dilemma in glucose homeostasis: high insulin efficiency is required for rapid plasma glucose clearance, whereas an insulin sparing state is required to guarantee the brain's safety during fasting. This new perspective can illuminate studies on the twin epidemics of obesity and diabetes and the corresponding intervening strategies. For example, overnutrition and sedentary lifestyle may represent sudden environmental changes that cause the lose of optimality, which may contribute to the marked rise of obesity and diabetes in our generation.