Science.gov

Sample records for minimization fundamental principles

  1. Principle of minimal work fluctuations

    NASA Astrophysics Data System (ADS)

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality =e-β Δ F , a change in the fluctuations of e-β W may impact how rapidly the statistical average of e-β W converges towards the theoretical value e-β Δ F, where W is the work, β is the inverse temperature, and Δ F is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-β W. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-β W, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-β W. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014), 10.1103/PhysRevE.90.052132].

  2. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].

  3. Fundamental base closure environmental principles

    SciTech Connect

    Yim, R.A.

    1994-12-31

    Military base closures present a paradox. The rate, scale and timing of military base closures is historically unique. However, each base itself typically does not present unique problems. Thus, the challenge is to design innovative solutions to base redevelopment and remediation issues, while simultaneously adopting common, streamlined or pre-approved strategies to shared problems. The author presents six environmental principles that are fundamental to base closure. They are: remediation not clean up; remediation will impact reuse; reuse will impact remediation; remediation and reuse must be coordinated; environmental contamination must be evaluated as any other initial physical constraint on development, not as an overlay after plans are created; and remediation will impact development, financing and marketability.

  4. Fundamental ethical principles in health care.

    PubMed

    Thompson, I E

    1987-12-01

    In an attempt to clarify which requirements of morality are logically primary to the ethics of health care, two questions are examined: is there sufficient common ground among the medical, nursing, paramedical, chaplaincy, and social work professions to justify looking for ethical principles common to health care? Do sufficient logical grounds or consensus among health workers and the public exist to speak of "fundamental ethical principles in health care"? While respect for persons, justice, and beneficence are fundamental principles in a formal sense, how we view these principles in practice will depend on our particular culture and experience and the kinds of metaethical criteria we use for applying these principles.

  5. Two Fundamental Principles of Nature's Interactions

    NASA Astrophysics Data System (ADS)

    Ma, Tian; Wang, Shouhong

    2014-03-01

    In this talk, we present two fundamental principles of nature's interactions, the principle of interaction dynamics (PID) and the principle of representation invariance (PRI). Intuitively, PID takes the variation of the action functional under energy-momentum conservation constraint. PID offers a completely different and natural way of introducing Higgs fields. PRI requires that physical laws be independent of representations of the gauge groups. These two principles give rise to a unified field model for four interactions, which can be naturally decoupled to study individual interactions. With these two principles, we are able to derive 1) a unified theory for dark matter and dark energy, 2) layered strong and weak interaction potentials, and 3) the energy levels of subatomic particles. Supported in part by NSF, ONR and Chinese NSF.

  6. 47 CFR 36.2 - Fundamental principles underlying procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-kilometers is the basic fundamental allocation factor for interexchange circuit plant and exchange trunk... 47 Telecommunication 2 2010-10-01 2010-10-01 false Fundamental principles underlying procedures... Fundamental principles underlying procedures. (a) The following general principles underlie the...

  7. Stem cell bioprocessing: fundamentals and principles

    PubMed Central

    Placzek, Mark R.; Chung, I-Ming; Macedo, Hugo M.; Ismail, Siti; Mortera Blanco, Teresa; Lim, Mayasari; Min Cha, Jae; Fauzi, Iliana; Kang, Yunyi; Yeo, David C.L.; Yip Joan Ma, Chi; Polak, Julia M.; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2008-01-01

    In recent years, the potential of stem cell research for tissue engineering-based therapies and regenerative medicine clinical applications has become well established. In 2006, Chung pioneered the first entire organ transplant using adult stem cells and a scaffold for clinical evaluation. With this a new milestone was achieved, with seven patients with myelomeningocele receiving stem cell-derived bladder transplants resulting in substantial improvements in their quality of life. While a bladder is a relatively simple organ, the breakthrough highlights the incredible benefits that can be gained from the cross-disciplinary nature of tissue engineering and regenerative medicine (TERM) that encompasses stem cell research and stem cell bioprocessing. Unquestionably, the development of bioprocess technologies for the transfer of the current laboratory-based practice of stem cell tissue culture to the clinic as therapeutics necessitates the application of engineering principles and practices to achieve control, reproducibility, automation, validation and safety of the process and the product. The successful translation will require contributions from fundamental research (from developmental biology to the ‘omics’ technologies and advances in immunology) and from existing industrial practice (biologics), especially on automation, quality assurance and regulation. The timely development, integration and execution of various components will be critical—failures of the past (such as in the commercialization of skin equivalents) on marketing, pricing, production and advertising should not be repeated. This review aims to address the principles required for successful stem cell bioprocessing so that they can be applied deftly to clinical applications. PMID:19033137

  8. Stem cell bioprocessing: fundamentals and principles.

    PubMed

    Placzek, Mark R; Chung, I-Ming; Macedo, Hugo M; Ismail, Siti; Mortera Blanco, Teresa; Lim, Mayasari; Cha, Jae Min; Fauzi, Iliana; Kang, Yunyi; Yeo, David C L; Ma, Chi Yip Joan; Polak, Julia M; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2009-03-01

    In recent years, the potential of stem cell research for tissue engineering-based therapies and regenerative medicine clinical applications has become well established. In 2006, Chung pioneered the first entire organ transplant using adult stem cells and a scaffold for clinical evaluation. With this a new milestone was achieved, with seven patients with myelomeningocele receiving stem cell-derived bladder transplants resulting in substantial improvements in their quality of life. While a bladder is a relatively simple organ, the breakthrough highlights the incredible benefits that can be gained from the cross-disciplinary nature of tissue engineering and regenerative medicine (TERM) that encompasses stem cell research and stem cell bioprocessing. Unquestionably, the development of bioprocess technologies for the transfer of the current laboratory-based practice of stem cell tissue culture to the clinic as therapeutics necessitates the application of engineering principles and practices to achieve control, reproducibility, automation, validation and safety of the process and the product. The successful translation will require contributions from fundamental research (from developmental biology to the 'omics' technologies and advances in immunology) and from existing industrial practice (biologics), especially on automation, quality assurance and regulation. The timely development, integration and execution of various components will be critical-failures of the past (such as in the commercialization of skin equivalents) on marketing, pricing, production and advertising should not be repeated. This review aims to address the principles required for successful stem cell bioprocessing so that they can be applied deftly to clinical applications.

  9. Fundamental Principles of Proper Space Kinematics

    NASA Astrophysics Data System (ADS)

    Wade, Sean

    It is desirable to understand the movement of both matter and energy in the universe based upon fundamental principles of space and time. Time dilation and length contraction are features of Special Relativity derived from the observed constancy of the speed of light. Quantum Mechanics asserts that motion in the universe is probabilistic and not deterministic. While the practicality of these dissimilar theories is well established through widespread application inconsistencies in their marriage persist, marring their utility, and preventing their full expression. After identifying an error in perspective the current theories are tested by modifying logical assumptions to eliminate paradoxical contradictions. Analysis of simultaneous frames of reference leads to a new formulation of space and time that predicts the motion of both kinds of particles. Proper Space is a real, three-dimensional space clocked by proper time that is undergoing a densification at the rate of c. Coordinate transformations to a familiar object space and a mathematical stationary space clarify the counterintuitive aspects of Special Relativity. These symmetries demonstrate that within the local universe stationary observers are a forbidden frame of reference; all is in motion. In lieu of Quantum Mechanics and Uncertainty the use of the imaginary number i is restricted for application to the labeling of mass as either material or immaterial. This material phase difference accounts for both the perceived constant velocity of light and its apparent statistical nature. The application of Proper Space Kinematics will advance more accurate representations of microscopic, oscopic, and cosmological processes and serve as a foundation for further study and reflection thereafter leading to greater insight.

  10. The "Fundamental Pedogagical Principle" in Second Language Teaching.

    ERIC Educational Resources Information Center

    Krashen, Stephen D.

    1981-01-01

    A fundamental principle of second language acquisition is stated and applied to language teaching. The principle states that learners acquire a second language when they receive comprehensible input in situations where their affective filters are sufficiently low. The theoretical background of this principle consists of five hypotheses: the…

  11. Fundamental Ethical Principles in Sports Medicine.

    PubMed

    Devitt, Brian M

    2016-04-01

    In sports medicine, the practice of ethics presents many unique challenges because of the unusual clinical environment of caring for players within the context of a team whose primary goal is to win. Ethical issues frequently arise because a doctor-patient-team triad often replaces the traditional doctor-patient relationship. Conflict may exist when the team's priority clashes with or even replaces the doctor's obligation to player well-being. Customary ethical norms that govern most forms of clinical practice, such as autonomy and confidentiality, are not easily translated to sports medicine. Ethical principles and examples of how they relate to sports medicine are discussed. PMID:26832970

  12. Functional Neuroimaging: Fundamental Principles and Clinical Applications

    PubMed Central

    Altmeyer, Wilson; Zhuo, Jiachen; Steven, Andrew

    2015-01-01

    SUMMARY Functional imaging modalities, such as functional magnetic resonance imaging (fMRI) and diffusion tensor imaging (DTI), are rapidly changing the scope and practice of neuroradiology. While these modalities have long been used in research, they are increasingly being used in clinical practice to enable reliable identification of eloquent cortex and white matter tracts in order to guide treatment planning and to serve as a diagnostic supplement when traditional imaging fails. An understanding of the scientific principles underlying fMRI and DTI is necessary in current radiological practice. fMRI relies on a compensatory hemodynamic response seen in cortical activation and the intrinsic discrepant magnetic properties of deoxy- and oxyhemoglobin. Neuronal activity can be indirectly visualized based on a hemodynamic response, termed neurovascular coupling. fMRI demonstrates utility in identifying areas of cortical activation (i.e., task-based activation) and in discerning areas of neuronal connectivity when used during the resting state, termed resting state fMRI. While fMRI is limited to visualization of gray matter, DTI permits visualization of white matter tracts through diffusion restriction along different axes. We will discuss the physical, statistical and physiological principles underlying these functional imaging modalities and explore new promising clinical applications. PMID:25963153

  13. [Fundamentals and principles of grafts and flaps].

    PubMed

    Cruz-Navarro, Natalio; León-Dueñas, Eduardo

    2014-01-01

    Reconstructive surgery of large urethral stenosis and the management of congenital anomalies such as hypospadias and epispadias require covering large cutaneous and mucosal defects with different techniques. The objective of this work is to define the main differences between tissues to be transferred and to study the principles that must govern the management of the various flaps and grafts used for these techniques. We analyze the anatomical and physiological features that may be key to understand the success and possible failures of these procedures, and we review technical details that must accompany in every case, not only during the operation, but also during the preoperative and postoperative period. We conclude stating that grafts (mainly oral and preputial mucosa) and flaps are increasingly used for the repair of urethral stenosis. Grafts must be prepared adequately in the back table and thinned to the maximum, and also be fixed properly, to guarantee their immobility until neovascularization is assured.

  14. [Development, terminology, principles, and controversies in minimally invasive knee arthroplasty].

    PubMed

    Hofmann, S; Pietsch, M

    2007-12-01

    Minimally invasive total knee arthroplasty is a logical and further improvement of the good results achieved with minimally invasive unicondylar knee arthroplasty. The terminology for minimally invasive surgery (MIS) is confusing and comparison of different techniques is therefore difficult. A simple separation between less invasive and minimally invasive techniques will be presented. Besides the approach, minimally invasive surgical principles are very important. MIS in total knee arthroplasty is discussed very controversially at the moment. The preliminary results of these new techniques are very promising. Up to now there is much more feeling then knowing. Important questions (risk-benefit analysis, which technique for which patient and surgeon, education and cost-effectiveness) must be addressed by the proponents of this MIS technique. Step by step learning of these new techniques (evolution instead of revolution) in specific education centres is strongly recommended. Ultimately, patients and surgeons will have to decide whether these new techniques will only be a modern trend or represent the future.

  15. Fundamental principles of energy consumption for gene expression.

    PubMed

    Huang, Lifang; Yuan, Zhanjiang; Yu, Jianshe; Zhou, Tianshou

    2015-12-01

    How energy is consumed in gene expression is largely unknown mainly due to complexity of non-equilibrium mechanisms affecting expression levels. Here, by analyzing a representative gene model that considers complexity of gene expression, we show that negative feedback increases energy consumption but positive feedback has an opposite effect; promoter leakage always reduces energy consumption; generating more bursts needs to consume more energy; and the speed of promoter switching is at the cost of energy consumption. We also find that the relationship between energy consumption and expression noise is multi-mode, depending on both the type of feedback and the speed of promoter switching. Altogether, these results constitute fundamental principles of energy consumption for gene expression, which lay a foundation for designing biologically reasonable gene modules. In addition, we discuss possible biological implications of these principles by combining experimental facts.

  16. Rigorous force field optimization principles based on statistical distance minimization.

    PubMed

    Vlcek, Lukas; Chialvo, Ariel A

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model's static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length. PMID:26472366

  17. Rigorous force field optimization principles based on statistical distance minimization

    SciTech Connect

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  18. Negative-Refraction Metamaterials: Fundamental Principles and Applications

    NASA Astrophysics Data System (ADS)

    Eleftheriades, G. V.; Balmain, K. G.

    2005-06-01

    Learn about the revolutionary new technology of negative-refraction metamaterials Negative-Refraction Metamaterials: Fundamental Principles and Applications introduces artificial materials that support the unusual electromagnetic property of negative refraction. Readers will discover several classes of negative-refraction materials along with their exciting, groundbreaking applications, such as lenses and antennas, imaging with super-resolution, microwave devices, dispersion-compensating interconnects, radar, and defense. The book begins with a chapter describing the fundamentals of isotropic metamaterials in which a negative index of refraction is defined. In the following chapters, the text builds on the fundamentals by describing a range of useful microwave devices and antennas. Next, a broad spectrum of exciting new research and emerging applications is examined, including: Theory and experiments behind a super-resolving, negative-refractive-index transmission-line lens 3-D transmission-line metamaterials with a negative refractive index Numerical simulation studies of negative refraction of Gaussian beams and associated focusing phenomena Unique advantages and theory of shaped lenses made of negative-refractive-index metamaterials A new type of transmission-line metamaterial that is anisotropic and supports the formation of sharp steerable beams (resonance cones) Implementations of negative-refraction metamaterials at optical frequencies Unusual propagation phenomena in metallic waveguides partially filled with negative-refractive-index metamaterials Metamaterials in which the refractive index and the underlying group velocity are both negative This work brings together the best minds in this cutting-edge field. It is fascinating reading for scientists, engineers, and graduate-level students in physics, chemistry, materials science, photonics, and electrical engineering.

  19. Minimal self-models and the free energy principle.

    PubMed

    Limanowski, Jakub; Blankenburg, Felix

    2013-01-01

    The term "minimal phenomenal selfhood" (MPS) describes the basic, pre-reflective experience of being a self (Blanke and Metzinger, 2009). Theoretical accounts of the minimal self have long recognized the importance and the ambivalence of the body as both part of the physical world, and the enabling condition for being in this world (Gallagher, 2005a; Grafton, 2009). A recent account of MPS (Metzinger, 2004a) centers on the consideration that minimal selfhood emerges as the result of basic self-modeling mechanisms, thereby being founded on pre-reflective bodily processes. The free energy principle (FEP; Friston, 2010) is a novel unified theory of cortical function built upon the imperative that self-organizing systems entail hierarchical generative models of the causes of their sensory input, which are optimized by minimizing free energy as an approximation of the log-likelihood of the model. The implementation of the FEP via predictive coding mechanisms and in particular the active inference principle emphasizes the role of embodiment for predictive self-modeling, which has been appreciated in recent publications. In this review, we provide an overview of these conceptions and illustrate thereby the potential power of the FEP in explaining the mechanisms underlying minimal selfhood and its key constituents, multisensory integration, interoception, agency, perspective, and the experience of mineness. We conclude that the conceptualization of MPS can be well mapped onto a hierarchical generative model furnished by the FEP and may constitute the basis for higher-level, cognitive forms of self-referral, as well as the understanding of other minds.

  20. Classical Dynamics Based on the Minimal Length Uncertainty Principle

    NASA Astrophysics Data System (ADS)

    Chung, Won Sang

    2016-02-01

    In this paper we consider the quadratic modification of the Heisenberg algebra and its classical limit version which we call the β-deformed Poisson bracket for corresponding classical variables. We use the β-deformed Poisson bracket to discuss some physical problems in the β-deformed classical dynamics. Finally, we consider the ( α, β)- deformed classical dynamics in which minimal length uncertainty principle is given by [ hat {x} , hat {p}] = i hbar (1 + α hat {x}2 + β hat {p}2 ) . For two small parameters α, β, we discuss the free fall of particle and a composite system in a uniform gravitational field.

  1. The Principle of Minimal Resistance in Non-equilibrium Thermodynamics

    NASA Astrophysics Data System (ADS)

    Mauri, Roberto

    2016-04-01

    Analytical models describing the motion of colloidal particles in given force fields are presented. In addition to local approaches, leading to well known master equations such as the Langevin and the Fokker-Planck equations, a global description based on path integration is reviewed. A new result is presented, showing that under very broad conditions, during its evolution a dissipative system tends to minimize its energy dissipation in such a way to keep constant the Hamiltonian time rate, equal to the difference between the flux-based and the force-based Rayleigh dissipation functions. In fact, the Fokker-Planck equation can be interpreted as the Hamilton-Jacobi equation resulting from such minumum principle. At steady state, the Hamiltonian time rate is maximized, leading to a minimum resistance principle. In the unsteady case, we consider the relaxation to equilibrium of harmonic oscillators and the motion of a Brownian particle in shear flow, obtaining results that coincide with the solution of the Fokker-Planck and the Langevin equations.

  2. Ultrafast mid-IR laser scalpel: protein signals of the fundamental limits to minimally invasive surgery.

    PubMed

    Amini-Nik, Saeid; Kraemer, Darren; Cowan, Michael L; Gunaratne, Keith; Nadesan, Puviindran; Alman, Benjamin A; Miller, R J Dwayne

    2010-09-28

    Lasers have in principle the capability to cut at the level of a single cell, the fundamental limit to minimally invasive procedures and restructuring biological tissues. To date, this limit has not been achieved due to collateral damage on the macroscale that arises from thermal and shock wave induced collateral damage of surrounding tissue. Here, we report on a novel concept using a specifically designed Picosecond IR Laser (PIRL) that selectively energizes water molecules in the tissue to drive ablation or cutting process faster than thermal exchange of energy and shock wave propagation, without plasma formation or ionizing radiation effects. The targeted laser process imparts the least amount of energy in the remaining tissue without any of the deleterious photochemical or photothermal effects that accompanies other laser wavelengths and pulse parameters. Full thickness incisional and excisional wounds were generated in CD1 mice using the Picosecond IR Laser, a conventional surgical laser (DELight Er:YAG) or mechanical surgical tools. Transmission and scanning electron microscopy showed that the PIRL laser produced minimal tissue ablation with less damage of surrounding tissues than wounds formed using the other modalities. The width of scars formed by wounds made by the PIRL laser were half that of the scars produced using either a conventional surgical laser or a scalpel. Aniline blue staining showed higher levels of collagen in the early stage of the wounds produced using the PIRL laser, suggesting that these wounds mature faster. There were more viable cells extracted from skin using the PIRL laser, suggesting less cellular damage. β-catenin and TGF-β signalling, which are activated during the proliferative phase of wound healing, and whose level of activation correlates with the size of wounds was lower in wounds generated by the PIRL system. Wounds created with the PIRL systsem also showed a lower rate of cell proliferation. Direct comparison of wound

  3. β-Alkyl Elimination: Fundamental Principles and Some Applications.

    PubMed

    O'Reilly, Matthew E; Dutta, Saikat; Veige, Adam S

    2016-07-27

    This review describes organometallic compounds and materials that are capable of mediating a rarely encountered but fundamentally important reaction: β-alkyl elimination at the metal-Cα-Cβ-R moiety, in which an alkyl group attached to the Cβ atom is transferred to the metal or to a coordinated substrate. The objectives of this review are to provide a cohesive fundamental understanding of β-alkyl-elimination reactions and to highlight its applications in olefin polymerization, alkane hydrogenolysis, depolymerization of branched polymers, ring-opening polymerization of cycloalkanes, and other useful organic reactions. To provide a coherent understanding of the β-alkyl elimination reaction, special attention is given to conditions and strategies used to facilitate β-alkyl-elimination/transfer events in metal-catalyzed olefin polymerization, which provide the well-studied examples.

  4. Nonthermal Plasma Synthesis of Nanocrystals: Fundamental Principles, Materials, and Applications.

    PubMed

    Kortshagen, Uwe R; Sankaran, R Mohan; Pereira, Rui N; Girshick, Steven L; Wu, Jeslin J; Aydil, Eray S

    2016-09-28

    Nonthermal plasmas have emerged as a viable synthesis technique for nanocrystal materials. Inherently solvent and ligand-free, nonthermal plasmas offer the ability to synthesize high purity nanocrystals of materials that require high synthesis temperatures. The nonequilibrium environment in nonthermal plasmas has a number of attractive attributes: energetic surface reactions selectively heat the nanoparticles to temperatures that can strongly exceed the gas temperature; charging of nanoparticles through plasma electrons reduces or eliminates nanoparticle agglomeration; and the large difference between the chemical potentials of the gaseous growth species and the species bound to the nanoparticle surfaces facilitates nanocrystal doping. This paper reviews the state of the art in nonthermal plasma synthesis of nanocrystals. It discusses the fundamentals of nanocrystal formation in plasmas, reviews practical implementations of plasma reactors, surveys the materials that have been produced with nonthermal plasmas and surface chemistries that have been developed, and provides an overview of applications of plasma-synthesized nanocrystals.

  5. Nonthermal Plasma Synthesis of Nanocrystals: Fundamental Principles, Materials, and Applications.

    PubMed

    Kortshagen, Uwe R; Sankaran, R Mohan; Pereira, Rui N; Girshick, Steven L; Wu, Jeslin J; Aydil, Eray S

    2016-09-28

    Nonthermal plasmas have emerged as a viable synthesis technique for nanocrystal materials. Inherently solvent and ligand-free, nonthermal plasmas offer the ability to synthesize high purity nanocrystals of materials that require high synthesis temperatures. The nonequilibrium environment in nonthermal plasmas has a number of attractive attributes: energetic surface reactions selectively heat the nanoparticles to temperatures that can strongly exceed the gas temperature; charging of nanoparticles through plasma electrons reduces or eliminates nanoparticle agglomeration; and the large difference between the chemical potentials of the gaseous growth species and the species bound to the nanoparticle surfaces facilitates nanocrystal doping. This paper reviews the state of the art in nonthermal plasma synthesis of nanocrystals. It discusses the fundamentals of nanocrystal formation in plasmas, reviews practical implementations of plasma reactors, surveys the materials that have been produced with nonthermal plasmas and surface chemistries that have been developed, and provides an overview of applications of plasma-synthesized nanocrystals. PMID:27550744

  6. Does quantity generate quality? Testing the fundamental principle of brainstorming.

    PubMed

    Muñoz Adánez, Alfredo

    2005-11-01

    The purpose of this work is to test the chief principle of brainstorming, formulated as "quantity generates quality." The study is included within a broad program whose goal is to detect the strong and weak points of creative techniques. In a sample of 69 groups, containing between 3 and 8 members, the concurrence of two commonly accepted criteria was established as a quality rule: originality and utility or value. The results fully support the quantity-quality relation (r = .893): the more ideas produced to solve a problem, the better quality of the ideas. The importance of this finding, which supports Osborn's theory, is discussed, and the use of brainstorming is recommended to solve the many open problems faced by our society.

  7. Fundamental Principles of Classical Mechanics: a Geometrical Perspectives

    NASA Astrophysics Data System (ADS)

    Lam, Kai S.

    2014-07-01

    Classical mechanics is the quantitative study of the laws of motion for oscopic physical systems with mass. The fundamental laws of this subject, known as Newton's Laws of Motion, are expressed in terms of second-order differential equations governing the time evolution of vectors in a so-called configuration space of a system (see Chapter 12). In an elementary setting, these are usually vectors in 3-dimensional Euclidean space, such as position vectors of point particles; but typically they can be vectors in higher dimensional and more abstract spaces. A general knowledge of the mathematical properties of vectors, not only in their most intuitive incarnations as directed arrows in physical space but as elements of abstract linear vector spaces, and those of linear operators (transformations) on vector spaces as well, is then indispensable in laying the groundwork for both the physical and the more advanced mathematical - more precisely topological and geometrical - concepts that will prove to be vital in our subject. In this beginning chapter we will review these properties, and introduce the all-important related notions of dual spaces and tensor products of vector spaces. The notational convention for vectorial and tensorial indices used for the rest of this book (except when otherwise specified) will also be established...

  8. The Minimal Control Principle Predicts Strategy Shifts in the Abstract Decision Making Task

    ERIC Educational Resources Information Center

    Taatgen, Niels A.

    2011-01-01

    The minimal control principle (Taatgen, 2007) predicts that people strive for problem-solving strategies that require as few internal control states as possible. In an experiment with the Abstract Decision Making task (ADM task; Joslyn & Hunt, 1998) the reward structure was manipulated to make either a low-control strategy or a high-strategy…

  9. Research on the fundamental principles of China's marine invasive species prevention legislation.

    PubMed

    Bai, Jiayu

    2014-12-15

    China's coastal area is severely damaged by marine invasive species. Traditional tort theory resolves issues relevant to property damage or personal injuries, through which plaintiffs cannot cope with the ecological damage caused by marine invasive species. Several defects exist within the current legal regimes, such as imperfect management systems, insufficient unified technical standards, and unsound legal responsibility systems. It is necessary to pass legislation to prevent the ecological damage caused by marine invasive species. This investigation probes the fundamental principles needed for the administration and legislation of an improved legal framework to combat the problem of invasive species within China's coastal waters.

  10. The uncertainty threshold principle - Fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1976-01-01

    The fundamental limitations of the optimal control of dynamic systems with random parameters are analyzed by studying a scalar linear-quadratic optimal control example. It is demonstrated that optimum long-range decision making is possible only if the dynamic uncertainty (quantified by the means and covariances of the random parameters) is below a certain threshold. If this threshold is exceeded, there do not exist optimum decision rules. This phenomenon is called the 'uncertainty threshold principle'. The implications of this phenomenon to the field of modelling, identification, and adaptive control are discussed.

  11. A defense of fundamental principles and human rights: a reply to Robert Baker.

    PubMed

    Macklin, Ruth

    1998-12-01

    This article seeks to rebut Robert Baker's contention that attempts to ground international bioethics in fundamental principles cannot withstand the challenges posed by multiculturalism and postmodernism. First, several corrections are provided of Baker's account of the conclusions reached by the Advisory Committee on Human Radiation Experiments. Second, a rebuttal is offered to Baker's claim that an unbridgeable moral gap exists between Western individualism and non-Western communalism. In conclusion, this article argues that Baker's "nonnegotiable primary goods" cannot do the work of "classical human rights" and that the latter framework is preferable from both a practical and a theoretical standpoint.

  12. [Fundamental principles of social work--(also) a contribution to public health ethics].

    PubMed

    Lob-Hüdepohl, A

    2009-05-01

    Social work and public health are different but mutually connected. Both are professions with their own ethical foundations. Despite all differences, they have the same goal: to protect and to enhance the well-being of people. This is, in part, why the fundamental ethical principles of social work are salient for developing public health ethics. As a human rights profession, social work respects the personal autonomy of clients, supports solidarity-based relationships in families, groups or communities, and attempts to uphold social justice in society. Social workers need to adopt special professional attitudes: sensibility for the vulnerabilities of clients, care and attentiveness for their resources and strengths, assistance instead of paternalistic care and advocacy in decision making for clients' well-being when clients are not able to decide for themselves. These fundamental ethical principles are the basis for discussion of special topics of social work ethics as public health ethics, for example, in justifying intervention in individual lifestyles by public services without the participation or consent of the affected persons.

  13. Polynomial-time algorithms for the integer minimal principle for centrosymmetric structures.

    PubMed

    Vaia, Anastasia; Sahinidis, Nikolaos V

    2005-07-01

    The minimal principle for structure determination from single-crystal X-ray diffraction measurements has recently been formulated as an integer linear optimization model for the case of centrosymmetric structures. Solution of this model via established combinatorial branch-and-bound algorithms provides the true global minimum of the minimal principle while operating exclusively in reciprocal space. However, integer programming techniques may require an exponential number of iterations to exhaust the search space. In this paper, a new approach is developed to solve the integer minimal principle to global optimality without requiring the solution of an optimization problem. Instead, properties of the solution of the optimization problem, as observed in a large number of computational experiments, are exploited in order to reduce the optimization formulation to a system of linear equations in the number field of two elements (F(2)). Two specialized Gaussian elimination algorithms are then developed to solve this system of equations in polynomial time in the number of atoms. Computational results on a collection of 38 structures demonstrate that the proposed approach provides very fast and accurate solutions to the phase problem for centrosymmetric structures. This approach also provided much better crystallographic R values than SHELXS for all 38 structures tested. PMID:15972998

  14. Driving an Active Vibration Balancer to Minimize Vibrations at the Fundamental and Harmonic Frequencies

    NASA Technical Reports Server (NTRS)

    Holliday, Ezekiel S. (Inventor)

    2014-01-01

    Vibrations of a principal machine are reduced at the fundamental and harmonic frequencies by driving the drive motor of an active balancer with balancing signals at the fundamental and selected harmonics. Vibrations are sensed to provide a signal representing the mechanical vibrations. A balancing signal generator for the fundamental and for each selected harmonic processes the sensed vibration signal with adaptive filter algorithms of adaptive filters for each frequency to generate a balancing signal for each frequency. Reference inputs for each frequency are applied to the adaptive filter algorithms of each balancing signal generator at the frequency assigned to the generator. The harmonic balancing signals for all of the frequencies are summed and applied to drive the drive motor. The harmonic balancing signals drive the drive motor with a drive voltage component in opposition to the vibration at each frequency.

  15. Minimization principles for the coupled problem of Darcy-Biot-type fluid transport in porous media linked to phase field modeling of fracture

    NASA Astrophysics Data System (ADS)

    Miehe, Christian; Mauthe, Steffen; Teichtmeister, Stephan

    2015-09-01

    This work develops new minimization and saddle point principles for the coupled problem of Darcy-Biot-type fluid transport in porous media at fracture. It shows that the quasi-static problem of elastically deforming, fluid-saturated porous media is related to a minimization principle for the evolution problem. This two-field principle determines the rate of deformation and the fluid mass flux vector. It provides a canonically compact model structure, where the stress equilibrium and the inverse Darcy's law appear as the Euler equations of a variational statement. A Legendre transformation of the dissipation potential relates the minimization principle to a characteristic three field saddle point principle, whose Euler equations determine the evolutions of deformation and fluid content as well as Darcy's law. A further geometric assumption results in modified variational principles for a simplified theory, where the fluid content is linked to the volumetric deformation. The existence of these variational principles underlines inherent symmetries of Darcy-Biot theories of porous media. This can be exploited in the numerical implementation by the construction of time- and space-discrete variational principles, which fully determine the update problems of typical time stepping schemes. Here, the proposed minimization principle for the coupled problem is advantageous with regard to a new unconstrained stable finite element design, while space discretizations of the saddle point principles are constrained by the LBB condition. The variational principles developed provide the most fundamental approach to the discretization of nonlinear fluid-structure interactions, showing symmetric systems in algebraic update procedures. They also provide an excellent starting point for extensions towards more complex problems. This is demonstrated by developing a minimization principle for a phase field description of fracture in fluid-saturated porous media. It is designed for an

  16. Principles of minimal residual disease detection for hematopoietic neoplasms by flow cytometry.

    PubMed

    Wood, Brent L

    2016-01-01

    Flow cytometry has become an indispensible tool for the diagnosis and classification of hematopoietic neoplasms. The ability to rapidly distinguish cellular subpopulations via multiparametric assessment of quantitative differences in antigen expression on single cells and enumerate the relative sizes of the resulting subpopulations is a key feature of the technology. More recently, these capabilities have been expanded to include the identification and enumeration of rare subpopulations within complex cellular mixtures, for example, blood or bone marrow, leading to the application for post-therapeutic monitoring or minimal residual disease detection. This review will briefly present the principles to be considered in the construction and use of flow cytometric assays for minimal residual disease detection including the use of informative antibody combinations, the impact of immunophenotypic instability, enumeration, assay sensitivity, and reproducibility.

  17. Mobility analysis tool based on the fundamental principle of conservation of energy.

    SciTech Connect

    Spletzer, Barry Louis; Nho, Hyuchul C.; Salton, Jonathan Robert

    2007-08-01

    In the past decade, a great deal of effort has been focused in research and development of versatile robotic ground vehicles without understanding their performance in a particular operating environment. As the usage of robotic ground vehicles for intelligence applications increases, understanding mobility of the vehicles becomes critical to increase the probability of their successful operations. This paper describes a framework based on conservation of energy to predict the maximum mobility of robotic ground vehicles over general terrain. The basis of the prediction is the difference between traction capability and energy loss at the vehicle-terrain interface. The mission success of a robotic ground vehicle is primarily a function of mobility. Mobility of a vehicle is defined as the overall capability of a vehicle to move from place to place while retaining its ability to perform its primary mission. A mobility analysis tool based on the fundamental principle of conservation of energy is described in this document. The tool is a graphical user interface application. The mobility analysis tool has been developed at Sandia National Laboratories, Albuquerque, NM. The tool is at an initial stage of development. In the future, the tool will be expanded to include all vehicles and terrain types.

  18. Designing nanomaterials to maximize performance and minimize undesirable implications guided by the Principles of Green Chemistry.

    PubMed

    Gilbertson, Leanne M; Zimmerman, Julie B; Plata, Desiree L; Hutchison, James E; Anastas, Paul T

    2015-08-21

    The Twelve Principles of Green Chemistry were first published in 1998 and provide a framework that has been adopted not only by chemists, but also by design practitioners and decision-makers (e.g., materials scientists and regulators). The development of the Principles was initially motivated by the need to address decades of unintended environmental pollution and human health impacts from the production and use of hazardous chemicals. Yet, for over a decade now, the Principles have been applied to the synthesis and production of engineered nanomaterials (ENMs) and the products they enable. While the combined efforts of the global scientific community have led to promising advances in the field of nanotechnology, there remain significant research gaps and the opportunity to leverage the potential global economic, societal and environmental benefits of ENMs safely and sustainably. As such, this tutorial review benchmarks the successes to date and identifies critical research gaps to be considered as future opportunities for the community to address. A sustainable material design framework is proposed that emphasizes the importance of establishing structure-property-function (SPF) and structure-property-hazard (SPH) relationships to guide the rational design of ENMs. The goal is to achieve or exceed the functional performance of current materials and the technologies they enable, while minimizing inherent hazard to avoid risk to human health and the environment at all stages of the life cycle. PMID:25955514

  19. Designing nanomaterials to maximize performance and minimize undesirable implications guided by the Principles of Green Chemistry.

    PubMed

    Gilbertson, Leanne M; Zimmerman, Julie B; Plata, Desiree L; Hutchison, James E; Anastas, Paul T

    2015-08-21

    The Twelve Principles of Green Chemistry were first published in 1998 and provide a framework that has been adopted not only by chemists, but also by design practitioners and decision-makers (e.g., materials scientists and regulators). The development of the Principles was initially motivated by the need to address decades of unintended environmental pollution and human health impacts from the production and use of hazardous chemicals. Yet, for over a decade now, the Principles have been applied to the synthesis and production of engineered nanomaterials (ENMs) and the products they enable. While the combined efforts of the global scientific community have led to promising advances in the field of nanotechnology, there remain significant research gaps and the opportunity to leverage the potential global economic, societal and environmental benefits of ENMs safely and sustainably. As such, this tutorial review benchmarks the successes to date and identifies critical research gaps to be considered as future opportunities for the community to address. A sustainable material design framework is proposed that emphasizes the importance of establishing structure-property-function (SPF) and structure-property-hazard (SPH) relationships to guide the rational design of ENMs. The goal is to achieve or exceed the functional performance of current materials and the technologies they enable, while minimizing inherent hazard to avoid risk to human health and the environment at all stages of the life cycle.

  20. Simple approach to sediment provenance tracing using element analysis and fundamental principles

    NASA Astrophysics Data System (ADS)

    Matys Grygar, Tomas; Elznicova, Jitka; Popelka, Jan

    2016-04-01

    Common sediment fingerprinting techniques use either (1) extensive analytical datasets, sometimes nearly complete with respect to accessible characterization techniques; they are processed by multidimensional statistics based on certain statistical assumptions on distribution functions of analytical results and conservativeness/additivity of some components, or (2) analytically demanding characteristics such as isotope ratios assumed to be unequivocal "labels" on the parent material unaltered by any catchment process. The inherent problem of the approach ad (1) is that interpretation of statistical components ("sources") is done ex post and remains purely formal. The problem of the approach ad (2) is that catchment processes (weathering, transport, deposition) can modify most geochemical parameters of soils and sediments, in other words, that the idea that some geochemistry parameters are "conservative" may be idealistic. Grain-size effects and sediment provenance have a joint influence on chemical composition of fluvial sediments that is indeed not easy to distinguish. Attempts to separate those two main components using only statistics seem risky and equivocal, because grain-size dependence of element composition is nearly individual for each element and reflects sediment maturity and catchment-specific formation transport processes. We suppose that the use of less extensive datasets of analytical results and their interpretation respecting fundamental principles should be more robust than only statistic tools applied to overwhelming datasets. We examined sediment composition, both published by other researchers and gathered by us, and we found some general principles, which are in our opinion relevant for fingerprinting: (1) Concentrations of all elements are grain-size sensitive, i.e. there are no "conservative" elements in conventional sense of provenance- or transport-pathways tracing, (2) fractionation by catchment processes and fluvial transport changes

  1. Learning to minimize efforts versus maximizing rewards: computational principles and neural correlates.

    PubMed

    Skvortsova, Vasilisa; Palminteri, Stefano; Pessiglione, Mathias

    2014-11-19

    The mechanisms of reward maximization have been extensively studied at both the computational and neural levels. By contrast, little is known about how the brain learns to choose the options that minimize action cost. In principle, the brain could have evolved a general mechanism that applies the same learning rule to the different dimensions of choice options. To test this hypothesis, we scanned healthy human volunteers while they performed a probabilistic instrumental learning task that varied in both the physical effort and the monetary outcome associated with choice options. Behavioral data showed that the same computational rule, using prediction errors to update expectations, could account for both reward maximization and effort minimization. However, these learning-related variables were encoded in partially dissociable brain areas. In line with previous findings, the ventromedial prefrontal cortex was found to positively represent expected and actual rewards, regardless of effort. A separate network, encompassing the anterior insula, the dorsal anterior cingulate, and the posterior parietal cortex, correlated positively with expected and actual efforts. These findings suggest that the same computational rule is applied by distinct brain systems, depending on the choice dimension-cost or benefit-that has to be learned.

  2. D 1 , 2 (RN) versus C (RN) local minimizer and a Hopf-type maximum principle

    NASA Astrophysics Data System (ADS)

    Carl, Siegfried; Costa, David G.; Tehrani, Hossein

    2016-08-01

    We consider functionals of the form Φ (u) =1/2∫RN | ∇u|2 -∫RN b (x) G (u) on D 1 , 2 (RN), N ≥ 3, whose critical points are the weak solutions of a corresponding elliptic equation in the whole RN. We present a Brezis-Nirenberg type result and a Hopf-type maximum principle in the context of the space D 1 , 2 (RN). More precisely, we prove that a local minimizer of Φ in the topology of the subspace V must be a local minimizer of Φ in the D 1 , 2 (RN)-topology, where V is given by V : = { v ∈D 1 , 2 (RN) : v ∈ C (RN)withsupx∈RN ⁡ (1 + | x| N - 2) | v (x) | < ∞ }. It is well-known that the Brezis-Nirenberg result has been proved a strong tool in the study of multiple solutions for elliptic boundary value problems in bounded domains. We believe that the result obtained in this paper may play a similar role for elliptic problems in RN.

  3. Prediction of Metabolic Flux Distribution from Gene Expression Data Based on the Flux Minimization Principle

    PubMed Central

    Song, Hyun-Seob; Reifman, Jaques; Wallqvist, Anders

    2014-01-01

    Prediction of possible flux distributions in a metabolic network provides detailed phenotypic information that links metabolism to cellular physiology. To estimate metabolic steady-state fluxes, the most common approach is to solve a set of macroscopic mass balance equations subjected to stoichiometric constraints while attempting to optimize an assumed optimal objective function. This assumption is justifiable in specific cases but may be invalid when tested across different conditions, cell populations, or other organisms. With an aim to providing a more consistent and reliable prediction of flux distributions over a wide range of conditions, in this article we propose a framework that uses the flux minimization principle to predict active metabolic pathways from mRNA expression data. The proposed algorithm minimizes a weighted sum of flux magnitudes, while biomass production can be bounded to fit an ample range from very low to very high values according to the analyzed context. We have formulated the flux weights as a function of the corresponding enzyme reaction's gene expression value, enabling the creation of context-specific fluxes based on a generic metabolic network. In case studies of wild-type Saccharomyces cerevisiae, and wild-type and mutant Escherichia coli strains, our method achieved high prediction accuracy, as gauged by correlation coefficients and sums of squared error, with respect to the experimentally measured values. In contrast to other approaches, our method was able to provide quantitative predictions for both model organisms under a variety of conditions. Our approach requires no prior knowledge or assumption of a context-specific metabolic functionality and does not require trial-and-error parameter adjustments. Thus, our framework is of general applicability for modeling the transcription-dependent metabolism of bacteria and yeasts. PMID:25397773

  4. Emergent features and perceptual objects: re-examining fundamental principles in analogical display design.

    PubMed

    Holt, Jerred; Bennett, Kevin B; Flach, John M

    2015-01-01

    Two sets of design principles for analogical visual displays, based on the concepts of emergent features and perceptual objects, are described. An interpretation of previous empirical findings for three displays (bar graph, polar graphic, alphanumeric) is provided from both perspectives. A fourth display (configural coordinate) was designed using principles of ecological interface design (i.e. direct perception). An experiment was conducted to evaluate performance (accuracy and latency of state identification) with these four displays. Numerous significant effects were obtained and a clear rank ordering of performance emerged (from best to worst): configural coordinate, bar graph, alphanumeric and polar graphic. These findings are consistent with principles of design based on emergent features; they are inconsistent with principles based on perceptual objects. Some limitations of the configural coordinate display are discussed and a redesign is provided. Practitioner Summary: Principles of ecological interface design, which emphasise the quality of very specific mappings between domain, display and observer constraints, are described; these principles are applicable to the design of all analogical graphical displays. PMID:26218496

  5. Emergent features and perceptual objects: re-examining fundamental principles in analogical display design.

    PubMed

    Holt, Jerred; Bennett, Kevin B; Flach, John M

    2015-01-01

    Two sets of design principles for analogical visual displays, based on the concepts of emergent features and perceptual objects, are described. An interpretation of previous empirical findings for three displays (bar graph, polar graphic, alphanumeric) is provided from both perspectives. A fourth display (configural coordinate) was designed using principles of ecological interface design (i.e. direct perception). An experiment was conducted to evaluate performance (accuracy and latency of state identification) with these four displays. Numerous significant effects were obtained and a clear rank ordering of performance emerged (from best to worst): configural coordinate, bar graph, alphanumeric and polar graphic. These findings are consistent with principles of design based on emergent features; they are inconsistent with principles based on perceptual objects. Some limitations of the configural coordinate display are discussed and a redesign is provided. Practitioner Summary: Principles of ecological interface design, which emphasise the quality of very specific mappings between domain, display and observer constraints, are described; these principles are applicable to the design of all analogical graphical displays.

  6. Developing a Dynamics and Vibrations Course for Civil Engineering Students Based on Fundamental-Principles

    ERIC Educational Resources Information Center

    Barroso, Luciana R.; Morgan, James R.

    2012-01-01

    This paper describes the creation and evolution of an undergraduate dynamics and vibrations course for civil engineering students. Incorporating vibrations into the course allows students to see and study "real" civil engineering applications of the course content. This connection of academic principles to real life situations is in…

  7. The uncertainty threshold principle - Some fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  8. The Uncertainty Threshold Principle: Some Fundamental Limitations of Optimal Decision Making Under Dynamic Uncertainity

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  9. Al-Air Batteries: Fundamental Thermodynamic Limitations from First Principles Theory

    NASA Astrophysics Data System (ADS)

    Chen, Leanne D.; Noerskov, Jens K.; Luntz, Alan C.

    2015-03-01

    The Al-air battery possesses high theoretical specific energy (4140 Wh/kg) and is therefore an attractive candidate for vehicle propulsion applications. However, the experimentally observed open-circuit potential is much lower than what thermodynamics predicts, and this potential loss is widely believed to be an effect of corrosion. We present a detailed study of the Al-air battery using density functional theory. The results suggest that the difference between bulk thermodynamic and surface potentials is due to both the effects of asymmetry in multi-electron transfer reactions that define the anodic dissolution of Al and, more importantly, a large chemical step inherent to the formation of bulk Al(OH)3 from surface intermediates. The former results in an energy loss of 3%, while the latter accounts for 14 -29% of the total thermodynamic energy depending on the surface site where dissolution occurs. Therefore, the maximum open-circuit potential of the Al anode is only -1.87 V vs. SHE in the absence of thermal excitations, contrary to -2.34 V predicted by bulk thermodynamics at pH 14.6. This is a fundamental limitation of the system and governs the maximum output potential, which cannot be improved even if corrosion effects were completely suppressed. Supported by the Natural Sciences and Engineering Research Council of Canada and the ReLiable Project (#11-116792) funded by the Danish Council for Strategic Research.

  10. Al-Air Batteries: Fundamental Thermodynamic Limitations from First-Principles Theory.

    PubMed

    Chen, Leanne D; Nørskov, Jens K; Luntz, Alan C

    2015-01-01

    The Al-air battery possesses high theoretical specific energy (4140 W h/kg) and is therefore an attractive candidate for vehicle propulsion. However, the experimentally observed open-circuit potential is much lower than what bulk thermodynamics predicts, and this potential loss is typically attributed to corrosion. Similarly, large Tafel slopes associated with the battery are assumed to be due to film formation. We present a detailed thermodynamic study of the Al-air battery using density functional theory. The results suggest that the maximum open-circuit potential of the Al anode is only -1.87 V versus the standard hydrogen electrode at pH 14.6 instead of the traditionally assumed -2.34 V and that large Tafel slopes are inherent in the electrochemistry. These deviations from the bulk thermodynamics are intrinsic to the electrochemical surface processes that define Al anodic dissolution. This has contributions from both asymmetry in multielectron transfers and, more importantly, a large chemical stabilization inherent to the formation of bulk Al(OH)3 from surface intermediates. These are fundamental limitations that cannot be improved even if corrosion and film effects are completely suppressed. PMID:26263108

  11. First Principles Studies of Tapered Silicon Nanowires: Fundamental Insights and Practical Applications

    NASA Astrophysics Data System (ADS)

    Wu, Zhigang

    2008-03-01

    Nanowires (NWs) are often observed experimentally to be tapered rather than straight-edged, with diameters (d) shrinking by as much as 1 nm per 10 nm of vertical growth. Previous theoretical studies have examined the electronic properties of straight-edged nanowires (SNWs), although the effects of tapering on quantum confinement may be of both fundamental and practical importance. We have employed ab initio calculations to study the structural and electronic properties of tapered Si NWs. As one may expect, tapered nanowires (TNWs) possess axially-dependent electronic properties; their local energy gaps vary along the wire axis, with the largest gap occurring at the narrowest point of the wire. In contrast to SNWs, where confinement tends to shift valence bands more than conduction bands away from the bulk gap, the unoccupied states in TNWs are much more sensitive to d than the occupied states. In addition, tapering causes the band-edge states to be spatially separated along the wire axis, a consequence of the interplay between a strong variation in quantum confinement strength with diameter and the tapering-induced charge transfer. This property may be exploited in electronic and optical applications, for example, in photovoltaic devices where the separation of the valence and conduction band states could be used to transport excited charges during the thermalization process. In order to gain insight into TNW photovoltaic properties, we have also carried out calculations of the dipole matrix elements near the band edges as well as the role of metal contacts on TNW electronic properties. Finally, a combination of ab initio total energy calculations and classical molecular dynamics (MD) simulations are employed to suggest a new technique for bringing nanoscale objects together to form ordered, ultra high-aspect ratio nanowires. This work was supported in part by the U.S. Department of Energy under Contract No. DE-AC02-05CH11231.

  12. A Greatly Under-Appreciated Fundamental Principle of Physical Organic Chemistry

    PubMed Central

    Cox, Robin A.

    2011-01-01

    If a species does not have a finite lifetime in the reaction medium, it cannot be a mechanistic intermediate. This principle was first enunciated by Jencks, as the concept of an enforced mechanism. For instance, neither primary nor secondary carbocations have long enough lifetimes to exist in an aqueous medium, so SN1 reactions involving these substrates are not possible, and an SN2 mechanism is enforced. Only tertiary carbocations and those stabilized by resonance (benzyl cations, acylium ions) are stable enough to be reaction intermediates. More importantly, it is now known that neither H3O+ nor HO− exist as such in dilute aqueous solution. Several recent high-level calculations on large proton clusters are unable to localize the positive charge; it is found to be simply “on the cluster” as a whole. The lifetime of any ionized water species is exceedingly short, a few molecular vibrations at most; the best experimental study, using modern IR instrumentation, has the most probable hydrated proton structure as H13O6+, but only an estimated quarter of the protons are present even in this form at any given instant. Thanks to the Grotthuss mechanism of chain transfer along hydrogen bonds, in reality a proton or a hydroxide ion is simply instantly available anywhere it is needed for reaction. Important mechanistic consequences result. Any charged oxygen species (e.g., a tetrahedral intermediate) is also not going to exist long enough to be a reaction intermediate, unless the charge is stabilized in some way, usually by resonance. General acid catalysis is the rule in reactions in concentrated aqueous acids. The Grotthuss mechanism also means that reactions involving neutral water are favored; the solvent is already highly structured, so the entropy involved in bringing several solvent molecules to the reaction center is unimportant. Examples are given. PMID:22272074

  13. Structural phase transitions and fundamental band gaps of MgxZn1 xO alloys from first principles

    SciTech Connect

    Maznichenko, I. V.; Ernst, Arthur; Bouhassoune, M.; Henk, J.; Daene, Markus W; Lueders, Martin; Bruno, Patrick; Wolfam, Hergert; Mertig, I.; Szotek, Zdzislawa; Temmerman, Walter M

    2009-01-01

    The structural phase transitions and the fundamental band gaps of MgxZn1 xO alloys are investigated by detailed first-principles calculations in the entire range of Mg concentrations x, applying a multiple-scattering theoretical approach (Korringa-Kohn-Rostoker method). Disordered alloys are treated within the coherent-potential approximation. The calculations for various crystal phases have given rise to a phase diagram in good agreement with experiments and other theoretical approaches. The phase transition from the wurtzite to the rock-salt structure is predicted at the Mg concentration of x=0.33, which is close to the experimental value of 0.33 0.40. The size of the fundamental band gap, typically underestimated by the local-density approximation, is considerably improved by the self-interaction correction. The increase in the gap upon alloying ZnO with Mg corroborates experimental trends. Our findings are relevant for applications in optical, electrical, and, in particular, in magnetoelectric devices.

  14. Adhesive restorations, centric relation, and the Dahl principle: minimally invasive approaches to localized anterior tooth erosion.

    PubMed

    Magne, Pascal; Magne, Michel; Belser, Urs C

    2007-01-01

    The purpose of this article is to review biomechanical and occlusal principles that could help optimize the conservative treatment of severely eroded and worn anterior dentition using adhesive restorations. It appears that enamel and dentin bonding, through the combined use of resin composites (on the palatal surface) and indirect porcelain veneers (on the facial/incisal surfaces) can lead to an optimal result from both esthetic and functional/biomechanical aspects. Cases of deep bite combined with palatal erosion and wear can be particularly challenging. A simplified approach is proposed through the use of an occlusal therapy combining centric relation and the Dahl principle to create anterior interocclusal space to reduce the need for more invasive palatal reduction. This approach allows the ultraconservative treatment of localized anterior tooth erosion and wear.

  15. Quantum statistical entropy and minimal length of 5D Ricci-flat black string with generalized uncertainty principle

    SciTech Connect

    Liu Molin; Gui Yuanxing; Liu Hongya

    2008-12-15

    In this paper, we study the quantum statistical entropy in a 5D Ricci-flat black string solution, which contains a 4D Schwarzschild-de Sitter black hole on the brane, by using the improved thin-layer method with the generalized uncertainty principle. The entropy is the linear sum of the areas of the event horizon and the cosmological horizon without any cutoff and any constraint on the bulk's configuration rather than the usual uncertainty principle. The system's density of state and free energy are convergent in the neighborhood of horizon. The small-mass approximation is determined by the asymptotic behavior of metric function near horizons. Meanwhile, we obtain the minimal length of the position {delta}x, which is restrained by the surface gravities and the thickness of layer near horizons.

  16. WeBSurg: An innovative educational Web site in minimally invasive surgery--principles and results.

    PubMed

    Mutter, Didier; Vix, Michel; Dallemagne, Bernard; Perretta, Silvana; Leroy, Joël; Marescaux, Jacques

    2011-03-01

    Internet has dramatically changed clinical practice and information sharing among the surgical community and has revolutionized the access to surgical education. High-speed Internet broadcasting allows display of high-quality high-definition full-screen videos. Herein, Internet access to surgical procedures plays a major role in continuing medical education (CME). The WeBSurg Web site is a virtual surgical university dedicated to post-graduate education in minimally invasive surgery. Its results measured through its members, number of visitors coming from 213 different countries, as well as the amount of data transmitted through the provider LimeLight, confirm that WeBSurg appears as the first Web site in surgical CME. The Internet offers a tailored education for all levels of surgical expertise as well as for all types of Internet access. This represents a global multimedia solution at the cutting edge of technology and surgical evolution, which responds to the modern ethos of "always, anywhere, anytime."

  17. Headache in a high school student – a reminder of fundamental principles of clinical medicine and common pitfalls of cognition

    PubMed Central

    Afghan, Zakira; Hussain, Abid; Asim, Muhammad

    2015-01-01

    Primary headache disorders account for the majority of the cases of headache. Nevertheless, the primary objective of a physician, when encountered with a patient with headache is to rule out a secondary cause the headache. This entails a search for specific associated red-flag symptoms or signs that may indicate a serious condition, as well as a heightened suspicion of and evaluation for a don't miss diagnosis. We present a case of a high-school student whose first manifestation of systemic lupus erythematosus (SLE) was a headache due to cerebral venous and sinus thrombosis, initially misdiagnosed as tension-headache and ‘ophthalmoplegic migraine’ (now known as ‘recurrent painful ophthalmoplegic neuropathy’). The patient made a complete neurological and radiological recovery after systemic anticoagulation and treatment of SLE. An analysis of the clinical errors and cognitive biases leading to delayed referral to hospital is presented. We highlight the fact that adherence to the fundamental principles of clinical medicine and enhancement of cognitive awareness is required to reduce diagnostic errors. PMID:26835410

  18. The inactivation principle: mathematical solutions minimizing the absolute work and biological implications for the planning of arm movements.

    PubMed

    Berret, Bastien; Darlot, Christian; Jean, Frédéric; Pozzo, Thierry; Papaxanthis, Charalambos; Gauthier, Jean Paul

    2008-10-01

    An important question in the literature focusing on motor control is to determine which laws drive biological limb movements. This question has prompted numerous investigations analyzing arm movements in both humans and monkeys. Many theories assume that among all possible movements the one actually performed satisfies an optimality criterion. In the framework of optimal control theory, a first approach is to choose a cost function and test whether the proposed model fits with experimental data. A second approach (generally considered as the more difficult) is to infer the cost function from behavioral data. The cost proposed here includes a term called the absolute work of forces, reflecting the mechanical energy expenditure. Contrary to most investigations studying optimality principles of arm movements, this model has the particularity of using a cost function that is not smooth. First, a mathematical theory related to both direct and inverse optimal control approaches is presented. The first theoretical result is the Inactivation Principle, according to which minimizing a term similar to the absolute work implies simultaneous inactivation of agonistic and antagonistic muscles acting on a single joint, near the time of peak velocity. The second theoretical result is that, conversely, the presence of non-smoothness in the cost function is a necessary condition for the existence of such inactivation. Second, during an experimental study, participants were asked to perform fast vertical arm movements with one, two, and three degrees of freedom. Observed trajectories, velocity profiles, and final postures were accurately simulated by the model. In accordance, electromyographic signals showed brief simultaneous inactivation of opposing muscles during movements. Thus, assuming that human movements are optimal with respect to a certain integral cost, the minimization of an absolute-work-like cost is supported by experimental observations. Such types of optimality

  19. The inactivation principle: mathematical solutions minimizing the absolute work and biological implications for the planning of arm movements.

    PubMed

    Berret, Bastien; Darlot, Christian; Jean, Frédéric; Pozzo, Thierry; Papaxanthis, Charalambos; Gauthier, Jean Paul

    2008-10-01

    An important question in the literature focusing on motor control is to determine which laws drive biological limb movements. This question has prompted numerous investigations analyzing arm movements in both humans and monkeys. Many theories assume that among all possible movements the one actually performed satisfies an optimality criterion. In the framework of optimal control theory, a first approach is to choose a cost function and test whether the proposed model fits with experimental data. A second approach (generally considered as the more difficult) is to infer the cost function from behavioral data. The cost proposed here includes a term called the absolute work of forces, reflecting the mechanical energy expenditure. Contrary to most investigations studying optimality principles of arm movements, this model has the particularity of using a cost function that is not smooth. First, a mathematical theory related to both direct and inverse optimal control approaches is presented. The first theoretical result is the Inactivation Principle, according to which minimizing a term similar to the absolute work implies simultaneous inactivation of agonistic and antagonistic muscles acting on a single joint, near the time of peak velocity. The second theoretical result is that, conversely, the presence of non-smoothness in the cost function is a necessary condition for the existence of such inactivation. Second, during an experimental study, participants were asked to perform fast vertical arm movements with one, two, and three degrees of freedom. Observed trajectories, velocity profiles, and final postures were accurately simulated by the model. In accordance, electromyographic signals showed brief simultaneous inactivation of opposing muscles during movements. Thus, assuming that human movements are optimal with respect to a certain integral cost, the minimization of an absolute-work-like cost is supported by experimental observations. Such types of optimality

  20. Dielectric properties of agricultural products – fundamental principles, influencing factors, and measurement technirques. Chapter 4. Electrotechnologies for Food Processing: Book Series. Volume 3. Radio-Frequency Heating

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this chapter, definitions of dielectric properties, or permittivity, of materials and a brief discussion of the fundamental principles governing their behavior with respect to influencing factors are presented. The basic physics of the influence of frequency of the electric fields and temperatur...

  1. Locomotor control of limb force switches from minimal intervention principle in early adaptation to noise reduction in late adaptation.

    PubMed

    Selgrade, Brian P; Chang, Young-Hui

    2015-03-01

    During movement, errors are typically corrected only if they hinder performance. Preferential correction of task-relevant deviations is described by the minimal intervention principle but has not been demonstrated in the joints during locomotor adaptation. We studied hopping as a tractable model of locomotor adaptation of the joints within the context of a limb-force-specific task space. Subjects hopped while adapting to shifted visual feedback that induced them to increase peak ground reaction force (GRF). We hypothesized subjects would preferentially reduce task-relevant joint torque deviations over task-irrelevant deviations to increase peak GRF. We employed a modified uncontrolled manifold analysis to quantify task-relevant and task-irrelevant joint torque deviations for each individual hop cycle. As would be expected by the explicit goal of the task, peak GRF errors decreased in early adaptation before reaching steady state during late adaptation. Interestingly, during the early adaptation performance improvement phase, subjects reduced GRF errors by decreasing only the task-relevant joint torque deviations. In contrast, during the late adaption performance maintenance phase, all torque deviations decreased in unison regardless of task relevance. In deadaptation, when the shift in visual feedback was removed, all torque deviations decreased in unison, possibly because performance improvement was too rapid to detect changes in only the task-relevant dimension. We conclude that limb force adaptation in hopping switches from a minimal intervention strategy during performance improvement to a noise reduction strategy during performance maintenance, which may represent a general control strategy for locomotor adaptation of limb force in other bouncing gaits, such as running.

  2. Lessons that Bear Repeating and Repeating that Bears Lessons: An Interdisciplinary Unit on Principles of Minimalism in Modern Music, Art, and Poetry (Grades 4-8)

    ERIC Educational Resources Information Center

    Smigel, Eric; McDonald, Nan L.

    2012-01-01

    This theory-to-practice article focuses on interdisciplinary classroom activities based on principles of minimalism in modern music, art, and poetry. A lesson sequence was designed for an inner-city Grades 4 and 5 general classroom of English language learners, where the unit was taught, assessed, and documented by the authors. Included in the…

  3. [Human rights and genetics: the fundamental principles of the Universal Declaration on the Human Genome and Human Rights].

    PubMed

    Bergel, S D

    1998-01-01

    The Universal Declaration on the Human Genome and Human Rights sets out generally agreed criteria in response to the human rights challenges posed by advances in molecular biology and genetics. The lynchpin of these criteria is respect for human dignity, a premise from which other principles are derived. The author examines and gives the justification for these principles, and refers to another crucial bioethics text, the recent Council of Europe Convention on the protection of human rights and the dignity of the human person in regard to applications of biology and medicine.

  4. How to increase treatment effectiveness and efficiency in psychiatry: creative psychopharmacotherapy - part 1: definition, fundamental principles and higher effectiveness polypharmacy.

    PubMed

    Jakovljević, Miro

    2013-09-01

    Psychopharmacotherapy is a fascinating field that can be understood in many different ways. It is both a science and an art of communication with a heavily subjective dimension. The advent of a significant number of the effective and well tolerated mental health medicines during and after 1990s decade of the brain has increased our possibilities to treat major mental disorders in more successful ways with much better treatment outcome including full recovery. However, there is a huge gap between our possibilities for achieving high treatment effectiveness and not satisfying results in day-to-day clinical practice. Creative approach to psychopharmacotherapy could advance everyday clinical practice and bridge the gap. Creative psychopharmacotherapy is a concept that incorporates creativity as its fundamental tool. Creativity involves the intention and ability to transcend limiting traditional ideas, rules, patterns and relationships and to create meaningful new ideas, interpretations, contexts and methods in clinical psychopharmacology.

  5. First-principle calculations of the fundamental properties of CuBrxI1-x ternary alloy

    NASA Astrophysics Data System (ADS)

    Touam, S.; Boukhtouta, M.; Hamioud, L.; Ghemid, S.; Meradji, H.; El Haj Hassan, F.

    2015-11-01

    Ab initio full-potential linearised augmented plane wave (FP-LAPW) method within density functional theory is applied to study the effect of composition on the structural, electronic and thermodynamic properties of CuBrxI1-x ternary alloy. The structural properties at equilibrium are investigated by using the new form of generalised gradient approximations that are based on the optimisation of total energy. For band structure calculations, both Engel-Vosko and modified Becke-Johnson of the exchange-correlation energy and potential, respectively, are used. Deviation of the lattice constants from Vegard's law and the bulk modulus from linear concentration dependence are observed. The microscopic origins of the gap bowing were explained by using the approach of Zunger and co-workers. On the other hand, the thermodynamic stability of this alloy was investigated by calculating the excess enthalpy of mixing ∆Hm as well as the phase diagram by calculating the critical temperatures. A numerical first-principle calculations of the elastic constants as function of pressure is used to calculate C11, C12 and C44.

  6. A review of the fundamentals of polymer-modified asphalts: Asphalt/polymer interactions and principles of compatibility.

    PubMed

    Polacco, Giovanni; Filippi, Sara; Merusi, Filippo; Stastna, George

    2015-10-01

    During the last decades, the number of vehicles per citizen as well as the traffic speed and load has dramatically increased. This sudden and somehow unplanned overloading has strongly shortened the life of pavements and increased its cost of maintenance and risks to users. In order to limit the deterioration of road networks, it is necessary to improve the quality and performance of pavements, which was achieved through the addition of a polymer to the bituminous binder. Since their introduction, polymer-modified asphalts have gained in importance during the second half of the twentieth century, and they now play a fundamental role in the field of road paving. With high-temperature and high-shear mixing with asphalt, the polymer incorporates asphalt molecules, thereby forming a swallowed network that involves the entire binder and results in a significant improvement of the viscoelastic properties in comparison with those of the unmodified binder. Such a process encounters the well-known difficulties related to the poor solubility of polymers, which limits the number of macromolecules able to not only form such a structure but also maintain it during high-temperature storage in static conditions, which may be necessary before laying the binder. Therefore, polymer-modified asphalts have been the subject of numerous studies aimed to understand and optimize their structure and storage stability, which gradually attracted polymer scientists into this field that was initially explored by civil engineers. The analytical techniques of polymer science have been applied to polymer-modified asphalts, which resulted in a good understanding of their internal structure. Nevertheless, the complexity and variability of asphalt composition rendered it nearly impossible to generalize the results and univocally predict the properties of a given polymer/asphalt pair. The aim of this paper is to review these aspects of polymer-modified asphalts. Together with a brief description of

  7. A review of the fundamentals of polymer-modified asphalts: Asphalt/polymer interactions and principles of compatibility.

    PubMed

    Polacco, Giovanni; Filippi, Sara; Merusi, Filippo; Stastna, George

    2015-10-01

    During the last decades, the number of vehicles per citizen as well as the traffic speed and load has dramatically increased. This sudden and somehow unplanned overloading has strongly shortened the life of pavements and increased its cost of maintenance and risks to users. In order to limit the deterioration of road networks, it is necessary to improve the quality and performance of pavements, which was achieved through the addition of a polymer to the bituminous binder. Since their introduction, polymer-modified asphalts have gained in importance during the second half of the twentieth century, and they now play a fundamental role in the field of road paving. With high-temperature and high-shear mixing with asphalt, the polymer incorporates asphalt molecules, thereby forming a swallowed network that involves the entire binder and results in a significant improvement of the viscoelastic properties in comparison with those of the unmodified binder. Such a process encounters the well-known difficulties related to the poor solubility of polymers, which limits the number of macromolecules able to not only form such a structure but also maintain it during high-temperature storage in static conditions, which may be necessary before laying the binder. Therefore, polymer-modified asphalts have been the subject of numerous studies aimed to understand and optimize their structure and storage stability, which gradually attracted polymer scientists into this field that was initially explored by civil engineers. The analytical techniques of polymer science have been applied to polymer-modified asphalts, which resulted in a good understanding of their internal structure. Nevertheless, the complexity and variability of asphalt composition rendered it nearly impossible to generalize the results and univocally predict the properties of a given polymer/asphalt pair. The aim of this paper is to review these aspects of polymer-modified asphalts. Together with a brief description of

  8. Fundamental principles of diaphragm meters

    SciTech Connect

    Thomson, J.

    1995-12-01

    A diaphragm meter is a positive displacement instrument which is used to measure the volume of gas that passes through it. This is accomplished through the known volume that is displaced for each stroke of the diaphragm. The diaphragm also provides the seal between the measuring chambers of the device. As such the diaphragm meter has proven to be an accurate and reliable means of measurement of gas for many years. This is especially true at low flow rates because of its positive displacement characteristics. This paper includes a brief history of diaphragm meters, an explanation of the operation of the diaphragm meter, a basic review of the function and design of the positive displacement meter, discusses meter ratings and capacity, and introduces temperature compensation.

  9. Animal and robot experiments to discover principles behind the evolution of a minimal locomotor apparatus for robust legged locomotion

    NASA Astrophysics Data System (ADS)

    McInroe, Benjamin; Astley, Henry; Kawano, Sandy; Blob, Richard; Goldman, Daniel I.

    2015-03-01

    In the evolutionary transition from an aquatic to a terrestrial environment, early walkers adapted to the challenges of locomotion on complex, flowable substrates (e.g. sand and mud). Our previous biological and robotic studies have demonstrated that locomotion on such substrates is sensitive to both limb morphology and kinematics. Although reconstructions of early vertebrate skeletal morphologies exist, the kinematic strategies required for successful locomotion by these organisms have not yet been explored. To gain insight into how early walkers contended with complex substrates, we developed a robotic model with appendage morphology inspired by a model analog organism, the mudskipper. We tested mudskippers and the robot on different substrates, including rigid ground and dry granular media, varying incline angle. The mudskippers moved effectively on all level substrates using a fin-driven gait. But as incline angle increased, the animals used their tails in concert with their fins to generate propulsion. Adding an actuated tail to the robot improved robustness, making possible locomotion on otherwise inaccessible inclines. With these discoveries, we are elucidating a minimal template that may have allowed the early walkers to adapt to locomotion on land. This work was supported by NSF PoLS.

  10. How do we create a climate literate society? A review of Climate Literacy essential principles and fundamental concepts that ensure climate literate citizens and students

    NASA Astrophysics Data System (ADS)

    Niepold, F.

    2008-05-01

    Through a partnership between the National Oceanic and Atmospheric Administration (NOAA) and AAAS Project 2061 and other partners we have collaborated to define climate literacy and develop weather and climate benchmarks for science literacy. The newly developed and revised national weather and climate science education standards were published in March of 2007 in the AAAS Project 2061 Atlas for Science Literacy volume II. This session will present the results of these projects as well as the publication of "An Abbreviated Guide for Teaching Climate Change." The Climate Literacy effort is worked in parallel with the Ocean Literacy effort and has developed a Framework for Climate Literacy using the AAAS Project 2061 Atlas of Science. During the development of the climate literacy framework federal science agencies, formal and informal educators, non- governmental organizations, and other institutions involved with climate research, education, and outreach to build on the science education benchmarks. This effort resulted in a framework that will be used to engage the broad community to develop a robust conceptual framework that addresses the essential principles and fundamental concepts that climate literate citizens and students should know. That document has been reviewed and commented on during several rounds and is available at http://www.climate.noaa.gov/education/

  11. Critical evaluation of energy intake data using fundamental principles of energy physiology: 1. Derivation of cut-off limits to identify under-recording.

    PubMed

    Goldberg, G R; Black, A E; Jebb, S A; Cole, T J; Murgatroyd, P R; Coward, W A; Prentice, A M

    1991-12-01

    This paper uses fundamental principles of energy physiology to define minimum cut-off limits for energy intake below which a person of a given sex, age and body weight could not live a normal life-style. These have been derived from whole-body calorimeter and doubly-labelled water measurements in a wide range of healthy adults after due statistical allowance for intra- and interindividual variance. The tabulated cut-off limits, which depend on sample size and duration of measurements, identify minimum plausible levels of energy expenditure expressed as a multiple of basal metabolic rate (BMR). CUT-OFF 1 tests whether reported energy intake measurements can be representative of long-term habitual intake. It is set at 1.35 x BMR for cases where BMR has been measured rather than predicted. CUT-OFF 2 tests whether reported energy intakes are a plausible measure of the food consumed during the actual measurement period, and is always more liberal than CUT-OFF 1 since it has to allow for the known measurement imprecision arising from the high level of day-to-day variability in food intake. The cut-off limits can be used to evaluate energy intake data. Results falling below these limits must be recognized as being incompatible with long-term maintenance of energy balance and therefore with long-term survival.

  12. Fundamental ecology is fundamental.

    PubMed

    Courchamp, Franck; Dunne, Jennifer A; Le Maho, Yvon; May, Robert M; Thébaud, Christophe; Hochberg, Michael E

    2015-01-01

    The primary reasons for conducting fundamental research are satisfying curiosity, acquiring knowledge, and achieving understanding. Here we develop why we believe it is essential to promote basic ecological research, despite increased impetus for ecologists to conduct and present their research in the light of potential applications. This includes the understanding of our environment, for intellectual, economical, social, and political reasons, and as a major source of innovation. We contend that we should focus less on short-term, objective-driven research and more on creativity and exploratory analyses, quantitatively estimate the benefits of fundamental research for society, and better explain the nature and importance of fundamental ecology to students, politicians, decision makers, and the general public. Our perspective and underlying arguments should also apply to evolutionary biology and to many of the other biological and physical sciences.

  13. Fundamentally updating fundamentals.

    PubMed

    Armstrong, Gail; Barton, Amy

    2013-01-01

    Recent educational research indicates that the six competencies of the Quality and Safety Education for Nurses initiative are best introduced in early prelicensure clinical courses. Content specific to quality and safety has traditionally been covered in senior level courses. This article illustrates an effective approach to using quality and safety as an organizing framework for any prelicensure fundamentals of nursing course. Providing prelicensure students a strong foundation in quality and safety in an introductory clinical course facilitates early adoption of quality and safety competencies as core practice values.

  14. Fundamentals of Diesel Engines.

    ERIC Educational Resources Information Center

    Marine Corps Inst., Washington, DC.

    This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the fundamentals of diesel engine mechanics. Addressed in the three individual units of the course are the following topics: basic principles of diesel mechanics; principles, mechanics, and…

  15. Buridan's Principle

    NASA Astrophysics Data System (ADS)

    Lamport, Leslie

    2012-08-01

    Buridan's principle asserts that a discrete decision based upon input having a continuous range of values cannot be made within a bounded length of time. It appears to be a fundamental law of nature. Engineers aware of it can design devices so they have an infinitessimal probability of not making a decision quickly enough. Ignorance of the principle could have serious consequences.

  16. Variational Principle for Planetary Interiors

    NASA Astrophysics Data System (ADS)

    Zeng, Li; Jacobsen, Stein B.

    2016-09-01

    In the past few years, the number of confirmed planets has grown above 2000. It is clear that they represent a diversity of structures not seen in our own solar system. In addition to very detailed interior modeling, it is valuable to have a simple analytical framework for describing planetary structures. The variational principle is a fundamental principle in physics, entailing that a physical system follows the trajectory, which minimizes its action. It is alternative to the differential equation formulation of a physical system. Applying the variational principle to the planetary interior can beautifully summarize the set of differential equations into one, which provides us some insight into the problem. From this principle, a universal mass-radius relation, an estimate of the error propagation from the equation of state to the mass-radius relation, and a form of the virial theorem applicable to planetary interiors are derived.

  17. Fundamentals of fluid lubrication

    NASA Technical Reports Server (NTRS)

    Hamrock, Bernard J.

    1991-01-01

    The aim is to coordinate the topics of design, engineering dynamics, and fluid dynamics in order to aid researchers in the area of fluid film lubrication. The lubrication principles that are covered can serve as a basis for the engineering design of machine elements. The fundamentals of fluid film lubrication are presented clearly so that students that use the book will have confidence in their ability to apply these principles to a wide range of lubrication situations. Some guidance on applying these fundamentals to the solution of engineering problems is also provided.

  18. Homeschooling and Religious Fundamentalism

    ERIC Educational Resources Information Center

    Kunzman, Robert

    2010-01-01

    This article considers the relationship between homeschooling and religious fundamentalism by focusing on their intersection in the philosophies and practices of conservative Christian homeschoolers in the United States. Homeschooling provides an ideal educational setting to support several core fundamentalist principles: resistance to…

  19. Tether fundamentals

    NASA Technical Reports Server (NTRS)

    Carroll, J. A.

    1986-01-01

    Some fundamental aspects of tethers are presented and briefly discussed. The effects of gravity gradients, dumbbell libration in circular orbits, tether control strategies and impact hazards for tethers are among those fundamentals. Also considered are aerodynamic drag, constraints in momentum transfer applications and constraints with permanently deployed tethers. The theoretical feasibility of these concepts are reviewed.

  20. Minimal cosmography

    NASA Astrophysics Data System (ADS)

    Piazza, Federico; Schücker, Thomas

    2016-04-01

    The minimal requirement for cosmography—a non-dynamical description of the universe—is a prescription for calculating null geodesics, and time-like geodesics as a function of their proper time. In this paper, we consider the most general linear connection compatible with homogeneity and isotropy, but not necessarily with a metric. A light-cone structure is assigned by choosing a set of geodesics representing light rays. This defines a "scale factor" and a local notion of distance, as that travelled by light in a given proper time interval. We find that the velocities and relativistic energies of free-falling bodies decrease in time as a consequence of cosmic expansion, but at a rate that can be different than that dictated by the usual metric framework. By extrapolating this behavior to photons' redshift, we find that the latter is in principle independent of the "scale factor". Interestingly, redshift-distance relations and other standard geometric observables are modified in this extended framework, in a way that could be experimentally tested. An extremely tight constraint on the model, however, is represented by the blackbody-ness of the cosmic microwave background. Finally, as a check, we also consider the effects of a non-metric connection in a different set-up, namely, that of a static, spherically symmetric spacetime.

  1. A proof-of-principle robot with potential for the development of a hand-held tactile instrument for minimally-invasive artery cross-clamping.

    PubMed

    Pahlavan, Pedram; Najarian, Siamak; Dargahi, Javad; Moini, Majid

    2014-08-01

    One of the most common diseases of the vascular system is abdominal aortic aneurysm (AAA), for which the most definitive treatment is surgery. Minimally invasive aorta surgery is a novel method of surgery performed through small incisions and offers significant advantages including less pain, shorter hospital stay, faster patient recovery, less possibility of infection, etc. However, lack of sense of touch is the main drawback of this type of aorta surgery that would incapacitate the surgeon to exactly distinguish the aorta from its surrounding tissues which could cause various problems during the aorta cross-clamping process. One of the most important drawbacks is that it makes the aorta cross-clamping process the most time-consuming process of aortic repair surgery. The artificial tactile sensing approach is a novel method that can be used in various fields of medicine and, more specifically, in minimally invasive surgeries, where using the 'tactile sense' is not possible. In this paper, considering the present problems during aortic-repair-laparoscopy and imitating the movement of surgeons' fingers during aorta cross-clamping, a novel tactile-based artery cross-clamping robot is introduced and its function is evaluated experimentally. It is illustrated that this new tactile-based artery cross-clamping robot is well capable of dissecting an artery from its adjacent tissues in a short time with an acceptable accuracy.

  2. Monte Carlo fundamentals

    SciTech Connect

    Brown, F.B.; Sutton, T.M.

    1996-02-01

    This report is composed of the lecture notes from the first half of a 32-hour graduate-level course on Monte Carlo methods offered at KAPL. These notes, prepared by two of the principle developers of KAPL`s RACER Monte Carlo code, cover the fundamental theory, concepts, and practices for Monte Carlo analysis. In particular, a thorough grounding in the basic fundamentals of Monte Carlo methods is presented, including random number generation, random sampling, the Monte Carlo approach to solving transport problems, computational geometry, collision physics, tallies, and eigenvalue calculations. Furthermore, modern computational algorithms for vector and parallel approaches to Monte Carlo calculations are covered in detail, including fundamental parallel and vector concepts, the event-based algorithm, master/slave schemes, parallel scaling laws, and portability issues.

  3. A Yoga Strengthening Program Designed to Minimize the Knee Adduction Moment for Women with Knee Osteoarthritis: A Proof-Of-Principle Cohort Study

    PubMed Central

    2015-01-01

    People with knee osteoarthritis may benefit from exercise prescriptions that minimize knee loads in the frontal plane. The primary objective of this study was to determine whether a novel 12-week strengthening program designed to minimize exposure to the knee adduction moment (KAM) could improve symptoms and knee strength in women with symptomatic knee osteoarthritis. A secondary objective was to determine whether the program could improve mobility and fitness, and decrease peak KAM during gait. The tertiary objective was to evaluate the biomechanical characteristics of this yoga program. In particular, we compared the peak KAM during gait with that during yoga postures at baseline. We also compared lower limb normalized mean electromyography (EMG) amplitudes during yoga postures between baseline and follow-up. Primary measures included self-reported pain and physical function (Knee injury and Osteoarthritis Outcome Score) and knee strength (extensor and flexor torques). Secondary measures included mobility (six-minute walk, 30-second chair stand, stair climbing), fitness (submaximal cycle ergometer test), and clinical gait analysis using motion capture synchronized with electromyography and force measurement. Also, KAM and normalized mean EMG amplitudes were collected during yoga postures. Forty-five women over age 50 with symptomatic knee osteoarthritis, consistent with the American College of Rheumatology criteria, enrolled in our 12-week (3 sessions per week) program. Data from 38 were analyzed (six drop-outs; one lost to co-intervention). Participants experienced reduced pain (mean improvement 10.1–20.1 normalized to 100; p<0.001), increased knee extensor strength (mean improvement 0.01 Nm/kg; p = 0.004), and increased flexor strength (mean improvement 0.01 Nm/kg; p = 0.001) at follow-up compared to baseline. Participants improved mobility on the six-minute walk (mean improvement 37.7 m; p<0.001) and 30-second chair stand (mean improvement 1.3; p = 0.006) at

  4. Marketing fundamentals.

    PubMed

    Redmond, W H

    2001-01-01

    This chapter outlines current marketing practice from a managerial perspective. The role of marketing within an organization is discussed in relation to efficiency and adaptation to changing environments. Fundamental terms and concepts are presented in an applied context. The implementation of marketing plans is organized around the four P's of marketing: product (or service), promotion (including advertising), place of delivery, and pricing. These are the tools with which marketers seek to better serve their clients and form the basis for competing with other organizations. Basic concepts of strategic relationship management are outlined. Lastly, alternate viewpoints on the role of advertising in healthcare markets are examined. PMID:11401791

  5. Fundamentals of fossil simulator instructor training

    SciTech Connect

    Not Available

    1984-01-01

    This single-volume, looseleaf text introduces the beginning instructor to fundamental instructor training principles, and then shows how to apply those principles to fossil simulator training. Topics include the fundamentals of classroom instruction, the learning process, course development, and the specifics of simulator training program development.

  6. Fundamentals of Geophysics

    NASA Astrophysics Data System (ADS)

    Lowrie, William

    1997-10-01

    This unique textbook presents a comprehensive overview of the fundamental principles of geophysics. Unlike most geophysics textbooks, it combines both the applied and theoretical aspects to the subject. The author explains complex geophysical concepts using abundant diagrams, a simplified mathematical treatment, and easy-to-follow equations. After placing the Earth in the context of the solar system, he describes each major branch of geophysics: gravitation, seismology, dating, thermal and electrical properties, geomagnetism, paleomagnetism and geodynamics. Each chapter begins with a summary of the basic physical principles, and a brief account of each topic's historical evolution. The book will satisfy the needs of intermediate-level earth science students from a variety of backgrounds, while at the same time preparing geophysics majors for continued study at a higher level.

  7. Healthcare fundamentals.

    PubMed

    Kauk, Justin; Hill, Austin D; Althausen, Peter L

    2014-07-01

    In order for a trauma surgeon to have an intelligent discussion with hospital administrators, healthcare plans, policymakers, or any other physicians, a basic understanding of the fundamentals of healthcare is paramount. It is truly shocking how many surgeons are unable to describe the difference between Medicare and Medicaid or describe how hospitals and physicians get paid. These topics may seem burdensome but they are vital to all business decision making in the healthcare field. The following chapter provides further insight about what we call "the basics" of providing medical care today. Most of the topics presented can be applied to all specialties of medicine. It is broken down into 5 sections. The first section is a brief overview of government programs, their influence on care delivery and reimbursement, and past and future legislation. Section 2 focuses on the compliance, care provision, and privacy statutes that regulate physicians who care for Medicare/Medicaid patient populations. With a better understanding of these obligations, section 3 discusses avenues by which physicians can stay informed of current and pending health policy and provides ways that they can become involved in shaping future legislation. The fourth section changes gears slightly by explaining how the concepts of trade restraint, libel, antitrust legislation, and indemnity relate to physician practice. The fifth, and final, section ties all of components together by describing how physician-hospital alignment can be mutually beneficial in providing patient care under current healthcare policy legislation.

  8. Minimal metabolic pathway structure is consistent with associated biomolecular interactions.

    PubMed

    Bordbar, Aarash; Nagarajan, Harish; Lewis, Nathan E; Latif, Haythem; Ebrahim, Ali; Federowicz, Stephen; Schellenberger, Jan; Palsson, Bernhard O

    2014-01-01

    Pathways are a universal paradigm for functionally describing cellular processes. Even though advances in high-throughput data generation have transformed biology, the core of our biological understanding, and hence data interpretation, is still predicated on human-defined pathways. Here, we introduce an unbiased, pathway structure for genome-scale metabolic networks defined based on principles of parsimony that do not mimic canonical human-defined textbook pathways. Instead, these minimal pathways better describe multiple independent pathway-associated biomolecular interaction datasets suggesting a functional organization for metabolism based on parsimonious use of cellular components. We use the inherent predictive capability of these pathways to experimentally discover novel transcriptional regulatory interactions in Escherichia coli metabolism for three transcription factors, effectively doubling the known regulatory roles for Nac and MntR. This study suggests an underlying and fundamental principle in the evolutionary selection of pathway structures; namely, that pathways may be minimal, independent, and segregated. PMID:24987116

  9. Minimal metabolic pathway structure is consistent with associated biomolecular interactions

    PubMed Central

    Bordbar, Aarash; Nagarajan, Harish; Lewis, Nathan E; Latif, Haythem; Ebrahim, Ali; Federowicz, Stephen; Schellenberger, Jan; Palsson, Bernhard O

    2014-01-01

    Pathways are a universal paradigm for functionally describing cellular processes. Even though advances in high-throughput data generation have transformed biology, the core of our biological understanding, and hence data interpretation, is still predicated on human-defined pathways. Here, we introduce an unbiased, pathway structure for genome-scale metabolic networks defined based on principles of parsimony that do not mimic canonical human-defined textbook pathways. Instead, these minimal pathways better describe multiple independent pathway-associated biomolecular interaction datasets suggesting a functional organization for metabolism based on parsimonious use of cellular components. We use the inherent predictive capability of these pathways to experimentally discover novel transcriptional regulatory interactions in Escherichia coli metabolism for three transcription factors, effectively doubling the known regulatory roles for Nac and MntR. This study suggests an underlying and fundamental principle in the evolutionary selection of pathway structures; namely, that pathways may be minimal, independent, and segregated. PMID:24987116

  10. Commentary: Minimizing Evaluation Misuse as Principled Practice

    ERIC Educational Resources Information Center

    Cousins, J. Bradley

    2004-01-01

    "Ethical Challenges," in my experience, is invariably interesting, often instructive and sometimes amusing. Some of the most engaging stimulus scenarios raise thorny evaluation practice issues that ultimately lead to disparate points of view about the nature of the issue and how to handle it (Datta, 2002; Smith, 2002). Despite my poor performance…

  11. Does osteoderm growth follow energy minimization principles?

    PubMed

    Sensale, Sebastián; Jones, Washington; Blanco, R Ernesto

    2014-08-01

    Although the growth and development of tissues and organs of extinct species cannot be directly observed, their fossils can record and preserve evidence of these mechanisms. It is generally accepted that bone architecture is the result of genetically based biomechanical constraints, but what about osteoderms? In this article, the influence of physical constraints on cranial osteoderms growth is assessed. Comparisons among lepidosaurs, synapsids, and archosaurs are performed; according to these analyses, lepidosaur osteoderms growth is predicted to be less energy demanding than that of synapsids and archosaurs. Obtained results also show that, from an energetic viewpoint, ankylosaurid osteoderms growth resembles more that of mammals than the one of reptilians, adding evidence to debate whether dinosaurs were hot or cold blooded. PMID:24634089

  12. Does osteoderm growth follow energy minimization principles?

    PubMed

    Sensale, Sebastián; Jones, Washington; Blanco, R Ernesto

    2014-08-01

    Although the growth and development of tissues and organs of extinct species cannot be directly observed, their fossils can record and preserve evidence of these mechanisms. It is generally accepted that bone architecture is the result of genetically based biomechanical constraints, but what about osteoderms? In this article, the influence of physical constraints on cranial osteoderms growth is assessed. Comparisons among lepidosaurs, synapsids, and archosaurs are performed; according to these analyses, lepidosaur osteoderms growth is predicted to be less energy demanding than that of synapsids and archosaurs. Obtained results also show that, from an energetic viewpoint, ankylosaurid osteoderms growth resembles more that of mammals than the one of reptilians, adding evidence to debate whether dinosaurs were hot or cold blooded.

  13. Fundamentals of Environmental Education. Report.

    ERIC Educational Resources Information Center

    1976

    An outline of fundamental definitions, relationships, and human responsibilities related to environment provides a basis from which a variety of materials, programs, and activities can be developed. The outline can be used in elementary, secondary, higher education, or adult education programs. The framework is based on principles of the science…

  14. Maximum Entropy Fundamentals

    NASA Astrophysics Data System (ADS)

    Harremoeës, P.; Topsøe, F.

    2001-09-01

    In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a rst reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here taken to represent "energy". This type of application is very well known from the literature with hundreds of applications pertaining to several different elds and will also here serve as important illustration of the theory. But our approach reaches further, especially regarding the study of continuity properties of the entropy function, and this leads to new results which allow a discussion of models with so-called entropy loss. These results have tempted us to speculate over the development of natural

  15. Minimal Reduplication

    ERIC Educational Resources Information Center

    Kirchner, Jesse Saba

    2010-01-01

    This dissertation introduces Minimal Reduplication, a new theory and framework within generative grammar for analyzing reduplication in human language. I argue that reduplication is an emergent property in multiple components of the grammar. In particular, reduplication occurs independently in the phonology and syntax components, and in both cases…

  16. The minimal work cost of information processing.

    PubMed

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-07

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  17. The minimal work cost of information processing

    PubMed Central

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  18. Fundamentals in Nuclear Physics

    NASA Astrophysics Data System (ADS)

    Basdevant, Jean-Louis, Rich, James, Spiro, Michael

    This course on nuclear physics leads the reader to the exploration of the field from nuclei to astrophysical issues. Much nuclear phenomenology can be understood from simple arguments such as those based on the Pauli principle and the Coulomb barrier. This book is concerned with extrapolating from such arguments and illustrating nuclear systematics with experimental data. Starting with the basic concepts in nuclear physics, nuclear models, and reactions, the book covers nuclear decays and the fundamental electro-weak interactions, radioactivity, and nuclear energy. After the discussions of fission and fusion leading into nuclear astrophysics, there is a presentation of the latest ideas about cosmology. As a primer this course will lay the foundations for more specialized subjects. This book emerged from a series of topical courses the authors delivered at the Ecole Polytechnique and will be useful for graduate students and for scientists in a variety of fields.

  19. Evolutionary principles and their practical application

    PubMed Central

    Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P

    2011-01-01

    Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology. PMID:25567966

  20. Evolutionary principles and their practical application.

    PubMed

    Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P

    2011-03-01

    Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology.

  1. Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory: A brief review

    NASA Astrophysics Data System (ADS)

    Kerner, Boris S.

    2013-11-01

    It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliably used for control and optimization in traffic networks. It is shown that the generally accepted fundamentals and methodologies of the traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of the traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (for example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular (fixed or stochastic) value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of the traffic and transportation theory, we discuss the three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.

  2. Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory

    SciTech Connect

    Kerner, Boris S.

    2015-03-10

    It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliable used for control and optimization in traffic networks. It is shown that generally accepted fundamentals and methodologies of traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (for example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular stochastic value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of traffic and transportation theory, we discuss three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.

  3. Testing Our Fundamental Assumptions

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  4. Generating minimal living systems from non-living materials and increasing their evolutionary abilities.

    PubMed

    Rasmussen, Steen; Constantinescu, Adi; Svaneborg, Carsten

    2016-08-19

    We review lessons learned about evolutionary transitions from a bottom-up construction of minimal life. We use a particular systemic protocell design process as a starting point for exploring two fundamental questions: (i) how may minimal living systems emerge from non-living materials? and (ii) how may minimal living systems support increasingly more evolutionary richness? Under (i), we present what has been accomplished so far and discuss the remaining open challenges and their possible solutions. Under (ii), we present a design principle we have used successfully both for our computational and experimental protocellular investigations, and we conjecture how this design principle can be extended for enhancing the evolutionary potential for a wide range of systems.This article is part of the themed issue 'The major synthetic evolutionary transitions'. PMID:27431518

  5. Periodic minimal surfaces

    NASA Astrophysics Data System (ADS)

    Mackay, Alan L.

    1985-04-01

    A minimal surface is one for which, like a soap film with the same pressure on each side, the mean curvature is zero and, thus, is one where the two principal curvatures are equal and opposite at every point. For every closed circuit in the surface, the area is a minimum. Schwarz1 and Neovius2 showed that elements of such surfaces could be put together to give surfaces periodic in three dimensions. These periodic minimal surfaces are geometrical invariants, as are the regular polyhedra, but the former are curved. Minimal surfaces are appropriate for the description of various structures where internal surfaces are prominent and seek to adopt a minimum area or a zero mean curvature subject to their topology; thus they merit more complete numerical characterization. There seem to be at least 18 such surfaces3, with various symmetries and topologies, related to the crystallographic space groups. Recently, glyceryl mono-oleate (GMO) was shown by Longley and McIntosh4 to take the shape of the F-surface. The structure postulated is shown here to be in good agreement with an analysis of the fundamental geometry of periodic minimal surfaces.

  6. The ZOOM minimization package

    SciTech Connect

    Fischler, Mark S.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  7. Testing Our Fundamental Assumptions

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  8. Fundamentals of Pharmacogenetics in Personalized, Precision Medicine.

    PubMed

    Valdes, Roland; Yin, DeLu Tyler

    2016-09-01

    This article introduces fundamental principles of pharmacogenetics as applied to personalized and precision medicine. Pharmacogenetics establishes relationships between pharmacology and genetics by connecting phenotypes and genotypes in predicting the response of therapeutics in individual patients. We describe differences between precision and personalized medicine and relate principles of pharmacokinetics and pharmacodynamics to applications in laboratory medicine. We also review basic principles of pharmacogenetics, including its evolution, how it enables the practice of personalized therapeutics, and the role of the clinical laboratory. These fundamentals are a segue for understanding specific clinical applications of pharmacogenetics described in subsequent articles in this issue.

  9. Fundamentals of Pharmacogenetics in Personalized, Precision Medicine.

    PubMed

    Valdes, Roland; Yin, DeLu Tyler

    2016-09-01

    This article introduces fundamental principles of pharmacogenetics as applied to personalized and precision medicine. Pharmacogenetics establishes relationships between pharmacology and genetics by connecting phenotypes and genotypes in predicting the response of therapeutics in individual patients. We describe differences between precision and personalized medicine and relate principles of pharmacokinetics and pharmacodynamics to applications in laboratory medicine. We also review basic principles of pharmacogenetics, including its evolution, how it enables the practice of personalized therapeutics, and the role of the clinical laboratory. These fundamentals are a segue for understanding specific clinical applications of pharmacogenetics described in subsequent articles in this issue. PMID:27514461

  10. Fundamentals of the Control of Gas-Turbine Power Plants for Aircraft. Part 2; Principles of Control Common to Jet, Turbine-Propeller Jet, and Ducted-Fan Jet Power Plants

    NASA Technical Reports Server (NTRS)

    Kuehl, H.

    1947-01-01

    After defining the aims and requirements to be set for a control system of gas-turbine power plants for aircraft, the report will deal with devices that prevent the quantity of fuel supplied per unit of time from exceeding the value permissible at a given moment. The general principles of the actuation of the adjustable parts of the power plant are also discussed.

  11. Fundamentals of Cryogenics

    NASA Technical Reports Server (NTRS)

    Johnson, Wesley; Tomsik, Thomas; Moder, Jeff

    2014-01-01

    Analysis of the extreme conditions that are encountered in cryogenic systems requires the most effort out of analysts and engineers. Due to the costs and complexity associated with the extremely cold temperatures involved, testing is sometimes minimized and extra analysis is often relied upon. This short course is designed as an introduction to cryogenic engineering and analysis, and it is intended to introduce the basic concepts related to cryogenic analysis and testing as well as help the analyst understand the impacts of various requests on a test facility. Discussion will revolve around operational functions often found in cryogenic systems, hardware for both tests and facilities, and what design or modelling tools are available for performing the analysis. Emphasis will be placed on what scenarios to use what hardware or the analysis tools to get the desired results. The class will provide a review of first principles, engineering practices, and those relations directly applicable to this subject including such topics as cryogenic fluids, thermodynamics and heat transfer, material properties at low temperature, insulation, cryogenic equipment, instrumentation, refrigeration, testing of cryogenic systems, cryogenics safety and typical thermal and fluid analysis used by the engineer. The class will provide references for further learning on various topics in cryogenics for those who want to dive deeper into the subject or have encountered specific problems.

  12. Towards a Minimal System for Cell Division

    NASA Astrophysics Data System (ADS)

    Schwille, Petra

    We have entered the "omics" era of the life sciences, meaning that our general knowledge about biological systems has become vast, complex, and almost impossible to fully comprehend. Consequently, the challenge for quantitative biology and biophysics is to identify appropriate procedures and protocols that allow the researcher to strip down the complexity of a biological system to a level that can be reliably modeled but still retains the essential features of its "real" counterpart. The virtue of physics has always been the reductionist approach, which allowed scientists to identify the underlying basic principles of seemingly complex phenomena, and subject them to rigorous mathematical treatment. Biological systems are obviously among the most complex phenomena we can think of, and it is fair to state that our rapidly increasing knowledge does not make it easier to identify a small set of fundamental principles of the big concept of "life" that can be defined and quantitatively understood. Nevertheless, it is becoming evident that only by tight cooperation and interdisciplinary exchange between the life sciences and quantitative sciences, and by applying intelligent reductionist approaches also to biology, will we be able to meet the intellectual challenges of the twenty-first century. These include not only the collection and proper categorization of the data, but also their true understanding and harnessing such that we can solve important practical problems imposed by medicine or the worldwide need for new energy sources. Many of these approaches are reflected by the modern buzz word "synthetic biology", therefore I briefly discuss this term in the first section. Further, I outline some endeavors of our and other groups to model minimal biological systems, with particular focus on the possibility of generating a minimal system for cell division.

  13. Basic Comfort Heating Principles.

    ERIC Educational Resources Information Center

    Dempster, Chalmer T.

    The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…

  14. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  15. Plasma Modeling Enabled Technology Development Empowered by Fundamental Scattering Data

    NASA Astrophysics Data System (ADS)

    Kushner, Mark J.

    2016-05-01

    Technology development increasingly relies on modeling to speed the innovation cycle. This is particularly true for systems using low temperature plasmas (LTPs) and their role in enabling energy efficient processes with minimal environmental impact. In the innovation cycle, LTP modeling supports investigation of fundamental processes that seed the cycle, optimization of newly developed technologies, and prediction of performance of unbuilt systems for new applications. Although proof-of-principle modeling may be performed for idealized systems in simple gases, technology development must address physically complex systems that use complex gas mixtures that now may be multi-phase (e.g., in contact with liquids). The variety of fundamental electron and ion scattering, and radiation transport data (FSRD) required for this modeling increases as the innovation cycle progresses, while the accuracy required of that data depends on the intended outcome. In all cases, the fidelity, depth and impact of the modeling depends on the availability of FSRD. Modeling and technology development are, in fact, empowered by the availability and robustness of FSRD. In this talk, examples of the impact of and requirements for FSRD in the innovation cycle enabled by plasma modeling will be discussed using results from multidimensional and global models. Examples of fundamental studies and technology optimization will focus on microelectronics fabrication and on optically pumped lasers. Modeling of systems as yet unbuilt will address the interaction of atmospheric pressure plasmas with liquids. Work supported by DOE Office of Fusion Energy Science and the National Science Foundation.

  16. Toward systematic integration between self-determination theory and motivational interviewing as examples of top-down and bottom-up intervention development: autonomy or volition as a fundamental theoretical principle.

    PubMed

    Vansteenkiste, Maarten; Williams, Geoffrey C; Resnicow, Ken

    2012-03-02

    Clinical interventions can be developed through two distinct pathways. In the first, which we call top-down, a well-articulated theory drives the development of the intervention, whereas in the case of a bottom-up approach, clinical experience, more so than a dedicated theoretical perspective, drives the intervention. Using this dialectic, this paper discusses Self-Determination Theory (SDT) 12 and Motivational Interviewing (MI) 3 as prototypical examples of a top-down and bottom-up approaches, respectively. We sketch the different starting points, foci and developmental processes of SDT and MI, but equally note the complementary character and the potential for systematic integration between both approaches. Nevertheless, for a deeper integration to take place, we contend that MI researchers might want to embrace autonomy as a fundamental basic process underlying therapeutic change and we discuss the advantages of doing so.

  17. Combustion Fundamentals Research

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Increased emphasis is placed on fundamental and generic research at Lewis Research Center with less systems development efforts. This is especially true in combustion research, where the study of combustion fundamentals has grown significantly in order to better address the perceived long term technical needs of the aerospace industry. The main thrusts for this combustion fundamentals program area are as follows: analytical models of combustion processes, model verification experiments, fundamental combustion experiments, and advanced numeric techniques.

  18. Exchange Rates and Fundamentals.

    ERIC Educational Resources Information Center

    Engel, Charles; West, Kenneth D.

    2005-01-01

    We show analytically that in a rational expectations present-value model, an asset price manifests near-random walk behavior if fundamentals are I (1) and the factor for discounting future fundamentals is near one. We argue that this result helps explain the well-known puzzle that fundamental variables such as relative money supplies, outputs,…

  19. Reconstruction of fundamental SUSY parameters

    SciTech Connect

    P. M. Zerwas et al.

    2003-09-25

    We summarize methods and expected accuracies in determining the basic low-energy SUSY parameters from experiments at future e{sup +}e{sup -} linear colliders in the TeV energy range, combined with results from LHC. In a second step we demonstrate how, based on this set of parameters, the fundamental supersymmetric theory can be reconstructed at high scales near the grand unification or Planck scale. These analyses have been carried out for minimal supergravity [confronted with GMSB for comparison], and for a string effective theory.

  20. Radiometric calibration by rank minimization.

    PubMed

    Lee, Joon-Young; Matsushita, Yasuyuki; Shi, Boxin; Kweon, In So; Ikeuchi, Katsushi

    2013-01-01

    We present a robust radiometric calibration framework that capitalizes on the transform invariant low-rank structure in the various types of observations, such as sensor irradiances recorded from a static scene with different exposure times, or linear structure of irradiance color mixtures around edges. We show that various radiometric calibration problems can be treated in a principled framework that uses a rank minimization approach. This framework provides a principled way of solving radiometric calibration problems in various settings. The proposed approach is evaluated using both simulation and real-world datasets and shows superior performance to previous approaches.

  1. Itch Management: General Principles.

    PubMed

    Misery, Laurent

    2016-01-01

    Like pain, itch is a challenging condition that needs to be managed. Within this setting, the first principle of itch management is to get an appropriate diagnosis to perform an etiology-oriented therapy. In several cases it is not possible to treat the cause, the etiology is undetermined, there are several causes, or the etiological treatment is not effective enough to alleviate itch completely. This is also why there is need for symptomatic treatment. In all patients, psychological support and associated pragmatic measures might be helpful. General principles and guidelines are required, yet patient-centered individual care remains fundamental. PMID:27578069

  2. Itch Management: General Principles.

    PubMed

    Misery, Laurent

    2016-01-01

    Like pain, itch is a challenging condition that needs to be managed. Within this setting, the first principle of itch management is to get an appropriate diagnosis to perform an etiology-oriented therapy. In several cases it is not possible to treat the cause, the etiology is undetermined, there are several causes, or the etiological treatment is not effective enough to alleviate itch completely. This is also why there is need for symptomatic treatment. In all patients, psychological support and associated pragmatic measures might be helpful. General principles and guidelines are required, yet patient-centered individual care remains fundamental.

  3. A Fundamental Theorem on Particle Acceleration

    SciTech Connect

    Xie, Ming

    2003-05-01

    A fundamental theorem on particle acceleration is derived from the reciprocity principle of electromagnetism and a rigorous proof of the theorem is presented. The theorem establishes a relation between acceleration and radiation, which is particularly useful for insightful understanding of and practical calculation about the first order acceleration in which energy gain of the accelerated particle is linearly proportional to the accelerating field.

  4. Fundamental Physical Constants

    National Institute of Standards and Technology Data Gateway

    SRD 121 CODATA Fundamental Physical Constants (Web, free access)   This site, developed in the Physics Laboratory at NIST, addresses three topics: fundamental physical constants, the International System of Units (SI), which is the modern metric system, and expressing the uncertainty of measurement results.

  5. Fundamentals of Physics

    NASA Astrophysics Data System (ADS)

    Halliday, David; Resnick, Robert; Walker, Jearl

    2003-01-01

    No other book on the market today can match the success of Halliday, Resnick and Walker's Fundamentals of Physics! In a breezy, easy-to-understand style the book offers a solid understanding of fundamental physics concepts, and helps readers apply this conceptual understanding to quantitative problem solving.

  6. A correspondence principle

    NASA Astrophysics Data System (ADS)

    Hughes, Barry D.; Ninham, Barry W.

    2016-02-01

    A single mathematical theme underpins disparate physical phenomena in classical, quantum and statistical mechanical contexts. This mathematical "correspondence principle", a kind of wave-particle duality with glorious realizations in classical and modern mathematical analysis, embodies fundamental geometrical and physical order, and yet in some sense sits on the edge of chaos. Illustrative cases discussed are drawn from classical and anomalous diffusion, quantum mechanics of single particles and ideal gases, quasicrystals and Casimir forces.

  7. Pattern formation in a minimal model of continuum dislocation plasticity

    NASA Astrophysics Data System (ADS)

    Sandfeld, Stefan; Zaiser, Michael

    2015-09-01

    The spontaneous emergence of heterogeneous dislocation patterns is a conspicuous feature of plastic deformation and strain hardening of crystalline solids. Despite long-standing efforts in the materials science and physics of defect communities, there is no general consensus regarding the physical mechanism which leads to the formation of dislocation patterns. In order to establish the fundamental mechanism, we formulate an extremely simplified, minimal model to investigate the formation of patterns based on the continuum theory of fluxes of curved dislocations. We demonstrate that strain hardening as embodied in a Taylor-type dislocation density dependence of the flow stress, in conjunction with the structure of the kinematic equations that govern dislocation motion under the action of external stresses, is already sufficient for the formation of dislocation patterns that are consistent with the principle of similitude.

  8. Esophagectomy - minimally invasive

    MedlinePlus

    Minimally invasive esophagectomy; Robotic esophagectomy; Removal of the esophagus - minimally invasive; Achalasia - esophagectomy; Barrett esophagus - esophagectomy; Esophageal cancer - esophagectomy - laparoscopic; Cancer of the ...

  9. Minimally invasive surgery. Future developments.

    PubMed

    Wickham, J E

    1994-01-15

    The rapid development of minimally invasive surgery means that there will be fundamental changes in interventional treatment. Technological advances will allow new minimally invasive procedures to be developed. Application of robotics will allow some procedures to be done automatically, and coupling of slave robotic instruments with virtual reality images will allow surgeons to perform operations by remote control. Miniature motors and instruments designed by microengineering could be introduced into body cavities to perform operations that are currently impossible. New materials will allow changes in instrument construction, such as use of memory metals to make heat activated scissors or forceps. With the reduced trauma associated with minimally invasive surgery, fewer operations will require long hospital stays. Traditional surgical wards will become largely redundant, and hospitals will need to cope with increased through-put of patients. Operating theatres will have to be equipped with complex high technology equipment, and hospital staff will need to be trained to manage it. Conventional nursing care will be carried out more in the community. Many traditional specialties will be merged, and surgical training will need fundamental revision to ensure that surgeons are competent to carry out the new procedures. PMID:8312776

  10. Principled Narrative

    ERIC Educational Resources Information Center

    MacBeath, John; Swaffield, Sue; Frost, David

    2009-01-01

    This article provides an overview of the "Carpe Vitam: Leadership for Learning" project, accounting for its provenance and purposes, before focusing on the principles for practice that constitute an important part of the project's legacy. These principles framed the dialogic process that was a dominant feature of the project and are presented,…

  11. Fundamental principles and applications of natural gas hydrates.

    PubMed

    Sloan, E Dendy

    2003-11-20

    Natural gas hydrates are solid, non-stoichiometric compounds of small gas molecules and water. They form when the constituents come into contact at low temperature and high pressure. The physical properties of these compounds, most notably that they are non-flowing crystalline solids that are denser than typical fluid hydrocarbons and that the gas molecules they contain are effectively compressed, give rise to numerous applications in the broad areas of energy and climate effects. In particular, they have an important bearing on flow assurance and safety issues in oil and gas pipelines, they offer a largely unexploited means of energy recovery and transportation, and they could play a significant role in past and future climate change.

  12. Fundamental principles and applications of natural gas hydrates

    NASA Astrophysics Data System (ADS)

    Sloan, E. Dendy

    2003-11-01

    Natural gas hydrates are solid, non-stoichiometric compounds of small gas molecules and water. They form when the constituents come into contact at low temperature and high pressure. The physical properties of these compounds, most notably that they are non-flowing crystalline solids that are denser than typical fluid hydrocarbons and that the gas molecules they contain are effectively compressed, give rise to numerous applications in the broad areas of energy and climate effects. In particular, they have an important bearing on flow assurance and safety issues in oil and gas pipelines, they offer a largely unexploited means of energy recovery and transportation, and they could play a significant role in past and future climate change.

  13. Governing during an Institutional Crisis: 10 Fundamental Principles

    ERIC Educational Resources Information Center

    White, Lawrence

    2012-01-01

    In today's world, managing a campus crisis poses special challenges for an institution's governing board, which may operate some distance removed from the immediate events giving rise to the crisis. In its most challenging form, a campus crisis--a shooting, a natural disaster, a fraternity hazing death, the arrest of a prominent campus…

  14. 47 CFR 36.2 - Fundamental principles underlying procedures.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... 36.2 Section 36.2 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER...” measurements, measurements of use are (i) determined for telecommunications plant or for work performed by...) Underlying the procedures included in this manual for the separation of plant costs is an over-all...

  15. 47 CFR 36.2 - Fundamental principles underlying procedures.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... 36.2 Section 36.2 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER...” measurements, measurements of use are (i) determined for telecommunications plant or for work performed by...) Underlying the procedures included in this manual for the separation of plant costs is an over-all...

  16. 47 CFR 36.2 - Fundamental principles underlying procedures.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... 36.2 Section 36.2 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER...” measurements, measurements of use are (i) determined for telecommunications plant or for work performed by...) Underlying the procedures included in this manual for the separation of plant costs is an over-all...

  17. 47 CFR 36.2 - Fundamental principles underlying procedures.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... 36.2 Section 36.2 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER...” measurements, measurements of use are (i) determined for telecommunications plant or for work performed by...) Underlying the procedures included in this manual for the separation of plant costs is an over-all...

  18. Levitated Optomechanics for Fundamental Physics

    NASA Astrophysics Data System (ADS)

    Rashid, Muddassar; Bateman, James; Vovrosh, Jamie; Hempston, David; Ulbricht, Hendrik

    2015-05-01

    Optomechanics with levitated nano- and microparticles is believed to form a platform for testing fundamental principles of quantum physics, as well as find applications in sensing. We will report on a new scheme to trap nanoparticles, which is based on a parabolic mirror with a numerical aperture of 1. Combined with achromatic focussing, the setup is a cheap and readily straightforward solution to trapping nanoparticles for further study. Here, we report on the latest progress made in experimentation with levitated nanoparticles; these include the trapping of 100 nm nanodiamonds (with NV-centres) down to 1 mbar as well as the trapping of 50 nm Silica spheres down to 10?4 mbar without any form of feedback cooling. We will also report on the progress to implement feedback stabilisation of the centre of mass motion of the trapped particle using digital electronics. Finally, we argue that such a stabilised particle trap can be the particle source for a nanoparticle matterwave interferometer. We will present our Talbot interferometer scheme, which holds promise to test the quantum superposition principle in the new mass range of 106 amu. EPSRC, John Templeton Foundation.

  19. Variation of Fundamental Constants

    NASA Astrophysics Data System (ADS)

    Flambaum, V. V.

    2006-11-01

    Theories unifying gravity with other interactions suggest temporal and spatial variation of the fundamental ``constants'' in expanding Universe. The spatial variation can explain a fine tuning of the fundamental constants which allows humans (and any life) to appear. We appeared in the area of the Universe where the values of the fundamental constants are consistent with our existence. We present a review of recent works devoted to the variation of the fine structure constant α, strong interaction and fundamental masses. There are some hints for the variation in quasar absorption spectra. Big Bang nucleosynthesis, and Oklo natural nuclear reactor data. A very promising method to search for the variation of the fundamental constants consists in comparison of different atomic clocks. Huge enhancement of the variation effects happens in transition between accidentally degenerate atomic and molecular energy levels. A new idea is to build a ``nuclear'' clock based on the ultraviolet transition between very low excited state and ground state in Thorium nucleus. This may allow to improve sensitivity to the variation up to 10 orders of magnitude! Huge enhancement of the variation effects is also possible in cold atomic and molecular collisions near Feshbach resonance.

  20. Arguing against fundamentality

    NASA Astrophysics Data System (ADS)

    McKenzie, Kerry

    This paper aims to open up discussion on the relationship between fundamentality and naturalism, and in particular on the question of whether fundamentality may be denied on naturalistic grounds. A historico-inductive argument for an anti-fundamentalist conclusion, prominent within the contemporary metaphysical literature, is examined; finding it wanting, an alternative 'internal' strategy is proposed. By means of an example from the history of modern physics - namely S-matrix theory - it is demonstrated that (1) this strategy can generate similar (though not identical) anti-fundamentalist conclusions on more defensible naturalistic grounds, and (2) that fundamentality questions can be empirical questions. Some implications and limitations of the proposed approach are discussed.

  1. The Subordination of Aesthetic Fundamentals in College Art Instruction

    ERIC Educational Resources Information Center

    Lavender, Randall

    2003-01-01

    Opportunities for college students of art and design to study fundamentals of visual aesthetics, integrity of form, and principles of composition are limited today by a number of factors. With the well-documented prominence of postmodern critical theory in the world of contemporary art, the study of aesthetic fundamentals is largely subordinated…

  2. Fundamentals of fluid sealing

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamentals of fluid sealing, including seal operating regimes, are discussed and the general fluid-flow equations for fluid sealing are developed. Seal performance parameters such as leakage and power loss are presented. Included in the discussion are the effects of geometry, surface deformations, rotation, and both laminar and turbulent flows. The concept of pressure balancing is presented, as are differences between liquid and gas sealing. Mechanisms of seal surface separation, fundamental friction and wear concepts applicable to seals, seal materials, and pressure-velocity (PV) criteria are discussed.

  3. Minimal change disease

    MedlinePlus

    ... seen under a very powerful microscope called an electron microscope. Minimal change disease is the most common ... biopsy and examination of the tissue with an electron microscope can show signs of minimal change disease.

  4. Reading Is Fundamental, 1977.

    ERIC Educational Resources Information Center

    Smithsonian Institution, Washington, DC. National Reading is Fun-damental Program.

    Reading Is Fundamental (RIF) is a national, nonprofit organization designed to motivate children to read by making a wide variety of inexpensive books available to them and allowing the children to choose and keep books that interest them. This annual report for 1977 contains the following information on the RIF project: an account of the…

  5. Fundamentals of Chemical Processes.

    ERIC Educational Resources Information Center

    Moser, William R.

    1985-01-01

    Describes a course that provides students with a fundamental understanding of the chemical, catalytic, and engineering sciences related to the chemical reactions taking place in a variety of reactors of different configurations. Also describes the eight major lecture topics, course examinations, and term papers. The course schedule is included.…

  6. Unification of Fundamental Forces

    NASA Astrophysics Data System (ADS)

    Salam, Abdus; Taylor, Foreword by John C.

    2005-10-01

    Foreword John C. Taylor; 1. Unification of fundamental forces Abdus Salam; 2. History unfolding: an introduction to the two 1968 lectures by W. Heisenberg and P. A. M. Dirac Abdus Salam; 3. Theory, criticism, and a philosophy Werner Heisenberg; 4. Methods in theoretical physics Paul Adrian Maurice Dirac.

  7. Fundamentals of Library Instruction

    ERIC Educational Resources Information Center

    McAdoo, Monty L.

    2012-01-01

    Being a great teacher is part and parcel of being a great librarian. In this book, veteran instruction services librarian McAdoo lays out the fundamentals of the discipline in easily accessible language. Succinctly covering the topic from top to bottom, he: (1) Offers an overview of the historical context of library instruction, drawing on recent…

  8. Food Service Fundamentals.

    ERIC Educational Resources Information Center

    Marine Corps Inst., Washington, DC.

    Developed as part of the Marine Corps Institute (MCI) correspondence training program, this course on food service fundamentals is designed to provide a general background in the basic aspects of the food service program in the Marine Corps; it is adaptable for nonmilitary instruction. Introductory materials include specific information for MCI…

  9. Laser Fundamentals and Experiments.

    ERIC Educational Resources Information Center

    Van Pelt, W. F.; And Others

    As a result of work performed at the Southwestern Radiological Health Laboratory with respect to lasers, this manual was prepared in response to the increasing use of lasers in high schools and colleges. It is directed primarily toward the high school instructor who may use the text for a short course in laser fundamentals. The definition of the…

  10. Quantum correlations are tightly bound by the exclusivity principle.

    PubMed

    Yan, Bin

    2013-06-28

    It is a fundamental problem in physics of what principle limits the correlations as predicted by our current description of nature, based on quantum mechanics. One possible explanation is the "global exclusivity" principle recently discussed in Phys. Rev. Lett. 110, 060402 (2013). In this work we show that this principle actually has a much stronger restriction on the probability distribution. We provide a tight constraint inequality imposed by this principle and prove that this principle singles out quantum correlations in scenarios represented by any graph. Our result implies that the exclusivity principle might be one of the fundamental principles of nature.

  11. Fundamentals of Refrigeration.

    ERIC Educational Resources Information Center

    Sutliff, Ronald D.; And Others

    This self-study course is designed to familiarize Marine enlisted personnel with the principles of the refrigeration process. The course contains five study units. Each study unit begins with a general objective, which is a statement of what the student should learn from the unit. The study units are divided into numbered work units, each…

  12. FUNDAMENTALS OF TELEVISION SYSTEMS.

    ERIC Educational Resources Information Center

    KESSLER, WILLIAM J.

    DESIGNED FOR A READER WITHOUT SPECIAL TECHNICAL KNOWLEDGE, THIS ILLUSTRATED RESOURCE PAPER EXPLAINS THE COMPONENTS OF A TELEVISION SYSTEM AND RELATES THEM TO THE COMPLETE SYSTEM. SUBJECTS DISCUSSED ARE THE FOLLOWING--STUDIO ORGANIZATION AND COMPATIBLE COLOR TELEVISION PRINCIPLES, WIRED AND RADIO TRANSMISSION SYSTEMS, DIRECT VIEW AND PROJECTION…

  13. Fundaments of plant cybernetics.

    PubMed

    Zucconi, F

    2001-01-01

    A systemic approach is proposed for analyzing plants' physiological organization and cybernesis. To this end, the plant is inspected as a system, starting from the integration of crown and root systems, and its impact on a number of basic epigenetic events. The approach proves to be axiomatic and facilitates the definition of the principles behind the plant's autonomous control of growth and reproduction.

  14. Common Principles and Multiculturalism

    PubMed Central

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720

  15. Common principles and multiculturalism.

    PubMed

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea.

  16. Common principles and multiculturalism.

    PubMed

    Zahedi, Farzaneh; Larijani, Bagher

    2009-01-01

    Judgment on rightness and wrongness of beliefs and behaviors is a main issue in bioethics. Over centuries, big philosophers and ethicists have been discussing the suitable tools to determine which act is morally sound and which one is not. Emerging the contemporary bioethics in the West has resulted in a misconception that absolute westernized principles would be appropriate tools for ethical decision making in different cultures. We will discuss this issue by introducing a clinical case. Considering various cultural beliefs around the world, though it is not logical to consider all of them ethically acceptable, we can gather on some general fundamental principles instead of going to the extremes of relativism and absolutism. Islamic teachings, according to the presented evidence in this paper, fall in with this idea. PMID:23908720

  17. Fundamentals of Polarized Light

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael

    2003-01-01

    The analytical and numerical basis for describing scattering properties of media composed of small discrete particles is formed by the classical electromagnetic theory. Although there are several excellent textbooks outlining the fundamentals of this theory, it is convenient for our purposes to begin with a summary of those concepts and equations that are central to the subject of this book and will be used extensively in the following chapters. We start by formulating Maxwell's equations and constitutive relations for time- harmonic macroscopic electromagnetic fields and derive the simplest plane-wave solution that underlies the basic optical idea of a monochromatic parallel beam of light. This solution naturally leads to the introduction of such fundamental quantities as the refractive index and the Stokes parameters. Finally, we define the concept of a quasi-monochromatic beam of light and discuss its implications.

  18. Fundamental studies in geodynamics

    NASA Technical Reports Server (NTRS)

    Anderson, D. L.; Hager, B. H.; Kanamori, H.

    1981-01-01

    Research in fundamental studies in geodynamics continued in a number of fields including seismic observations and analysis, synthesis of geochemical data, theoretical investigation of geoid anomalies, extensive numerical experiments in a number of geodynamical contexts, and a new field seismic volcanology. Summaries of work in progress or completed during this report period are given. Abstracts of publications submitted from work in progress during this report period are attached as an appendix.

  19. Free-energy minimization and the dark-room problem.

    PubMed

    Friston, Karl; Thornton, Christopher; Clark, Andy

    2012-01-01

    Recent years have seen the emergence of an important new fundamental theory of brain function. This theory brings information-theoretic, Bayesian, neuroscientific, and machine learning approaches into a single framework whose overarching principle is the minimization of surprise (or, equivalently, the maximization of expectation). The most comprehensive such treatment is the "free-energy minimization" formulation due to Karl Friston (see e.g., Friston and Stephan, 2007; Friston, 2010a,b - see also Fiorillo, 2010; Thornton, 2010). A recurrent puzzle raised by critics of these models is that biological systems do not seem to avoid surprises. We do not simply seek a dark, unchanging chamber, and stay there. This is the "Dark-Room Problem." Here, we describe the problem and further unpack the issues to which it speaks. Using the same format as the prolog of Eddington's Space, Time, and Gravitation (Eddington, 1920) we present our discussion as a conversation between: an information theorist (Thornton), a physicist (Friston), and a philosopher (Clark). PMID:22586414

  20. Free-energy minimization and the dark-room problem.

    PubMed

    Friston, Karl; Thornton, Christopher; Clark, Andy

    2012-01-01

    Recent years have seen the emergence of an important new fundamental theory of brain function. This theory brings information-theoretic, Bayesian, neuroscientific, and machine learning approaches into a single framework whose overarching principle is the minimization of surprise (or, equivalently, the maximization of expectation). The most comprehensive such treatment is the "free-energy minimization" formulation due to Karl Friston (see e.g., Friston and Stephan, 2007; Friston, 2010a,b - see also Fiorillo, 2010; Thornton, 2010). A recurrent puzzle raised by critics of these models is that biological systems do not seem to avoid surprises. We do not simply seek a dark, unchanging chamber, and stay there. This is the "Dark-Room Problem." Here, we describe the problem and further unpack the issues to which it speaks. Using the same format as the prolog of Eddington's Space, Time, and Gravitation (Eddington, 1920) we present our discussion as a conversation between: an information theorist (Thornton), a physicist (Friston), and a philosopher (Clark).

  1. Value of Fundamental Science

    NASA Astrophysics Data System (ADS)

    Burov, Alexey

    Fundamental science is a hard, long-term human adventure that has required high devotion and social support, especially significant in our epoch of Mega-science. The measure of this devotion and this support expresses the real value of the fundamental science in public opinion. Why does fundamental science have value? What determines its strength and what endangers it? The dominant answer is that the value of science arises out of curiosity and is supported by the technological progress. Is this really a good, astute answer? When trying to attract public support, we talk about the ``mystery of the universe''. Why do these words sound so attractive? What is implied by and what is incompatible with them? More than two centuries ago, Immanuel Kant asserted an inseparable entanglement between ethics and metaphysics. Thus, we may ask: which metaphysics supports the value of scientific cognition, and which does not? Should we continue to neglect the dependence of value of pure science on metaphysics? If not, how can this issue be addressed in the public outreach? Is the public alienated by one or another message coming from the face of science? What does it mean to be politically correct in this sort of discussion?

  2. Genetic principles.

    PubMed

    Abuelo, D

    1987-01-01

    The author discusses the basic principles of genetics, including the classification of genetic disorders and a consideration of the rules and mechanisms of inheritance. The most common pitfalls in clinical genetic diagnosis are described, with emphasis on the problem of the negative or misleading family history.

  3. The 4th Thermodynamic Principle?

    SciTech Connect

    Montero Garcia, Jose de la Luz; Novoa Blanco, Jesus Francisco

    2007-04-28

    It should be emphasized that the 4th Principle above formulated is a thermodynamic principle and, at the same time, is mechanical-quantum and relativist, as it should inevitably be and its absence has been one of main the theoretical limitations of the physical theory until today.We show that the theoretical discovery of Dimensional Primitive Octet of Matter, the 4th Thermodynamic Principle, the Quantum Hexet of Matter, the Global Hexagonal Subsystem of Fundamental Constants of Energy and the Measurement or Connected Global Scale or Universal Existential Interval of the Matter is that it is possible to be arrived at a global formulation of the four 'forces' or fundamental interactions of nature. The Einstein's golden dream is possible.

  4. Ecological Principles and Guidelines for Managing the Use of Land

    SciTech Connect

    Dale, Virginia H; Brown, Sandra; Haeuber, R A; Hobbs, N T; Huntly, N; Naiman, R J; Riebsame, W E; Turner, M G; Valone, T J

    2014-01-01

    The many ways that people have used and managed land throughout history has emerged as a primary cause of land-cover change around the world. Thus, land use and land management increasingly represent a fundamental source of change in the global environment. Despite their global importance, however, many decisions about the management and use of land are made with scant attention to ecological impacts. Thus, ecologists' knowledge of the functioning of Earth's ecosystems is needed to broaden the scientific basis of decisions on land use and management. In response to this need, the Ecological Society of America established a committee to examine the ways that land-use decisions are made and the ways that ecologists could help inform those decisions. This paper reports the scientific findings of that committee. Five principles of ecological science have particular implications for land use and can assure that fundamental processes of Earth's ecosystems are sustained. These ecological principles deal with time, species, place, dis- turbance, and the landscape. The recognition that ecological processes occur within a temporal setting and change over time is fundamental to analyzing the effects of land use. In addition, individual species and networks of interacting species have strong and far-reaching effects on ecological processes. Furthermore, each site or region has a unique set of organisms and abiotic conditions influencing and constraining ecological processes. Distur- bances are important and ubiquitous ecological events whose effects may strongly influence population, com- munity, and ecosystem dynamics. Finally, the size, shape, and spatial relationships of habitat patches on the landscape affect the structure and function of ecosystems. The responses of the land to changes in use and management by people depend on expressions of these fundamental principles in nature. These principles dictate several guidelines for land use. The guidelines give practical

  5. Fundamental "Uncertainty" in Science

    NASA Astrophysics Data System (ADS)

    Reichl, Linda E.

    The conference on "Uncertainty and Surprise" was concerned with our fundamental inability to predict future events. How can we restructure organizations to effectively function in an uncertain environment? One concern is that many large complex organizations are built on mechanical models, but mechanical models cannot always respond well to "surprises." An underlying assumption a bout mechanical models is that, if we give them enough information about the world, they will know the future accurately enough that there will be few or no surprises. The assumption is that the future is basically predictable and deterministic.

  6. Fundamental experiments in velocimetry

    SciTech Connect

    Briggs, Matthew Ellsworth; Hull, Larry; Shinas, Michael

    2009-01-01

    One can understand what velocimetry does and does not measure by understanding a few fundamental experiments. Photon Doppler Velocimetry (PDV) is an interferometer that will produce fringe shifts when the length of one of the legs changes, so we might expect the fringes to change whenever the distance from the probe to the target changes. However, by making PDV measurements of tilted moving surfaces, we have shown that fringe shifts from diffuse surfaces are actually measured only from the changes caused by the component of velocity along the beam. This is an important simplification in the interpretation of PDV results, arising because surface roughness randomizes the scattered phases.

  7. Fluorescence lifetimes: fundamentals and interpretations.

    PubMed

    Noomnarm, Ulai; Clegg, Robert M

    2009-01-01

    Fluorescence measurements have been an established mainstay of photosynthesis experiments for many decades. Because in the photosynthesis literature the basics of excited states and their fates are not usually described, we have presented here an easily understandable text for biology students in the style of a chapter in a text book. In this review we give an educational overview of fundamental physical principles of fluorescence, with emphasis on the temporal response of emission. Escape from the excited state of a molecule is a dynamic event, and the fluorescence emission is in direct kinetic competition with several other pathways of de-excitation. It is essentially through a kinetic competition between all the pathways of de-excitation that we gain information about the fluorescent sample on the molecular scale. A simple probability allegory is presented that illustrates the basic ideas that are important for understanding and interpreting most fluorescence experiments. We also briefly point out challenges that confront the experimenter when interpreting time-resolved fluorescence responses.

  8. Microscopic Description of Le Chatelier's Principle

    ERIC Educational Resources Information Center

    Novak, Igor

    2005-01-01

    A simple approach that "demystifies" Le Chatelier's principle (LCP) and simulates students to think about fundamental physical background behind the well-known principles is presented. The approach uses microscopic descriptors of matter like energy levels and populations and does not require any assumption about the fixed amount of substance being…

  9. Fundamentals of satellite navigation

    NASA Astrophysics Data System (ADS)

    Stiller, A. H.

    The basic operating principles and capabilities of conventional and satellite-based navigation systems for air, sea, and land vehicles are reviewed and illustrated with diagrams. Consideration is given to autonomous onboard systems; systems based on visible or radio beacons; the Transit, Cicada, Navstar-GPS, and Glonass satellite systems; the physical laws and parameters of satellite motion; the definition of time in satellite systems; and the content of the demodulated GPS data signal. The GPS and Glonass data format frames are presented graphically, and tables listing the GPS and Glonass satellites, their technical characteristics, and the (past or scheduled) launch dates are provided.

  10. Radar principles

    NASA Technical Reports Server (NTRS)

    Sato, Toru

    1989-01-01

    Discussed here is a kind of radar called atmospheric radar, which has as its target clear air echoes from the earth's atmosphere produced by fluctuations of the atmospheric index of refraction. Topics reviewed include the vertical structure of the atmosphere, the radio refractive index and its fluctuations, the radar equation (a relation between transmitted and received power), radar equations for distributed targets and spectral echoes, near field correction, pulsed waveforms, the Doppler principle, and velocity field measurements.

  11. Fundamentals of electrokinetics

    NASA Astrophysics Data System (ADS)

    Kozak, M. W.

    The study of electrokinetics is a very mature field. Experimental studies date from the early 1800s, and acceptable theoretical analyses have existed since the early 1900s. The use of electrokinetics in practical field problems is more recent, but it is still quite mature. Most developments in the fundamental understanding of electrokinetics are in the colloid science literature. A significant and increasing divergence between the theoretical understanding of electrokinetics found in the colloid science literature and the theoretical analyses used in interpreting applied experimental studies in soil science and waste remediation has developed. The soil science literature has to date restricted itself to the use of very early theories, with their associated limitations. The purpose of this contribution is to review fundamental aspects of electrokinetic phenomena from a colloid science viewpoint. It is hoped that a bridge can be built between the two branches of the literature, from which both will benefit. Attention is paid to special topics such as the effects of overlapping double layers, applications in unsaturated soils, the influence of dispersivity, and the differences between electrokinetic theory and conductivity theory.

  12. Controlling molecular transport in minimal emulsions

    PubMed Central

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of ‘minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  13. Controlling molecular transport in minimal emulsions

    NASA Astrophysics Data System (ADS)

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of `minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions.

  14. Analysis of lipid flow on minimal surfaces

    NASA Astrophysics Data System (ADS)

    Bahmani, Fatemeh; Christenson, Joel; Rangamani, Padmini

    2016-03-01

    Interaction between the bilayer shape and surface flow is important for capturing the flow of lipids in many biological membranes. Recent microscopy evidence has shown that minimal surfaces (planes, catenoids, and helicoids) occur often in cellular membranes. In this study, we explore lipid flow in these geometries using a `stream function' formulation for viscoelastic lipid bilayers. Using this formulation, we derive two-dimensional lipid flow equations for the commonly occurring minimal surfaces in lipid bilayers. We show that for three minimal surfaces (planes, catenoids, and helicoids), the surface flow equations satisfy Stokes flow equations. In helicoids and catenoids, we show that the tangential velocity field is a Killing vector field. Thus, our analysis provides fundamental insight into the flow patterns of lipids on intracellular organelle membranes that are characterized by fixed shapes reminiscent of minimal surfaces.

  15. The Role of Fisher Information Theory in the Development of Fundamental Laws in Physical Chemistry

    ERIC Educational Resources Information Center

    Honig, J. M.

    2009-01-01

    The unifying principle that involves rendering the Fisher information measure an extremum is reviewed. It is shown that with this principle, in conjunction with appropriate constraints, a large number of fundamental laws can be derived from a common source in a unified manner. The resulting economy of thought pertaining to fundamental principles…

  16. [Principle of least action, physiology of vision, and conditioned reflexes theory].

    PubMed

    Shelepin, Iu E; Krasil'nikov, N N

    2003-06-01

    The variation principles such as principle of least action by Maupertuis (1740) and Fermat principle (1660) are fundamental for physics. They permit to establish a property by which the actual state is differing from all possible states of the system. The variation approach permits to establish equation of motion and equilibrium of a material system on the basis of one common rule which reduces to the search of the function extremes, describes this property of the system. So for the optical systems, crucial is the time and not the length of the way. According to Fermat principles, the light "choosen" from all possible ways connects two dots in the way which needs the least time. Generality of the variation principles guarantees success of their use in brain function investigations. Between different attempts to apply the variation principles to psychology and linguistics, the Zipf principle of least effort must be distinguished. Zipf (1949) demonstrated that languages and some artificial codes satisfied the least principle. For the brain physiology, classical conditioned reflex theory is the ideal area of variation principles application. According to this approach, conditioning leads to finding the extreme during fixation of the temporal link. In vision, physiological investigations are difficult because the signal has many dimensions. For example, during perception of spatial properties of surrounding world, in vision is realized minimization (reduction) of spatial-frequency spectrum of the scene. The receptive fields provide optimal accumulation of the signal. In ontogenesis, signal--noise ratio becomes optimal as receptive fields minimized the internal noise spectrum. According to the theory of match filtration, in the visual system recognition is carryied out by minimal differences between the image description in the visual system and storage in the human memory template of that image. The variation principles help to discover the physical property of

  17. [Principle of least action, physiology of vision, and conditioned reflexes theory].

    PubMed

    Shelepin, Iu E; Krasil'nikov, N N

    2003-06-01

    The variation principles such as principle of least action by Maupertuis (1740) and Fermat principle (1660) are fundamental for physics. They permit to establish a property by which the actual state is differing from all possible states of the system. The variation approach permits to establish equation of motion and equilibrium of a material system on the basis of one common rule which reduces to the search of the function extremes, describes this property of the system. So for the optical systems, crucial is the time and not the length of the way. According to Fermat principles, the light "choosen" from all possible ways connects two dots in the way which needs the least time. Generality of the variation principles guarantees success of their use in brain function investigations. Between different attempts to apply the variation principles to psychology and linguistics, the Zipf principle of least effort must be distinguished. Zipf (1949) demonstrated that languages and some artificial codes satisfied the least principle. For the brain physiology, classical conditioned reflex theory is the ideal area of variation principles application. According to this approach, conditioning leads to finding the extreme during fixation of the temporal link. In vision, physiological investigations are difficult because the signal has many dimensions. For example, during perception of spatial properties of surrounding world, in vision is realized minimization (reduction) of spatial-frequency spectrum of the scene. The receptive fields provide optimal accumulation of the signal. In ontogenesis, signal--noise ratio becomes optimal as receptive fields minimized the internal noise spectrum. According to the theory of match filtration, in the visual system recognition is carryied out by minimal differences between the image description in the visual system and storage in the human memory template of that image. The variation principles help to discover the physical property of

  18. Quantum mechanics and the generalized uncertainty principle

    SciTech Connect

    Bang, Jang Young; Berger, Micheal S.

    2006-12-15

    The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

  19. Wall of fundamental constants

    SciTech Connect

    Olive, Keith A.; Peloso, Marco; Uzan, Jean-Philippe

    2011-02-15

    We consider the signatures of a domain wall produced in the spontaneous symmetry breaking involving a dilatonlike scalar field coupled to electromagnetism. Domains on either side of the wall exhibit slight differences in their respective values of the fine-structure constant, {alpha}. If such a wall is present within our Hubble volume, absorption spectra at large redshifts may or may not provide a variation in {alpha} relative to the terrestrial value, depending on our relative position with respect to the wall. This wall could resolve the contradiction between claims of a variation of {alpha} based on Keck/Hires data and of the constancy of {alpha} based on Very Large Telescope data. We derive the properties of the wall and the parameters of the underlying microscopic model required to reproduce the possible spatial variation of {alpha}. We discuss the constraints on the existence of the low-energy domain wall and describe its observational implications concerning the variation of the fundamental constants.

  20. Fundamentals of zoological scaling

    NASA Astrophysics Data System (ADS)

    Lin, Herbert

    1982-01-01

    Most introductory physics courses emphasize highly idealized problems with unique well-defined answers. Though many textbooks complement these problems with estimation problems, few books present anything more than an elementary discussion of scaling. This paper presents some fundamentals of scaling in the zoological domain—a domain complex by any standard, but one also well suited to illustrate the power of very simple physical ideas. We consider the following animal characteristics: skeletal weight, speed of running, height and range of jumping, food consumption, heart rate, lifetime, locomotive efficiency, frequency of wing flapping, and maximum sizes of animals that fly and hover. These relationships are compared to zoological data and everyday experience, and match reasonably well.

  1. Fundamentals of neurogastroenterology.

    PubMed

    Wood, J D; Alpers, D H; Andrews, P L

    1999-09-01

    Current concepts and basic principles of neurogastroenterology in relation to functional gastrointestinal disorders are reviewed. Neurogastroenterology is emphasized as a new and advancing subspecialty of clinical gastroenterology and digestive science. As such, it embraces the investigative sciences dealing with functions, malfunctions, and malformations in the brain and spinal cord, and the sympathetic, parasympathetic and enteric divisions of the autonomic innervation of the digestive tract. Somatomotor systems are included insofar as pharyngeal phases of swallowing and pelvic floor involvement in defecation, continence, and pelvic pain are concerned. Inclusion of basic physiology of smooth muscle, mucosal epithelium, and the enteric immune system in the neurogastroenterologic domain relates to requirements for compatibility with neural control mechanisms. Psychologic and psychiatric relations to functional gastrointestinal disorders are included because they are significant components of neurogastroenterology, especially in relation to projections of discomfort and pain to the digestive tract. PMID:10457039

  2. Chemostat cultures of yeasts, continuous culture fundamentals and simple unstructured mathematical models.

    PubMed

    von Stockar, U; Auberson, L C

    1992-01-01

    Fundamental aspects of chemostat cultures are reviewed. Using yeast cultures as examples, it is shown that steady states in chemostats may be predicted quantitatively by combining the correct number of unstructured kinetic models with expressions for existing stoichiometric constraints. The necessary number of such kinetic models corresponds to the number of limiting substrates and increases with the number of different metabolic pathways available to the strain. This is demonstrated by an experimental comparison of yeast growth limited by glucose alone for which metabolism is oxidative, and growth doubly limited by both glucose and oxygen, which occurs according to an oxido-reductive metabolism. The steady state data for such experiments can in principle be predicted based on a minimal amount of information by a simple stoichiometric model. It represents the overall stoichiometry of growth by a superposition of a fully oxidative and a fully reductive growth reaction and uses the concept of "aerobicity" to characterize the relative importance of the two reactions.

  3. Minimally Invasive Valve Surgery

    PubMed Central

    Pope, Nicolas H.; Ailawadi, Gorav

    2014-01-01

    Cardiac valve surgery is life saving for many patients. The advent of minimally invasive surgical techniques has historically allowed for improvement in both post-operative convalescence and important clinical outcomes. The development of minimally invasive cardiac valve repair and replacement surgery over the past decade is poised to revolutionize the care of cardiac valve patients. Here, we present a review of the history and current trends in minimally invasive aortic and mitral valve repair and replacement, including the development of sutureless bioprosthetic valves. PMID:24797148

  4. [Minimal Change Esophagitis].

    PubMed

    Ryu, Han Seung; Choi, Suck Chei

    2016-01-25

    Gastroesophageal reflux disease (GERD) is defined as a condition which develops when the reflux of gastric contents causes troublesome symptoms and long-term complications. GERD can be divided into erosive reflux disease and non-erosive reflux disease based on endoscopic findings defined by the presence of mucosal break. The Los Angeles classification excludes minimal changes as an evidence of reflux esophagitis because of poor interobserver agreement. In the Asian literature, minimal changes are considered as one of the endoscopic findings of reflux esophagitis, but the clinical significance is still controversial. Minimal change esophagitis is recognized quite frequently among patients with GERD and many endoscopists recognize such findings in their clinical practice. This review is intended to clarify the definition of minimal change esophagitis and their histology, interobserver agreement, and symptom association with GERD.

  5. Minimizing Shortness of Breath

    MedlinePlus

    ... Top Doctors in the Nation Departments & Divisions Home Health Insights Stress & Relaxation Breathing and Relaxation Minimizing Shortness of Breath ... Management Assess Your Stress Coping Strategies Identifying ... & Programs Health Insights Doctors & Departments Research & Science Education & Training Make ...

  6. Achieving sustainable plant disease management through evolutionary principles.

    PubMed

    Zhan, Jiasui; Thrall, Peter H; Burdon, Jeremy J

    2014-09-01

    Plants and their pathogens are engaged in continuous evolutionary battles and sustainable disease management requires novel systems to create environments conducive for short-term and long-term disease control. In this opinion article, we argue that knowledge of the fundamental factors that drive host-pathogen coevolution in wild systems can provide new insights into disease development in agriculture. Such evolutionary principles can be used to guide the formulation of sustainable disease management strategies which can minimize disease epidemics while simultaneously reducing pressure on pathogens to evolve increased infectivity and aggressiveness. To ensure agricultural sustainability, disease management programs that reflect the dynamism of pathogen population structure are essential and evolutionary biologists should play an increasing role in their design.

  7. Overlay accuracy fundamentals

    NASA Astrophysics Data System (ADS)

    Kandel, Daniel; Levinski, Vladimir; Sapiens, Noam; Cohen, Guy; Amit, Eran; Klein, Dana; Vakshtein, Irina

    2012-03-01

    Currently, the performance of overlay metrology is evaluated mainly based on random error contributions such as precision and TIS variability. With the expected shrinkage of the overlay metrology budget to < 0.5nm, it becomes crucial to include also systematic error contributions which affect the accuracy of the metrology. Here we discuss fundamental aspects of overlay accuracy and a methodology to improve accuracy significantly. We identify overlay mark imperfections and their interaction with the metrology technology, as the main source of overlay inaccuracy. The most important type of mark imperfection is mark asymmetry. Overlay mark asymmetry leads to a geometrical ambiguity in the definition of overlay, which can be ~1nm or less. It is shown theoretically and in simulations that the metrology may enhance the effect of overlay mark asymmetry significantly and lead to metrology inaccuracy ~10nm, much larger than the geometrical ambiguity. The analysis is carried out for two different overlay metrology technologies: Imaging overlay and DBO (1st order diffraction based overlay). It is demonstrated that the sensitivity of DBO to overlay mark asymmetry is larger than the sensitivity of imaging overlay. Finally, we show that a recently developed measurement quality metric serves as a valuable tool for improving overlay metrology accuracy. Simulation results demonstrate that the accuracy of imaging overlay can be improved significantly by recipe setup optimized using the quality metric. We conclude that imaging overlay metrology, complemented by appropriate use of measurement quality metric, results in optimal overlay accuracy.

  8. Fundamentals of Atmospheric Radiation

    NASA Astrophysics Data System (ADS)

    Bohren, Craig F.; Clothiaux, Eugene E.

    2006-02-01

    This textbook fills a gap in the literature for teaching material suitable for students of atmospheric science and courses on atmospheric radiation. It covers the fundamentals of emission, absorption, and scattering of electromagnetic radiation from ultraviolet to infrared and beyond. Much of the book applies to planetary atmosphere. The authors are physicists and teach at the largest meteorology department of the US at Penn State. Craig T. Bohren has taught the atmospheric radiation course there for the past 20 years with no book. Eugene Clothiaux has taken over and added to the course notes. Problems given in the text come from students, colleagues, and correspondents. The design of the figures especially for this book is meant to ease comprehension. Discussions have a graded approach with a thorough treatment of subjects, such as single scattering by particles, at different levels of complexity. The discussion of the multiple scattering theory begins with piles of plates. This simple theory introduces concepts in more advanced theories, i.e. optical thickness, single-scattering albedo, asymmetry parameter. The more complicated theory, the two-stream theory, then takes the reader beyond the pile-of-plates theory. Ideal for advanced undergraduate and graduate students of atmospheric science.

  9. DOE Fundamentals Handbook: Instrumentation and Control, Volume 1

    SciTech Connect

    Not Available

    1992-06-01

    The Instrumentation and Control Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of instrumentation and control systems. The handbook includes information on temperature, pressure, flow, and level detection systems; position indication systems; process control systems; and radiation detection principles. This information will provide personnel with an understanding of the basic operation of various types of DOE nuclear facility instrumentation and control systems.

  10. Fundamentals of neurogastroenterology

    PubMed Central

    Wood, J; Alpers, D; Andrews, P

    1999-01-01

    Current concepts and basic principles of neurogastroenterology in relation to functional gastrointestinal disorders are reviewed. Neurogastroenterology is emphasized as a new and advancing subspecialty of clinical gastroenterology and digestive science. As such, it embraces the investigative sciences dealing with functions, malfunctions, and malformations in the brain and spinal cord, and the sympathetic, parasympathetic and enteric divisions of the autonomic innervation of the digestive tract. Somatomotor systems are included insofar as pharyngeal phases of swallowing and pelvic floor involvement in defecation, continence, and pelvic pain are concerned. Inclusion of basic physiology of smooth muscle, mucosal epithelium, and the enteric immune system in the neurogastroenterologic domain relates to requirements for compatibility with neural control mechanisms. Psychologic and psychiatric relations to functional gastrointestinal disorders are included because they are significant components of neurogastroenterology, especially in relation to projections of discomfort and pain to the digestive tract.


Keywords: enteric nervous system; brain-gut axis; autonomic nervous system; nausea; gut motility; mast cells; gastrointestinal pain; Rome II PMID:10457039

  11. Role of Fundamental Physics in Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Turyshev, Slava

    2004-01-01

    This talk will discuss the critical role that fundamental physics research plays for the human space exploration. In particular, the currently available technologies can already provide significant radiation reduction, minimize bone loss, increase crew productivity and, thus, uniquely contribute to overall mission success. I will discuss how fundamental physics research and emerging technologies may not only further reduce the risks of space travel, but also increase the crew mobility, enhance safety and increase the value of space exploration in the near future.

  12. A systems approach to theoretical fluid mechanics: Fundamentals

    NASA Technical Reports Server (NTRS)

    Anyiwo, J. C.

    1978-01-01

    A preliminary application of the underlying principles of the investigator's general system theory to the description and analyses of the fluid flow system is presented. An attempt is made to establish practical models, or elements of the general fluid flow system from the point of view of the general system theory fundamental principles. Results obtained are applied to a simple experimental fluid flow system, as test case, with particular emphasis on the understanding of fluid flow instability, transition and turbulence.

  13. Fundamentals and Techniques of Nonimaging

    SciTech Connect

    O'Gallagher, J. J.; Winston, R.

    2003-07-10

    This is the final report describing a long term basic research program in nonimaging optics that has led to major advances in important areas, including solar energy, fiber optics, illumination techniques, light detectors, and a great many other applications. The term ''nonimaging optics'' refers to the optics of extended sources in systems for which image forming is not important, but effective and efficient collection, concentration, transport, and distribution of light energy is. Although some of the most widely known developments of the early concepts have been in the field of solar energy, a broad variety of other uses have emerged. Most important, under the auspices of this program in fundamental research in nonimaging optics established at the University of Chicago with support from the Office of Basic Energy Sciences at the Department of Energy, the field has become very dynamic, with new ideas and concepts continuing to develop, while applications of the early concepts continue to be pursued. While the subject began as part of classical geometrical optics, it has been extended subsequently to the wave optics domain. Particularly relevant to potential new research directions are recent developments in the formalism of statistical and wave optics, which may be important in understanding energy transport on the nanoscale. Nonimaging optics permits the design of optical systems that achieve the maximum possible concentration allowed by physical conservation laws. The earliest designs were constructed by optimizing the collection of the extreme rays from a source to the desired target: the so-called ''edge-ray'' principle. Later, new concentrator types were generated by placing reflectors along the flow lines of the ''vector flux'' emanating from lambertian emitters in various geometries. A few years ago, a new development occurred with the discovery that making the design edge-ray a functional of some other system parameter permits the construction of whole

  14. How not to criticize the precautionary principle.

    PubMed

    Hughes, Jonathan

    2006-10-01

    The precautionary principle has its origins in debates about environmental policy, but is increasingly invoked in bioethical contexts. John Harris and Søren Holm argue that the principle should be rejected as incoherent, irrational, and representing a fundamental threat to scientific advance and technological progress. This article argues that while there are problems with standard formulations of the principle, Harris and Holm's rejection of all its forms is mistaken. In particular, they focus on strong versions of the principle and fail to recognize that weaker forms, which may escape their criticisms, are both possible and advocated in the literature.

  15. Minimally invasive procedures

    PubMed Central

    Baltayiannis, Nikolaos; Michail, Chandrinos; Lazaridis, George; Anagnostopoulos, Dimitrios; Baka, Sofia; Mpoukovinas, Ioannis; Karavasilis, Vasilis; Lampaki, Sofia; Papaiwannou, Antonis; Karavergou, Anastasia; Kioumis, Ioannis; Pitsiou, Georgia; Katsikogiannis, Nikolaos; Tsakiridis, Kosmas; Rapti, Aggeliki; Trakada, Georgia; Zissimopoulos, Athanasios; Zarogoulidis, Konstantinos

    2015-01-01

    Minimally invasive procedures, which include laparoscopic surgery, use state-of-the-art technology to reduce the damage to human tissue when performing surgery. Minimally invasive procedures require small “ports” from which the surgeon inserts thin tubes called trocars. Carbon dioxide gas may be used to inflate the area, creating a space between the internal organs and the skin. Then a miniature camera (usually a laparoscope or endoscope) is placed through one of the trocars so the surgical team can view the procedure as a magnified image on video monitors in the operating room. Specialized equipment is inserted through the trocars based on the type of surgery. There are some advanced minimally invasive surgical procedures that can be performed almost exclusively through a single point of entry—meaning only one small incision, like the “uniport” video-assisted thoracoscopic surgery (VATS). Not only do these procedures usually provide equivalent outcomes to traditional “open” surgery (which sometimes require a large incision), but minimally invasive procedures (using small incisions) may offer significant benefits as well: (I) faster recovery; (II) the patient remains for less days hospitalized; (III) less scarring and (IV) less pain. In our current mini review we will present the minimally invasive procedures for thoracic surgery. PMID:25861610

  16. Complementary Huygens Principle for Geometrical and Nongeometrical Optics

    ERIC Educational Resources Information Center

    Luis, Alfredo

    2007-01-01

    We develop a fundamental principle depicting the generalized ray formulation of optics provided by the Wigner function. This principle is formally identical to the Huygens-Fresnel principle but in terms of opposite concepts, rays instead of waves, and incoherent superpositions instead of coherent ones. This ray picture naturally includes…

  17. Fundamentals of Space Medicine

    NASA Astrophysics Data System (ADS)

    Clément, Gilles

    2005-03-01

    A total of more than 240 human space flights have been completed to date, involving about 450 astronauts from various countries, for a combined total presence in space of more than 70 years. The seventh long-duration expedition crew is currently in residence aboard the International Space Station, continuing a permanent presence in space that began in October 2000. During that time, investigations have been conducted on both humans and animal models to study the bone demineralization and muscle deconditioning, space motion sickness, the causes and possible treatment of postflight orthostatic intolerance, the changes in immune function, crew and crew-ground interactions, and the medical issues of living in a space environment, such as the effects of radiation or the risk of developing kidney stones. Some results of these investigations have led to fundamental discoveries about the adaptation of the human body to the space environment. Gilles Clément has been active in this research. This readable text presents the findings from the life science experiments conducted during and after space missions. Topics discussed in this book include: adaptation of sensory-motor, cardio-vascular, bone, and muscle systems to the microgravity of spaceflight; psychological and sociological issues of living in a confined, isolated, and stressful environment; operational space medicine, such as crew selection, training and in-flight health monitoring, countermeasures and support; results of space biology experiments on individual cells, plants, and animal models; and the impact of long-duration missions such as the human mission to Mars. The author also provides a detailed description of how to fly a space experiment, based on his own experience with research projects conducted onboard Salyut-7, Mir, Spacelab, and the Space Shuttle. Now is the time to look at the future of human spaceflight and what comes next. The future human exploration of Mars captures the imagination of both the

  18. Fundamentals of Space Medicine

    NASA Astrophysics Data System (ADS)

    Clément, G.

    2003-10-01

    As of today, a total of more than 240 human space flights have been completed, involving about 450 astronauts from various countries, for a combined total presence in space of more than 70 years. The seventh long-duration expedition crew is currently in residence aboard the International Space Station, continuing a permanent presence in space that began in October 2000. During that time, investigations have been conducted on both humans and animal models to study the bone demineralization and muscle deconditioning, space motion sickness, the causes and possible treatment of postflight orthostatic intolerance, the changes in immune function, crew and crew-ground interactions, and the medical issues of living in a space environment, such as the effects of radiation or the risk of developing kidney stones. Some results of these investigations have led to fundamental discoveries about the adaptation of the human body to the space environment. Gilles Clément has been active in this research. This book presents in a readable text the findings from the life science experiments conducted during and after space missions. Topics discussed in this book include: adaptation of sensory-motor, cardiovascular, bone and muscle systems to the microgravity of spaceflight; psychological and sociological issues of living in a confined, isolated and stressful environment; operational space medicine, such as crew selection, training and in-flight health monitoring, countermeasures and support; results of space biology experiments on individual cells, plants, and animal models; and the impact of long-duration missions such as the human mission to Mars. The author also provides a detailed description of how to fly a space experiment, based on his own experience with research projects conducted onboard Salyut-7, Mir, Spacelab, and the Space Shuttle. Now is the time to look at the future of human spaceflight and what comes next. The future human exploration of Mars captures the imagination

  19. Fundamentals of phosphate transfer.

    PubMed

    Kirby, Anthony J; Nome, Faruk

    2015-07-21

    Historically, the chemistry of phosphate transfer-a class of reactions fundamental to the chemistry of Life-has been discussed almost exclusively in terms of the nucleophile and the leaving group. Reactivity always depends significantly on both factors; but recent results for reactions of phosphate triesters have shown that it can also depend strongly on the nature of the nonleaving or "spectator" groups. The extreme stabilities of fully ionised mono- and dialkyl phosphate esters can be seen as extensions of the same effect, with one or two triester OR groups replaced by O(-). Our chosen lead reaction is hydrolysis-phosphate transfer to water: because water is the medium in which biological chemistry takes place; because the half-life of a system in water is an accepted basic index of stability; and because the typical mechanisms of hydrolysis, with solvent H2O providing specific molecules to act as nucleophiles and as general acids or bases, are models for reactions involving better nucleophiles and stronger general species catalysts. Not least those available in enzyme active sites. Alkyl monoester dianions compete with alkyl diester monoanions for the slowest estimated rates of spontaneous hydrolysis. High stability at physiological pH is a vital factor in the biological roles of organic phosphates, but a significant limitation for experimental investigations. Almost all kinetic measurements of phosphate transfer reactions involving mono- and diesters have been followed by UV-visible spectroscopy using activated systems, conveniently compounds with good leaving groups. (A "good leaving group" OR* is electron-withdrawing, and can be displaced to generate an anion R*O(-) in water near pH 7.) Reactivities at normal temperatures of P-O-alkyl derivatives-better models for typical biological substrates-have typically had to be estimated: by extended extrapolation from linear free energy relationships, or from rate measurements at high temperatures. Calculation is free

  20. Optimization principles of dendritic structure

    PubMed Central

    Cuntz, Hermann; Borst, Alexander; Segev, Idan

    2007-01-01

    Background Dendrites are the most conspicuous feature of neurons. However, the principles determining their structure are poorly understood. By employing cable theory and, for the first time, graph theory, we describe dendritic anatomy solely on the basis of optimizing synaptic efficacy with minimal resources. Results We show that dendritic branching topology can be well described by minimizing the path length from the neuron's dendritic root to each of its synaptic inputs while constraining the total length of wiring. Tapering of diameter toward the dendrite tip – a feature of many neurons – optimizes charge transfer from all dendritic synapses to the dendritic root while housekeeping the amount of dendrite volume. As an example, we show how dendrites of fly neurons can be closely reconstructed based on these two principles alone. PMID:17559645

  1. Minimal perceptrons for memorizing complex patterns

    NASA Astrophysics Data System (ADS)

    Pastor, Marissa; Song, Juyong; Hoang, Danh-Tai; Jo, Junghyo

    2016-11-01

    Feedforward neural networks have been investigated to understand learning and memory, as well as applied to numerous practical problems in pattern classification. It is a rule of thumb that more complex tasks require larger networks. However, the design of optimal network architectures for specific tasks is still an unsolved fundamental problem. In this study, we consider three-layered neural networks for memorizing binary patterns. We developed a new complexity measure of binary patterns, and estimated the minimal network size for memorizing them as a function of their complexity. We formulated the minimal network size for regular, random, and complex patterns. In particular, the minimal size for complex patterns, which are neither ordered nor disordered, was predicted by measuring their Hamming distances from known ordered patterns. Our predictions agree with simulations based on the back-propagation algorithm.

  2. System level electrochemical principles

    NASA Technical Reports Server (NTRS)

    Thaller, L. H.

    1985-01-01

    The traditional electrochemical storage concepts are difficult to translate into high power, high voltage multikilowatt storage systems. The increased use of electronics, and the use of electrochemical couples that minimize the difficulties associated with the corrective measures to reduce the cell to cell capacity dispersion were adopted by battery technology. Actively cooled bipolar concepts are described which represent some attractive alternative system concepts. They are projected to have higher energy densities lower volumes than current concepts. They should be easier to scale from one capacity to another and have a closer cell to cell capacity balance. These newer storage system concepts are easier to manage since they are designed to be a fully integrated battery. These ideas are referred to as system level electrochemistry. The hydrogen-oxygen regenerative fuel cells (RFC) is probably the best example of the integrated use of these principles.

  3. Minimum Principles in Motor Control.

    PubMed

    Engelbrecht, Sascha E.

    2001-06-01

    Minimum (or minimal) principles are mathematical laws that were first used in physics: Hamilton's principle and Fermat's principle of least time are two famous example. In the past decade, a number of motor control theories have been proposed that are formally of the same kind as the minimum principles of physics, and some of these have been quite successful at predicting motor performance in a variety of tasks. The present paper provides a comprehensive review of this work. Particular attention is given to the relation between minimum theories in motor control and those used in other disciplines. Other issues around which the review is organized include: (1) the relation between minimum principles and structural models of motor planning and motor control, (2) the empirically-driven development of minimum principles and the danger of circular theorizing, and (3) the design of critical tests for minimum theories. Some perspectives for future research are discussed in the concluding section of the paper. Copyright 2001 Academic Press. PMID:11401453

  4. Ways To Minimize Bullying.

    ERIC Educational Resources Information Center

    Mueller, Mary Ellen; Parisi, Mary Joy

    This report delineates a series of interventions aimed at minimizing incidences of bullying in a suburban elementary school. The social services staff was scheduled to initiate an anti-bullying incentive in fall 2001 due to the increased occurrences of bullying during the prior year. The target population consisted of third- and fourth-grade…

  5. Minimally invasive periodontal therapy.

    PubMed

    Dannan, Aous

    2011-10-01

    Minimally invasive dentistry is a concept that preserves dentition and supporting structures. However, minimally invasive procedures in periodontal treatment are supposed to be limited within periodontal surgery, the aim of which is to represent alternative approaches developed to allow less extensive manipulation of surrounding tissues than conventional procedures, while accomplishing the same objectives. In this review, the concept of minimally invasive periodontal surgery (MIPS) is firstly explained. An electronic search for all studies regarding efficacy and effectiveness of MIPS between 2001 and 2009 was conducted. For this purpose, suitable key words from Medical Subject Headings on PubMed were used to extract the required studies. All studies are demonstrated and important results are concluded. Preliminary data from case cohorts and from many studies reveal that the microsurgical access flap, in terms of MIPS, has a high potential to seal the healing wound from the contaminated oral environment by achieving and maintaining primary closure. Soft tissues are mostly preserved and minimal gingival recession is observed, an important feature to meet the demands of the patient and the clinician in the esthetic zone. However, although the potential efficacy of MIPS in the treatment of deep intrabony defects has been proved, larger studies are required to confirm and extend the reported positive preliminary outcomes.

  6. Minimizing Promotion Trauma.

    ERIC Educational Resources Information Center

    Darling, LuAnn W.; McGrath, Loraine

    1983-01-01

    Nursing administrators can minimize promotion trauma and its unnecessary cost by building awareness of the transition process, clarifying roles and expectations, and attending to the promoted employee's needs. This article will help nursing administrators develop a concept of manager care combined with programs for orientation of new managers,…

  7. Minimally invasive pancreatic surgery.

    PubMed

    Yiannakopoulou, E

    2015-12-01

    Minimally invasive pancreatic surgery is feasible and safe. Laparoscopic distal pancreatectomy should be widely adopted for benign lesions of the pancreas. Laparoscopic pancreaticoduodenectomy, although technically demanding, in the setting of pancreatic ductal adenocarcinoma has a number of advantages including shorter hospital stay, faster recovery, allowing patients to recover in a timelier manner and pursue adjuvant treatment options. Furthermore, it seems that progression-free survival is longer in patients undergoing laparoscopic pancreaticoduodenectomy in comparison with those undergoing open pancreaticoduodenectomy. Minimally invasive middle pancreatectomy seems appropriate for benign or borderline tumors of the neck of the pancreas. Technological advances including intraoperative ultrasound and intraoperative fluorescence imaging systems are expected to facilitate the wide adoption of minimally invasive pancreatic surgery. Although, the oncological outcome seems similar with that of open surgery, there are still concerns, as the majority of relevant evidence comes from retrospective studies. Large multicenter randomized studies comparing laparoscopic with open pancreatectomy as well as robotic assisted with both open and laparoscopic approaches are needed. Robotic approach could be possibly shown to be less invasive than conventional laparoscopic approach through the less traumatic intra-abdominal handling of tissues. In addition, robotic approach could enable the wide adoption of the technique by surgeon who is not that trained in advanced laparoscopic surgery. A putative clinical benefit of minimally invasive pancreatic surgery could be the attenuated surgical stress response leading to reduced morbidity and mortality as well as lack of the detrimental immunosuppressive effect especially for the oncological patients. PMID:26530291

  8. Free-Energy Minimization and the Dark-Room Problem

    PubMed Central

    Friston, Karl; Thornton, Christopher; Clark, Andy

    2012-01-01

    Recent years have seen the emergence of an important new fundamental theory of brain function. This theory brings information-theoretic, Bayesian, neuroscientific, and machine learning approaches into a single framework whose overarching principle is the minimization of surprise (or, equivalently, the maximization of expectation). The most comprehensive such treatment is the “free-energy minimization” formulation due to Karl Friston (see e.g., Friston and Stephan, 2007; Friston, 2010a,b – see also Fiorillo, 2010; Thornton, 2010). A recurrent puzzle raised by critics of these models is that biological systems do not seem to avoid surprises. We do not simply seek a dark, unchanging chamber, and stay there. This is the “Dark-Room Problem.” Here, we describe the problem and further unpack the issues to which it speaks. Using the same format as the prolog of Eddington’s Space, Time, and Gravitation (Eddington, 1920) we present our discussion as a conversation between: an information theorist (Thornton), a physicist (Friston), and a philosopher (Clark). PMID:22586414

  9. 48 CFR 9904.405-40 - Fundamental requirement.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... 9904.405-40 Section 9904.405-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.405-40 Fundamental requirement. (a) Costs expressly... be subject to the same cost accounting principles governing cost allocability as allowable costs....

  10. 48 CFR 9904.405-40 - Fundamental requirement.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... 9904.405-40 Section 9904.405-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.405-40 Fundamental requirement. (a) Costs expressly... be subject to the same cost accounting principles governing cost allocability as allowable costs....

  11. 48 CFR 9904.405-40 - Fundamental requirement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... 9904.405-40 Section 9904.405-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.405-40 Fundamental requirement. (a) Costs expressly... be subject to the same cost accounting principles governing cost allocability as allowable costs....

  12. Religious Fundamentalism among Young Muslims in Egypt and Saudi Arabia

    ERIC Educational Resources Information Center

    Moaddel, Mansoor; Karabenick, Stuart A.

    2008-01-01

    Religious fundamentalism is conceived as a distinctive set of beliefs and attitudes toward one's religion, including obedience to religious norms, belief in the universality and immutability of its principles, the validity of its claims, and its indispensability for human happiness. Surveys of Egyptian and Saudi youth, ages 18-25, reveal that…

  13. Effects of Phonetic Context on Relative Fundamental Frequency

    ERIC Educational Resources Information Center

    Lien, Yu-An S.; Gattuccio, Caitlin I.; Stepp, Cara E.

    2014-01-01

    Purpose: The effect of phonetic context on relative fundamental frequency (RFF) was examined, in order to develop stimuli sets with minimal within-speaker variability that can be implemented in future clinical protocols. Method: Sixteen speakers with healthy voices produced RFF stimuli. Uniform utterances consisted of 3 repetitions of the same…

  14. Discrete Minimal Surface Algebras

    NASA Astrophysics Data System (ADS)

    Arnlind, Joakim; Hoppe, Jens

    2010-05-01

    We consider discrete minimal surface algebras (DMSA) as generalized noncommutative analogues of minimal surfaces in higher dimensional spheres. These algebras appear naturally in membrane theory, where sequences of their representations are used as a regularization. After showing that the defining relations of the algebra are consistent, and that one can compute a basis of the enveloping algebra, we give several explicit examples of DMSAs in terms of subsets of sln (any semi-simple Lie algebra providing a trivial example by itself). A special class of DMSAs are Yang-Mills algebras. The representation graph is introduced to study representations of DMSAs of dimension d ≤ 4, and properties of representations are related to properties of graphs. The representation graph of a tensor product is (generically) the Cartesian product of the corresponding graphs. We provide explicit examples of irreducible representations and, for coinciding eigenvalues, classify all the unitary representations of the corresponding algebras.

  15. [The anthropic principle in biology and radiobiology].

    PubMed

    Akif'ev, A P; Degtiarev, S V

    1999-01-01

    In accordance with the anthropic principle of the Universe the physical constants of fundamental particles of matter and the laws of their counteraction are those that an appearance of man and mind becomes possible and necessary. It is suggested to add some biological constants to the set of fundamental constants. With reparation of DNA as an example it was shown how a cell ran some parameters of Watson-Crick double helix. It was pointed that the concept of the anthropic principle of the Universe in its full body including biological constants is a key to developing of a unified theory of evolution of the Universe within the limits of scientific creationism.

  16. Fundamental symmetry tests with antihydrogen

    SciTech Connect

    Hughes, R.J.

    1992-12-31

    The prospects for testing CPT invariance and the weak equivalence principle (WEP) for antimatter with spectroscopic measurements on antihydrogen are discussed. The potential precisions of these tests are compared with those from other measurements. The arguments involving energy conservation, the behavior of neutral kaons in a gravitational field and the equivalence principle for antiparticles are reviewed in detail.

  17. Fundamental symmetry tests with antihydrogen

    SciTech Connect

    Hughes, R.J.

    1992-01-01

    The prospects for testing CPT invariance and the weak equivalence principle (WEP) for antimatter with spectroscopic measurements on antihydrogen are discussed. The potential precisions of these tests are compared with those from other measurements. The arguments involving energy conservation, the behavior of neutral kaons in a gravitational field and the equivalence principle for antiparticles are reviewed in detail.

  18. A Matter of Principle: The Principles of Quantum Theory, Dirac's Equation, and Quantum Information

    NASA Astrophysics Data System (ADS)

    Plotnitsky, Arkady

    2015-10-01

    This article is concerned with the role of fundamental principles in theoretical physics, especially quantum theory. The fundamental principles of relativity will be addressed as well, in view of their role in quantum electrodynamics and quantum field theory, specifically Dirac's work, which, in particular Dirac's derivation of his relativistic equation of the electron from the principles of relativity and quantum theory, is the main focus of this article. I shall also consider Heisenberg's earlier work leading him to the discovery of quantum mechanics, which inspired Dirac's work. I argue that Heisenberg's and Dirac's work was guided by their adherence to and their confidence in the fundamental principles of quantum theory. The final section of the article discusses the recent work by D'Ariano and coworkers on the principles of quantum information theory, which extend quantum theory and its principles in a new direction. This extension enabled them to offer a new derivation of Dirac's equations from these principles alone, without using the principles of relativity.

  19. Against Explanatory Minimalism in Psychiatry.

    PubMed

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell's criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein's Zettel. But attention to the context of Wittgenstein's remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation.

  20. Against Explanatory Minimalism in Psychiatry

    PubMed Central

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell’s criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein’s Zettel. But attention to the context of Wittgenstein’s remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  1. Equivalence principles and electromagnetism

    NASA Technical Reports Server (NTRS)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  2. Minimally invasive mediastinal surgery

    PubMed Central

    Melfi, Franca M. A.; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a “no-touch” technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally

  3. Minimally refined biomass fuel

    DOEpatents

    Pearson, Richard K.; Hirschfeld, Tomas B.

    1984-01-01

    A minimally refined fluid composition, suitable as a fuel mixture and derived from biomass material, is comprised of one or more water-soluble carbohydrates such as sucrose, one or more alcohols having less than four carbons, and water. The carbohydrate provides the fuel source; water solubilizes the carbohydrates; and the alcohol aids in the combustion of the carbohydrate and reduces the vicosity of the carbohydrate/water solution. Because less energy is required to obtain the carbohydrate from the raw biomass than alcohol, an overall energy savings is realized compared to fuels employing alcohol as the primary fuel.

  4. Minimal E6 unification

    NASA Astrophysics Data System (ADS)

    Susič, Vasja

    2016-06-01

    A realistic model in the class of renormalizable supersymmetric E6 Grand Unified Theories is constructed. Its matter sector consists of 3 × 27 representations, while the Higgs sector is 27 +27 ¯+35 1'+35 1' ¯+78 . An analytic solution for a Standard Model vacuum is found and the Yukawa sector analyzed. It is argued that if one considers the increased predictability due to only two symmetric Yukawa matrices in this model, it can be considered a minimal SUSY E6 model with this type of matter sector. This contribution is based on Ref. [1].

  5. Fundamentals of natural computing: an overview

    NASA Astrophysics Data System (ADS)

    de Castro, Leandro Nunes

    2007-03-01

    Natural computing is a terminology introduced to encompass three classes of methods: (1) those that take inspiration from nature for the development of novel problem-solving techniques; (2) those that are based on the use of computers to synthesize natural phenomena; and (3) those that employ natural materials (e.g., molecules) to compute. The main fields of research that compose these three branches are the artificial neural networks, evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, and quantum computing, among others. This paper provides an overview of the fundamentals of natural computing, particularly the fields listed above, emphasizing the biological motivation, some design principles, their scope of applications, current research trends and open problems. The presentation is concluded with a discussion about natural computing, and when it should be used.

  6. Fundamentals of bipolar high-frequency surgery.

    PubMed

    Reidenbach, H D

    1993-04-01

    In endoscopic surgery a very precise surgical dissection technique and an efficient hemostasis are of decisive importance. The bipolar technique may be regarded as a method which satisfies both requirements, especially regarding a high safety standard in application. In this context the biophysical and technical fundamentals of this method, which have been known in principle for a long time, are described with regard to the special demands of a newly developed field of modern surgery. After classification of this method into a general and a quasi-bipolar mode, various technological solutions of specific bipolar probes, in a strict and in a generalized sense, are characterized in terms of indication. Experimental results obtained with different bipolar instruments and probes are given. The application of modern microprocessor-controlled high-frequency surgery equipment and, wherever necessary, the integration of additional ancillary technology into the specialized bipolar instruments may result in most useful and efficient tools of a key technology in endoscopic surgery.

  7. Design of the fundamental power coupler and photocathode inserts for the 112MHz superconducting electron gun

    SciTech Connect

    Xin, T.; Ben-Zvi, I.; Belomestnykh, S.; Chang, X.; Rao, T.; Skaritka, J.; Wu, Q.; Wang, E.; Liang, X.

    2011-07-25

    A 112 MHz superconducting quarter-wave resonator electron gun will be used as the injector of the Coherent Electron Cooling (CEC) proof-of-principle experiment at BNL. Furthermore, this electron gun can be the testing cavity for various photocathodes. In this paper, we present the design of the cathode stalks and a Fundamental Power Coupler (FPC) designated to the future experiments. Two types of cathode stalks are discussed. Special shape of the stalk is applied in order to minimize the RF power loss. The location of cathode plane is also optimized to enable the extraction of low emittance beam. The coaxial waveguide structure FPC has the properties of tunable coupling factor and small interference to the electron beam output. The optimization of the coupling factor and the location of the FPC are discussed in detail. Based on the transmission line theory, we designed a half wavelength cathode stalk which significantly brings down the voltage drop between the cavity and the stalk from more than 5.6 kV to 0.1 kV. The transverse field distribution on cathode has been optimized by carefully choosing the position of cathode stalk inside the cavity. Moreover, in order to decrease the RF power loss, a variable diameter design of cathode stalk has been applied. Compared to the uniform shape of stalk, this design gives us much smaller power losses in important locations. Besides that, we also proposed a fundamental power coupler based on the designed beam parameters for the future proof-of-principle CEC experiment. This FPC should give a strong enough coupling which has the Q external range from 1.5e7 to 2.6e8.

  8. Logarithmic superconformal minimal models

    NASA Astrophysics Data System (ADS)

    Pearce, Paul A.; Rasmussen, Jørgen; Tartaglia, Elena

    2014-05-01

    The higher fusion level logarithmic minimal models {\\cal LM}(P,P';n) have recently been constructed as the diagonal GKO cosets {(A_1^{(1)})_k\\oplus (A_1^ {(1)})_n}/ {(A_1^{(1)})_{k+n}} where n ≥ 1 is an integer fusion level and k = nP/(P‧- P) - 2 is a fractional level. For n = 1, these are the well-studied logarithmic minimal models {\\cal LM}(P,P')\\equiv {\\cal LM}(P,P';1). For n ≥ 2, we argue that these critical theories are realized on the lattice by n × n fusion of the n = 1 models. We study the critical fused lattice models {\\cal LM}(p,p')_{n\\times n} within a lattice approach and focus our study on the n = 2 models. We call these logarithmic superconformal minimal models {\\cal LSM}(p,p')\\equiv {\\cal LM}(P,P';2) where P = |2p - p‧|, P‧ = p‧ and p, p‧ are coprime. These models share the central charges c=c^{P,P';2}=\\frac {3}{2}\\big (1-{2(P'-P)^2}/{P P'}\\big ) of the rational superconformal minimal models {\\cal SM}(P,P'). Lattice realizations of these theories are constructed by fusing 2 × 2 blocks of the elementary face operators of the n = 1 logarithmic minimal models {\\cal LM}(p,p'). Algebraically, this entails the fused planar Temperley-Lieb algebra which is a spin-1 Birman-Murakami-Wenzl tangle algebra with loop fugacity β2 = [x]3 = x2 + 1 + x-2 and twist ω = x4 where x = eiλ and λ = (p‧- p)π/p‧. The first two members of this n = 2 series are superconformal dense polymers {\\cal LSM}(2,3) with c=-\\frac {5}{2}, β2 = 0 and superconformal percolation {\\cal LSM}(3,4) with c = 0, β2 = 1. We calculate the bulk and boundary free energies analytically. By numerically studying finite-size conformal spectra on the strip with appropriate boundary conditions, we argue that, in the continuum scaling limit, these lattice models are associated with the logarithmic superconformal models {\\cal LM}(P,P';2). For system size N, we propose finitized Kac character formulae of the form q^{-{c^{P,P';2}}/{24}+\\Delta ^{P,P';2} _{r

  9. Fundamentals of freeze-drying.

    PubMed

    Nail, Steven L; Jiang, Shan; Chongprasert, Suchart; Knopp, Shawn A

    2002-01-01

    Given the increasing importance of reducing development time for new pharmaceutical products, formulation and process development scientists must continually look for ways to "work smarter, not harder." Within the product development arena, this means reducing the amount of trial and error empiricism in arriving at a formulation and identification of processing conditions which will result in a quality final dosage form. Characterization of the freezing behavior of the intended formulation is necessary for developing processing conditions which will result in the shortest drying time while maintaining all critical quality attributes of the freeze-dried product. Analysis of frozen systems was discussed in detail, particularly with respect to the glass transition as the physical event underlying collapse during freeze-drying, eutectic mixture formation, and crystallization events upon warming of frozen systems. Experiments to determine how freezing and freeze-drying behavior is affected by changes in the composition of the formulation are often useful in establishing the "robustness" of a formulation. It is not uncommon for seemingly subtle changes in composition of the formulation, such as a change in formulation pH, buffer salt, drug concentration, or an additional excipient, to result in striking differences in freezing and freeze-drying behavior. With regard to selecting a formulation, it is wise to keep the formulation as simple as possible. If a buffer is needed, a minimum concentration should be used. The same principle applies to added salts: If used at all, the concentration should be kept to a minimum. For many proteins a combination of an amorphous excipient, such as a disaccharide, and a crystallizing excipient, such as glycine, will result in a suitable combination of chemical stability and physical stability of the freeze-dried solid. Concepts of heat and mass transfer are valuable in rational design of processing conditions. Heat transfer by conduction

  10. Minimal length in quantum gravity and gravitational measurements

    NASA Astrophysics Data System (ADS)

    Farag Ali, Ahmed; Khalil, Mohammed M.; Vagenas, Elias C.

    2015-10-01

    The existence of a minimal length is a common prediction of various theories of quantum gravity. This minimal length leads to a modification of the Heisenberg uncertainty principle to a Generalized Uncertainty Principle (GUP). Various studies showed that a GUP modifies the Hawking radiation of black holes. In this paper, we propose a modification of the Schwarzschild metric based on the modified Hawking temperature derived from the GUP. Based on this modified metric, we calculate corrections to the deflection of light, time delay of light, perihelion precession, and gravitational redshift. We compare our results with gravitational measurements to set an upper bound on the GUP parameter.

  11. Water Balance Covers For Waste Containment: Principles and Practice

    EPA Science Inventory

    Water Balance Covers for Waste Containment: Principles and Practices introduces water balance covers and compares them with conventional approaches to waste containment. The authors provided detailed analysis of the fundamentals of soil physics and design issues, introduce appl...

  12. A minimal fate-selection switch.

    PubMed

    Weinberger, Leor S

    2015-12-01

    To preserve fitness in unpredictable, fluctuating environments, a range of biological systems probabilistically generate variant phenotypes--a process often referred to as 'bet-hedging', after the financial practice of diversifying assets to minimize risk in volatile markets. The molecular mechanisms enabling bet-hedging have remained elusive. Here, we review how HIV makes a bet-hedging decision between active replication and proviral latency, a long-lived dormant state that is the chief barrier to an HIV cure. The discovery of a virus-encoded bet-hedging circuit in HIV revealed an ancient evolutionary role for latency and identified core regulatory principles, such as feedback and stochastic 'noise', that enable cell-fate decisions. These core principles were later extended to fate selection in stem cells and cancer, exposed new therapeutic targets for HIV, and led to a potentially broad strategy of using 'noise modulation' to redirect cell fate. PMID:26611210

  13. Development of Canonical Transformations from Hamilton's Principle.

    ERIC Educational Resources Information Center

    Quade, C. Richard

    1979-01-01

    The theory of canonical transformations and its development are discussed with regard to its application to Hutton's principle. Included are the derivation of the equations of motion and a lack of symmetry in the formulaion with respect to Lagrangian and the fundamental commutator relations of quantum mechanics. (Author/SA)

  14. Principles of Guided Missiles and Nuclear Weapons.

    ERIC Educational Resources Information Center

    Naval Personnel Program Support Activity, Washington, DC.

    Fundamentals of missile and nuclear weapons systems are presented in this book which is primarily prepared as the second text of a three-volume series for students of the Navy Reserve Officers' Training Corps and the Officer Candidate School. Following an introduction to guided missiles and nuclear physics, basic principles and theories are…

  15. The minimal length and quantum partition functions

    NASA Astrophysics Data System (ADS)

    Abbasiyan-Motlaq, M.; Pedram, P.

    2014-08-01

    We study the thermodynamics of various physical systems in the framework of the generalized uncertainty principle that implies a minimal length uncertainty proportional to the Planck length. We present a general scheme to analytically calculate the quantum partition function of the physical systems to first order of the deformation parameter based on the behavior of the modified energy spectrum and compare our results with the classical approach. Also, we find the modified internal energy and heat capacity of the systems for the anti-Snyder framework.

  16. Minimizing fan energy costs

    SciTech Connect

    Monroe, R.C.

    1985-05-27

    Minimizing fan energy costs and maximizing fan efficiency is the subject of this paper. Blade design itself can cause poor flow distribution and inefficiency. A basic design criterion is that a blade should produce uniform flow over the entire plane of the fan. Also an inherent problem with the axial fan is swirl -- the tangential deflection of exit-flow caused by the effect of torque. Swirl can be prevented with an inexpensive hub component. Basic efficiency can be checked by means of the fan's performance curve. Generally, fewer blades translate into higher axial-fan efficiency. A crowded inboard area creates hub turbulence which lessens efficiency. Whether the pitch of fan blades is fixed or variable also affects energy consumption. Power savings of 50% per year or more can be realized by replacing fixed-pitch, continuously operating fans with fans whose blade pitch or speed is automatically varied.

  17. Transanal Minimally Invasive Surgery

    PubMed Central

    deBeche-Adams, Teresa; Nassif, George

    2015-01-01

    Transanal minimally invasive surgery (TAMIS) was first described in 2010 as a crossover between single-incision laparoscopic surgery and transanal endoscopic microsurgery (TEM) to allow access to the proximal and mid-rectum for resection of benign and early-stage malignant rectal lesions. The TAMIS technique can also be used for noncurative intent surgery of more advanced lesions in patients who are not candidates for radical surgery. Proper workup and staging should be done before surgical decision-making. In addition to the TAMIS port, instrumentation and set up include readily available equipment found in most operating suites. TAMIS has proven its usefulness in a wide range of applications outside of local excision, including repair of rectourethral fistula, removal of rectal foreign body, control of rectal hemorrhage, and as an adjunct in total mesorectal excision for rectal cancer. TAMIS is an easily accessible, technically feasible, and cost-effective alternative to TEM. PMID:26491410

  18. [Minimal invasive implantology].

    PubMed

    Bruck, N; Zagury, A; Nahlieli, O

    2015-07-01

    Endoscopic surgery has changed the philosophy and practice of modern surgery in all aspects of medicine. It gave rise to minimally invasive surgery procedures based on the ability to visualize and to operate via small channels. In maxillofacial surgery, our ability to see clearly the surgical field opened an entirely new world of exploration, as conditions that were once almost impossible to control and whose outcome was uncertain can be now predictably managed. in this article we will descripe the advantage of using the oral endoscope during the dental implantology procedure, and we will describe a unique implant which enable us in combination with the oral endoscope to create a maxillary sinus lift with out the need of the major surgery with all of its risks and complication.

  19. [Minimally invasive breast surgery].

    PubMed

    Mátrai, Zoltán; Gulyás, Gusztáv; Kunos, Csaba; Sávolt, Akos; Farkas, Emil; Szollár, András; Kásler, Miklós

    2014-02-01

    Due to the development in medical science and industrial technology, minimally invasive procedures have appeared in the surgery of benign and malignant breast diseases. In general , such interventions result in significantly reduced breast and chest wall scars, shorter hospitalization and less pain, but they require specific, expensive devices, longer surgical time compared to open surgery. Furthermore, indications or oncological safety have not been established yet. It is quite likely, that minimally invasive surgical procedures with high-tech devices - similar to other surgical subspecialties -, will gradually become popular and it may form part of routine breast surgery even. Vacuum-assisted core biopsy with a therapeutic indication is suitable for the removal of benign fibroadenomas leaving behind an almost invisible scar, while endoscopically assisted skin-sparing and nipple-sparing mastectomy, axillary staging and reconstruction with latissimus dorsi muscle flap are all feasible through the same short axillary incision. Endoscopic techniques are also suitable for the diagnostics and treatment of intracapsular complications of implant-based breast reconstructions (intracapsular fluid, implant rupture, capsular contracture) and for the biopsy of intracapsular lesions with uncertain pathology. Perception of the role of radiofrequency ablation of breast tumors requires further hands-on experience, but it is likely that it can serve as a replacement of surgical removal in a portion of primary tumors in the future due to the development in functional imaging and anticancer drugs. With the reduction of the price of ductoscopes routine examination of the ductal branch system, guided microdochectomy and targeted surgical removal of terminal ducto-lobular units or a "sick lobe" as an anatomical unit may become feasible. The paper presents the experience of the authors and provides a literature review, for the first time in Hungarian language on the subject. Orv. Hetil

  20. Minimally invasive parathyroid surgery

    PubMed Central

    Noureldine, Salem I.; Gooi, Zhen

    2015-01-01

    Traditionally, bilateral cervical exploration for localization of all four parathyroid glands and removal of any that are grossly enlarged has been the standard surgical treatment for primary hyperparathyroidism (PHPT). With the advances in preoperative localization studies and greater public demand for less invasive procedures, novel targeted, minimally invasive techniques to the parathyroid glands have been described and practiced over the past 2 decades. Minimally invasive parathyroidectomy (MIP) can be done either through the standard Kocher incision, a smaller midline incision, with video assistance (purely endoscopic and video-assisted techniques), or through an ectopically placed, extracervical, incision. In current practice, once PHPT is diagnosed, preoperative evaluation using high-resolution radiographic imaging to localize the offending parathyroid gland is essential if MIP is to be considered. The imaging study results suggest where the surgeon should begin the focused procedure and serve as a road map to allow tailoring of an efficient, imaging-guided dissection while eliminating the unnecessary dissection of multiple glands or a bilateral exploration. Intraoperative parathyroid hormone (IOPTH) levels may be measured during the procedure, or a gamma probe used during radioguided parathyroidectomy, to ascertain that the correct gland has been excised and that no other hyperfunctional tissue is present. MIP has many advantages over the traditional bilateral, four-gland exploration. MIP can be performed using local anesthesia, requires less operative time, results in fewer complications, and offers an improved cosmetic result and greater patient satisfaction. Additional advantages of MIP are earlier hospital discharge and decreased overall associated costs. This article aims to address the considerations for accomplishing MIP, including the role of preoperative imaging studies, intraoperative adjuncts, and surgical techniques. PMID:26425454

  1. Principles of nanoscience: an overview.

    PubMed

    Behari, Jitendra

    2010-10-01

    The scientific basis of nanotechnology as envisaged from the first principles is compared to bulk behavior. Development of nanoparticles having controllable physical and electronic properties has opened up possibility of designing artificial solids. Top down and bottom up approaches are emphasized. The role of nanoparticle (quantum dots) application in nanophotonics (photovoltaic cell), and drug delivery vehicle is discussed. Fundamentals of DNA structure as the prime site in bionanotechnological manipulations is also discussed. A summary of presently available devices and applications are presented. PMID:21299044

  2. Equivalence principle in Chameleon models .

    NASA Astrophysics Data System (ADS)

    Kraiselburd, L.; Landau, S.; Salgado, M.; Sudarsky, D.

    Most theories that predict time and/or space variation of fundamental constants also predict violations of the Weak Equivalence Principle (WEP). Khoury and Weltmann proposed the chameleon model in 2004 and claimed that this model avoids experimental bounds on WEP. We present a contrasting view based on an approximate calculation of the two body problem for the chameleon field and show that the force depends on the test body composition. Furthermore, we compare the prediction of the force on a test body with Eötvös type experiments and find that the chameleon field effect cannot account for current bounds.

  3. Magnetism: Principles and Applications

    NASA Astrophysics Data System (ADS)

    Craik, Derek J.

    2003-09-01

    If you are studying physics, chemistry, materials science, electrical engineering, information technology or medicine, then you'll know that understanding magnetism is fundamental to success in your studies and here is the key to unlocking the mysteries of magnetism....... You can: obtain a simple overview of magnetism, including the roles of B and H, resonances and special techniques take full advantage of modern magnets with a wealth of expressions for fields and forces develop realistic general design programmes using isoparametric finite elements study the subtleties of the general theory of magnetic moments and their dynamics follow the development of outstanding materials appreciate how magnetism encompasses topics as diverse as rock magnetism, chemical reaction rates, biological compasses, medical therapies, superconductivity and levitation understand the basis and remarkable achievements of magnetic resonance imaging In his new book, Magnetism, Derek Craik throws light on the principles and applications of this fascinating subject. From formulae for calculating fields to quantum theory, the secrets of magnetism are exposed, ensuring that whether you are a chemist or engineer, physicist, medic or materials scientist Magnetism is the book for our course.

  4. A Colorful Demonstration of Le Chbtelier's Principle.

    ERIC Educational Resources Information Center

    Last, Arthur M.; Slade, Peter W.

    1997-01-01

    Le Chbtelier's Principle states that, when a system at equilibrium is subjected to stress, the system will respond in such a way as to minimize the effect of the stress. Describes a lecture demonstration that illustrates shifts in the position of equilibrium caused by a variety of factors. The equilibrium mixture contains iron (III) and…

  5. The minimal autopoietic unit.

    PubMed

    Luisi, Pier Luigi

    2014-12-01

    It is argued that closed, cell-like compartments, may have existed in prebiotic time, showing a simplified metabolism which was bringing about a primitive form of stationary state- a kind of homeostasis. The autopoietic primitive cell can be taken as an example and there are preliminary experimental data supporting the possible existence of this primitive form of cell activity. The genetic code permits, among other things, the continuous self-reproduction of proteins; enzymic proteins permit the synthesis of nucleic acids, and in this way there is a perfect recycling between the two most important classes of biopolymers in our life. On the other hand, the genetic code is a complex machinery, which cannot be posed at the very early time of the origin of life. And the question then arises, whether some form of alternative beginning, prior to the genetic code, would have been possible: and this is the core of the question asked. Is something with the flavor of early life conceivable, prior to the genetic code? My answer is positive, although I am too well aware that the term "conceivable" does not mean that this something is easily to be performed experimentally. To illustrate my answer, I would first go back to the operational description of cellular life as given by the theory of autopoiesis. Accordingly, a living cell is an open system capable of self-maintenance, due to a process of internal self-regeneration of the components, all within a boundary which is itself product from within. This is a universal code, valid not only for a cell, but for any living macroscopic entity, as no living system exists on Earth which does not obey this principle. In this definition (or better operational description) there is no mention of DNA or genetic code. I added in that definition the term "open system"-which is not present in the primary literature (Varela, et al., 1974) to make clear that every living system is indeed an open system-without this addition, it may seem that

  6. Gauge unification of fundamental forces

    NASA Astrophysics Data System (ADS)

    Salam, Abdus

    The following sections are included: * I. Fundamental Particles, Fundamental Forces, and Gauge Unification * II. The Emergence of Spontaneously Broken SU(2)×U(1) Gauge Theory * III. The Present and Its Problems * IV. Direct Extrapolation from the Electroweak to the Electronuclear * A. The three ideas * B. Tests of electronuclear grand unification * V. Elementarity: Unification with Gravity and Nature of Charge * A. The quest for elementarity, prequarks (preons and pre-preons * B. Post-Planck physics, supergravity, and Einstein's dreams * C. Extended supergravity, SU(8) preons, and composite gauge fields * Appendix A: Examples of Grand Unifying Groups * Appendix B: Does the Grand Plateau really exist * References

  7. Superpower nuclear minimalism

    SciTech Connect

    Graben, E.K.

    1992-01-01

    During the Cold War, the United States and the Soviet Union competed in building weapons -- now it seems like America and Russia are competing to get rid of them the fastest. The lengthy process of formal arms control has been replaced by exchanges of unilateral force reductions and proposals for reciprocal reductions not necessarily codified by treaty. Should superpower nuclear strategies change along with force postures President Bush has yet to make a formal pronouncement on post-Cold War American nuclear strategy, and it is uncertain if the Soviet/Russian doctrine of reasonable sufficiency formulated in the Gorbachev era actually heralds a change in strategy. Some of the provisions in the most recent round of unilateral proposals put forth by Presidents Bush and Yeltsin in January 1992 are compatible with a change in strategy. Whether such a change has actually occurred remains to be seen. With the end of the Cold War and the breakup of the Soviet Union, the strategic environment has fundamentally changed, so it would seem logical to reexamine strategy as well. There are two main schools of nuclear strategic thought: a maximalist school, mutual assured destruction (MAD) which emphasizes counterforce superiority and nuclear war- fighting capability, and a MAD-plus school, which emphasizes survivability of an assured destruction capability along with the ability to deliver small, limited nuclear attacks in the event that conflict occurs. The MAD-plus strategy is based on an attempt to conventionalize nuclear weapons which is unrealistic.

  8. Minimally legally invasive dentistry.

    PubMed

    Lam, R

    2014-12-01

    One disadvantage of the rapid advances in modern dentistry is that treatment options have never been more varied or confusing. Compounded by a more educated population greatly assisted by online information in an increasingly litigious society, a major concern in recent times is increased litigation against health practitioners. The manner in which courts handle disputes is ambiguous and what is considered fair or just may not be reflected in the judicial process. Although legal decisions in Australia follow a doctrine of precedent, the law is not static and is often reflected by community sentiment. In medical litigation, this has seen the rejection of the Bolam principle with a preference towards greater patient rights. Recent court decisions may change the practice of dentistry and it is important that the clinician is not caught unaware. The aim of this article is to discuss legal issues that are pertinent to the practice of modern dentistry through an analysis of legal cases that have shaped health law. Through these discussions, the importance of continuing professional development, professional association and informed consent will be realized as a means to limit the legal complications of dental practice.

  9. Minimizing Accidents and Risks in High Adventure Outdoor Pursuits.

    ERIC Educational Resources Information Center

    Meier, Joel

    The fundamental dilemma in adventure programming is eliminating unreasonable risks to participants without also reducing levels of excitement, challenge, and stress. Most accidents are caused by a combination of unsafe conditions, unsafe acts, and error judgments. The best and only way to minimize critical human error in adventure programs is…

  10. A minimal lentivirus Tat.

    PubMed Central

    Derse, D; Carvalho, M; Carroll, R; Peterlin, B M

    1991-01-01

    Transcriptional regulatory mechanisms found in lentiviruses employ RNA enhancer elements called trans-activation responsive (TAR) elements. These nascent RNA stem-loops are cis-acting targets of virally encoded Tat effectors. Interactions between Tat and TAR increase the processivity of transcription complexes and lead to efficient copying of viral genomes. To study essential elements of this trans activation, peptide motifs from Tats of two distantly related lentiviruses, equine infectious anemia virus (EIAV) and human immunodeficiency virus type 1 (HIV-1), were fused to the coat protein of bacteriophage R17 and tested on the long terminal repeat of EIAV, where TAR was replaced by the R17 operator, the target of the coat protein. This independent RNA-tethering mechanism mapped activation domains of Tats from HIV-1 and EIAV to 47 and 15 amino acids and RNA-binding domains to 10 and 26 amino acids, respectively. Thus, a minimal lentivirus Tat consists of 25 amino acids, of which 15 modify viral transcription and 10 bind to the target RNA stem-loop. Images PMID:1658392

  11. The Elements and Principles of Design: A Baseline Study

    ERIC Educational Resources Information Center

    Adams, Erin

    2013-01-01

    Critical to the discipline, both professionally and academically, are the fundamentals of interior design. These fundamentals include the elements and principles of interior design: the commonly accepted tools and vocabulary used to create and communicate successful interior environments. Research indicates a lack of consistency in both the…

  12. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed. PMID:26512022

  13. A review of the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Nasser Tawfik, Abdel; Magied Diab, Abdel

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some ‘thought’ experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  14. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  15. Status of Fundamental Physics Program

    NASA Technical Reports Server (NTRS)

    Lee, Mark C.

    2003-01-01

    Update of the Fundamental Physics Program. JEM/EF Slip. 2 years delay. Reduced budget. Community support and advocacy led by Professor Nick Bigelow. Reprogramming led by Fred O Callaghan/JPL team. LTMPF M1 mission (DYNAMX and SUMO). PARCS. Carrier re baselined on JEM/EF.

  16. Fundamental Practices of Curriculum Development.

    ERIC Educational Resources Information Center

    Usova, George M.; Gibson, Marcia

    Designed to give guidance to those involved in the curriculum development process within the Shipyard Training Modernization Program (STMP), this guide provides an understanding of the fundamental practices followed in the curriculum development process. It also demonstrates incorrect and correct approaches to the development of the curriculum…

  17. Light as a Fundamental Particle

    ERIC Educational Resources Information Center

    Weinberg, Steven

    1975-01-01

    Presents two arguments concerning the role of the photon. One states that the photon is just another particle distinguished by a particular value of charge, spin, mass, lifetime, and interaction properties. The second states that the photon plays a fundamental role with a deep relation to ultimate formulas of physics. (GS)

  18. Fundamentals of Microelectronics Processing (VLSI).

    ERIC Educational Resources Information Center

    Takoudis, Christos G.

    1987-01-01

    Describes a 15-week course in the fundamentals of microelectronics processing in chemical engineering, which emphasizes the use of very large scale integration (VLSI). Provides a listing of the topics covered in the course outline, along with a sample of some of the final projects done by students. (TW)

  19. Fundamentals of the Slide Library.

    ERIC Educational Resources Information Center

    Boerner, Susan Zee

    This paper is an introduction to the fundamentals of the art (including architecture) slide library, with some emphasis on basic procedures of the science slide library. Information in this paper is particularly relevant to the college, university, and museum slide library. Topics addressed include: (1) history of the slide library; (2) duties of…

  20. Chronometric cosmology and fundamental fermions

    PubMed Central

    Segal, I. E.

    1982-01-01

    It is proposed that the fundamental fermions of nature are modeled by fields on the chronometric cosmos that are not precisely spinors but become such only in the nonchronometric limit. The imbedding of the scale-extended Poincaré group in the linearizer of the Minkowskian conformal group defines such fields, by induction. PMID:16593266

  1. Museum Techniques in Fundamental Education.

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Paris (France).

    Some museum techniques and methods can be used in fundamental educational programs without elaborate buildings or equipment; exhibitions should be based on valid presumptions and should take into account the "common sense" beliefs of people for whom the exhibit is designed. They can be used profitably in the economic development of local cultural…

  2. Brake Fundamentals. Automotive Articulation Project.

    ERIC Educational Resources Information Center

    Cunningham, Larry; And Others

    Designed for secondary and postsecondary auto mechanics programs, this curriculum guide contains learning exercises in seven areas: (1) brake fundamentals; (2) brake lines, fluid, and hoses; (3) drum brakes; (4) disc brake system and service; (5) master cylinder, power boost, and control valves; (6) parking brakes; and (7) trouble shooting. Each…

  3. Fundamentals of Welding. Teacher Edition.

    ERIC Educational Resources Information Center

    Fortney, Clarence; And Others

    These instructional materials assist teachers in improving instruction on the fundamentals of welding. The following introductory information is included: use of this publication; competency profile; instructional/task analysis; related academic and workplace skills list; tools, materials, and equipment list; and 27 references. Seven units of…

  4. Environmental Law: Fundamentals for Schools.

    ERIC Educational Resources Information Center

    Day, David R.

    This booklet outlines the environmental problems most likely to arise in schools. An overview provides a fundamental analysis of environmental issues rather than comprehensive analysis and advice. The text examines the concerns that surround superfund cleanups, focusing on the legal framework, and furnishes some practical pointers, such as what to…

  5. Integration of Social Studies Principles in the Home Economics Curriculum.

    ERIC Educational Resources Information Center

    Texas Tech Univ., Lubbock. Home Economics Curriculum Center.

    This document is intended to help secondary home economics teachers incorporate social studies principles into their curriculum. After an introduction, the document is divided into three sections. The first section identifies and explains fundamental principles within social studies and covers the history and current state of the social studies…

  6. The minimal autopoietic unit.

    PubMed

    Luisi, Pier Luigi

    2014-12-01

    It is argued that closed, cell-like compartments, may have existed in prebiotic time, showing a simplified metabolism which was bringing about a primitive form of stationary state- a kind of homeostasis. The autopoietic primitive cell can be taken as an example and there are preliminary experimental data supporting the possible existence of this primitive form of cell activity. The genetic code permits, among other things, the continuous self-reproduction of proteins; enzymic proteins permit the synthesis of nucleic acids, and in this way there is a perfect recycling between the two most important classes of biopolymers in our life. On the other hand, the genetic code is a complex machinery, which cannot be posed at the very early time of the origin of life. And the question then arises, whether some form of alternative beginning, prior to the genetic code, would have been possible: and this is the core of the question asked. Is something with the flavor of early life conceivable, prior to the genetic code? My answer is positive, although I am too well aware that the term "conceivable" does not mean that this something is easily to be performed experimentally. To illustrate my answer, I would first go back to the operational description of cellular life as given by the theory of autopoiesis. Accordingly, a living cell is an open system capable of self-maintenance, due to a process of internal self-regeneration of the components, all within a boundary which is itself product from within. This is a universal code, valid not only for a cell, but for any living macroscopic entity, as no living system exists on Earth which does not obey this principle. In this definition (or better operational description) there is no mention of DNA or genetic code. I added in that definition the term "open system"-which is not present in the primary literature (Varela, et al., 1974) to make clear that every living system is indeed an open system-without this addition, it may seem that

  7. Control principles of complex systems

    NASA Astrophysics Data System (ADS)

    Liu, Yang-Yu; Barabási, Albert-László

    2016-07-01

    A reflection of our ultimate understanding of a complex system is our ability to control its behavior. Typically, control has multiple prerequisites: it requires an accurate map of the network that governs the interactions between the system's components, a quantitative description of the dynamical laws that govern the temporal behavior of each component, and an ability to influence the state and temporal behavior of a selected subset of the components. With deep roots in dynamical systems and control theory, notions of control and controllability have taken a new life recently in the study of complex networks, inspiring several fundamental questions: What are the control principles of complex systems? How do networks organize themselves to balance control with functionality? To address these questions here recent advances on the controllability and the control of complex networks are reviewed, exploring the intricate interplay between the network topology and dynamical laws. The pertinent mathematical results are matched with empirical findings and applications. Uncovering the control principles of complex systems can help us explore and ultimately understand the fundamental laws that govern their behavior.

  8. Minimal Length, Maximal Momentum and the Entropic Force Law

    NASA Astrophysics Data System (ADS)

    Nozari, Kourosh; Pedram, Pouria; Molkara, M.

    2012-04-01

    Different candidates of quantum gravity proposal such as string theory, noncommutative geometry, loop quantum gravity and doubly special relativity, all predict the existence of a minimum observable length and/or a maximal momentum which modify the standard Heisenberg uncertainty principle. In this paper, we study the effects of minimal length and maximal momentum on the entropic force law formulated recently by E. Verlinde.

  9. Fundamentals of Aqueous Microwave Chemistry

    EPA Science Inventory

    The first chemical revolution changed modern life with a host of excellent amenities and services, but created serious problems related to environmental pollution. After 150 years of current chemistry principles and practices, we need a radical change to a new type of chemistry k...

  10. Diesel Fundamentals. Teacher Edition (Revised).

    ERIC Educational Resources Information Center

    Clark, Elton; And Others

    This module is one of a series of teaching guides that cover diesel mechanics. The module contains 4 sections and 19 units. Section A--Orientation includes the following units: introduction to diesel mechanics and shop safety; basic shop tools; test equipment and service tools; fasteners; bearings; and seals. Section B--Engine Principles and…

  11. Fundamentals of freeze-drying.

    PubMed

    Nail, Steven L; Jiang, Shan; Chongprasert, Suchart; Knopp, Shawn A

    2002-01-01

    Given the increasing importance of reducing development time for new pharmaceutical products, formulation and process development scientists must continually look for ways to "work smarter, not harder." Within the product development arena, this means reducing the amount of trial and error empiricism in arriving at a formulation and identification of processing conditions which will result in a quality final dosage form. Characterization of the freezing behavior of the intended formulation is necessary for developing processing conditions which will result in the shortest drying time while maintaining all critical quality attributes of the freeze-dried product. Analysis of frozen systems was discussed in detail, particularly with respect to the glass transition as the physical event underlying collapse during freeze-drying, eutectic mixture formation, and crystallization events upon warming of frozen systems. Experiments to determine how freezing and freeze-drying behavior is affected by changes in the composition of the formulation are often useful in establishing the "robustness" of a formulation. It is not uncommon for seemingly subtle changes in composition of the formulation, such as a change in formulation pH, buffer salt, drug concentration, or an additional excipient, to result in striking differences in freezing and freeze-drying behavior. With regard to selecting a formulation, it is wise to keep the formulation as simple as possible. If a buffer is needed, a minimum concentration should be used. The same principle applies to added salts: If used at all, the concentration should be kept to a minimum. For many proteins a combination of an amorphous excipient, such as a disaccharide, and a crystallizing excipient, such as glycine, will result in a suitable combination of chemical stability and physical stability of the freeze-dried solid. Concepts of heat and mass transfer are valuable in rational design of processing conditions. Heat transfer by conduction

  12. What Metadata Principles Apply to Scientific Data?

    NASA Astrophysics Data System (ADS)

    Mayernik, M. S.

    2014-12-01

    Information researchers and professionals based in the library and information science fields often approach their work through developing and applying defined sets of principles. For example, for over 100 years, the evolution of library cataloging practice has largely been driven by debates (which are still ongoing) about the fundamental principles of cataloging and how those principles should manifest in rules for cataloging. Similarly, the development of archival research and practices over the past century has proceeded hand-in-hand with the emergence of principles of archival arrangement and description, such as maintaining the original order of records and documenting provenance. This project examines principles related to the creation of metadata for scientific data. The presentation will outline: 1) how understandings and implementations of metadata can range broadly depending on the institutional context, and 2) how metadata principles developed by the library and information science community might apply to metadata developments for scientific data. The development and formalization of such principles would contribute to the development of metadata practices and standards in a wide range of institutions, including data repositories, libraries, and research centers. Shared metadata principles would potentially be useful in streamlining data discovery and integration, and would also benefit the growing efforts to formalize data curation education.

  13. On nonholonomic systems and variational principles

    NASA Astrophysics Data System (ADS)

    Cronström, Christofer; Raita, Tommi

    2009-04-01

    We consider the compatibility of the equations of motion which follow from d'Alembert's principle in the case of a general autonomous nonholonomic mechanical system in N dimensions with those equations which follow for the same system by assuming the validity of a specific variational action principle, in which the nonholonomic conditions are implemented by means of the multiplication rule in the calculus of variations. The equations of motion which follow from the principle of d'Alembert are not identical to the equations which follow from the variational action principle. We give a proof that the solutions to the equations of motion which follow from d'Alembert's principle do not in general satisfy the equations which follow from the action principle with nonholonomic constraints. Thus the principle of d'Alembert and the minimal action principle involving the multiplication rule are not compatible in the case of systems with nonholonomic constraints. For simplicity the proof is given for autonomous systems only, with one general nonholonomic constraint, which is linear in the generalized velocities of the system.

  14. Slow magic angle sample spinning: a non- or minimally invasive method for high-resolution 1H nuclear magnetic resonance (NMR) metabolic profiling.

    PubMed

    Hu, Jian Zhi

    2011-01-01

    High-resolution (1)H magic angle spinning nuclear magnetic resonance (NMR), using a sample spinning rate of several kilohertz or more (i.e., high-resolution magic angle spinning (hr-MAS)), is a well-established method for metabolic profiling in intact tissues without the need for sample extraction. The only shortcoming with hr-MAS is that it is invasive and is thus unusable for non-destructive detections. Recently, a method called slow MAS, using the concept of two-dimensional NMR spectroscopy, has emerged as an alternative method for non- or minimally invasive metabolomics in intact tissues, including live animals, due to the slow or ultra-slow sample spinning used. Although slow MAS is a powerful method, its applications are hindered by experimental challenges. Correctly designing the experiment and choosing the appropriate slow MAS method both require a fundamental understanding of the operation principles, in particular the details of line narrowing due to the presence of molecular diffusion. However, these fundamental principles have not yet been fully disclosed in previous publications. The goal of this chapter is to provide an in-depth evaluation of the principles associated with slow MAS techniques by emphasizing the challenges associated with a phantom sample consisting of glass beads and H(2)O, where an unusually large magnetic susceptibility field gradient is obtained.

  15. Slow Magic Angle Sample Spinning: A Non- or Minimally Invasive Method for High- Resolution 1H Nuclear Magnetic Resonance (NMR) Metabolic Profiling

    SciTech Connect

    Hu, Jian Z.

    2011-05-01

    High resolution 1H magic angle spinning nuclear magnetic resonance (NMR), using a sample spinning rate of several kHz or more (i.e., high resolution-magic angle spinning (hr-MAS)), is a well established method for metabolic profiling in intact tissues without the need for sample extraction. The only shortcoming with hr-MAS is that it is invasive and is thus unusable for non-destructive detections. Recently, a method called slow-MAS, using the concept of two dimensional NMR spectroscopy, has emerged as an alternative method for non- or minimal invasive metabolomics in intact tissues, including live animals, due to the slow or ultra-slow-sample spinning used. Although slow-MAS is a powerful method, its applications are hindered by experimental challenges. Correctly designing the experiment and choosing the appropriate slow-MAS method both require a fundamental understanding of the operation principles, in particular the details of line narrowing due to the presence of molecular diffusion. However, these fundamental principles have not yet been fully disclosed in previous publications. The goal of this chapter is to provide an in depth evaluation of the principles associated with slow-MAS techniques by emphasizing the challenges associated with a phantom sample consisting of glass beads and H2O, where an unusually large magnetic susceptibility field gradient is obtained.

  16. Cosmic polarization rotation: An astrophysical test of fundamental physics

    NASA Astrophysics Data System (ADS)

    di Serego Alighieri, Sperello

    2015-02-01

    Possible violations of fundamental physical principles, e.g. the Einstein equivalence principle on which all metric theories of gravity are based, including general relativity (GR), would lead to a rotation of the plane of polarization for linearly polarized radiation traveling over cosmological distances, the so-called cosmic polarization rotation (CPR). We review here the astrophysical tests which have been carried out so far to check if CPR exists. These are using the radio and ultraviolet polarization of radio galaxies and the polarization of the cosmic microwave background (both E-mode and B-mode). These tests so far have been negative, leading to upper limits of the order of one degree on any CPR angle, thereby increasing our confidence in those physical principles, including GR. We also discuss future prospects in detecting CPR or improving the constraints on it.

  17. Fundamental neutron physics at LANSCE

    SciTech Connect

    Greene, G.

    1995-10-01

    Modern neutron sources and science share a common origin in mid-20th-century scientific investigations concerned with the study of the fundamental interactions between elementary particles. Since the time of that common origin, neutron science and the study of elementary particles have evolved into quite disparate disciplines. The neutron became recognized as a powerful tool for studying condensed matter with modern neutron sources being primarily used (and justified) as tools for neutron scattering and materials science research. The study of elementary particles has, of course, led to the development of rather different tools and is now dominated by activities performed at extremely high energies. Notwithstanding this trend, the study of fundamental interactions using neutrons has continued and remains a vigorous activity at many contemporary neutron sources. This research, like neutron scattering research, has benefited enormously by the development of modern high-flux neutron facilities. Future sources, particularly high-power spallation sources, offer exciting possibilities for continuing this research.

  18. DOE Fundamentals Handbook: Classical Physics

    SciTech Connect

    Not Available

    1992-06-01

    The Classical Physics Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of physical forces and their properties. The handbook includes information on the units used to measure physical properties; vectors, and how they are used to show the net effect of various forces; Newton's Laws of motion, and how to use these laws in force and motion applications; and the concepts of energy, work, and power, and how to measure and calculate the energy involved in various applications. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility systems and equipment.

  19. Variation of fundamental constants: theory

    NASA Astrophysics Data System (ADS)

    Flambaum, Victor

    2008-05-01

    Theories unifying gravity with other interactions suggest temporal and spatial variation of the fundamental ``constants'' in expanding Universe. There are some hints for the variation of different fundamental constants in quasar absorption spectra and Big Bang nucleosynthesis data. A large number of publications (including atomic clocks) report limits on the variations. We want to study the variation of the main dimensionless parameters of the Standard Model: 1. Fine structure constant alpha (combination of speed of light, electron charge and Plank constant). 2. Ratio of the strong interaction scale (LambdaQCD) to a fundamental mass like electron mass or quark mass which are proportional to Higgs vacuum expectation value. The proton mass is propotional to LambdaQCD, therefore, the proton-to-electron mass ratio comes into this second category. We performed necessary atomic, nuclear and QCD calculations needed to study variation of the fundamental constants using the Big Bang Nucleosynthsis, quasar spectra, Oklo natural nuclear reactor and atomic clock data. The relative effects of the variation may be enhanced in transitions between narrow close levels in atoms, molecules and nuclei. If one will study an enhanced effect, the relative value of systematic effects (which are not enhanced) may be much smaller. Note also that the absolute magnitude of the variation effects in nuclei (e.g. in very narrow 7 eV transition in 229Th) may be 5 orders of magnitude larger than in atoms. A different possibility of enhancement comes from the inversion transitions in molecules where splitting between the levels is due to the quantum tunneling amplitude which has strong, exponential dependence on the electron to proton mass ratio. Our study of NH3 quasar spectra has already given the best limit on the variation of electron to proton mass ratio.

  20. Principles of project management

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.

  1. Intuitions, principles and consequences.

    PubMed

    Shaw, A B

    2001-02-01

    Some approaches to the assessment of moral intuitions are discussed. The controlled ethical trial isolates a moral issue from confounding factors and thereby clarifies what a person's intuition actually is. Casuistic reasoning from situations, where intuitions are clear, suggests or modifies principles, which can then help to make decisions in situations where intuitions are unclear. When intuitions are defended by a supporting principle, that principle can be tested by finding extreme cases, in which it is counterintuitive to follow the principle. An approach to the resolution of conflict between valid moral principles, specifically the utilitarian and justice principles, is considered. It is argued that even those who justify intuitions by a priori principles are often obliged to modify or support their principles by resort to the consideration of consequences.

  2. Chemical Principls Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1973-01-01

    Two topics are discussed: (1) Stomach Upset Caused by Aspirin, illustrating principles of acid-base equilibrium and solubility; (2) Physical Chemistry of the Drinking Duck, illustrating principles of phase equilibria and thermodynamics. (DF)

  3. Systems Biology Perspectives on Minimal and Simpler Cells

    PubMed Central

    Xavier, Joana C.; Patil, Kiran Raosaheb

    2014-01-01

    SUMMARY The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. PMID:25184563

  4. Principles of Drosophila Eye Differentiation

    PubMed Central

    Cagan, Ross

    2010-01-01

    The Drosophila eye is one of nature's most beautiful structures and one of its most useful. It has emerged as a favored model for understanding the processes that direct cell fate specification, patterning, and morphogenesis. Though composed of thousands of cells, each fly eye is a simple repeating pattern of perhaps a dozen cell types arranged in a hexagonal array that optimizes coverage of the visual field. This simple structure combined with powerful genetic tools make the fly eye an ideal model to explore the relationships between local cell fate specification and global tissue patterning. In this chapter, I discuss the basic principles that have emerged from three decades of close study. We now understand at a useful level some of the basic principles of cell fate selection and the importance of local cell–cell communication. We understand less of the processes by which signaling combines with morphogenesis and basic cell biology to create a correctly patterned neuroepithelium. Progress is being made on these fundamental issues, and in this chapter I discuss some of the principles that are beginning to emerge. PMID:19737644

  5. Principles of Modern Soccer.

    ERIC Educational Resources Information Center

    Beim, George

    This book is written to give a better understanding of the principles of modern soccer to coaches and players. In nine chapters the following elements of the game are covered: (1) the development of systems; (2) the principles of attack; (3) the principles of defense; (4) training games; (5) strategies employed in restarts; (6) physical fitness…

  6. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1970-01-01

    This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…

  7. The mechanical exfoliation mechanism of black phosphorus to phosphorene: A first-principles study

    NASA Astrophysics Data System (ADS)

    Mu, Yunsheng; Si, M. S.

    2015-11-01

    Today, the renaissance of black phosphorus largely depends on the mechanical exfoliation method, which is accessible to produce few-layer forms from the bulk counterpart. However, the deep understanding of the exfoliation mechanism is missing. To this end, we resolve this issue by simulating the sliding processes of bilayer phosphorene based on first-principles calculations. It is found that the interlayer Coulomb interactions dictate the optimal sliding pathway, leading to the minimal energy barrier as low as ∼60 \\text{meV} , which gives a comparable surface energy of ∼59 \\text{mJ/m}2 in experiment. This means that black phosphorus can be exfoliated by the sliding approach. In addition, considerable bandgap modulations along these sliding pathways are obtained. The study like ours builds up a fundamental understanding of how black phosphorus is exfoliated to few-layer forms, providing a good guide to experimental research.

  8. Guidelines for mixed waste minimization

    SciTech Connect

    Owens, C.

    1992-02-01

    Currently, there is no commercial mixed waste disposal available in the United States. Storage and treatment for commercial mixed waste is limited. Host States and compacts region officials are encouraging their mixed waste generators to minimize their mixed wastes because of management limitations. This document provides a guide to mixed waste minimization.

  9. Influenza SIRS with Minimal Pneumonitis

    PubMed Central

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A.

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement.

  10. Fundamental Limits to Cellular Sensing

    NASA Astrophysics Data System (ADS)

    ten Wolde, Pieter Rein; Becker, Nils B.; Ouldridge, Thomas E.; Mugler, Andrew

    2016-03-01

    In recent years experiments have demonstrated that living cells can measure low chemical concentrations with high precision, and much progress has been made in understanding what sets the fundamental limit to the precision of chemical sensing. Chemical concentration measurements start with the binding of ligand molecules to receptor proteins, which is an inherently noisy process, especially at low concentrations. The signaling networks that transmit the information on the ligand concentration from the receptors into the cell have to filter this receptor input noise as much as possible. These networks, however, are also intrinsically stochastic in nature, which means that they will also add noise to the transmitted signal. In this review, we will first discuss how the diffusive transport and binding of ligand to the receptor sets the receptor correlation time, which is the timescale over which fluctuations in the state of the receptor, arising from the stochastic receptor-ligand binding, decay. We then describe how downstream signaling pathways integrate these receptor-state fluctuations, and how the number of receptors, the receptor correlation time, and the effective integration time set by the downstream network, together impose a fundamental limit on the precision of sensing. We then discuss how cells can remove the receptor input noise while simultaneously suppressing the intrinsic noise in the signaling network. We describe why this mechanism of time integration requires three classes (groups) of resources—receptors and their integration time, readout molecules, energy—and how each resource class sets a fundamental sensing limit. We also briefly discuss the scheme of maximum-likelihood estimation, the role of receptor cooperativity, and how cellular copy protocols differ from canonical copy protocols typically considered in the computational literature, explaining why cellular sensing systems can never reach the Landauer limit on the optimal trade

  11. Frontiers of Fundamental Physics 14

    NASA Astrophysics Data System (ADS)

    The 14th annual international symposium "Frontiers of Fundamental Physics" (FFP14) was organized by the OCEVU Labex. It was held in Marseille, on the Saint-Charles Campus of Aix Marseille University (AMU) and had over 280 participants coming from all over the world. FFP Symposium began in India in 1997 and it became itinerant in 2004, through Europe, Canada and Australia. It covers topics in fundamental physics with the objective to enable scholars working in related areas to meet on a single platform and exchange ideas. In addition to highlighting the progress in these areas, the symposium invites the top researchers to reflect on the educational aspects of our discipline. Moreover, the scientific concepts are also discussed through philosophical and epistemological viewpoints. Several eminent scientists, such as the laureates of prestigious awards (Nobel Prize, Fields Medal,…), have already participated in these meetings. The FFP14 Symposium developed around seven main themes, namely: Astroparticle Physics, Cosmology, High Energy Physics, Quantum Gravity, Mathematical Physics, Physics Education, Epistemology and Philosophy. The morning was devoted to the plenary session, with talks for a broad audience of physicists in its first half (9:00-10:30), and more specialized in its second half (11:00-12:30); this part was held in three amphitheaters. The parallel session of the Symposium took place during the afternoon (14:30-18:30) with seven thematic conferences and an additional conference on open topics named "Frontiers of Fundamental Physics". These eight conferences were organized around the contributions of participants, in addition to the ones of invited speakers. Altogether, there were some 250 contributions to the symposium (talks and posters). The plenary talks were webcasted live and recorded. The slides of the talks and the videos of the plenary talks are available from the Symposium web site: http://ffp14.cpt.univ-mrs.fr/

  12. Solid Lubrication Fundamentals and Applications

    NASA Technical Reports Server (NTRS)

    Miyoshi, Kazuhisa

    2001-01-01

    Solid Lubrication Fundamentals and Applications description of the adhesion, friction, abrasion, and wear behavior of solid film lubricants and related tribological materials, including diamond and diamond-like solid films. The book details the properties of solid surfaces, clean surfaces, and contaminated surfaces as well as discussing the structures and mechanical properties of natural and synthetic diamonds; chemical-vapor-deposited diamond film; surface design and engineering toward wear-resistant, self-lubricating diamond films and coatings. The author provides selection and design criteria as well as applications for synthetic and natural coatings in the commercial, industrial and aerospace industries..

  13. Waste minimization handbook, Volume 1

    SciTech Connect

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  14. The Physics Handbook: Fundamentals and Key Equations

    NASA Astrophysics Data System (ADS)

    Poole, Charles P., Jr.

    1999-03-01

    A comprehensive quick reference to basic topics and equations in physics This compendium of physics covers the key equations and fundamental principles that are taught in graduate programs. It offers a succinct yet systematic treatment of all areas of physics, including mathematical physics, solid state, particle physics, statistical mechanics, and optics. In one complete, self-contained volume, author Charles P. Poole, Jr. provides both review material for students preparing for PhD qualifying examinations and a quick reference for physicists who need to brush up on basic topics or delve into areas outside their expertise. Poole devotes two chapters to regularly needed information such as trigonometric and vector identities and special functions. The remaining chapters incorporate less frequently summoned concepts, including Lagrangians, parity, dispersion relations, chaos, free energies, statistical mechanical ensembles, elementary particle classification, and so forth. An indispensable resource for graduate students and physicists in industry and academia, The Physics Handbook: * Puts key information at the reader's fingertips * Incorporates essential material previously scattered through many different texts * Features 150 illustrations * Addresses theoretical as well as practical issues * Includes an extensive bibliography pointing to more thorough texts for individual subject areas

  15. Fundamental Scientific Problems in Magnetic Recording

    SciTech Connect

    Schulthess, T.C.; Miller, M.K.

    2007-06-27

    Magnetic data storage technology is presently leading the high tech industry in advancing device integration--doubling the storage density every 12 months. To continue these advancements and to achieve terra bit per inch squared recording densities, new approaches to store and access data will be needed in about 3-5 years. In this project, collaboration between Oak Ridge National Laboratory (ORNL), Center for Materials for Information Technology (MINT) at University of Alabama (UA), Imago Scientific Instruments, and Seagate Technologies, was undertaken to address the fundamental scientific problems confronted by the industry in meeting the upcoming challenges. The areas that were the focus of this study were to: (1) develop atom probe tomography for atomic scale imaging of magnetic heterostructures used in magnetic data storage technology; (2) develop a first principles based tools for the study of exchange bias aimed at finding new anti-ferromagnetic materials to reduce the thickness of the pinning layer in the read head; (3) develop high moment magnetic materials and tools to study magnetic switching in nanostructures aimed at developing improved writers of high anisotropy magnetic storage media.

  16. Nanoionic Memristive Switches -- From Fundamentals to Applications

    NASA Astrophysics Data System (ADS)

    Waser, Rainer

    2013-03-01

    A potential leap beyond the limits of Flash (with respect to write speed, write energies) and DRAM (with respect to scalability, retention times) emerges from nanoionic redox-based switching effects encountered in metal oxides (ReRAM). A range of systems exist in which highly complex ionic transport and redox reactions on the nanoscale provide the essential mechanisms for memristive switching. One class relies on mobile cations which are easily created by electrochemical oxidation of the corresponding electrode metal, transported in the insulating layer, and reduced at the inert counterelectrode (so-called electrochemical metallization memories, ECM, also called CBRAM). Another important class operates through the migration of anions, typically oxygen ions, towards the anode, and the reduction of the cation valences in the cation sublattice locally providing metallic or semiconducting phases (so-called valence change memories, VCM). The electrochemical nature of these memristive effects triggers a bipolar memory operation. In yet another class, the thermochemical effects dominate over the electrochemical effects in metal oxides (so-called thermochemical memories, TCM) which leads to a unipolar switching as known from the phase-change memories. In all systems, the defect structure turned out to be crucial for the switching process. The presentation will cover fundamental principles in terms of microscopic processes, switching kinetics and retention times, and device reliability of bipolar ReRAM variants. Passive memory arrays of ReRAM cells open up the paths towards ultradense and 3-D stackable memory and logic gate arrays.

  17. Fundamental physics in space: The French contribution

    NASA Astrophysics Data System (ADS)

    Léon-Hirtz, Sylvie

    2003-08-01

    This paper outlines the space Fundamental Physics projects developped under CNES responsability together with the french scientific community, either in the national french programme or in the french contribution to the ESA programme, mainly: -the MICROSCOPE project which aims at testing the Equivalence Principle between inertial mass and gravitational mass at a high level of precision, on a microsatellite of the MYRIADE series developped by CNES, -the PHARAO cold-atom clock which is part of the ACES project of ESA, located on an external pallett of the International Space Station, together with a swiss H-MASER and a micro-wave link making comparison with ground clocks, aimed at relativistic tests and measurement of universal constants, -the T2L2 optical link allowing to compare ultra-stable and ultra-precise clocks, -a contribution to the AMS spectrometer which searches for cosmic antimatter, on the external part of the International Space Station, -a contribution to the LISA mission of ESA for direct detection and measurement of gravitational waves by interferometry, -ground-based studies on cold-atom interferometers which could be part of the HYPER project submitted to ESA.

  18. An entropy method for induced drag minimization

    NASA Technical Reports Server (NTRS)

    Greene, George C.

    1989-01-01

    A fundamentally new approach to the aircraft minimum induced drag problem is presented. The method, a 'viscous lifting line', is based on the minimum entropy production principle and does not require the planar wake assumption. An approximate, closed form solution is obtained for several wing configurations including a comparison of wing extension, winglets, and in-plane wing sweep, with and without a constraint on wing-root bending moment. Like the classical lifting-line theory, this theory predicts that induced drag is proportional to the square of the lift coefficient and inversely proportioinal to the wing aspect ratio. Unlike the classical theory, it predicts that induced drag is Reynolds number dependent and that the optimum spanwise circulation distribution is non-elliptic.

  19. Fundamentals of Clinical Outcomes Assessment for Spinal Disorders: Clinical Outcome Instruments and Applications

    PubMed Central

    Vavken, Patrick; Ganal-Antonio, Anne Kathleen B.; Quidde, Julia; Shen, Francis H.; Chapman, Jens R.; Samartzis, Dino

    2015-01-01

    Study Design A broad narrative review. Objectives Outcome assessment in spinal disorders is imperative to help monitor the safety and efficacy of the treatment in an effort to change the clinical practice and improve patient outcomes. The following article, part two of a two-part series, discusses the various outcome tools and instruments utilized to address spinal disorders and their management. Methods A thorough review of the peer-reviewed literature was performed, irrespective of language, addressing outcome research, instruments and tools, and applications. Results Numerous articles addressing the development and implementation of health-related quality-of-life, neck and low back pain, overall pain, spinal deformity, and other condition-specific outcome instruments have been reported. Their applications in the context of the clinical trial studies, the economic analyses, and overall evidence-based orthopedics have been noted. Additional issues regarding the problems and potential sources of bias utilizing outcomes scales and the concept of minimally clinically important difference were discussed. Conclusion Continuing research needs to assess the outcome instruments and tools used in the clinical outcome assessment for spinal disorders. Understanding the fundamental principles in spinal outcome assessment may also advance the field of “personalized spine care.” PMID:26225283

  20. Quantum correlations require multipartite information principles.

    PubMed

    Gallego, Rodrigo; Würflinger, Lars Erik; Acín, Antonio; Navascués, Miguel

    2011-11-18

    Identifying which correlations among distant observers are possible within our current description of nature, based on quantum mechanics, is a fundamental problem in physics. Recently, information concepts have been proposed as the key ingredient to characterize the set of quantum correlations. Novel information principles, such as information causality or nontrivial communication complexity, have been introduced in this context and successfully applied to some concrete scenarios. We show in this work a fundamental limitation of this approach: no principle based on bipartite information concepts is able to singleout the set of quantum correlations for an arbitrary number of parties. Our results reflect the intricate structure of quantum correlations and imply that new and intrinsically multipartite information concepts are needed for their full understanding.

  1. [The anthropic principle in biology and radiobiology].

    PubMed

    Akif'ev, A P; Degtiarev, S V

    1999-01-01

    In accordance with the anthropic principle of the Universe the physical constants of fundamental particles of matter and the laws of their counteraction are those that an appearance of man and mind becomes possible and necessary. It is suggested to add some biological constants to the set of fundamental constants. With reparation of DNA as an example it was shown how a cell ran some parameters of Watson-Crick double helix. It was pointed that the concept of the anthropic principle of the Universe in its full body including biological constants is a key to developing of a unified theory of evolution of the Universe within the limits of scientific creationism. PMID:10347592

  2. Hamiltonian formalism of minimal massive gravity

    NASA Astrophysics Data System (ADS)

    Mahdavian Yekta, Davood

    2015-09-01

    In this paper, we study the three-dimensional minimal massive gravity (MMG) in the Hamiltonian formalism. At first, we define the canonical gauge generators as building blocks in this formalism and then derive the canonical expressions for the asymptotic conserved charges. The construction of a consistent asymptotic structure of MMG requires introducing suitable boundary conditions. In the second step, we show that the Poisson bracket algebra of the improved canonical gauge generators produces an asymptotic gauge group, which includes two separable versions of the Virasoro algebras. For instance, we study the Banados-Teitelboim-Zanelli (BTZ) black hole as a solution of the MMG field equations, and the conserved charges give the energy and angular momentum of the BTZ black hole. Finally, we compute the black hole entropy from the Cardy formula in the dual conformal field theory and show our result is consistent with the value obtained by using the Smarr formula from the holographic principle.

  3. The minimal nanowire: Mechanical properties of carbyne

    NASA Astrophysics Data System (ADS)

    Nair, A. K.; Cranford, S. W.; Buehler, M. J.

    2011-07-01

    Advances in molecular assembly are converging to an ultimate in atomistic precision —nanostructures built by single atoms. Recent experimental studies confirm that single chains of carbon atoms —carbyne— exist in stable polyyne structures and can be synthesized, representing the minimal possible nanowire. Here we report the mechanical properties of carbyne obtained by first-principles-based ReaxFF molecular simulation. A peak Young's modulus of 288 GPa is found with linear stiffnesses ranging from 64.6-5 N/m for lengths of 5-64 Å. We identify a size-dependent strength that ranges from 11 GPa (1.3 nN) for the shortest to a constant 8 GPa (0.9 nN) for longer carbyne chains. We demonstrate that carbyne chains exhibit extremely high vibrational frequencies close to 6 THz for the shortest chains, which are found to be highly length-dependent.

  4. The principles of teratology: are they still true?

    PubMed

    Friedman, Jan M

    2010-10-01

    James Wilson originally proposed a set of "Principles of Teratology" in 1959, the year before he helped to found the Teratology Society. By 1977, when these Principles were presented in a more definitive form in Wilson and Fraser's Handbook of Teratology, they had become a standard formulation of the basic tenets of the field. Wilson's Principles have continued to guide scientific research in teratology, and they are widely used in teaching. Recent advances in our knowledge of the molecular and cellular bases of embryogenesis serve only to provide a deeper understanding of the fundamental developmental mechanisms that underlie Wilson's Principles of Teratology.

  5. Principles of thermoacoustic energy harvesting

    NASA Astrophysics Data System (ADS)

    Avent, A. W.; Bowen, C. R.

    2015-11-01

    Thermoacoustics exploit a temperature gradient to produce powerful acoustic pressure waves. The technology has a key role to play in energy harvesting systems. A time-line in the development of thermoacoustics is presented from its earliest recorded example in glass blowing through to the development of the Sondhauss and Rijke tubes to Stirling engines and pulse-tube cryo-cooling. The review sets the current literature in context, identifies key publications and promising areas of research. The fundamental principles of thermoacoustic phenomena are explained; design challenges and factors influencing efficiency are explored. Thermoacoustic processes involve complex multi-physical coupling and transient, highly non-linear relationships which are computationally expensive to model; appropriate numerical modelling techniques and options for analyses are presented. Potential methods of harvesting the energy in the acoustic waves are also examined.

  6. Dynamical principles of two-component genetic oscillators.

    PubMed

    Guantes, Raúl; Poyatos, Juan F

    2006-03-01

    Genetic oscillators based on the interaction of a small set of molecular components have been shown to be involved in the regulation of the cell cycle, the circadian rhythms, or the response of several signaling pathways. Uncovering the functional properties of such oscillators then becomes important for the understanding of these cellular processes and for the characterization of fundamental properties of more complex clocks. Here, we show how the dynamics of a minimal two-component oscillator is drastically affected by its genetic implementation. We consider a repressor and activator element combined in a simple logical motif. While activation is always exerted at the transcriptional level, repression is alternatively operating at the transcriptional (Design I) or post-translational (Design II) level. These designs display differences on basic oscillatory features and on their behavior with respect to molecular noise or entrainment by periodic signals. In particular, Design I induces oscillations with large activator amplitudes and arbitrarily small frequencies, and acts as an "integrator" of external stimuli, while Design II shows emergence of oscillations with finite, and less variable, frequencies and smaller amplitudes, and detects better frequency-encoded signals ("resonator"). Similar types of stimulus response are observed in neurons, and thus this work enables us to connect very different biological contexts. These dynamical principles are relevant for the characterization of the physiological roles of simple oscillator motifs, the understanding of core machineries of complex clocks, and the bio-engineering of synthetic oscillatory circuits.

  7. Fundamental Travel Demand Model Example

    NASA Technical Reports Server (NTRS)

    Hanssen, Joel

    2010-01-01

    Instances of transportation models are abundant and detailed "how to" instruction is available in the form of transportation software help documentation. The purpose of this paper is to look at the fundamental inputs required to build a transportation model by developing an example passenger travel demand model. The example model reduces the scale to a manageable size for the purpose of illustrating the data collection and analysis required before the first step of the model begins. This aspect of the model development would not reasonably be discussed in software help documentation (it is assumed the model developer comes prepared). Recommendations are derived from the example passenger travel demand model to suggest future work regarding the data collection and analysis required for a freight travel demand model.

  8. Fundamental reaction pathways during coprocessing

    SciTech Connect

    Stock, L.M.; Gatsis, J.G.

    1992-12-01

    The objective of this research was to investigate the fundamental reaction pathways in coal petroleum residuum coprocessing. Once the reaction pathways are defined, further efforts can be directed at improving those aspects of the chemistry of coprocessing that are responsible for the desired results such as high oil yields, low dihydrogen consumption, and mild reaction conditions. We decided to carry out this investigation by looking at four basic aspects of coprocessing: (1) the effect of fossil fuel materials on promoting reactions essential to coprocessing such as hydrogen atom transfer, carbon-carbon bond scission, and hydrodemethylation; (2) the effect of varied mild conditions on the coprocessing reactions; (3) determination of dihydrogen uptake and utilization under severe conditions as a function of the coal or petroleum residuum employed; and (4) the effect of varied dihydrogen pressure, temperature, and residence time on the uptake and utilization of dihydrogen and on the distribution of the coprocessed products. Accomplishments are described.

  9. Astronomical reach of fundamental physics.

    PubMed

    Burrows, Adam S; Ostriker, Jeremiah P

    2014-02-18

    Using basic physical arguments, we derive by dimensional and physical analysis the characteristic masses and sizes of important objects in the universe in terms of just a few fundamental constants. This exercise illustrates the unifying power of physics and the profound connections between the small and the large in the cosmos we inhabit. We focus on the minimum and maximum masses of normal stars, the corresponding quantities for neutron stars, the maximum mass of a rocky planet, the maximum mass of a white dwarf, and the mass of a typical galaxy. To zeroth order, we show that all these masses can be expressed in terms of either the Planck mass or the Chandrasekar mass, in combination with various dimensionless quantities. With these examples, we expose the deep interrelationships imposed by nature between disparate realms of the universe and the amazing consequences of the unifying character of physical law. PMID:24477692

  10. Holographic viscosity of fundamental matter.

    PubMed

    Mateos, David; Myers, Robert C; Thomson, Rowan M

    2007-03-01

    A holographic dual of a finite-temperature SU(Nc) gauge theory with a small number of flavors Nf or =1/4pi. Given the known results for the entropy density, the contribution of the fundamental matter eta fund is therefore enhanced at strong 't Hooft coupling lambda; for example, eta fund approximately lambda NcNfT3 in four dimensions. Other transport coefficients are analogously enhanced. These results hold with or without a baryon number chemical potential. PMID:17358523

  11. [INFORMATION, A FUNDAMENTAL PATIENT RIGHT?].

    PubMed

    Mémeteau, Gérard

    2015-03-01

    Although expressed before the "Lambert" case, which has led us to think about refusal and assent in the context of internal rights, conventional rights--and in the context of the patient's bed!--these simple remarks present the patient's right to medical information as a so-called fundamental right. But it can only be understood with a view to a treatment or other medical act; otherwise it has no reason to be and is only an academic exercise, however exciting, but not much use by itself. What if we reversed the terms of the problem: the right of the doctor to information? (The beautiful thesis of Ph. Gaston, Paris 8, 2 December 2014).

  12. Fundamental studies of polymer filtration

    SciTech Connect

    Smith, B.F.; Lu, M.T.; Robison, T.W.; Rogers, Y.C.; Wilson, K.V.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). The objectives of this project were (1) to develop an enhanced fundamental understanding of the coordination chemistry of hazardous-metal-ion complexation with water-soluble metal-binding polymers, and (2) to exploit this knowledge to develop improved separations for analytical methods, metals processing, and waste treatment. We investigated features of water-soluble metal-binding polymers that affect their binding constants and selectivity for selected transition metal ions. We evaluated backbone polymers using light scattering and ultrafiltration techniques to determine the effect of pH and ionic strength on the molecular volume of the polymers. The backbone polymers were incrementally functionalized with a metal-binding ligand. A procedure and analytical method to determine the absolute level of functionalization was developed and the results correlated with the elemental analysis, viscosity, and molecular size.

  13. Astronomical reach of fundamental physics.

    PubMed

    Burrows, Adam S; Ostriker, Jeremiah P

    2014-02-18

    Using basic physical arguments, we derive by dimensional and physical analysis the characteristic masses and sizes of important objects in the universe in terms of just a few fundamental constants. This exercise illustrates the unifying power of physics and the profound connections between the small and the large in the cosmos we inhabit. We focus on the minimum and maximum masses of normal stars, the corresponding quantities for neutron stars, the maximum mass of a rocky planet, the maximum mass of a white dwarf, and the mass of a typical galaxy. To zeroth order, we show that all these masses can be expressed in terms of either the Planck mass or the Chandrasekar mass, in combination with various dimensionless quantities. With these examples, we expose the deep interrelationships imposed by nature between disparate realms of the universe and the amazing consequences of the unifying character of physical law.

  14. Astronomical reach of fundamental physics

    PubMed Central

    Burrows, Adam S.; Ostriker, Jeremiah P.

    2014-01-01

    Using basic physical arguments, we derive by dimensional and physical analysis the characteristic masses and sizes of important objects in the universe in terms of just a few fundamental constants. This exercise illustrates the unifying power of physics and the profound connections between the small and the large in the cosmos we inhabit. We focus on the minimum and maximum masses of normal stars, the corresponding quantities for neutron stars, the maximum mass of a rocky planet, the maximum mass of a white dwarf, and the mass of a typical galaxy. To zeroth order, we show that all these masses can be expressed in terms of either the Planck mass or the Chandrasekar mass, in combination with various dimensionless quantities. With these examples, we expose the deep interrelationships imposed by nature between disparate realms of the universe and the amazing consequences of the unifying character of physical law. PMID:24477692

  15. Cognition is … Fundamentally Cultural

    PubMed Central

    Bender, Andrea; Beller, Sieghard

    2013-01-01

    A prevailing concept of cognition in psychology is inspired by the computer metaphor. Its focus on mental states that are generated and altered by information input, processing, storage and transmission invites a disregard for the cultural dimension of cognition, based on three (implicit) assumptions: cognition is internal, processing can be distinguished from content, and processing is independent of cultural background. Arguing against each of these assumptions, we point out how culture may affect cognitive processes in various ways, drawing on instances from numerical cognition, ethnobiological reasoning, and theory of mind. Given the pervasive cultural modulation of cognition—on all of Marr’s levels of description—we conclude that cognition is indeed fundamentally cultural, and that consideration of its cultural dimension is essential for a comprehensive understanding. PMID:25379225

  16. Fundamental issues in questionnaire design.

    PubMed

    Murray, P

    1999-07-01

    The questionnaire is probably the most common form of data collection tool used in nursing research. There is a misconception that anyone with a clear grasp of English and a modicum of common sense can design an effective questionnaire. Contrary to such common belief, this article will demonstrate that questionnaire design is a complex and time consuming process, but a necessary labour to ensure valid and reliable data is collected. In addition, meticulous construction is more likely to yield data that can be utilized in the pursuit of objective, quantitative and generalizable truths, upon which practice and policy decisions can be formulated. This article examines a myriad of fundamental issues surrounding questionnaire design, which encompass question wording, question order, presentation, administration and data collection, amongst other issues.

  17. Fundamentals of air quality systems

    SciTech Connect

    Noll, K.E.

    1999-08-01

    The book uses numerous examples to demonstrate how basic design concepts can be applied to the control of air emissions from industrial sources. It focuses on the design of air pollution control devices for the removal of gases and particles from industrial sources, and provides detailed, specific design methods for each major air pollution control system. Individual chapters provide design methods that include both theory and practice with emphasis on the practical aspect by providing numerous examples that demonstrate how air pollution control devices are designed. Contents include air pollution laws, air pollution control devices; physical properties of air, gas laws, energy concepts, pressure; motion of airborne particles, filter and water drop collection efficiency; fundamentals of particulate emission control; cyclones; fabric filters; wet scrubbers; electrostatic precipitators; control of volatile organic compounds; adsorption; incineration; absorption; control of gaseous emissions from motor vehicles; practice problems (with solutions) for the P.E. examination in environmental engineering. Design applications are featured throughout.

  18. Rare Isotopes and Fundamental Symmetries

    NASA Astrophysics Data System (ADS)

    Brown, B. Alex; Engel, Jonathan; Haxton, Wick; Ramsey-Musolf, Michael; Romalis, Michael; Savard, Guy

    2009-01-01

    Experiments searching for new interactions in nuclear beta decay / Klaus P. Jungmann -- The beta-neutrino correlation in sodium-21 and other nuclei / P. A. Vetter ... [et al.] -- Nuclear structure and fundamental symmetries/ B. Alex Brown -- Schiff moments and nuclear structure / J. Engel -- Superallowed nuclear beta decay: recent results and their impact on V[symbol] / J. C. Hardy and I. S. Towner -- New calculation of the isospin-symmetry breaking correlation to superallowed Fermi beta decay / I. S. Towner and J. C. Hardy -- Precise measurement of the [symbol]H to [symbol]He mass difference / D. E. Pinegar ... [et al.] -- Limits on scalar currents from the 0+ to 0+ decay of [symbol]Ar and isospin breaking in [symbol]Cl and [symbol]Cl / A. Garcia -- Nuclear constraints on the weak nucleon-nucleon interaction / W. C. Haxton -- Atomic PNC theory: current status and future prospects / M. S. Safronova -- Parity-violating nucleon-nucleon interactions: what can we learn from nuclear anapole moments? / B. Desplanques -- Proposed experiment for the measurement of the anapole moment in francium / A. Perez Galvan ... [et al.] -- The Radon-EDM experiment / Tim Chupp for the Radon-EDM collaboration -- The lead radius Eexperiment (PREX) and parity violating measurements of neutron densities / C. J. Horowitz -- Nuclear structure aspects of Schiff moment and search for collective enhancements / Naftali Auerbach and Vladimir Zelevinsky -- The interpretation of atomic electric dipole moments: Schiff theorem and its corrections / C. -P. Liu -- T-violation and the search for a permanent electric dipole moment of the mercury atom / M. D. Swallows ... [et al.] -- The new concept for FRIB and its potential for fundamental interactions studies / Guy Savard -- Collinear laser spectroscopy and polarized exotic nuclei at NSCL / K. Minamisono -- Environmental dependence of masses and coupling constants / M. Pospelov.

  19. Fundamental enabling issues in nanotechnology :

    SciTech Connect

    Floro, Jerrold Anthony; Foiles, Stephen Martin; Hearne, Sean Joseph; Hoyt, Jeffrey John; Seel, Steven Craig; Webb, Edmund Blackburn,; Morales, Alfredo Martin; Zimmerman, Jonathan A.

    2007-10-01

    To effectively integrate nanotechnology into functional devices, fundamental aspects of material behavior at the nanometer scale must be understood. Stresses generated during thin film growth strongly influence component lifetime and performance; stress has also been proposed as a mechanism for stabilizing supported nanoscale structures. Yet the intrinsic connections between the evolving morphology of supported nanostructures and stress generation are still a matter of debate. This report presents results from a combined experiment and modeling approach to study stress evolution during thin film growth. Fully atomistic simulations are presented predicting stress generation mechanisms and magnitudes during all growth stages, from island nucleation to coalescence and film thickening. Simulations are validated by electrodeposition growth experiments, which establish the dependence of microstructure and growth stresses on process conditions and deposition geometry. Sandia is one of the few facilities with the resources to combine experiments and modeling/theory in this close a fashion. Experiments predicted an ongoing coalescence process that generates signficant tensile stress. Data from deposition experiments also supports the existence of a kinetically limited compressive stress generation mechanism. Atomistic simulations explored island coalescence and deposition onto surfaces intersected by grain boundary structures to permit investigation of stress evolution during later growth stages, e.g. continual island coalescence and adatom incorporation into grain boundaries. The predictive capabilities of simulation permit direct determination of fundamental processes active in stress generation at the nanometer scale while connecting those processes, via new theory, to continuum models for much larger island and film structures. Our combined experiment and simulation results reveal the necessary materials science to tailor stress, and therefore performance, in

  20. Physical principles of hearing

    NASA Astrophysics Data System (ADS)

    Martin, Pascal

    2015-10-01

    The following sections are included: * Psychophysical properties of hearing * The cochlear amplifier * Mechanosensory hair cells * The "critical" oscillator as a general principle of auditory detection * Bibliography

  1. Time-Varying Fundamental Constants

    NASA Astrophysics Data System (ADS)

    Olive, Keith

    2003-04-01

    Recent data from quasar absorption systems can be interpreted as arising from a time variation in the fine-structure constant. However, there are numerous cosmological, astro-physical, and terrestrial bounds on any such variation. These includes bounds from Big Bang Nucleosynthesis (from the ^4He abundance), the Oklo reactor (from the resonant neutron capture cross-section of Sm), and from meteoretic lifetimes of heavy radioactive isotopes. The bounds on the variation of the fine-structure constant are significantly strengthened in models where all gauge and Yukawa couplings vary in a dependent manner, as would be expected in unified theories. Models which are consistent with all data are severly challenged when Equivalence Principle constraints are imposed.

  2. Influenza SIRS with Minimal Pneumonitis

    PubMed Central

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A.

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement. PMID:27630988

  3. Influenza SIRS with Minimal Pneumonitis.

    PubMed

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement. PMID:27630988

  4. Minimally invasive dentistry and the dental enterprise.

    PubMed

    Rossomando, Edward F

    2007-03-01

    Improvements in understanding the process of remineralization have resulted in a reappraisal of repair of damaged tooth structure and call into question the principles of cavity preparation of GV Black and his principle of "extension for prevention." From this reappraisal has emerged the idea of minimally invasive dentistry (MID). The goal of MID is to remove as little of the sound tooth structure during the restoration phase as possible. This goal is in our reach in part because of availability of products that promote mineralization and of dental excavation instruments, like the dental laser, that can be managed to remove only damaged tooth structure. It is critical that the leaders of the dental enterprise endorse MID. Delay could allow new products to move from the dental profession to other health care providers. For example, a caries vaccine will soon enter the market place. Will dentists expand the scope of their practices to include the application of this vaccine, or will they ignore this new product and allow the new technology to enter the scope of practice of other health providers?

  5. Postoperative infections: general principles and considerations.

    PubMed

    Downey, M S; Lamy, C J

    1990-07-01

    Every surgeon should have a thorough knowledge and awareness of the general principles of postoperative infections. The key to postoperative infections is in their prevention. Even with the most prudent and ardent regimens, however, postoperative wound infections will occasionally occur. Thus, the aforementioned knowledge will allow an improved clinical acumen and permit the early diagnosis of postoperative infection. Early and vigorous local wound care combined with systemic antibiotics are necessary to minimize the potentially debilitating sequelae of the postoperative wound infection.

  6. PRINCIPLE OF INTERACTION REGION LOCAL CORRECTION

    SciTech Connect

    WEI,J.

    1999-09-07

    For hadron storage rings like the Relativistic Heavy Ion Collider (RHIC) and the Large Hadron Collider (LHC), the machine performance at collision is usually limited by the field quality of the interaction region (IR) magnets. A robust local correction for the IR region is valuable in improving the dynamic aperture with practically achievable magnet field quality. The authors present in this paper the action-angle kick minimization principle on which the local IR correction for both RHIC and the LHC are based.

  7. Minimally invasive surgery for gastric cancer.

    PubMed

    Güner, Ali; Hyung, Woo Jin

    2014-01-01

    The interest in minimally invasive surgery (MIS) has rapidly increased in recent decades and surgeons have adopted minimally invasive techniques due to its reduced invasiveness and numerous advantages for patients. With increased surgical experience and newly developed surgical instruments, MIS has become the preferred approach not only for benign disease but also for oncologic surgery. Recently, robotic systems have been developed to overcome difficulties of standard laparoscopic instruments during complex procedures. Its advantages including three-dimensional images, tremor filtering, motion scaling, articulated instruments, and stable retraction have created the opportunity to use robotic technology in many procedures including cancer surgery. Gastric cancer is one of the most common causes of cancer-related deaths worldwide. While its overall incidence has decreased worldwide, the proportion of early gastric cancer has increased mainly in eastern countries following mass screening programs. The shift in the paradigm of gastric cancer treatment is toward less invasive approaches in order to improve the patient's quality of life while adhering to oncological principles. In this review, we aimed to summarize the operative strategy and current literature in laparoscopic and robotic surgery for gastric cancer.

  8. [Minimally Invasive Open Surgery for Lung Cancer].

    PubMed

    Nakagawa, Kazuo; Watanabe, Shunichi

    2016-07-01

    Significant efforts have been made to reduce the invasiveness of surgical procedures by surgeons for a long time. Surgeons always keep it in mind that the basic principle performing less invasive surgical procedures for malignant tumors is to decrease the invasiveness for patients without compromising oncological curability and surgical safety. Video-assisted thoracic surgery (VATS) has been used increasingly as a minimally invasive approach to lung cancer surgery. Whereas, whether VATS lobectomy is a less invasive procedure and has equivalent or better clinical effect compared with open lobectomy for patients with lung cancer remains controversial because of the absence of randomized prospective studies. The degree of difficulty for anatomical lung resection depends on the degree of the fissure development, mobility of hilar lymph nodes, and the degree of pleural adhesions. During pulmonary surgery, thoracic surgeons always have to deal with not only these difficulties but other unexpected events such as intraoperative bleeding. Recently, we perform pulmonary resection for lung cancer with minimally invasive open surgery (MIOS) approach. In this article, we introduce the surgical procedure of MIOS and demonstrate short-term results. Off course, the efficacy of MIOS needs to be further evaluated with long-term results. PMID:27440030

  9. Minimally invasive treatment options in fixed prosthodontics.

    PubMed

    Edelhoff, Daniel; Liebermann, Anja; Beuer, Florian; Stimmelmayr, Michael; Güth, Jan-Frederik

    2016-03-01

    Minimally invasive treatment options have become increasingly feasible in restorative dentistry, due to the introduction of the adhesive technique in combination with restorative materials featuring translucent properties similar to those of natural teeth. Mechanical anchoring of restorations via conventional cementation represents a predominantly subtractive treatment approach that is gradually being superseded by a primarily defect-oriented additive method in prosthodontics. Modifications of conventional treatment procedures have led to the development of an economical approach to the removal of healthy tooth structure. This is possible because the planned treatment outcome is defined in a wax-up before the treatment is commenced and this wax-up is subsequently used as a reference during tooth preparation. Similarly, resin- bonded FDPs and implants have made it possible to preserve the natural tooth structure of potential abutment teeth. This report describes a number of clinical cases to demonstrate the principles of modern prosthetic treatment strategies and discusses these approaches in the context of minimally invasive prosthetic dentistry.

  10. Minimally invasive treatment options in fixed prosthodontics.

    PubMed

    Edelhoff, Daniel; Liebermann, Anja; Beuer, Florian; Stimmelmayr, Michael; Güth, Jan-Frederik

    2016-03-01

    Minimally invasive treatment options have become increasingly feasible in restorative dentistry, due to the introduction of the adhesive technique in combination with restorative materials featuring translucent properties similar to those of natural teeth. Mechanical anchoring of restorations via conventional cementation represents a predominantly subtractive treatment approach that is gradually being superseded by a primarily defect-oriented additive method in prosthodontics. Modifications of conventional treatment procedures have led to the development of an economical approach to the removal of healthy tooth structure. This is possible because the planned treatment outcome is defined in a wax-up before the treatment is commenced and this wax-up is subsequently used as a reference during tooth preparation. Similarly, resin- bonded FDPs and implants have made it possible to preserve the natural tooth structure of potential abutment teeth. This report describes a number of clinical cases to demonstrate the principles of modern prosthetic treatment strategies and discusses these approaches in the context of minimally invasive prosthetic dentistry. PMID:26925471

  11. Fundamental Physics in Space: the French Contribution

    NASA Astrophysics Data System (ADS)

    Leon-Hirtz, S.

    2002-01-01

    Relativity and quantum physics provide the framework for contemporary physics in which the relations between matter, space and time have been radically rethought during the past century. Physicists however cannot be satisfied with these two distinct theories and they are seeking to unify them and thereby quantify the gravitational field. The key of this research lies in the highly precise study of the gravitational laws. Space environment, allowing large distance experiments and isolation from terrestrial noise, is the ideal place for carrying out very precise experiments on gravitation and is highly suitable for seeking new interactions that could show up in low-energy conditions. Since 1993 when the scientific community gave its first recommandations, CNES has been working out with french research laboratories on a variety of advanced technical instrumentations needed to fulfill such space experiments, especially in the fields of electrostatic microaccelerometers, cold atom clocks and cold atom inertial sensors, optical datation, optical interferometry and drag-free control. A number of Fundamental Physics projects are now under progress, in the frame of the national programme and the participation to the ESA programme, such as : -the MICROSCOPE microsatellite project aimed at testing the Equivalence Principle between inertial mass and gravitational mass at a high level of precision, which is the fourth CNES scientific project based on the MYRIADE microsatellite series, -the PHARAO cold-atom clock which is the heart of the ACES (Atomic Clock Ensemble in Space) european project located on an external pallett of the International Space Station, together with a swiss H- MASER and a micro-wave link making comparison with ground clocks, aimed at relativistic tests and measurement of universal constants, -the T2L2 optical link allowing to compare ultra-stable and ultra-precise clocks, -contribution to the AMS spectrometer aimed at the search for cosmic antimatter, on

  12. Minimally invasive dentistry: paradigm shifts in preparation design.

    PubMed

    LeSage, Brian P

    2009-01-01

    While the concept of minimally invasive dentistry has long been considered a rational, viable approach to restorative care, preparation design, material science, and long-term evidentiary support have only recently begun to provide the foundation necessary to support such treatment in the everyday practice. This article reviews the fundamental paradigm shift evidenced in contemporary prosthodontics as required to facilitate the emerging interest in delivering conservative restorative alternatives.

  13. Principles of learning.

    PubMed

    Voith, V L

    1986-12-01

    This article discusses some general principles of learning as well as possible constraints and how such principles can apply to horses. A brief review is presented of experiments that were designed to assess learning in horses. The use of behavior modification techniques to treat behavior problems in horses is discussed and several examples of the use of these techniques are provided. PMID:3492241

  14. Hamilton's Principle for Beginners

    ERIC Educational Resources Information Center

    Brun, J. L.

    2007-01-01

    I find that students have difficulty with Hamilton's principle, at least the first time they come into contact with it, and therefore it is worth designing some examples to help students grasp its complex meaning. This paper supplies the simplest example to consolidate the learning of the quoted principle: that of a free particle moving along a…

  15. From fundamental fields to constituent quarks and nucleon form factors

    SciTech Connect

    Coester, F.

    1990-01-01

    Constituent-quark models formulated in the frame work of nonrelativistic quantum mechanics have been successful in accounting for the mass spectra of mesons and baryons. Applications to elastic electron scattering require relativistic dynamics. Relativistic quantum mechanics of constituent quarks can be formulated by constructing a suitable unitary representation of the Poincare group on the three-quark Hilbert space. The mass and spin operators of this representation specify the relativistic model dynamics. The dynamics of fundamental quark fields, on the other hand, is specified by a Euclidean functional integral. In this paper I show how the dynamics of the fundamental fields can be related in principle to the Hamiltonian dynamics of quark particles through the properties of the Wightman functions. 14 refs.

  16. A critique of principlism.

    PubMed

    Clouser, K D; Gert, B

    1990-04-01

    The authors use the term "principlism" to refer to the practice of using "principles" to replace both moral theory and particular moral rules and ideals in dealing with the moral problems that arise in medical practice. The authors argue that these "principles" do not function as claimed, and that their use is misleading both practically and theoretically. The "principles" are in fact not guides to action, but rather they are merely names for a collection of sometimes superficially related matters for consideration when dealing with a moral problem. The "principles" lack any systematic relationship to each other, and they often conflict with each other. These conflicts are unresolvable, since there is no unified moral theory from which they are all derived. For comparison the authors sketch the advantages of using a unified moral theory. PMID:2351895

  17. Challenging the principle of proportionality.

    PubMed

    Andersson, Anna-Karin Margareta

    2016-04-01

    The first objective of this article is to examine one aspect of the principle of proportionality (PP) as advanced by Alan Gewirth in his 1978 bookReason and Morality Gewirth claims that being capable of exercising agency to some minimal degree is a property that justifies having at least prima facie rights not to get killed. However, according to the PP, before the being possesses the capacity for exercising agency to that minimal degree, the extent of her rights depends on to what extent she approaches possession of agential capacities. One interpretation of PP holds that variations in degree of possession of the physical constitution necessary to exercise agency are morally relevant. The other interpretation holds that only variations in degree of actual mental capacity are morally relevant. The first of these interpretations is vastly more problematic than the other. The second objective is to argue that according to the most plausible interpretation of the PP, the fetus' level of development before at least the 20th week of pregnancy does not affect the fetus' moral rights status. I then suggest that my argument is not restricted to such fetuses, although extending my argument to more developed fetuses requires caution. PMID:26839114

  18. The simplicity principle in perception and cognition.

    PubMed

    Feldman, Jacob

    2016-09-01

    The simplicity principle, traditionally referred to as Occam's razor, is the idea that simpler explanations of observations should be preferred to more complex ones. In recent decades the principle has been clarified via the incorporation of modern notions of computation and probability, allowing a more precise understanding of how exactly complexity minimization facilitates inference. The simplicity principle has found many applications in modern cognitive science, in contexts as diverse as perception, categorization, reasoning, and neuroscience. In all these areas, the common idea is that the mind seeks the simplest available interpretation of observations- or, more precisely, that it balances a bias toward simplicity with a somewhat opposed constraint to choose models consistent with perceptual or cognitive observations. This brief tutorial surveys some of the uses of the simplicity principle across cognitive science, emphasizing how complexity minimization in a number of forms has been incorporated into probabilistic models of inference. WIREs Cogn Sci 2016, 7:330-340. doi: 10.1002/wcs.1406 For further resources related to this article, please visit the WIREs website. PMID:27470193

  19. Communication: Fitting potential energy surfaces with fundamental invariant neural network

    NASA Astrophysics Data System (ADS)

    Shao, Kejie; Chen, Jun; Zhao, Zhiqiang; Zhang, Dong H.

    2016-08-01

    A more flexible neural network (NN) method using the fundamental invariants (FIs) as the input vector is proposed in the construction of potential energy surfaces for molecular systems involving identical atoms. Mathematically, FIs finitely generate the permutation invariant polynomial (PIP) ring. In combination with NN, fundamental invariant neural network (FI-NN) can approximate any function to arbitrary accuracy. Because FI-NN minimizes the size of input permutation invariant polynomials, it can efficiently reduce the evaluation time of potential energy, in particular for polyatomic systems. In this work, we provide the FIs for all possible molecular systems up to five atoms. Potential energy surfaces for OH3 and CH4 were constructed with FI-NN, with the accuracy confirmed by full-dimensional quantum dynamic scattering and bound state calculations.

  20. Communication: Fitting potential energy surfaces with fundamental invariant neural network.

    PubMed

    Shao, Kejie; Chen, Jun; Zhao, Zhiqiang; Zhang, Dong H

    2016-08-21

    A more flexible neural network (NN) method using the fundamental invariants (FIs) as the input vector is proposed in the construction of potential energy surfaces for molecular systems involving identical atoms. Mathematically, FIs finitely generate the permutation invariant polynomial (PIP) ring. In combination with NN, fundamental invariant neural network (FI-NN) can approximate any function to arbitrary accuracy. Because FI-NN minimizes the size of input permutation invariant polynomials, it can efficiently reduce the evaluation time of potential energy, in particular for polyatomic systems. In this work, we provide the FIs for all possible molecular systems up to five atoms. Potential energy surfaces for OH3 and CH4 were constructed with FI-NN, with the accuracy confirmed by full-dimensional quantum dynamic scattering and bound state calculations.

  1. Communication: Fitting potential energy surfaces with fundamental invariant neural network.

    PubMed

    Shao, Kejie; Chen, Jun; Zhao, Zhiqiang; Zhang, Dong H

    2016-08-21

    A more flexible neural network (NN) method using the fundamental invariants (FIs) as the input vector is proposed in the construction of potential energy surfaces for molecular systems involving identical atoms. Mathematically, FIs finitely generate the permutation invariant polynomial (PIP) ring. In combination with NN, fundamental invariant neural network (FI-NN) can approximate any function to arbitrary accuracy. Because FI-NN minimizes the size of input permutation invariant polynomials, it can efficiently reduce the evaluation time of potential energy, in particular for polyatomic systems. In this work, we provide the FIs for all possible molecular systems up to five atoms. Potential energy surfaces for OH3 and CH4 were constructed with FI-NN, with the accuracy confirmed by full-dimensional quantum dynamic scattering and bound state calculations. PMID:27544080

  2. Minimal but non-minimal inflation and electroweak symmetry breaking

    NASA Astrophysics Data System (ADS)

    Marzola, Luca; Racioppi, Antonio

    2016-10-01

    We consider the most minimal scale invariant extension of the standard model that allows for successful radiative electroweak symmetry breaking and inflation. The framework involves an extra scalar singlet, that plays the rôle of the inflaton, and is compatibile with current experimental bounds owing to the non-minimal coupling of the latter to gravity. This inflationary scenario predicts a very low tensor-to-scalar ratio r ≈ 10‑3, typical of Higgs-inflation models, but in contrast yields a scalar spectral index ns simeq 0.97 which departs from the Starobinsky limit. We briefly discuss the collider phenomenology of the framework.

  3. Do goldfish miss the fundamental?

    NASA Astrophysics Data System (ADS)

    Fay, Richard R.

    2003-10-01

    The perception of harmonic complexes was studied in goldfish using classical respiratory conditioning and a stimulus generalization paradigm. Groups of animals were initially conditioned to several harmonic complexes with a fundamental frequency (f0) of 100 Hz. ln some cases the f0 component was present, and in other cases, the f0 component was absent. After conditioning, animals were tested for generalization to novel harmonic complexes having different f0's, some with f0 present and some with f0 absent. Generalization gradients always peaked at 100 Hz, indicating that the pitch value of the conditioning complexes was consistent with the f0, whether or not f0 was present in the conditioning or test complexes. Thus, goldfish do not miss the fundmental with respect to a pitch-like perceptual dimension. However, generalization gradients tended to have different skirt slopes for the f0-present and f0-absent conditioning and test stimuli. This suggests that goldfish distinguish between f0 present/absent stimuli, probably on the basis of a timbre-like perceptual dimension. These and other results demonstrate that goldfish respond to complex sounds as if they possessed perceptual dimensions similar to pitch and timbre as defined for human and other vertebrate listeners. [Work supported by NIH/NIDCD.

  4. Fundamental studies of fusion plasmas

    SciTech Connect

    Aamodt, R.E.; Catto, P.J.; D'Ippolito, D.A.; Myra, J.R.; Russell, D.A.

    1992-05-26

    The major portion of this program is devoted to critical ICH phenomena. The topics include edge physics, fast wave propagation, ICH induced high frequency instabilities, and a preliminary antenna design for Ignitor. This research was strongly coordinated with the world's experimental and design teams at JET, Culham, ORNL, and Ignitor. The results have been widely publicized at both general scientific meetings and topical workshops including the speciality workshop on ICRF design and physics sponsored by Lodestar in April 1992. The combination of theory, empirical modeling, and engineering design in this program makes this research particularly important for the design of future devices and for the understanding and performance projections of present tokamak devices. Additionally, the development of a diagnostic of runaway electrons on TEXT has proven particularly useful for the fundamental understanding of energetic electron confinement. This work has led to a better quantitative basis for quasilinear theory and the role of magnetic vs. electrostatic field fluctuations on electron transport. An APS invited talk was given on this subject and collaboration with PPPL personnel was also initiated. Ongoing research on these topics will continue for the remainder fo the contract period and the strong collaborations are expected to continue, enhancing both the relevance of the work and its immediate impact on areas needing critical understanding.

  5. Fundamental Solutions and Optimal Control of Neutral Systems

    NASA Astrophysics Data System (ADS)

    Liu, Kai

    In this work, we shall consider standard optimal control problems for a class of neutral functional differential equations in Banach spaces. As the basis of a systematic theory of neutral models, the fundamental solution is constructed and a variation of constants formula of mild solutions is established. Necessary conditions in terms of the solutions of neutral adjoint systems are established to deal with the fixed time integral convex cost problem of optimality. Based on optimality conditions, the maximum principle for time varying control domain is presented.

  6. Lanthanide upconversion luminescence at the nanoscale: fundamentals and optical properties

    NASA Astrophysics Data System (ADS)

    Nadort, Annemarie; Zhao, Jiangbo; Goldys, Ewa M.

    2016-07-01

    Upconversion photoluminescence is a nonlinear effect where multiple lower energy excitation photons produce higher energy emission photons. This fundamentally interesting process has many applications in biomedical imaging, light source and display technology, and solar energy harvesting. In this review we discuss the underlying physical principles and their modelling using rate equations. We discuss how the understanding of photophysical processes enabled a strategic influence over the optical properties of upconversion especially in rationally designed materials. We subsequently present an overview of recent experimental strategies to control and optimize the optical properties of upconversion nanoparticles, focussing on their emission spectral properties and brightness.

  7. BOOK REVIEWS: Quantum Mechanics: Fundamentals

    NASA Astrophysics Data System (ADS)

    Whitaker, A.

    2004-02-01

    mechanics, which is assumed, but to examine whether it gives a consistent account of measurement. The conclusion is that after a measurement, interference terms are ‘effectively’ absent; the set of ‘one-to-one correlations between states of the apparatus and the object’ has the same form as that of everyday statistics and is thus a probability distribution. This probability distribution refers to potentialities, only one of which is actually realized in any one trial. Opinions may differ on whether their treatment is any less vulnerable to criticisms such as those of Bell. To sum up, Gottfried and Yan’s book contains a vast amount of knowledge and understanding. As well as explaining the way in which quantum theory works, it attempts to illuminate fundamental aspects of the theory. A typical example is the ‘fable’ elaborated in Gottfried’s article in Nature cited above, that if Newton were shown Maxwell’s equations and the Lorentz force law, he could deduce the meaning of E and B, but if Maxwell were shown Schrödinger’s equation, he could not deduce the meaning of Psi. For use with a well-constructed course (and, of course, this is the avowed purpose of the book; a useful range of problems is provided for each chapter), or for the relative expert getting to grips with particular aspects of the subject or aiming for a deeper understanding, the book is certainly ideal. It might be suggested, though, that, even compared to the first edition, the isolated learner might find the wide range of topics, and the very large number of mathematical and conceptual techniques, introduced in necessarily limited space, somewhat overwhelming. The second book under consideration, that of Schwabl, contains ‘Advanced’ elements of quantum theory; it is designed for a course following on from one for which Gottfried and Yan, or Schwabl’s own `Quantum Mechanics' might be recommended. It is the second edition in English, and is a translation of the third German edition

  8. Improved fundamental frequency coding in cochlear implant signal processing.

    PubMed

    Milczynski, Matthias; Wouters, Jan; van Wieringen, Astrid

    2009-04-01

    A new signal processing algorithm for improved pitch perception in cochlear implants is proposed. The algorithm realizes fundamental frequency (F0) coding by explicitly modulating the amplitude of the electrical stimulus. The proposed processing scheme is compared with the standard advanced combination encoder strategy in psychophysical music perception related tasks. Possible filter-bank and loudness cues between the strategies under study were minimized to predominantly focus on differences in temporal processing. The results demonstrate significant benefits provided by the new coding strategy for pitch ranking, melodic contour identification, and familiar melody identification. PMID:19354401

  9. Fundamental mechanisms of micromachine reliability

    SciTech Connect

    DE BOER,MAARTEN P.; SNIEGOWSKI,JEFFRY J.; KNAPP,JAMES A.; REDMOND,JAMES M.; MICHALSKE,TERRY A.; MAYER,THOMAS K.

    2000-01-01

    Due to extreme surface to volume ratios, adhesion and friction are critical properties for reliability of Microelectromechanical Systems (MEMS), but are not well understood. In this LDRD the authors established test structures, metrology and numerical modeling to conduct studies on adhesion and friction in MEMS. They then concentrated on measuring the effect of environment on MEMS adhesion. Polycrystalline silicon (polysilicon) is the primary material of interest in MEMS because of its integrated circuit process compatibility, low stress, high strength and conformal deposition nature. A plethora of useful micromachined device concepts have been demonstrated using Sandia National Laboratories' sophisticated in-house capabilities. One drawback to polysilicon is that in air the surface oxidizes, is high energy and is hydrophilic (i.e., it wets easily). This can lead to catastrophic failure because surface forces can cause MEMS parts that are brought into contact to adhere rather than perform their intended function. A fundamental concern is how environmental constituents such as water will affect adhesion energies in MEMS. The authors first demonstrated an accurate method to measure adhesion as reported in Chapter 1. In Chapter 2 through 5, they then studied the effect of water on adhesion depending on the surface condition (hydrophilic or hydrophobic). As described in Chapter 2, they find that adhesion energy of hydrophilic MEMS surfaces is high and increases exponentially with relative humidity (RH). Surface roughness is the controlling mechanism for this relationship. Adhesion can be reduced by several orders of magnitude by silane coupling agents applied via solution processing. They decrease the surface energy and render the surface hydrophobic (i.e. does not wet easily). However, only a molecular monolayer coats the surface. In Chapters 3-5 the authors map out the extent to which the monolayer reduces adhesion versus RH. They find that adhesion is independent of

  10. Fundamental Mechanisms of Interface Roughness

    SciTech Connect

    Randall L. Headrick

    2009-01-06

    Publication quality results were obtained for several experiments and materials systems including: (i) Patterning and smoothening of sapphire surfaces by energetic Ar+ ions. Grazing Incidence Small Angle X-ray Scattering (GISAXS) experiments were performed in the system at the National Synchrotron Light Source (NSLS) X21 beamline. Ar+ ions in the energy range from 300 eV to 1000 eV were used to produce ripples on the surfaces of single-crystal sapphire. It was found that the ripple wavelength varies strongly with the angle of incidence of the ions, which increase significantly as the angle from normal is varied from 55° to 35°. A smooth region was found for ion incidence less than 35° away from normal incidence. In this region a strong smoothening mechanism with strength proportional to the second derivative of the height of the surface was found to be responsible for the effect. The discovery of this phase transition between stable and unstable regimes as the angle of incidence is varied has also stimulated new work by other groups in the field. (ii) Growth of Ge quantum dots on Si(100) and (111). We discovered the formation of quantum wires on 4° misoriented Si(111) using real-time GISAXS during the deposition of Ge. The results represent the first time-resolved GISAXS study of Ge quantum dot formation. (iii) Sputter deposition of amorphous thin films and multilayers composed of WSi2 and Si. Our in-situ GISAXS experiments reveal fundamental roughening and smoothing phenomena on surfaces during film deposition. The main results of this work is that the WSi2 layers actually become smoother during deposition due to the smoothening effect of energetic particles in the sputter deposition process.

  11. Fundamental Studies of Recombinant Hydrogenases

    SciTech Connect

    Adams, Michael W

    2014-01-25

    This research addressed the long term goals of understanding the assembly and organization of hydrogenase enzymes, of reducing them in size and complexity, of determining structure/function relationships, including energy conservation via charge separation across membranes, and in screening for novel H2 catalysts. A key overall goal of the proposed research was to define and characterize minimal hydrogenases that are produced in high yields and are oxygen-resistant. Remarkably, in spite of decades of research carried out on hydrogenases, it is not possible to readily manipulate or design the enzyme using molecular biology approaches since a recombinant form produced in a suitable host is not available. Such resources are essential if we are to understand what constitutes a “minimal” hydrogenase and design such catalysts with certain properties, such as resistance to oxygen, extreme stability and specificity for a given electron donor. The model system for our studies is Pyrococcus furiosus, a hyperthermophile that grows optimally at 100°C, which contains three different nickel-iron [NiFe-] containing hydrogenases. Hydrogenases I and II are cytoplasmic while the other, MBH, is an integral membrane protein that functions to both evolve H2 and pump protons. Three important breakthroughs were made during the funding period with P. furiosus soluble hydrogenase I (SHI). First, we produced an active recombinant form of SHI in E. coli by the co-expression of sixteen genes using anaerobically-induced promoters. Second, we genetically-engineered P. furiosus to overexpress SHI by an order of magnitude compared to the wild type strain. Third, we generated the first ‘minimal’ form of SHI, one that contained two rather than four subunits. This dimeric form was stable and active, and directly interacted with a pyruvate-oxidizing enzyme with any intermediate electron carrier. The research resulted in five peer-reviewed publications.

  12. Astronomia Motivadora no Ensino Fundamental

    NASA Astrophysics Data System (ADS)

    Melo, J.; Voelzke, M. R.

    2008-09-01

    O objetivo principal deste trabalho é procurar desenvolver o interesse dos alunos pelas ciências através da Astronomia. Uma pesquisa com perguntas sobre Astronomia foi realizada junto a 161 alunos do Ensino Fundamental, com o intuito de descobrir conhecimentos prévios dos alunos sobre o assunto. Constatou-se, por exemplo, que 29,3% da 6ª série responderam corretamente o que é eclipse, 30,0% da 8ª série acertaram o que a Astronomia estuda, enquanto 42,3% dos alunos da 5ª série souberam definir o Sol. Pretende-se ampliar as turmas participantes e trabalhar, principalmente de forma prática com: dimensões e escalas no Sistema Solar, construção de luneta, questões como dia e noite, estações do ano e eclipses. Busca-se abordar, também, outros conteúdos de Física tais como a óptica na construção da luneta, e a mecânica no trabalho com escalas e medidas, e ao utilizar uma luminária para representar o Sol na questão do eclipse, e de outras disciplinas como a Matemática na transformação de unidades, regras de três; Artes na modelagem ou desenho dos planetas; a própria História com relação à busca pela origem do universo, e a Informática que possibilita a busca mais rápida por informações, além de permitir simulações e visualizações de imagens importantes. Acredita-se que a Astronomia é importante no processo ensino aprendizagem, pois permite a discussão de temas curiosos como, por exemplo, a origem do universo, viagens espaciais a existência ou não de vida em outros planetas, além de temas atuais como as novas tecnologias.

  13. Archimedes' Principle in Action

    ERIC Educational Resources Information Center

    Kires, Marian

    2007-01-01

    The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers. (Contains 5 figures.)

  14. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1972-01-01

    Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)

  15. Archimedes' principle in action

    NASA Astrophysics Data System (ADS)

    Kireš, Marián

    2007-09-01

    The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers.

  16. LLNL Waste Minimization Program Plan

    SciTech Connect

    Not Available

    1990-02-14

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs.

  17. Minimally invasive aortic valve surgery

    PubMed Central

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-01-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  18. Minimally invasive aortic valve surgery.

    PubMed

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-09-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  19. What is minimally invasive dentistry?

    PubMed

    Ericson, Dan

    2004-01-01

    Minimally Invasive Dentistry is the application of "a systematic respect for the original tissue." This implies that the dental profession recognizes that an artifact is of less biological value than the original healthy tissue. Minimally invasive dentistry is a concept that can embrace all aspects of the profession. The common delineator is tissue preservation, preferably by preventing disease from occurring and intercepting its progress, but also removing and replacing with as little tissue loss as possible. It does not suggest that we make small fillings to restore incipient lesions or surgically remove impacted third molars without symptoms as routine procedures. The introduction of predictable adhesive technologies has led to a giant leap in interest in minimally invasive dentistry. The concept bridges the traditional gap between prevention and surgical procedures, which is just what dentistry needs today. The evidence-base for survival of restorations clearly indicates that restoring teeth is a temporary palliative measure that is doomed to fail if the disease that caused the condition is not addressed properly. Today, the means, motives and opportunities for minimally invasive dentistry are at hand, but incentives are definitely lacking. Patients and third parties seem to be convinced that the only things that count are replacements. Namely, they are prepared to pay for a filling but not for a procedure that can help avoid having one.

  20. A Defense of Semantic Minimalism

    ERIC Educational Resources Information Center

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  1. Assembly of a minimal protocell

    NASA Astrophysics Data System (ADS)

    Rasmussen, Steen

    2007-03-01

    What is minimal life, how can we make it, and how can it be useful? We present experimental and computational results towards bridging nonliving and living matter, which results in life that is different and much simpler than contemporary life. A simple yet tightly coupled catalytic cooperation between genes, metabolism, and container forms the design underpinnings of our protocell, which is a minimal self-replicating molecular machine. Experimentally, we have recently demonstrated this coupling by having an informational molecule (8-oxoguanine) catalytically control the light driven metabolic (Ru-bpy based) production of container materials (fatty acids). This is a significant milestone towards assembling a minimal self-replicating molecular machine. Recent theoretical investigations indicate that coordinated exponential component growth should naturally emerge as a result from such a catalytic coupling between the main protocellular components. A 3-D dissipative particle simulation (DPD) study of the full protocell life-cycle exposes a number of anticipated systemic issues associated with the remaining experimental challenges for the implementation of the minimal protocell. Finally we outline how more general self-replicating materials could be useful.

  2. Principles of Tendon Transfer.

    PubMed

    Wilbur, Danielle; Hammert, Warren C

    2016-08-01

    Tendon transfers provide a substitute, either temporary or permanent, when function is lost due to neurologic injury in stroke, cerebral palsy or central nervous system lesions, peripheral nerve injuries, or injuries to the musculotendinous unit itself. This article reviews the basic principles of tendon transfer, which are important when planning surgery and essential for an optimal outcome. In addition, concepts for coapting the tendons during surgery and general principles to be followed during the rehabilitation process are discussed. PMID:27387072

  3. The fundamental physics explorer: An ESA technology reference study

    NASA Astrophysics Data System (ADS)

    Binns, D. A.; Rando, N.; Cacciapuoti, L.

    2009-04-01

    ESA technology reference studies are used as a process to identify key technologies and technical challenges of potential future missions not yet in the science programme. This paper reports on the study of the Fundamental Physics Explorer (FPE), a re-usable platform targeted to small missions testing fundamental laws of physics in space. The study addresses three specific areas of interest: special and general relativity tests based on atomic clocks, experiments on the Weak Equivalence Principle (WEP), and studies of Bose-Einstein condensates under microgravity conditions. Starting from preliminary science objectives and payload requirements, three reference missions in the small/medium class range are discussed, based on a re-adaptation of the LISA Pathfinder spacecraft. A 700/3600 km elliptic orbit has been selected to conduct clock tests of special and general relativity, a 700 km circular orbit to perform experiments on the Weak Equivalence Principle and to study Bose-Einstein condensates, each mission being based on a three-axis stabilised spacecraft. It was determined that adaptation of LISA Pathfinder would be required in order to meet the demands of the FPE missions. Moreover it was established that specific payload and spacecraft technology development would be required to realise such a programme.

  4. Structural principles for computational and de novo design of 4Fe-4S metalloproteins.

    PubMed

    Nanda, Vikas; Senn, Stefan; Pike, Douglas H; Rodriguez-Granillo, Agustina; Hansen, Will A; Khare, Sagar D; Noy, Dror

    2016-05-01

    Iron-sulfur centers in metalloproteins can access multiple oxidation states over a broad range of potentials, allowing them to participate in a variety of electron transfer reactions and serving as catalysts for high-energy redox processes. The nitrogenase FeMoCO cluster converts di-nitrogen to ammonia in an eight-electron transfer step. The 2(Fe4S4) containing bacterial ferredoxin is an evolutionarily ancient metalloprotein fold and is thought to be a primordial progenitor of extant oxidoreductases. Controlling chemical transformations mediated by iron-sulfur centers such as nitrogen fixation, hydrogen production as well as electron transfer reactions involved in photosynthesis are of tremendous importance for sustainable chemistry and energy production initiatives. As such, there is significant interest in the design of iron-sulfur proteins as minimal models to gain fundamental understanding of complex natural systems and as lead-molecules for industrial and energy applications. Herein, we discuss salient structural characteristics of natural iron-sulfur proteins and how they guide principles for design. Model structures of past designs are analyzed in the context of these principles and potential directions for enhanced designs are presented, and new areas of iron-sulfur protein design are proposed. This article is part of a Special issue entitled Biodesign for Bioenergetics--the design and engineering of electronic transfer cofactors, protein networks, edited by Ronald L. Koder and J.L Ross Anderson.

  5. Structural principles for computational and de novo design of 4Fe-4S metalloproteins.

    PubMed

    Nanda, Vikas; Senn, Stefan; Pike, Douglas H; Rodriguez-Granillo, Agustina; Hansen, Will A; Khare, Sagar D; Noy, Dror

    2016-05-01

    Iron-sulfur centers in metalloproteins can access multiple oxidation states over a broad range of potentials, allowing them to participate in a variety of electron transfer reactions and serving as catalysts for high-energy redox processes. The nitrogenase FeMoCO cluster converts di-nitrogen to ammonia in an eight-electron transfer step. The 2(Fe4S4) containing bacterial ferredoxin is an evolutionarily ancient metalloprotein fold and is thought to be a primordial progenitor of extant oxidoreductases. Controlling chemical transformations mediated by iron-sulfur centers such as nitrogen fixation, hydrogen production as well as electron transfer reactions involved in photosynthesis are of tremendous importance for sustainable chemistry and energy production initiatives. As such, there is significant interest in the design of iron-sulfur proteins as minimal models to gain fundamental understanding of complex natural systems and as lead-molecules for industrial and energy applications. Herein, we discuss salient structural characteristics of natural iron-sulfur proteins and how they guide principles for design. Model structures of past designs are analyzed in the context of these principles and potential directions for enhanced designs are presented, and new areas of iron-sulfur protein design are proposed. This article is part of a Special issue entitled Biodesign for Bioenergetics--the design and engineering of electronic transfer cofactors, protein networks, edited by Ronald L. Koder and J.L Ross Anderson. PMID:26449207

  6. Perfusion Magnetic Resonance Imaging: A Comprehensive Update on Principles and Techniques

    PubMed Central

    Li, Ka-Loh; Ostergaard, Leif; Calamante, Fernando

    2014-01-01

    Perfusion is a fundamental biological function that refers to the delivery of oxygen and nutrients to tissue by means of blood flow. Perfusion MRI is sensitive to microvasculature and has been applied in a wide variety of clinical applications, including the classification of tumors, identification of stroke regions, and characterization of other diseases. Perfusion MRI techniques are classified with or without using an exogenous contrast agent. Bolus methods, with injections of a contrast agent, provide better sensitivity with higher spatial resolution, and are therefore more widely used in clinical applications. However, arterial spin-labeling methods provide a unique opportunity to measure cerebral blood flow without requiring an exogenous contrast agent and have better accuracy for quantification. Importantly, MRI-based perfusion measurements are minimally invasive overall, and do not use any radiation and radioisotopes. In this review, we describe the principles and techniques of perfusion MRI. This review summarizes comprehensive updated knowledge on the physical principles and techniques of perfusion MRI. PMID:25246817

  7. The uncertainty principle determines the nonlocality of quantum mechanics.

    PubMed

    Oppenheim, Jonathan; Wehner, Stephanie

    2010-11-19

    Two central concepts of quantum mechanics are Heisenberg's uncertainty principle and a subtle form of nonlocality that Einstein famously called "spooky action at a distance." These two fundamental features have thus far been distinct concepts. We show that they are inextricably and quantitatively linked: Quantum mechanics cannot be more nonlocal with measurements that respect the uncertainty principle. In fact, the link between uncertainty and nonlocality holds for all physical theories. More specifically, the degree of nonlocality of any theory is determined by two factors: the strength of the uncertainty principle and the strength of a property called "steering," which determines which states can be prepared at one location given a measurement at another.

  8. Cannulation Strategies and Pitfalls in Minimally Invasive Cardiac Surgery.

    PubMed

    Ramchandani, Mahesh; Al Jabbari, Odeaa; Abu Saleh, Walid K; Ramlawi, Basel

    2016-01-01

    For any given cardiac surgery, there are two invasive components: the surgical approach and the cardiopulmonary bypass circuit. The standard approach for cardiac surgery is the median sternotomy, which offers unrestricted access to the thoracic organs-the heart, lung, and major vessels. However, it carries a long list of potential complications such as wound infection, brachial plexus palsies, respiratory dysfunction, and an unpleasant-looking scar. The cardiopulmonary bypass component also carries potential complications such as end-organ dysfunction, coagulopathy, hemodilution, bleeding, and blood transfusion requirement. Furthermore, the aortic manipulation during cannulation and cross clamping increases the risk of dissection, arterial embolization, and stroke. Minimally invasive cardiac surgery is an iconic event in the history of cardiothoracic medicine and has become a widely adapted approach as it minimizes many of the inconvenient side effects associated with the median sternotomy and bypass circuit placement. This type of surgery requires the use of novel perfusion strategies, especially in patients who hold the highest potential for postoperative morbidity. Cannulation techniques are a fundamental element in minimally invasive cardiac surgery, and there are numerous cannulation procedures for each type of minimally invasive operation. In this review, we will highlight the strategies and pitfalls associated with a minimally invasive cannulation. PMID:27127556

  9. Cannulation Strategies and Pitfalls in Minimally Invasive Cardiac Surgery

    PubMed Central

    Ramchandani, Mahesh; Al Jabbari, Odeaa; Abu Saleh, Walid K.; Ramlawi, Basel

    2016-01-01

    For any given cardiac surgery, there are two invasive components: the surgical approach and the cardiopulmonary bypass circuit. The standard approach for cardiac surgery is the median sternotomy, which offers unrestricted access to the thoracic organs—the heart, lung, and major vessels. However, it carries a long list of potential complications such as wound infection, brachial plexus palsies, respiratory dysfunction, and an unpleasant-looking scar. The cardiopulmonary bypass component also carries potential complications such as end-organ dysfunction, coagulopathy, hemodilution, bleeding, and blood transfusion requirement. Furthermore, the aortic manipulation during cannulation and cross clamping increases the risk of dissection, arterial embolization, and stroke. Minimally invasive cardiac surgery is an iconic event in the history of cardiothoracic medicine and has become a widely adapted approach as it minimizes many of the inconvenient side effects associated with the median sternotomy and bypass circuit placement. This type of surgery requires the use of novel perfusion strategies, especially in patients who hold the highest potential for postoperative morbidity. Cannulation techniques are a fundamental element in minimally invasive cardiac surgery, and there are numerous cannulation procedures for each type of minimally invasive operation. In this review, we will highlight the strategies and pitfalls associated with a minimally invasive cannulation. PMID:27127556

  10. Anaesthesia for minimally invasive surgery

    PubMed Central

    Dec, Marta

    2015-01-01

    Minimally invasive surgery (MIS) is rising in popularity. It offers well-known benefits to the patient. However, restricted access to the surgical site and gas insufflation into the body cavities may result in severe complications. From the anaesthetic point of view MIS poses unique challenges associated with creation of pneumoperitoneum, carbon dioxide absorption, specific positioning and monitoring a patient to whom the anaesthetist has often restricted access, in a poorly lit environment. Moreover, with refinement of surgical procedures and growing experience the anaesthetist is presented with patients from high-risk groups (obese, elderly, with advanced cardiac and respiratory disease) who once were deemed unsuitable for the laparoscopic technique. Anaesthetic management is aimed at getting the patient safely through the procedure, minimizing the specific risks arising from laparoscopy and the patient's coexisting medical problems, ensuring quick recovery and a relatively pain-free postoperative course with early return to normal function. PMID:26865885

  11. Minimal universal quantum heat machine.

    PubMed

    Gelbwaser-Klimovsky, D; Alicki, R; Kurizki, G

    2013-01-01

    In traditional thermodynamics the Carnot cycle yields the ideal performance bound of heat engines and refrigerators. We propose and analyze a minimal model of a heat machine that can play a similar role in quantum regimes. The minimal model consists of a single two-level system with periodically modulated energy splitting that is permanently, weakly, coupled to two spectrally separated heat baths at different temperatures. The equation of motion allows us to compute the stationary power and heat currents in the machine consistent with the second law of thermodynamics. This dual-purpose machine can act as either an engine or a refrigerator (heat pump) depending on the modulation rate. In both modes of operation, the maximal Carnot efficiency is reached at zero power. We study the conditions for finite-time optimal performance for several variants of the model. Possible realizations of the model are discussed.

  12. Fundamentals of Management, 14-1. Military Curriculum Materials for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This military-developed text consists of six lessons designed to give students an understanding of the fundamentals of management. Covered in the individual lessons are the following topics: the nature of management (leadership and the functions of management); principles and policies of management (management policies, characteristics of good…

  13. Evidence for the Fundamental Difference Hypothesis or Not?: Island Constraints Revisited

    ERIC Educational Resources Information Center

    Belikova, Alyona; White, Lydia

    2009-01-01

    This article examines how changes in linguistic theory affect the debate between the fundamental difference hypothesis and the access-to-Universal Grammar (UG) approach to SLA. With a focus on subjacency (Chomsky, 1973), a principle of UG that places constraints on "wh"-movement and that has frequently been taken as a test case for verifying…

  14. The Fundamental Values of Academic Integrity: Honesty, Trust, Respect, Fairness, Responsibility.

    ERIC Educational Resources Information Center

    Duke Univ., Durham, NC. Center for Academic Integrity.

    The Center for Academic Integrity defines academic integrity as a commitment, even in the face of adversity, to five fundamental values: honesty, trust, fairness, respect, and responsibility. From these values come principles of behavior that enable academic communities to translate ideals into action. This essay discusses each of these values and…

  15. U.S. Geological Survey Fundamental Science Practices

    USGS Publications Warehouse

    ,

    2011-01-01

    The USGS has a long and proud tradition of objective, unbiased science in service to the Nation. A reputation for impartiality and excellence is one of our most important assets. To help preserve this vital asset, in 2004 the Executive Leadership Team (ELT) of the USGS was charged by the Director to develop a set of fundamental science practices, philosophical premises, and operational principles as the foundation for all USGS research and monitoring activities. In a concept document, 'Fundamental Science Practices of the U.S. Geological Survey', the ELT proposed 'a set of fundamental principles to underlie USGS science practices.' The document noted that protecting the reputation of USGS science for quality and objectivity requires the following key elements: - Clearly articulated, Bureau-wide fundamental science practices. - A shared understanding at all levels of the organization that the health and future of the USGS depend on following these practices. - The investment of budget, time, and people to ensure that the USGS reputation and high-quality standards are maintained. The USGS Fundamental Science Practices (FSP) encompass all elements of research investigations, including data collection, experimentation, analysis, writing results, peer review, management review, and Bureau approval and publication of information products. The focus of FSP is on how science is carried out and how products are produced and disseminated. FSP is not designed to address the question of what work the USGS should do; that is addressed in USGS science planning handbooks and other documents. Building from longstanding existing USGS policies and the ELT concept document, in May 2006, FSP policies were developed with input from all parts of the organization and were subsequently incorporated into the Bureau's Survey Manual. In developing an implementation plan for FSP policy, the intent was to recognize and incorporate the best of USGS current practices to obtain the optimum

  16. Atomically Precise Colloidal Metal Nanoclusters and Nanoparticles: Fundamentals and Opportunities.

    PubMed

    Jin, Rongchao; Zeng, Chenjie; Zhou, Meng; Chen, Yuxiang

    2016-09-28

    Colloidal nanoparticles are being intensely pursued in current nanoscience research. Nanochemists are often frustrated by the well-known fact that no two nanoparticles are the same, which precludes the deep understanding of many fundamental properties of colloidal nanoparticles in which the total structures (core plus surface) must be known. Therefore, controlling nanoparticles with atomic precision and solving their total structures have long been major dreams for nanochemists. Recently, these goals are partially fulfilled in the case of gold nanoparticles, at least in the ultrasmall size regime (1-3 nm in diameter, often called nanoclusters). This review summarizes the major progress in the field, including the principles that permit atomically precise synthesis, new types of atomic structures, and unique physical and chemical properties of atomically precise nanoparticles, as well as exciting opportunities for nanochemists to understand very fundamental science of colloidal nanoparticles (such as the stability, metal-ligand interfacial bonding, ligand assembly on particle surfaces, aesthetic structural patterns, periodicities, and emergence of the metallic state) and to develop a range of potential applications such as in catalysis, biomedicine, sensing, imaging, optics, and energy conversion. Although most of the research activity currently focuses on thiolate-protected gold nanoclusters, important progress has also been achieved in other ligand-protected gold, silver, and bimetal (or alloy) nanoclusters. All of these types of unique nanoparticles will bring unprecedented opportunities, not only in understanding the fundamental questions of nanoparticles but also in opening up new horizons for scientific studies of nanoparticles. PMID:27585252

  17. Atomically Precise Colloidal Metal Nanoclusters and Nanoparticles: Fundamentals and Opportunities.

    PubMed

    Jin, Rongchao; Zeng, Chenjie; Zhou, Meng; Chen, Yuxiang

    2016-09-28

    Colloidal nanoparticles are being intensely pursued in current nanoscience research. Nanochemists are often frustrated by the well-known fact that no two nanoparticles are the same, which precludes the deep understanding of many fundamental properties of colloidal nanoparticles in which the total structures (core plus surface) must be known. Therefore, controlling nanoparticles with atomic precision and solving their total structures have long been major dreams for nanochemists. Recently, these goals are partially fulfilled in the case of gold nanoparticles, at least in the ultrasmall size regime (1-3 nm in diameter, often called nanoclusters). This review summarizes the major progress in the field, including the principles that permit atomically precise synthesis, new types of atomic structures, and unique physical and chemical properties of atomically precise nanoparticles, as well as exciting opportunities for nanochemists to understand very fundamental science of colloidal nanoparticles (such as the stability, metal-ligand interfacial bonding, ligand assembly on particle surfaces, aesthetic structural patterns, periodicities, and emergence of the metallic state) and to develop a range of potential applications such as in catalysis, biomedicine, sensing, imaging, optics, and energy conversion. Although most of the research activity currently focuses on thiolate-protected gold nanoclusters, important progress has also been achieved in other ligand-protected gold, silver, and bimetal (or alloy) nanoclusters. All of these types of unique nanoparticles will bring unprecedented opportunities, not only in understanding the fundamental questions of nanoparticles but also in opening up new horizons for scientific studies of nanoparticles.

  18. The traveltime holographic principle

    NASA Astrophysics Data System (ADS)

    Huang, Yunsong; Schuster, Gerard T.

    2015-01-01

    Fermat's interferometric principle is used to compute interior transmission traveltimes τpq from exterior transmission traveltimes τsp and τsq. Here, the exterior traveltimes are computed for sources s on a boundary B that encloses a volume V of interior points p and q. Once the exterior traveltimes are computed, no further ray tracing is needed to calculate the interior times τpq. Therefore this interferometric approach can be more efficient than explicitly computing interior traveltimes τpq by ray tracing. Moreover, the memory requirement of the traveltimes is reduced by one dimension, because the boundary B is of one fewer dimension than the volume V. An application of this approach is demonstrated with interbed multiple (IM) elimination. Here, the IMs in the observed data are predicted from the migration image and are subsequently removed by adaptive subtraction. This prediction is enabled by the knowledge of interior transmission traveltimes τpq computed according to Fermat's interferometric principle. We denote this principle as the `traveltime holographic principle', by analogy with the holographic principle in cosmology where information in a volume is encoded on the region's boundary.

  19. Spaceborne receivers: Basic principles

    NASA Technical Reports Server (NTRS)

    Stacey, J. M.

    1984-01-01

    The underlying principles of operation of microwave receivers for space observations of planetary surfaces were examined. The design philosophy of the receiver as it is applied to operate functionally as an efficient receiving system, the principle of operation of the key components of the receiver, and the important differences among receiver types are explained. The operating performance and the sensitivity expectations for both the modulated and total power receiver configurations are outlined. The expressions are derived from first principles and are developed through the important intermediate stages to form practicle and easily applied equations. The transfer of thermodynamic energy from point to point within the receiver is illustrated. The language of microwave receivers is applied statistics.

  20. In quest of constitutional principles of "neurolaw".

    PubMed

    Pizzetti, Federico Gustavo

    2011-01-01

    The growing use of brain imaging technology and the developing of cognitive neuroscience pose unaccustomed challenges to legal systems. Until now, the fields of Law much affected are the civil and criminal law and procedure, but the constitutional dimension of "neurolaw" cannot be easily underestimated. As the capacity to investigate and to trace brain mechanisms and functional neural activities increases, it becomes urgent the recognition and definition of the unalienable rights and fundamental values in respect of this new techno-scientific power, that must be protected and safeguard at "constitutional level" of norms such as: human dignity, personal identity, authenticity and the pursuit of individual "happiness". As the same as for the law regulating research and experimentation on human genome adopted in the past years, one may also argue if the above mentioned fundamental principles of "neurolaw" must be fixed and disciplined also at European and International level. PMID:23057208

  1. In quest of constitutional principles of "neurolaw".

    PubMed

    Pizzetti, Federico Gustavo

    2011-01-01

    The growing use of brain imaging technology and the developing of cognitive neuroscience pose unaccustomed challenges to legal systems. Until now, the fields of Law much affected are the civil and criminal law and procedure, but the constitutional dimension of "neurolaw" cannot be easily underestimated. As the capacity to investigate and to trace brain mechanisms and functional neural activities increases, it becomes urgent the recognition and definition of the unalienable rights and fundamental values in respect of this new techno-scientific power, that must be protected and safeguard at "constitutional level" of norms such as: human dignity, personal identity, authenticity and the pursuit of individual "happiness". As the same as for the law regulating research and experimentation on human genome adopted in the past years, one may also argue if the above mentioned fundamental principles of "neurolaw" must be fixed and disciplined also at European and International level.

  2. Radiological images on personal computers: introduction and fundamental principles of digital images.

    PubMed

    Gillespy, T; Rowberg, A H

    1993-05-01

    This series of articles will explore the issue related to displaying, manipulating, and analyzing radiological images on personal computers (PC). This first article discusses the digital image data file, standard PC graphic file formats, and various methods for importing radiological images into the PC. PMID:8334176

  3. Two Essays on Learning Disabilities in the Application of Fundamental Financial Principles

    ERIC Educational Resources Information Center

    Auciello, Daria Joy

    2010-01-01

    This dissertation consists of two essays which examine the relationship between dyslexia and the application and acquisition of financial knowledge. Recent behavioral research has documented that factors such as representativeness, overconfidence, loss aversion, naivete, wealth, age and gender all impact a person's risk perception and asset…

  4. Enhancing Student Learning in Marketing Courses: An Exploration of Fundamental Principles for Website Platforms

    ERIC Educational Resources Information Center

    Hollenbeck, Candice R.; Mason, Charlotte H.; Song, Ji Hee

    2011-01-01

    The design of a course has potential to help marketing students achieve their learning objectives. Marketing courses are increasingly turning to technology to facilitate teaching and learning, and pedagogical tools such as Blackboard, WebCT, and e-Learning Commons are essential to the design of a course. Here, the authors investigate the research…

  5. Integrating Fundamental Principles Underlying Somatic Practices into the Dance Technique Class

    ERIC Educational Resources Information Center

    Brodie, Julie; Lobel, Elin

    2004-01-01

    Integrating somatic practices into the dance technique class by bringing awareness to the bodily processes of breathing, sensing, connecting, and initiating can help students reconnect the mind with the body within the context of the classroom environment. Dance educators do not always have the resources to implement separate somatics courses…

  6. 75 FR 71317 - Fundamental Principles and Policymaking Criteria for Partnerships With Faith-Based and Other...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... neighborhood organizations, agencies that administer social service programs or that support (including through prime awards or sub-awards) social service programs with Federal financial assistance shall, to the... assistance for social service programs should be distributed in the most effective and efficient...

  7. 3 CFR 13559 - Executive Order 13559 of November 17, 2010. Fundamental Principles and Policymaking Criteria for...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...: “(e) ‘Specified agency heads’ means: (i) the Attorney General; (ii) the Secretary of Agriculture; (iii...; (vi) the Secretary of Housing and Urban Development; (vii) the Secretary of Education; (viii) the... Interior; (iv) the Department of Agriculture; (v) the Department of Commerce; (vi) the Department of...

  8. Nanotechnology in hyperthermia cancer therapy: From fundamental principles to advanced applications.

    PubMed

    Beik, Jaber; Abed, Ziaeddin; Ghoreishi, Fatemeh S; Hosseini-Nami, Samira; Mehrzadi, Saeed; Shakeri-Zadeh, Ali; Kamrava, S Kamran

    2016-08-10

    In this work, we present an in-depth review of recent breakthroughs in nanotechnology for hyperthermia cancer therapy. Conventional hyperthermia methods do not thermally discriminate between the target and the surrounding normal tissues, and this non-selective tissue heating can lead to serious side effects. Nanotechnology is expected to have great potential to revolutionize current hyperthermia methods. To find an appropriate place in cancer treatment, all nanotechnology-based hyperthermia methods and their risks/benefits must be thoroughly understood. In this review paper, we extensively examine and compare four modern nanotechnology-based hyperthermia methods. For each method, the possible physical mechanisms of heat generation and enhancement due to the presence of nanoparticles are explained, and recent in vitro and in vivo studies are reviewed and discussed. Nano-Photo-Thermal Therapy (NPTT) and Nano-Magnetic Hyperthermia (NMH) are reviewed as the two first exciting approaches for targeted hyperthermia. The third novel hyperthermia method, Nano-Radio-Frequency Ablation (NaRFA) is discussed together with the thermal effects of novel nanoparticles in the presence of radiofrequency waves. Finally, Nano-Ultrasound Hyperthermia (NUH) is described as the fourth modern method for cancer hyperthermia. PMID:27264551

  9. A primer on the fundamental principles of light microscopy: Optimizing magnification, resolution, and contrast.

    PubMed

    Goodwin, Paul C

    2015-01-01

    The light microscope is an indispensable tool in the study of living organisms. Most biologists are familiar with microscopes, perhaps being first introduced to the wonders of the world of small things at a very early age. Yet, few fully comprehend the nature of microscopy and the basis of its utility. This review (re)-introduces the concepts of magnification, resolution, and contrast, and explores how they are intimately related and necessary for effective microscopy.

  10. A New Big Five: Fundamental Principles for an Integrative Science of Personality

    ERIC Educational Resources Information Center

    McAdams, Dan P.; Pals, Jennifer L.

    2006-01-01

    Despite impressive advances in recent years with respect to theory and research, personality psychology has yet to articulate clearly a comprehensive framework for understanding the whole person. In an effort to achieve that aim, the current article draws on the most promising empirical and theoretical trends in personality psychology today to…

  11. Minimizing radiation exposure during percutaneous nephrolithotomy.

    PubMed

    Chen, T T; Preminger, G M; Lipkin, M E

    2015-12-01

    Given the recent trends in growing per capita radiation dose from medical sources, there have been increasing concerns over patient radiation exposure. Patients with kidney stones undergoing percutaneous nephrolithotomy (PNL) are at particular risk for high radiation exposure. There exist several risk factors for increased radiation exposure during PNL which include high Body Mass Index, multiple access tracts, and increased stone burden. We herein review recent trends in radiation exposure, radiation exposure during PNL to both patients and urologists, and various approaches to reduce radiation exposure. We discuss incorporating the principles of As Low As reasonably Achievable (ALARA) into clinical practice and review imaging techniques such as ultrasound and air contrast to guide PNL access. Alternative surgical techniques and approaches to reducing radiation exposure, including retrograde intra-renal surgery, retrograde nephrostomy, endoscopic-guided PNL, and minimally invasive PNL, are also highlighted. It is important for urologists to be aware of these concepts and techniques when treating stone patients with PNL. The discussions outlined will assist urologists in providing patient counseling and high quality of care.

  12. Minimizing radiation exposure during percutaneous nephrolithotomy.

    PubMed

    Chen, T T; Preminger, G M; Lipkin, M E

    2015-12-01

    Given the recent trends in growing per capita radiation dose from medical sources, there have been increasing concerns over patient radiation exposure. Patients with kidney stones undergoing percutaneous nephrolithotomy (PNL) are at particular risk for high radiation exposure. There exist several risk factors for increased radiation exposure during PNL which include high Body Mass Index, multiple access tracts, and increased stone burden. We herein review recent trends in radiation exposure, radiation exposure during PNL to both patients and urologists, and various approaches to reduce radiation exposure. We discuss incorporating the principles of As Low As reasonably Achievable (ALARA) into clinical practice and review imaging techniques such as ultrasound and air contrast to guide PNL access. Alternative surgical techniques and approaches to reducing radiation exposure, including retrograde intra-renal surgery, retrograde nephrostomy, endoscopic-guided PNL, and minimally invasive PNL, are also highlighted. It is important for urologists to be aware of these concepts and techniques when treating stone patients with PNL. The discussions outlined will assist urologists in providing patient counseling and high quality of care. PMID:26354615

  13. Compression as a Universal Principle of Animal Behavior

    ERIC Educational Resources Information Center

    Ferrer-i-Cancho, Ramon; Hernández-Fernández, Antoni; Lusseau, David; Agoramoorthy, Govindasamy; Hsu, Minna J.; Semple, Stuart

    2013-01-01

    A key aim in biology and psychology is to identify fundamental principles underpinning the behavior of animals, including humans. Analyses of human language and the behavior of a range of non-human animal species have provided evidence for a common pattern underlying diverse behavioral phenomena: Words follow Zipf's law of brevity (the…

  14. The Didactic Principles and Their Applications in the Didactic Activity

    ERIC Educational Resources Information Center

    Marius-Costel, Esi

    2010-01-01

    The evaluation and reevaluation of the fundamental didactic principles suppose the acceptance at the level of an instructive-educative activity of a new educational paradigm. Thus, its understanding implies an assumption at a conceptual-theoretical level of some approaches where the didactic aspects find their usefulness by relating to value…

  15. The maximum principle for the Navier-Stokes equations

    NASA Astrophysics Data System (ADS)

    Akysh, Abdigali Sh.

    2016-08-01

    New connections were established between extreme values of the velocity, the density of kinetic energy (in particular local maximum) and the pressure of the Navier-Stokes equations. Validity of the maximum principle was shown for nonlinear Navier-Stokes equations using these connections, that is fundamentally-key from the mathematical point of view.

  16. Core Principles for Transforming Remedial Education: A Joint Statement

    ERIC Educational Resources Information Center

    Jobs for the Future, 2012

    2012-01-01

    As a result of new research and promising practice, we have more clarity than ever about how we can fundamentally transform our developmental education system to improve success for all students. To propel the movement forward, this statement offers a set of clear and actionable principles that, although not the final word on remedial education…

  17. Teaching/learning principles

    NASA Technical Reports Server (NTRS)

    Hankins, D. B.; Wake, W. H.

    1981-01-01

    The potential remote sensing user community is enormous, and the teaching and training tasks are even larger; however, some underlying principles may be synthesized and applied at all levels from elementary school children to sophisticated and knowledgeable adults. The basic rules applying to each of the six major elements of any training course and the underlying principle involved in each rule are summarized. The six identified major elements are: (1) field sites for problems and practice; (2) lectures and inside study; (3) learning materials and resources (the kit); (4) the field experience; (5) laboratory sessions; and (6) testing and evaluation.

  18. Investigating the Fundamental Theorem of Calculus

    ERIC Educational Resources Information Center

    Johnson, Heather L.

    2010-01-01

    The fundamental theorem of calculus, in its simplified complexity, connects differential and integral calculus. The power of the theorem comes not merely from recognizing it as a mathematical fact but from using it as a systematic tool. As a high school calculus teacher, the author developed and taught lessons on this fundamental theorem that were…

  19. Individual differences in fundamental social motives.

    PubMed

    Neel, Rebecca; Kenrick, Douglas T; White, Andrew Edward; Neuberg, Steven L

    2016-06-01

    Motivation has long been recognized as an important component of how people both differ from, and are similar to, each other. The current research applies the biologically grounded fundamental social motives framework, which assumes that human motivational systems are functionally shaped to manage the major costs and benefits of social life, to understand individual differences in social motives. Using the Fundamental Social Motives Inventory, we explore the relations among the different fundamental social motives of Self-Protection, Disease Avoidance, Affiliation, Status, Mate Seeking, Mate Retention, and Kin Care; the relationships of the fundamental social motives to other individual difference and personality measures including the Big Five personality traits; the extent to which fundamental social motives are linked to recent life experiences; and the extent to which life history variables (e.g., age, sex, childhood environment) predict individual differences in the fundamental social motives. Results suggest that the fundamental social motives are a powerful lens through which to examine individual differences: They are grounded in theory, have explanatory value beyond that of the Big Five personality traits, and vary meaningfully with a number of life history variables. A fundamental social motives approach provides a generative framework for considering the meaning and implications of individual differences in social motivation. (PsycINFO Database Record

  20. Fundamentals of Physics, Problem Supplement No. 1

    NASA Astrophysics Data System (ADS)

    Halliday, David; Resnick, Robert; Walker, Jearl

    2000-05-01

    No other book on the market today can match the success of Halliday, Resnick and Walker's Fundamentals of Physics! In a breezy, easy-to-understand style the book offers a solid understanding of fundamental physics concepts, and helps readers apply this conceptual understanding to quantitative problem solving.

  1. Fundamentals of Physics, 7th Edition

    NASA Astrophysics Data System (ADS)

    Halliday, David; Resnick, Robert; Walker, Jearl

    2004-05-01

    No other book on the market today can match the 30-year success of Halliday, Resnick and Walker's Fundamentals of Physics! In a breezy, easy-to-understand style the book offers a solid understanding of fundamental physics concepts, and helps readers apply this conceptual understanding to quantitative problem solving. This book offers a unique combination of authoritative content and stimulating applications.

  2. Fundamentals of Physics, Student's Solutions Manual

    NASA Astrophysics Data System (ADS)

    Halliday, David; Resnick, Robert; Walker, Jearl

    2000-07-01

    No other book on the market today can match the success of Halliday, Resnick and Walker's Fundamentals of Physics! In a breezy, easy-to-understand style the book offers a solid understanding of fundamental physics concepts, and helps readers apply this conceptual understanding to quantitative problem solving.

  3. Chemical principles of single-molecule electronics

    NASA Astrophysics Data System (ADS)

    Su, Timothy A.; Neupane, Madhav; Steigerwald, Michael L.; Venkataraman, Latha; Nuckolls, Colin

    2016-03-01

    The field of single-molecule electronics harnesses expertise from engineering, physics and chemistry to realize circuit elements at the limit of miniaturization; it is a subfield of nanoelectronics in which the electronic components are single molecules. In this Review, we survey the field from a chemical perspective and discuss the structure-property relationships of the three components that form a single-molecule junction: the anchor, the electrode and the molecular bridge. The spatial orientation and electronic coupling between each component profoundly affect the conductance properties and functions of the single-molecule device. We describe the design principles of the anchor group, the influence of the electronic configuration of the electrode and the effect of manipulating the structure of the molecular backbone and of its substituent groups. We discuss single-molecule conductance switches as well as the phenomenon of quantum interference and then trace their fundamental roots back to chemical principles.

  4. Making theoretical principles for new Chinese medicine.

    PubMed

    Chang, Rhonda

    2014-01-01

    It is commonly assumed that contemporary Chinese Medicine has an ancient lineage and its practice can be related in a straightforward way to medicine practiced in China for thousands of years. In this article, I argue that this impression is mistaken. What we currently call traditional Chinese Medicine is only sixty years old and it does not share the same theoretical principles to the ancient medicine of China (referred to as yi). Both yi and contemporary Chinese medicine practices use herbs and acupuncture methods, but yi is based on the principles of yinyang, wuxing whereas contemporary Chinese medicine is fundamentally based on western anatomical understandings of the body and disease, and notably, the two practices create different healing outcomes.

  5. ``From Fundamental Motives to Rational Expectation Equilibrium[REE, henceworth] of Indeterminacy''

    NASA Astrophysics Data System (ADS)

    Maksoed, Ssi, Wh-

    For ``Principle of Indeterminacy''from Heisenberg states: ``one of the fundamental cornerstone of quantum mechanics is the Heisenberg uncertainty principle''.whereas canonically conjugate quantities can be determined simultaneously only with a characteristic indeterminacy[M. Arevalo Aguilar, et.al]. Accompanying Alfred North Whitehead conclusion in ``The Aims of Education''that mathematical symbols are artificial before new meanings given, two kinds of fundamental motives: (i) expectation-expectation, (ii) expectation-certainty inherently occurs with determinacy properties of rational expectation equilibrium(REE, henceworth)- Guido Ascari & Tizano Ropele:''Trend inflation, Taylor principle & Indeterminacy'', Kiel Institute, June 2007. Furthers, relative price expression can be compare of their α and (1 - α) configurations in the expression of possible activity. Acknowledgment to Prof[asc]. Dr. Bobby Eka Gunara for ``made a rank through physics'' denotes...

  6. The uncertainty principle and quantum chaos

    NASA Technical Reports Server (NTRS)

    Chirikov, Boris V.

    1993-01-01

    The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.

  7. Principles of Naval Engineering.

    ERIC Educational Resources Information Center

    Naval Personnel Program Support Activity, Washington, DC.

    Fundamentals of shipboard machinery, equipment, and engineering plants are presented in this text prepared for engineering officers. A general description is included of the development of naval ships, ship design and construction, stability and buoyancy, and damage and casualty control. Engineering theories are explained on the background of ship…

  8. The Species Delimitation Uncertainty Principle

    PubMed Central

    Adams, Byron J.

    2001-01-01

    If, as Einstein said, "it is the theory which decides what we can observe," then "the species problem" could be solved by simply improving our theoretical definition of what a species is. However, because delimiting species entails predicting the historical fate of evolutionary lineages, species appear to behave according to the Heisenberg Uncertainty Principle, which states that the most philosophically satisfying definitions of species are the least operational, and as species concepts are modified to become more operational they tend to lose their philosophical integrity. Can species be delimited operationally without losing their philosophical rigor? To mitigate the contingent properties of species that tend to make them difficult for us to delimit, I advocate a set of operations that takes into account the prospective nature of delimiting species. Given the fundamental role of species in studies of evolution and biodiversity, I also suggest that species delimitation proceed within the context of explicit hypothesis testing, like other scientific endeavors. The real challenge is not so much the inherent fallibility of predicting the future but rather adequately sampling and interpreting the evidence available to us in the present. PMID:19265874

  9. Design principles underlying circadian clocks.

    PubMed Central

    Rand, D. A.; Shulgin, B. V.; Salazar, D.; Millar, A. J.

    2004-01-01

    A fundamental problem for regulatory networks is to understand the relation between form and function: to uncover the underlying design principles of the network. Circadian clocks present a particularly interesting instance, as recent work has shown that they have complex structures involving multiple interconnected feedback loops with both positive and negative feedback. While several authors have speculated on the reasons for this, a convincing explanation is still lacking.We analyse both the flexibility of clock networks and the relationships between various desirable properties such as robust entrainment, temperature compensation, and stability to environmental variations and parameter fluctuations. We use this to argue that the complexity provides the flexibility necessary to simultaneously attain multiple key properties of circadian clocks. As part of our analysis we show how to quantify the key evolutionary aims using infinitesimal response curves, a tool that we believe will be of general utility in the analysis of regulatory networks. Our results suggest that regulatory and signalling networks might be much less flexible and of lower dimension than their apparent complexity would suggest. PMID:16849158

  10. General Quantum Interference Principle and Duality Computer

    NASA Astrophysics Data System (ADS)

    Long, Gui-Lu

    2006-05-01

    In this article, we propose a general principle of quantum interference for quantum system, and based on this we propose a new type of computing machine, the duality computer, that may outperform in principle both classical computer and the quantum computer. According to the general principle of quantum interference, the very essence of quantum interference is the interference of the sub-waves of the quantum system itself. A quantum system considered here can be any quantum system: a single microscopic particle, a composite quantum system such as an atom or a molecule, or a loose collection of a few quantum objects such as two independent photons. In the duality computer, the wave of the duality computer is split into several sub-waves and they pass through different routes, where different computing gate operations are performed. These sub-waves are then re-combined to interfere to give the computational results. The quantum computer, however, has only used the particle nature of quantum object. In a duality computer, it may be possible to find a marked item from an unsorted database using only a single query, and all NP-complete problems may have polynomial algorithms. Two proof-of-the-principle designs of the duality computer are presented: the giant molecule scheme and the nonlinear quantum optics scheme. We also propose thought experiment to check the related fundamental issues, the measurement efficiency of a partial wave function.

  11. Precautionary principles: a jurisdiction-free framework for decision-making under risk.

    PubMed

    Ricci, Paolo F; Cox, Louis A; MacDonald, Thomas R

    2004-12-01

    Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives--defined as a choice that makes preferred consequences more likely--requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the

  12. Precautionary principles: a jurisdiction-free framework for decision-making under risk.

    PubMed

    Ricci, Paolo F; Cox, Louis A; MacDonald, Thomas R

    2004-12-01

    Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives--defined as a choice that makes preferred consequences more likely--requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the

  13. Next step in minimally invasive surgery: hybrid image-guided surgery.

    PubMed

    Marescaux, Jacques; Diana, Michele

    2015-01-01

    Surgery, interventional radiology, and advanced endoscopy have all developed minimally invasive techniques to effectively treat a variety of diseases with positive impact on patients' postoperative outcomes. However, those techniques are challenging and require extensive training. Robotics and computer sciences can help facilitate minimally invasive approaches. Furthermore, surgery, advanced endoscopy, and interventional radiology could converge towards a new hybrid specialty, hybrid image-guided minimally invasive therapies, in which the three fundamental disciplines could complement one another to maximize the positive effects and reduce the iatrogenic footprint on patients. The present manuscript describes the fundamental steps of this new paradigm shift in surgical therapies that, in our opinion, will be the next revolutionary step in minimally invasive approaches. PMID:25598089

  14. Microrover Operates With Minimal Computation

    NASA Technical Reports Server (NTRS)

    Miller, David P.; Loch, John L.; Gat, Erann; Desai, Rajiv S.; Angle, Colin; Bickler, Donald B.

    1992-01-01

    Small, light, highly mobile robotic vehicles called "microrovers" use sensors and artificial intelligence to perform complicated tasks autonomously. Vehicle navigates, avoids obstacles, and picks up objects using reactive control scheme selected from among few preprogrammed behaviors to respond to environment while executing assigned task. Under development for exploration and mining of other planets. Also useful in firefighting, cleaning up chemical spills, and delivering materials in factories. Reactive control scheme and principle of behavior-description language useful in reducing computational loads in prosthetic limbs and automotive collision-avoidance systems.

  15. Principles of sound ecotoxicology.

    PubMed

    Harris, Catherine A; Scott, Alexander P; Johnson, Andrew C; Panter, Grace H; Sheahan, Dave; Roberts, Mike; Sumpter, John P

    2014-03-18

    We have become progressively more concerned about the quality of some published ecotoxicology research. Others have also expressed concern. It is not uncommon for basic, but extremely important, factors to apparently be ignored. For example, exposure concentrations in laboratory experiments are sometimes not measured, and hence there is no evidence that the test organisms were actually exposed to the test substance, let alone at the stated concentrations. To try to improve the quality of ecotoxicology research, we suggest 12 basic principles that should be considered, not at the point of publication of the results, but during the experimental design. These principles range from carefully considering essential aspects of experimental design through to accurately defining the exposure, as well as unbiased analysis and reporting of the results. Although not all principles will apply to all studies, we offer these principles in the hope that they will improve the quality of the science that is available to regulators. Science is an evidence-based discipline and it is important that we and the regulators can trust the evidence presented to us. Significant resources often have to be devoted to refuting the results of poor research when those resources could be utilized more effectively.

  16. Fermat's Principle Revisited.

    ERIC Educational Resources Information Center

    Kamat, R. V.

    1991-01-01

    A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)

  17. Extended Mach Principle.

    ERIC Educational Resources Information Center

    Rosen, Joe

    1981-01-01

    Discusses the meaning of symmetry of the laws of physics and symmetry of the universe and the connection between symmetries and asymmetries of the laws of physics and those of the universe. An explanation of Hamilton's principle is offered. The material is suitable for informal discussions with students. (Author/SK)

  18. Matters of Principle.

    ERIC Educational Resources Information Center

    Martz, Carlton

    1999-01-01

    This issue of "Bill of Rights in Action" looks at individuals who have stood on principle against authority or popular opinion. The first article investigates John Adams and his defense of British soldiers at the Boston Massacre trials. The second article explores Archbishop Thomas Becket's fatal conflict with England's King Henry II. The final…

  19. The Denver principles.

    PubMed

    2000-01-01

    The Denver principles articulate the self empowerment movement of People With AIDS (PWA). The statements, written in 1983 by the Advisory Committee of the People With AIDS, include recommendations on how to support those with disease. It also includes suggestions for people who have AIDS. It concludes by listing the "rights of people with AIDS."

  20. Reprographic Principles Made Easy.

    ERIC Educational Resources Information Center

    Young, J. B.

    Means for reproducing graphic materials are explained. There are several types of processes: those using light sensitive material, those using heat sensitive material, those using photo conductive materials (electrophotography), and duplicating processes using ink. For each of these, the principles behind them are explained, the necessary…

  1. The Idiom Principle Revisited

    ERIC Educational Resources Information Center

    Siyanova-Chanturia, Anna; Martinez, Ron

    2015-01-01

    John Sinclair's Idiom Principle famously posited that most texts are largely composed of multi-word expressions that "constitute single choices" in the mental lexicon. At the time that assertion was made, little actual psycholinguistic evidence existed in support of that holistic, "single choice," view of formulaic language. In…

  2. Principles of Biomedical Ethics

    PubMed Central

    Athar, Shahid

    2012-01-01

    In this presentation, I will discuss the principles of biomedical and Islamic medical ethics and an interfaith perspective on end-of-life issues. I will also discuss three cases to exemplify some of the conflicts in ethical decision-making. PMID:23610498

  3. Principles of Teaching. Module.

    ERIC Educational Resources Information Center

    Rhoades, Joseph W.

    This module on principles of teaching is 1 in a series of 10 modules written for vocational education teacher education programs. It is designed to enable the teacher to do the following: (1) identify subject matter and integrate that subject matter with thought-provoking questions; (2) organize and demonstrate good questioning techniques; and (3)…

  4. Hydrogen evolution: Guiding principles

    NASA Astrophysics Data System (ADS)

    Xia, Zhenhai

    2016-10-01

    Lower-cost alternatives to platinum electrocatalysts are being explored for the sustainable production of hydrogen, but often trial-and-error approaches are used for their development. Now, principles are elucidated that suggest pathways to rationally design efficient metal-free electrocatalysts based on doped graphene.

  5. Minimal model for Brownian vortexes.

    PubMed

    Sun, Bo; Grier, David G; Grosberg, Alexander Y

    2010-08-01

    A Brownian vortex is a noise-driven machine that uses thermal fluctuations to extract a steady-state flow of work from a static force field. Its operation is characterized by loops in a probability current whose topology and direction can change with changes in temperature. We present discrete three- and four-state minimal models for Brownian vortexes that can be solved exactly with a master-equation formalism. These models elucidate conditions required for flux reversal in Brownian vortexes and provide insights into their thermodynamic efficiency through the rate of entropy production. PMID:20866791

  6. [Minimally invasive iridocorneal angle surgery].

    PubMed

    Jordan, J F

    2012-07-01

    The classical filtration surgery with trabeculectomy or drainage of chamber fluid with episcleral implants is the most effective method for permanent reduction of intraocular pressure to lower and normal levels. Even though both operative procedures are well-established the high efficiency of the method causes potentially dangerous intraoperative as well as interoperative complications with a frequency which cannot be ignored. In the past this led to a search for low complication alternatives with non-penetrating glaucoma surgery (NPGS) and the search is still continuing. Trabecular meshwork surgery in particular with continuous development of new operation techniques steered the focus to a complication-poor and minimally invasive, gonioscopic glaucoma surgery.

  7. The minimal scenario of leptogenesis

    NASA Astrophysics Data System (ADS)

    Blanchet, Steve; Di Bari, Pasquale

    2012-12-01

    We review the main features and results of thermal leptogenesis within the type I seesaw mechanism, the minimal extension of the Standard Model explaining neutrino masses and mixing. After presenting the simplest approach, the vanilla scenario, we discuss various important developments of recent years, such as the inclusion of lepton and heavy neutrino flavour effects, a description beyond a hierarchical heavy neutrino mass spectrum and an improved kinetic description within the density matrix and the closed-time-path formalisms. We also discuss how leptogenesis can ultimately represent an important phenomenological tool to test the seesaw mechanism and the underlying model of new physics.

  8. Minimizing medical litigation, part 2.

    PubMed

    Harold, Tan Keng Boon

    2006-01-01

    Provider-patient disputes are inevitable in the healthcare sector. Healthcare providers and regulators should recognize this and plan opportunities to enforce alternative dispute resolution (ADR) a early as possible in the care delivery process. Negotiation is often the main dispute resolution method used by local healthcare providers, failing which litigation would usually follow. The role of mediation in resolving malpractice disputes has been minimal. Healthcare providers, administrators, and regulators should therefore look toward a post-event communication-cum-mediation framework as the key national strategy to resolving malpractice disputes. PMID:16711089

  9. Dilaton cosmology, noncommutativity, and generalized uncertainty principle

    SciTech Connect

    Vakili, Babak

    2008-02-15

    The effects of noncommutativity and of the existence of a minimal length on the phase space of a dilatonic cosmological model are investigated. The existence of a minimum length results in the generalized uncertainty principle (GUP), which is a deformed Heisenberg algebra between the minisuperspace variables and their momenta operators. I extend these deformed commutating relations to the corresponding deformed Poisson algebra. For an exponential dilaton potential, the exact classical and quantum solutions in the commutative and noncommutative cases, and some approximate analytical solutions in the case of GUP, are presented and compared.

  10. Universal Principles in the Repair of Communication Problems

    PubMed Central

    Dingemanse, Mark; Roberts, Seán G.; Baranova, Julija; Blythe, Joe; Drew, Paul; Floyd, Simeon; Gisladottir, Rosa S.; Kendrick, Kobin H.; Levinson, Stephen C.; Manrique, Elizabeth; Rossi, Giovanni; Enfield, N. J.

    2015-01-01

    There would be little adaptive value in a complex communication system like human language if there were no ways to detect and correct problems. A systematic comparison of conversation in a broad sample of the world’s languages reveals a universal system for the real-time resolution of frequent breakdowns in communication. In a sample of 12 languages of 8 language families of varied typological profiles we find a system of ‘other-initiated repair’, where the recipient of an unclear message can signal trouble and the sender can repair the original message. We find that this system is frequently used (on average about once per 1.4 minutes in any language), and that it has detailed common properties, contrary to assumptions of radical cultural variation. Unrelated languages share the same three functionally distinct types of repair initiator for signalling problems and use them in the same kinds of contexts. People prefer to choose the type that is the most specific possible, a principle that minimizes cost both for the sender being asked to fix the problem and for the dyad as a social unit. Disruption to the conversation is kept to a minimum, with the two-utterance repair sequence being on average no longer that the single utterance which is being fixed. The findings, controlled for historical relationships, situation types and other dependencies, reveal the fundamentally cooperative nature of human communication and offer support for the pragmatic universals hypothesis: while languages may vary in the organization of grammar and meaning, key systems of language use may be largely similar across cultural groups. They also provide a fresh perspective on controversies about the core properties of language, by revealing a common infrastructure for social interaction which may be the universal bedrock upon which linguistic diversity rests. PMID:26375483

  11. Minimizing travel claims cost with minimal-spanning tree model

    NASA Astrophysics Data System (ADS)

    Jamalluddin, Mohd Helmi; Jaafar, Mohd Azrul; Amran, Mohd Iskandar; Ainul, Mohd Sharizal; Hamid, Aqmar; Mansor, Zafirah Mohd; Nopiah, Zulkifli Mohd

    2014-06-01

    Travel demand necessitates a big expenditure in spending, as has been proven by the National Audit Department (NAD). Every year the auditing process is carried out throughout the country involving official travel claims. This study focuses on the use of the Spanning Tree model to determine the shortest path to minimize the cost of the NAD's official travel claims. The objective is to study the possibility of running a network based in the Kluang District Health Office to eight Rural Clinics in Johor state using the Spanning Tree model applications for optimizing travelling distances and make recommendations to the senior management of the Audit Department to analyze travelling details before an audit is conducted. Result of this study reveals that there were claims of savings of up to 47.4% of the original claims, over the course of the travel distance.

  12. Annual Waste Minimization Summary Report

    SciTech Connect

    Alfred J. Karns

    2007-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC (NSTec), for the U. S. Department of Energy (DOE) National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during CY06. This report was developed in accordance with the requirements of the Nevada Test Site (NTS) Resource Conservation and Recovery Act (RCRA) Permit (No. NEV HW0021) and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the DOE, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by the NNSA/NSO and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by the NNSA/NSO.

  13. Less minimal supersymmetric standard model

    SciTech Connect

    de Gouvea, Andre; Friedland, Alexander; Murayama, Hitoshi

    1998-03-28

    Most of the phenomenological studies of supersymmetry have been carried out using the so-called minimal supergravity scenario, where one assumes a universal scalar mass, gaugino mass, and trilinear coupling at M{sub GUT}. Even though this is a useful simplifying assumption for phenomenological analyses, it is rather too restrictive to accommodate a large variety of phenomenological possibilities. It predicts, among other things, that the lightest supersymmetric particle (LSP) is an almost pure B-ino, and that the {mu}-parameter is larger than the masses of the SU(2){sub L} and U(1){sub Y} gauginos. We extend the minimal supergravity framework by introducing one extra parameter: the Fayet'Iliopoulos D-term for the hypercharge U(1), D{sub Y}. Allowing for this extra parameter, we find a much more diverse phenomenology, where the LSP is {tilde {nu}}{sub {tau}}, {tilde {tau}} or a neutralino with a large higgsino content. We discuss the relevance of the different possibilities to collider signatures. The same type of extension can be done to models with the gauge mediation of supersymmetry breaking. We argue that it is not wise to impose cosmological constraints on the parameter space.

  14. Next-to-minimal SOFTSUSY

    NASA Astrophysics Data System (ADS)

    Allanach, B. C.; Athron, P.; Tunstall, Lewis C.; Voigt, A.; Williams, A. G.

    2014-09-01

    We describe an extension to the SOFTSUSY program that provides for the calculation of the sparticle spectrum in the Next-to-Minimal Supersymmetric Standard Model (NMSSM), where a chiral superfield that is a singlet of the Standard Model gauge group is added to the Minimal Supersymmetric Standard Model (MSSM) fields. Often, a Z3 symmetry is imposed upon the model. SOFTSUSY can calculate the spectrum in this case as well as the case where general Z3 violating (denoted as =) terms are added to the soft supersymmetry breaking terms and the superpotential. The user provides a theoretical boundary condition for the couplings and mass terms of the singlet. Radiative electroweak symmetry breaking data along with electroweak and CKM matrix data are used as weak-scale boundary conditions. The renormalisation group equations are solved numerically between the weak scale and a high energy scale using a nested iterative algorithm. This paper serves as a manual to the NMSSM mode of the program, detailing the approximations and conventions used. Catalogue identifier: ADPM_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPM_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154886 No. of bytes in distributed program, including test data, etc.: 1870890 Distribution format: tar.gz Programming language: C++, fortran. Computer: Personal computer. Operating system: Tested on Linux 3.x. Word size: 64 bits Classification: 11.1, 11.6. Does the new version supersede the previous version?: Yes Catalogue identifier of previous version: ADPM_v3_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 785 Nature of problem: Calculating supersymmetric particle spectrum and mixing parameters in the next-to-minimal supersymmetric standard model. The solution to the

  15. Rotor-Liquid-Fundament System's Oscillation

    NASA Astrophysics Data System (ADS)

    Kydyrbekuly, A.

    The work is devoted to research of oscillation and sustainability of stationary twirl of vertical flexible static dynamically out-of-balance rotor with cavity partly filled with liquid and set on relative frame fundament. The accounting of such factors like oscillation of fundament, liquid oscillation, influence of asymmetry of installation of a rotor on a shaft, anisotropism of shaft support and fundament, static and dynamic out-of-balance of a rotor, an external friction, an internal friction of a shaft, allows to settle an invoice more precisely kinematic and dynamic characteristics of system.

  16. Tensorial Minkowski functionals of triply periodic minimal surfaces

    PubMed Central

    Mickel, Walter; Schröder-Turk, Gerd E.; Mecke, Klaus

    2012-01-01

    A fundamental understanding of the formation and properties of a complex spatial structure relies on robust quantitative tools to characterize morphology. A systematic approach to the characterization of average properties of anisotropic complex interfacial geometries is provided by integral geometry which furnishes a family of morphological descriptors known as tensorial Minkowski functionals. These functionals are curvature-weighted integrals of tensor products of position vectors and surface normal vectors over the interfacial surface. We here demonstrate their use by application to non-cubic triply periodic minimal surface model geometries, whose Weierstrass parametrizations allow for accurate numerical computation of the Minkowski tensors. PMID:24098847

  17. A minimal living system and the origin of a protocell

    NASA Technical Reports Server (NTRS)

    Oro, J.; Lazcano, A.

    1984-01-01

    The essential molecular attributes of a minimal living system are discussed, and the evolution of a protocell from such a system is considered. Present thought on the emergence and evolution of life is summarized, and the complexity of biological systems is reviewed. The five fundamental molecular attributes considered are: informational molecules, catalytic peptides, a decoding and translation system, protoribosomes, and protomembranes. Their functions in a primitive cell are discussed. Positive feedback interaction between proto-RNA, proto-AA-tRNA, and protoenzyme are identified as the three major steps to the formation of a primitive living cell.

  18. Gauge theories under incorporation of a generalized uncertainty principle

    SciTech Connect

    Kober, Martin

    2010-10-15

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  19. On the Minimal Length Uncertainty Relation and the Foundations of String Theory

    DOE PAGES

    Chang, Lay Nam; Lewis, Zachary; Minic, Djordje; Takeuchi, Tatsu

    2011-01-01

    We review our work on the minimal length uncertainty relation as suggested by perturbative string theory. We discuss simple phenomenological implications of the minimal length uncertainty relation and then argue that the combination of the principles of quantum theory and general relativity allow for a dynamical energy-momentum space. We discuss the implication of this for the problem of vacuum energy and the foundations of nonperturbative string theory.

  20. Minimizing communication cost among distributed controllers in software defined networks

    NASA Astrophysics Data System (ADS)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  1. Waste Minimization via Radiological Hazard Reduction

    SciTech Connect

    Stone, K.A.; Coffield, T.; Hooker, K.L.

    1998-03-01

    The Savannah River Site (SRS), a 803 km{sup 2} U.S. Department of Energy (DOE) facility in south-western South Carolina, incorporates pollution prevention as a fundamental component of its Environmental Management System. A comprehensive pollution prevention program was implemented as part of an overall business strategy to reduce waste generation and pollution releases, minimize environmental impacts, and to reduce future waste management and pollution control costs. In fiscal years 1995 through 1997, the Site focused on implementing specific waste reduction initiatives identified while benchmarking industry best practices. These efforts resulted in greater than $25 million in documented cost avoidance. While these results have been dramatic to date, the Site is further challenged to maximize resource utilization and deploy new technologies and practices to achieve further waste reductions. The Site has elected to target a site-wide reduction of contaminated work spaces in fiscal year 1998 as the primary source reduction initiative. Over 120,900 m{sup 2} of radiologically contaminated work areas (approximately 600 separate inside areas) exist at SRS. Reduction of these areas reduces future waste generation, minimizes worker exposure, and reduces surveillance and maintenance costs. This is a major focus of the Site`s As Low As Reasonably Achievable (ALARA) program by reducing sources of worker exposure. The basis for this approach was demonstrated during 1997 as part of a successful Enhanced Work Planning pilot conducted at several specific contamination areas at SRS. An economic-based prioritization process was utilized to develop a model for prioritizing areas to reclaim. In the H-Canyon Separation facility, over 3,900 m{sup 2} of potentially contaminated area was rolled back to a Radiation Buffer Area. The facility estimated nearly 420 m{sup 3} of low level radioactive waste will be avoided each year, and overall cost savings and productivity gains will reach

  2. Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms

    ERIC Educational Resources Information Center

    Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

    2002-01-01

    Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…

  3. Personality Theories Facilitate Integrating the Five Principles and Deducing Hypotheses for Testing

    ERIC Educational Resources Information Center

    Maddi, Salvatore R.

    2007-01-01

    Comments on the original article "A New Big Five: Fundamental Principles for an Integrative Science of Personality," by Dan P. McAdams and Jennifer L. Pals (see record 2006-03947-002). In presenting their view of personality science, McAdams and Pals (April 2006) elaborated the importance of five principles for building an integrated science of…

  4. AUTOMOTIVE DIESEL MAINTENANCE 2. UNIT XVI, LEARNING ABOUT AC GENERATOR (ALTERNATOR) PRINCIPLES (PART I).

    ERIC Educational Resources Information Center

    Human Engineering Inst., Cleveland, OH.

    THIS MODULE OF A 25-MODULE COURSE IS DESIGNED TO DEVELOP AN UNDERSTANDING OF THE OPERATING PRINCIPLES OF ALTERNATING CURRENT GENERATORS USED ON DIESEL POWERED EQUIPMENT. TOPICS ARE REVIEWING ELECTRICAL FUNDAMENTALS, AND OPERATING PRINCIPLES OF ALTERNATORS. THE MODULE CONSISTS OF A SELF-INSTRUCTIONAL PROGRAMED TRAINING FILM "AC GENERATORS…

  5. Principles of Natural Photosynthesis.

    PubMed

    Krewald, Vera; Retegan, Marius; Pantazis, Dimitrios A

    2016-01-01

    Nature relies on a unique and intricate biochemical setup to achieve sunlight-driven water splitting. Combined experimental and computational efforts have produced significant insights into the structural and functional principles governing the operation of the water-oxidizing enzyme Photosystem II in general, and of the oxygen-evolving manganese-calcium cluster at its active site in particular. Here we review the most important aspects of biological water oxidation, emphasizing current knowledge on the organization of the enzyme, the geometric and electronic structure of the catalyst, and the role of calcium and chloride cofactors. The combination of recent experimental work on the identification of possible substrate sites with computational modeling have considerably limited the possible mechanistic pathways for the critical O-O bond formation step. Taken together, the key features and principles of natural photosynthesis may serve as inspiration for the design, development, and implementation of artificial systems. PMID:26099285

  6. Principles of Glacier Mechanics

    NASA Astrophysics Data System (ADS)

    Waddington, Edwin D.

    Glaciers are awesome in size and move at a majestic pace, and they frequently occupy spectacular mountainous terrain. Naturally, many Earth scientists are attracted to glaciers. Some of us are even fortunate enough to make a career of studying glacier flow. Many others work on the large, flat polar ice sheets where there is no scenery. As a leader of one of the foremost research projects now studying the flow of mountain glaciers (Storglaciaren, Norway), Roger Hooke is well qualified to describe the principles of glacier mechanics. Principles of Glacier Mechanics is written for upper-level undergraduate students and graduate students with an interest in glaciers and the landforms that glaciers produce. While most of the examples in the text are drawn from valley glacier studies, much of the material is also relevant to “glacier flatland” on the polar ice sheets.

  7. Pauli Exclusion Principle

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    A principle of quantum theory, devised in 1925 by Wolfgang Pauli (1900-58), which states that no two fermions may exist in the same quantum state. The quantum state of a particle is defined by a set of numbers that describe quantities such as energy, angular momentum and spin. Fermions are particles such as quarks, protons, neutrons and electrons, that have spin = ½ (in units of h/2π, where h is ...

  8. Principles of nuclear geology

    SciTech Connect

    Aswathanarayana, U.

    1985-01-01

    This book treats the basic principles of nuclear physics and the mineralogy, geochemistry, distribution and ore deposits of uranium and thorium. The application of nuclear methodology in radiogenic heat and thermal regime of the earth, radiometric prospecting, isotopic age dating, stable isotopes and cosmic-ray produced isotopes is covered. Geological processes, such as metamorphic chronology, petrogenesis, groundwater movement, and sedimentation rate are focussed on.

  9. Principles of lake sedimentology

    SciTech Connect

    Janasson, L.

    1983-01-01

    This book presents a comprehensive outline on the basic sedimentological principles for lakes, and focuses on environmental aspects and matters related to lake management and control-on lake ecology rather than lake geology. This is a guide for those who plan, perform and evaluate lake sedimentological investigations. Contents abridged: Lake types and sediment types. Sedimentation in lakes and water dynamics. Lake bottom dynamics. Sediment dynamics and sediment age. Sediments in aquatic pollution control programmes. Subject index.

  10. Computational principles of memory.

    PubMed

    Chaudhuri, Rishidev; Fiete, Ila

    2016-03-01

    The ability to store and later use information is essential for a variety of adaptive behaviors, including integration, learning, generalization, prediction and inference. In this Review, we survey theoretical principles that can allow the brain to construct persistent states for memory. We identify requirements that a memory system must satisfy and analyze existing models and hypothesized biological substrates in light of these requirements. We also highlight open questions, theoretical puzzles and problems shared with computer science and information theory. PMID:26906506

  11. On the quantum mechanical solutions with minimal length uncertainty

    NASA Astrophysics Data System (ADS)

    Shababi, Homa; Pedram, Pouria; Chung, Won Sang

    2016-06-01

    In this paper, we study two generalized uncertainty principles (GUPs) including [X,P] = iℏ(1 + βP2j) and [X,P] = iℏ(1 + βP2 + kβ2P4) which imply minimal measurable lengths. Using two momentum representations, for the former GUP, we find eigenvalues and eigenfunctions of the free particle and the harmonic oscillator in terms of generalized trigonometric functions. Also, for the latter GUP, we obtain quantum mechanical solutions of a particle in a box and harmonic oscillator. Finally we investigate the statistical properties of the harmonic oscillator including partition function, internal energy, and heat capacity in the context of the first GUP.

  12. Spinless Particle in a Magnetic Field Under Minimal Length Scenario

    NASA Astrophysics Data System (ADS)

    Amirfakhrian, S. M.

    2016-06-01

    In this article, we studied the Klein-Gordon equation in a generalised uncertainty principle (GUP) framework which predicts a minimal uncertainty in position. We considered a spinless particle in this framework in the presence of a magnetic field, applied in the z-direction, which varies as {1 over {{x^2}}}. We found the energy eigenvalues of this system and also obtained the correspounding eigenfunctions, using the numerical method. When GUP parameter tends to zero, our solutions were in agreement with those obtained in the absence of GUP.

  13. Ergonomic T-Handle for Minimally Invasive Surgical Instruments

    PubMed Central

    Parekh, J; Shepherd, DET; Hukins, DWL; Maffulli, N

    2016-01-01

    A T-handle has been designed to be used for minimally invasive implantation of a dynamic hip screw to repair fractures of the proximal femur. It is capable of being used in two actions: (i) push and hold (while using an angle guide) and (ii) application of torque when using the insertion wrench and lag screw tap. The T-handle can be held in a power or precision grip. It is suitable for either single (sterilised by γ-irradiation) or multiple (sterilised by autoclaving) use. The principles developed here are applicable to handles for a wide range of surgical instruments. PMID:27326394

  14. Ergonomic T-Handle for Minimally Invasive Surgical Instruments.

    PubMed

    Parekh, J; Shepherd, Det; Hukins, Dwl; Maffulli, N

    2016-05-01

    A T-handle has been designed to be used for minimally invasive implantation of a dynamic hip screw to repair fractures of the proximal femur. It is capable of being used in two actions: (i) push and hold (while using an angle guide) and (ii) application of torque when using the insertion wrench and lag screw tap. The T-handle can be held in a power or precision grip. It is suitable for either single (sterilised by γ-irradiation) or multiple (sterilised by autoclaving) use. The principles developed here are applicable to handles for a wide range of surgical instruments.

  15. [MINIMALLY INVASIVE AORTIC VALVE REPLACEMENT].

    PubMed

    Tabata, Minoru

    2016-03-01

    Minimally invasive aortic valve replacement (MIAVR) is defined as aortic valve replacement avoiding full sternotomy. Common approaches include a partial sternotomy right thoracotomy, and a parasternal approach. MIAVR has been shown to have advantages over conventional AVR such as shorter length of stay and smaller amount of blood transfusion and better cosmesis. However, it is also known to have disadvantages such as longer cardiopulmonary bypass and aortic cross-clamp times and potential complications related to peripheral cannulation. Appropriate patient selection is very important. Since the procedure is more complex than conventional AVR, more intensive teamwork in the operating room is essential. Additionally, a team approach during postoperative management is critical to maximize the benefits of MIAVR.

  16. Minimal unitary (covariant) scattering theory

    SciTech Connect

    Lindesay, J.V.; Markevich, A.

    1983-06-01

    In the minimal three particle equations developed by Lindesay the two body input amplitude was an on shell relativistic generalization of the non-relativistic scattering model characterized by a single mass parameter ..mu.. which in the two body (m + m) system looks like an s-channel bound state (..mu.. < 2m) or virtual state (..mu.. > 2m). Using this driving term in covariant Faddeev equations generates a rich covariant and unitary three particle dynamics. However, the simplest way of writing the relativisitic generalization of the Faddeev equations can take the on shell Mandelstam parameter s = 4(q/sup 2/ + m/sup 2/), in terms of which the two particle input is expressed, to negative values in the range of integration required by the dynamics. This problem was met in the original treatment by multiplying the two particle input amplitude by THETA(s). This paper provides what we hope to be a more direct way of meeting the problem.

  17. A minimally invasive smile enhancement.

    PubMed

    Peck, Fred H

    2014-01-01

    Minimally invasive dentistry refers to a wide variety of dental treatments. On the restorative aspect of dental procedures, direct resin bonding can be a very conservative treatment option for the patient. When tooth structure does not need to be removed, the patient benefits. Proper treatment planning is essential to determine how conservative the restorative treatment will be. This article describes the diagnosis, treatment options, and procedural techniques in the restoration of 4 maxillary anterior teeth with direct composite resin. The procedural steps are reviewed with regard to placing the composite and the variety of colors needed to ensure a natural result. Finishing and polishing of the composite are critical to ending with a natural looking dentition that the patient will be pleased with for many years.

  18. Strategies to Minimize Antibiotic Resistance

    PubMed Central

    Lee, Chang-Ro; Cho, Ill Hwan; Jeong, Byeong Chul; Lee, Sang Hee

    2013-01-01

    Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs) and various data such as pharmacokinetic (PK) and pharmacodynamic (PD) properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST), clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care), the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students) regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing). The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics. PMID:24036486

  19. Minimally packed phases in holography

    NASA Astrophysics Data System (ADS)

    Donos, Aristomenis; Gauntlett, Jerome P.

    2016-03-01

    We numerically construct asymptotically AdS black brane solutions of D = 4 Einstein-Maxwell theory coupled to a pseudoscalar. The solutions are holographically dual to d = 3 CFTs at finite chemical potential and in a constant magnetic field, which spontaneously break translation invariance leading to the spontaneous formation of abelian and momentum magnetisation currents flowing around the plaquettes of a periodic Bravais lattice. We analyse the three-dimensional moduli space of lattice solutions, which are generically oblique, and show, for a specific value of the magnetic field, that the free energy is minimised by the triangular lattice, associated with minimal packing of circles in the plane. We show that the average stress tensor for the thermodynamically preferred phase is that of a perfect fluid and that this result applies more generally to spontaneously generated periodic phases. The triangular structure persists at low temperatures indicating the existence of novel crystalline ground states.

  20. Waste minimization in chrome plating

    SciTech Connect

    Scheuer, J.; Walter, K.; Nastasi, M.

    1996-09-01

    This is the final report of a one year laboratory directed research and development project at the Los Alamos National Laboratory (LANL). Traditional wet chemical electroplating techniques utilize toxic materials and pose environmental hazards in the disposal of primary baths and waste waters. Pollutants include metals and nonmetals, such as oil, grease, phosphates, and toxic and organic compounds. This project is focused on development of plasma source ion implantation (PSII), a novel and cost-effective surface modification technique, to minimize and ultimately eliminate waste generated in chrome plating. We are collaborating with and industrial partner to design material systems, utilize the PSII processes in existing Los Alamos experimental facilities, and analyze both material and performance characteristics.

  1. Non-minimal Inflationary Attractors

    SciTech Connect

    Kallosh, Renata; Linde, Andrei E-mail: alinde@stanford.edu

    2013-10-01

    Recently we identified a new class of (super)conformally invariant theories which allow inflation even if the scalar potential is very steep in terms of the original conformal variables. Observational predictions of a broad class of such theories are nearly model-independent. In this paper we consider generalized versions of these models where the inflaton has a non-minimal coupling to gravity with a negative parameter ξ different from its conformal value -1/6. We show that these models exhibit attractor behavior. With even a slight increase of |ξ| from |ξ| = 0, predictions of these models for n{sub s} and r rapidly converge to their universal model-independent values corresponding to conformal coupling ξ = −1/6. These values of n{sub s} and r practically coincide with the corresponding values in the limit ξ → −∞.

  2. Grice's cooperative principle in the psychoanalytic setting.

    PubMed

    Ephratt, Michal

    2014-12-01

    Grice's "cooperative principle," including conversational implicatures and maxims, is commonplace in current pragmatics (a subfield of linguistics), and is generally applied in conversational analysis. The author examines the unique contribution of Grice's principle in considering the psychotherapeutic setting and its discourse. Such an investigation is called for chiefly because of the central role of speech in psychoanalytic practice (the "talking cure"). Symptoms and transference, which are characterized as forms of expression that are fundamentally deceptive, must be equivocal and indirect, and must breach all four of Grice's categories and maxims: truth (Quality), relation (Relevance), Manner (be clear), and Quantity. Therapeutic practice, according to Freud's "fundamental rule of psychoanalysis," encourages the parties (analysand and analyst) to breach each and every one of Grice's maxims. Using case reports drawn from the literature, the author shows that these breachings are essential for therapeutic progress. They serve as a unique and important ground for revealing inner (psychic) contents, and demarcating real self from illusive self, which in turn constitutes leverage for integrating these contents with the self.

  3. Minimally invasive spine stabilisation with long implants

    PubMed Central

    Logroscino, Carlo Ambrogio; Proietti, Luca

    2009-01-01

    Originally aimed at treating degenerative syndromes of the lumbar spine, percutaneous minimally invasive posterior fixation is nowadays even more frequently used to treat some thoracolumbar fractures. According to the modern principles of saving segment of motion, a short implant (one level above and one level below the injured vertebra) is generally used to stabilise the injured spine. Although the authors generally use a short percutaneous fixation in treating thoracolumbar fractures with good results, they observed some cases in which the high fragmentation of the vertebral body and the presence of other associated diseases (co-morbidities) did not recommend the use of a short construct. The authors identified nine cases, in which a long implant (two levels above and two levels below the injured vertebra) was performed by a percutaneous minimally invasive approach. Seven patients (five males/two females) were affected by thoracolumbar fractures. T12 vertebra was involved in three cases, L1 in two cases, T10 and L2 in one case, respectively. Two fractures were classified as type A 3.1, two as A 3.2, two as A 3.3 and one as B 2.3, according to Magerl. In the present series, there were also two patients affected by a severe osteolysis of the spine (T9 and T12) due to tumoral localisation. All patients operated on with long instrumentation had a good outcome with prompt and uneventful clinical recovery. At the 1-year follow-up, all patients except one, who died 11 months after the operation, did not show any radiologic signs of mobilisation or failure of the implant. Based on the results of the present series, the long percutaneous fixation seems to represent an effective and safe system to treat particular cases of vertebral lesions. In conclusion, the authors believe that a long implant might be an alternative surgical method compared to more aggressive or demanding procedures, which in a few patients could represent an overtreatment. PMID:19399530

  4. The Minimal Cost of Life in Space

    NASA Astrophysics Data System (ADS)

    Drysdale, A.; Rutkze, C.; Albright, L.; Ladue, R.

    Life in space requires protection from the external environment, provision of a suitable internal environment, provision of consumables to maintain life, and removal of wastes. Protection from the external environment will mainly require shielding from radiation and meteoroids. Provision of a suitable environment inside the spacecraft will require provision of suitable air pressure and composition, temperature, and protection from environmental toxins (trace contaminants) and pathogenic micro-organisms. Gravity may be needed for longer missions to avoid excessive changes such as decalcification and muscle degeneration. Similarly, the volume required per crewmember will increase as the mission duration increases. Consumables required include oxygen, food, and water. Nitrogen might be required, depending on the total pressure and non-metabolic losses. We normally provide these consumables from the Earth, with a greater or lesser degree of regeneration. In principle, all consumables can be regenerated. Water and air are easiest to regenerate. At the present time, food can only be regenerated by using plants, and higher plants at that. Waste must be removed, including carbon dioxide and other metabolic waste as well as trash such as food packaging, filters, and expended spare parts. This can be done by dumping or regeneration. The minimal cost of life in space would be to use a synthesis process or system to regenerate all consumables from wastes. As the efficiency of the various processes rises, the minimal cost of life support will fall. However, real world regeneration requires significant equipment, power, and crew time. Make-up will be required for those items that cannot be economically regenerated. For very inefficient processes, it might be cheaper to ship all or part of the consumables. We are currently far down the development curve, and for short missions it is cheaper to ship consumables. For longer duration missions, greater closure is cost effective

  5. Fundamental approaches in molecular biology for communication sciences and disorders

    PubMed Central

    Bartlett, Rebecca; Jetté, Marie E; King, Suzanne N.; Schaser, Allison; Thibeault, Susan L.

    2012-01-01

    Purpose This contemporary tutorial will introduce general principles of molecular biology, common DNA, RNA and protein assays and their relevance in the field of communication sciences and disorders (CSD). Methods Over the past two decades, knowledge of the molecular pathophysiology of human disease has increased at a remarkable pace. Most of this progress can be attributed to concomitant advances in basic molecular biology and, specifically, the development of an ever-expanding armamentarium of technologies for analysis of DNA, RNA and protein structure and function. Details of these methodologies, their limitations and examples from the CSD literature are presented. Results/Conclusions The use of molecular biology techniques in the fields of speech, language and hearing sciences is increasing, facilitating the need for an understanding of molecular biology fundamentals and common experimental assays. PMID:22232415

  6. Some Fundamental Molecular Mechanisms of Contractility in Fibrous Macromolecules

    PubMed Central

    Mandelkern, L.

    1967-01-01

    The fundamental molecular mechanisms of contractility and tension development in fibrous macromolecules are developed from the point of view of the principles of polymer physical chemistry. The problem is treated in a general manner to encompass the behavior of all macromolecular systems irrespective of their detailed chemical structure and particular function, if any. Primary attention is given to the contractile process which accompanies the crystal-liquid transition in axially oriented macromolecular systems. The theoretical nature of the process is discussed, and many experimental examples are given from the literature which demonstrate the expected behavior. Experimental attention is focused on the contraction of fibrous proteins, and the same underlying molecular mechanism is shown to be operative for a variety of different systems. PMID:6050598

  7. The role of orbital mechanics in fundamental physics

    NASA Astrophysics Data System (ADS)

    Exertier, Pierre; Metris, Gilles

    The contribution of space techniques to fundamental physics is at two levels. First, very interesting results have been obtained using precise tracking and orbitography of natural bodies or space probes not initially designed for this aim; this is the case, for example, of the precise estimation of the GM gravitational constant and of some PPN parameters, of the confirmation of the Lense-Thirring effect, and of the test of the strong Equivalence Principle. Second, dedicated missions have been settled to perform in space, experiments which cannot be realized on the ground, at least at the same level of precision; this is in particular the case of the time transfer experiment T2L2 and of the MicroSCOPE mission for the test of the weak EP.

  8. Minimal genome: Worthwhile or worthless efforts toward being smaller?

    PubMed

    Choe, Donghui; Cho, Suhyung; Kim, Sun Chang; Cho, Byung-Kwan

    2016-02-01

    Microbial cells are versatile hosts for the production of value-added products due to the well-established background knowledge, various genetic tools, and ease of manipulation. Despite those advantages, efficiency of newly incorporated synthetic pathways in microbial cells is frequently limited by innate metabolism, product toxicity, and growth-mediated genetic instability. To overcome those obstacles, a minimal genome harboring only the essential set of genes was proposed, which is a fascinating concept with potential for use as a platform strain. Here, we review the currently available artificial reduced genomes and discuss the prospects for extending use of the genome-reduced strains as programmable chasses. The genome-reduced strains generally showed comparable growth to and higher productivity than their ancestral strains. In Escherichia coli, about 300 genes are estimated as the minimal number of genes under laboratory conditions. However, recent advances revealed that there are non-essential components in essential genes, suggesting that the design principle of minimal genomes should be reconstructed. Current technology is not efficient enough to reduce large amount of interspaced genomic regions or to synthesize the genome. Furthermore, construction of minimal genome frequently has failed due to lack of genomic information. Technological breakthroughs and intense systematic studies on genomes remain tasks.

  9. Waste minimization in analytical methods

    SciTech Connect

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-05-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department`s goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision.

  10. Heisenberg's observability principle

    NASA Astrophysics Data System (ADS)

    Wolff, Johanna

    2014-02-01

    Werner Heisenberg's 1925 paper 'Quantum-theoretical re-interpretation of kinematic and mechanical relations' marks the beginning of quantum mechanics. Heisenberg famously claims that the paper is based on the idea that the new quantum mechanics should be 'founded exclusively upon relationships between quantities which in principle are observable'. My paper is an attempt to understand this observability principle, and to see whether its employment is philosophically defensible. Against interpretations of 'observability' along empiricist or positivist lines I argue that such readings are philosophically unsatisfying. Moreover, a careful comparison of Heisenberg's reinterpretation of classical kinematics with Einstein's argument against absolute simultaneity reveals that the positivist reading does not fit with Heisenberg's strategy in the paper. Instead the appeal to observability should be understood as a specific criticism of the causal inefficacy of orbital electron motion in Bohr's atomic model. I conclude that the tacit philosophical principle behind Heisenberg's argument is not a positivistic connection between observability and meaning, but the idea that a theory should not contain causally idle wheels.

  11. Teaching professionalism: general principles.

    PubMed

    Cruess, Richard L; Cruess, Sylvia R

    2006-05-01

    There are educational principles that apply to the teaching of professionalism during undergraduate education and postgraduate training. It is axiomatic that there is a single cognitive base that applies with increasing moral force as students enter medical school, progress to residency or registrar training, and enter practice. While parts of this body of knowledge are easier to teach and learn at different stages of an individual's career, it remains a definable whole at all times and should be taught as such. While the principle that self-reflection on theoretical and real issues encountered in the life of a student, resident or practitioner is essential to the acquisition of experiential learning and the incorporation of the values and behaviors of the professional, the opportunities to provide situations where this can take place will change as an individual progresses through the system, as will the sophistication of the level of learning. Teaching the cognitive base of professionalism and providing opportunities for the internalization of its values and behaviors are the cornerstones of the organization of the teaching of professionalism at all levels. Situated learning theory appears to provide practical guidance as to how this may be implemented. While the application of this theory will vary with the type of curriculum, the institutional culture and the resources available, the principles outlined should remain constant.

  12. Waste minimization plan construction and operation of the replacement cross-site transfer system, project W-058

    SciTech Connect

    Boucher, T.D.

    1996-04-01

    This report addresses the research and development of a waste minimization plan for the construction and operation of Project W-058, Replacement of the Cross-Site Transfer System, on the Hanford Site. The plan is based on Washington Administrative Code (WAC) 173-307, Plans. The waste minimization plan identifies areas where pollution prevention/waste minimization principles can be incorporated into the construction and operation of the cross-site transfer system.

  13. Context Effects in Western Herbal Medicine: Fundamental to Effectiveness?

    PubMed

    Snow, James

    2016-01-01

    Western herbal medicine (WHM) is a complex healthcare system that uses traditional plant-based medicines in patient care. Typical preparations are individualized polyherbal formulae that, unlike herbal pills, retain the odor and taste of whole herbs. Qualitative studies in WHM show patient-practitioner relationships to be collaborative. Health narratives are co-constructed, leading to assessments, and treatments with personal significance for participants. It is hypothesized that the distinct characteristics of traditional herbal preparations and patient-herbalist interactions, in conjunction with the WHM physical healthcare environment, evoke context (placebo) effects that are fundamental to the overall effectiveness of herbal treatment. These context effects may need to be minimized to demonstrate pharmacological efficacy of herbal formulae in randomized, placebo-controlled trials, optimized to demonstrate effectiveness of WHM in pragmatic trials, and consciously harnessed to enhance outcomes in clinical practice. PMID:26613625

  14. Context Effects in Western Herbal Medicine: Fundamental to Effectiveness?

    PubMed

    Snow, James

    2016-01-01

    Western herbal medicine (WHM) is a complex healthcare system that uses traditional plant-based medicines in patient care. Typical preparations are individualized polyherbal formulae that, unlike herbal pills, retain the odor and taste of whole herbs. Qualitative studies in WHM show patient-practitioner relationships to be collaborative. Health narratives are co-constructed, leading to assessments, and treatments with personal significance for participants. It is hypothesized that the distinct characteristics of traditional herbal preparations and patient-herbalist interactions, in conjunction with the WHM physical healthcare environment, evoke context (placebo) effects that are fundamental to the overall effectiveness of herbal treatment. These context effects may need to be minimized to demonstrate pharmacological efficacy of herbal formulae in randomized, placebo-controlled trials, optimized to demonstrate effectiveness of WHM in pragmatic trials, and consciously harnessed to enhance outcomes in clinical practice.

  15. Fundamental performance improvement to dispersive spectrograph based imaging technologies

    NASA Astrophysics Data System (ADS)

    Meade, Jeff T.; Behr, Bradford B.; Cenko, Andrew T.; Christensen, Peter; Hajian, Arsen R.; Hendrikse, Jan; Sweeney, Frederic D.

    2011-03-01

    Dispersive-based spectrometers may be qualified by their spectral resolving power and their throughput efficiency. A device known as a virtual slit is able to improve the resolving power by factors of several with a minimal loss in throughput, thereby fundamentally improving the quality of the spectrometer. A virtual slit was built and incorporated into a low performing spectrometer (R ~ 300) and was shown to increase the performance without a significant loss in signal. The operation and description of virtual slits is also given. High-performance, lowlight, and high-speed imaging instruments based on a dispersive-type spectrometer see the greatest impact from a virtual slit. The impact of a virtual slit on spectral domain optical coherence tomography (SD-OCT) is shown to improve the imaging quality substantially.

  16. Principle of Spacetime and Black Hole Equivalence

    NASA Astrophysics Data System (ADS)

    Zhang, Tianxi

    2016-06-01

    Modelling the universe without relying on a set of hypothetical entities (HEs) to explain observations and overcome problems and difficulties is essential to developing a physical cosmology. The well-known big bang cosmology, widely accepted as the standard model, stands on two fundamentals, which are Einstein’s general relativity (GR) that describes the effect of matter on spacetime and the cosmological principle (CP) of spacetime isotropy and homogeneity. The field equation of GR along with the Friedmann-Lemaitre-Robertson-Walker (FLRW) metric of spacetime derived from CP generates the Friedmann equation (FE) that governs the development and dynamics of the universe. The big bang theory has made impressive successes in explaining the universe, but still has problems and solutions of them rely on an increasing number of HEs such as inflation, dark matter, dark energy, and so on. Recently, the author has developed a new cosmological model called black hole universe, which, instead of making many those hypotheses, only includes a new single postulate (or a new principle) to the cosmology - Principle of Spacetime and Black Hole Equivalence (SBHEP) - to explain all the existing observations of the universe and overcome all the existing problems in conventional cosmologies. This study thoroughly demonstrates how this newly developed black hole universe model, which therefore stands on the three fundamentals (GR, CP, and SBHEP), can fully explain the universe as well as easily conquer the difficulties according to the well-developed physics, thus, neither needing any other hypotheses nor existing any unsolved difficulties. This work was supported by NSF/REU (Grant #: PHY-1263253) at Alabama A & M University.

  17. Fundamentals of microfluidic cell culture in controlled microenvironments†

    PubMed Central

    Young, Edmond W. K.; Beebe, David J.

    2010-01-01

    Microfluidics has the potential to revolutionize the way we approach cell biology research. The dimensions of microfluidic channels are well suited to the physical scale of biological cells, and the many advantages of microfluidics make it an attractive platform for new techniques in biology. One of the key benefits of microfluidics for basic biology is the ability to control parameters of the cell microenvironment at relevant length and time scales. Considerable progress has been made in the design and use of novel microfluidic devices for culturing cells and for subsequent treatment and analysis. With the recent pace of scientific discovery, it is becoming increasingly important to evaluate existing tools and techniques, and to synthesize fundamental concepts that would further improve the efficiency of biological research at the microscale. This tutorial review integrates fundamental principles from cell biology and local microenvironments with cell culture techniques and concepts in microfluidics. Culturing cells in microscale environments requires knowledge of multiple disciplines including physics, biochemistry, and engineering. We discuss basic concepts related to the physical and biochemical microenvironments of the cell, physicochemical properties of that microenvironment, cell culture techniques, and practical knowledge of microfluidic device design and operation. We also discuss the most recent advances in microfluidic cell culture and their implications on the future of the field. The goal is to guide new and interested researchers to the important areas and challenges facing the scientific community as we strive toward full integration of microfluidics with biology. PMID:20179823

  18. Maximum Principles and Boundary Value Problems for FDEs

    NASA Astrophysics Data System (ADS)

    Domoshnitsky, Alexander

    2009-05-01

    The maximum principles present one of the classical parts in the qualitative theory of ordinary and partial differential equations. Although assertions about the maximum principles for functional differential equations can be interpreted in a corresponding sense as analogs of corresponding classical ones in the case of ordinary differential equations, they do not imply important corollaries, reached on the basis of finite dimensional fundamental systems. For example, results associated with the maximum principles in contrast with the cases of ordinary and even partial differential equations do not add so much in problems of existence and uniqueness. In this paper we obtain the maximum principles for functional differential equations and on this basis new results on existence and uniqueness of solutions for various boundary value problems are proposed.

  19. Mini-Med School Planning Guide

    ERIC Educational Resources Information Center

    National Institutes of Health, Office of Science Education, 2008

    2008-01-01

    Mini-Med Schools are public education programs now offered by more than 70 medical schools, universities, research institutions, and hospitals across the nation. There are even Mini-Med Schools in Ireland, Malta, and Canada! The program is typically a lecture series that meets once a week and provides "mini-med students" information on some of the…

  20. Closed locally minimal nets on tetrahedra

    SciTech Connect

    Strelkova, Nataliya P

    2011-01-31

    Closed locally minimal networks are in a sense a generalization of closed geodesics. A complete classification is known of closed locally minimal networks on regular (and generally any equihedral) tetrahedra. In the present paper certain necessary and certain sufficient conditions are given for at least one closed locally minimal network to exist on a given non-equihedral tetrahedron. Bibliography: 6 titles.

  1. Minimally Invasive Mitral Valve Surgery II

    PubMed Central

    Wolfe, J. Alan; Malaisrie, S. Chris; Farivar, R. Saeid; Khan, Junaid H.; Hargrove, W. Clark; Moront, Michael G.; Ryan, William H.; Ailawadi, Gorav; Agnihotri, Arvind K.; Hummel, Brian W.; Fayers, Trevor M.; Grossi, Eugene A.; Guy, T. Sloane; Lehr, Eric J.; Mehall, John R.; Murphy, Douglas A.; Rodriguez, Evelio; Salemi, Arash; Segurola, Romualdo J.; Shemin, Richard J.; Smith, J. Michael; Smith, Robert L.; Weldner, Paul W.; Lewis, Clifton T. P.; Barnhart, Glenn R.; Goldman, Scott M.

    2016-01-01

    Abstract Techniques for minimally invasive mitral valve repair and replacement continue to evolve. This expert opinion, the second of a 3-part series, outlines current best practices for nonrobotic, minimally invasive mitral valve procedures, and for postoperative care after minimally invasive mitral valve surgery. PMID:27654406

  2. Fundamentals of green chemistry: efficiency in reaction design.

    PubMed

    Sheldon, Roger A

    2012-02-21

    In this tutorial review, the fundamental concepts underlying the principles of green and sustainable chemistry--atom and step economy and the E factor--are presented, within the general context of efficiency in organic synthesis. The importance of waste minimisation through the widespread application of catalysis in all its forms--homogeneous, heterogeneous, organocatalysis and biocatalysis--is discussed. These general principles are illustrated with simple practical examples, such as alcohol oxidation and carbonylation and the asymmetric reduction of ketones. The latter reaction is exemplified by a three enzyme process for the production of a key intermediate in the synthesis of the cholesterol lowering agent, atorvastatin. The immobilisation of enzymes as cross-linked enzyme aggregates (CLEAs) as a means of optimizing operational performance is presented. The use of immobilised enzymes in catalytic cascade processes is illustrated with a trienzymatic process for the conversion of benzaldehyde to (S)-mandelic acid using a combi-CLEA containing three enzymes. Finally, the transition from fossil-based chemicals manufacture to a more sustainable biomass-based production is discussed. PMID:22033698

  3. Fundamental role of bistability in optimal homeostatic control

    NASA Astrophysics Data System (ADS)

    Wang, Guanyu

    2013-03-01

    Bistability is a fundamental phenomenon in nature and has a number of fine properties. However, these properties are consequences of bistability at the physiological level, which do not explain why it had to emerge during evolution. Using optimal homeostasis as the first principle and Pontryagin's Maximum Principle as the optimization approach, I find that bistability emerges as an indispensable control mechanism. Because the mathematical model is general and the result is independent of parameters, it is likely that most biological systems use bistability to control homeostasis. Glucose homeostasis represents a good example. It turns out that bistability is the only solution to a dilemma in glucose homeostasis: high insulin efficiency is required for rapid plasma glucose clearance, whereas an insulin sparing state is required to guarantee the brain's safety during fasting. This new perspective can illuminate studies on the twin epidemics of obesity and diabetes and the corresponding intervening strategies. For example, overnutrition and sedentary lifestyle may represent sudden environmental changes that cause the lose of optimality, which may contribute to the marked rise of obesity and diabetes in our generation.

  4. Fundamentals of green chemistry: efficiency in reaction design.

    PubMed

    Sheldon, Roger A

    2012-02-21

    In this tutorial review, the fundamental concepts underlying the principles of green and sustainable chemistry--atom and step economy and the E factor--are presented, within the general context of efficiency in organic synthesis. The importance of waste minimisation through the widespread application of catalysis in all its forms--homogeneous, heterogeneous, organocatalysis and biocatalysis--is discussed. These general principles are illustrated with simple practical examples, such as alcohol oxidation and carbonylation and the asymmetric reduction of ketones. The latter reaction is exemplified by a three enzyme process for the production of a key intermediate in the synthesis of the cholesterol lowering agent, atorvastatin. The immobilisation of enzymes as cross-linked enzyme aggregates (CLEAs) as a means of optimizing operational performance is presented. The use of immobilised enzymes in catalytic cascade processes is illustrated with a trienzymatic process for the conversion of benzaldehyde to (S)-mandelic acid using a combi-CLEA containing three enzymes. Finally, the transition from fossil-based chemicals manufacture to a more sustainable biomass-based production is discussed.

  5. Principles of tendon transfers.

    PubMed

    Coulet, B

    2016-04-01

    Tendon transfers are carried out to restore functional deficits by rerouting the remaining intact muscles. Transfers are highly attractive in the context of hand surgery because of the possibility of restoring the patient's ability to grip. In palsy cases, tendon transfers are only used when a neurological procedure is contraindicated or has failed. The strategy used to restore function follows a common set of principles, no matter the nature of the deficit. The first step is to clearly distinguish between deficient muscles and muscles that could be transferred. Next, the type of palsy will dictate the scope of the program and the complexity of the gripping movements that can be restored. Based on this reasoning, a surgical strategy that matches the means (transferable muscles) with the objectives (functions to restore) will be established and clearly explained to the patient. Every paralyzed hand can be described using three parameters. 1) Deficient segments: wrist, thumb and long fingers; 2) mechanical performance of muscles groups being revived: high energy-wrist extension and finger flexion that require strong transfers with long excursion; low energy-wrist flexion and finger extension movements that are less demanding mechanically, because they can be accomplished through gravity alone in some cases; 3) condition of the two primary motors in the hand: extrinsics (flexors and extensors) and intrinsics (facilitator). No matter the type of palsy, the transfer surgery follows the same technical principles: exposure, release, fixation, tensioning and rehabilitation. By performing an in-depth analysis of each case and by following strict technical principles, tendon transfer surgery leads to reproducible results; this allows the surgeon to establish clear objectives for the patient preoperatively. PMID:27117119

  6. Representations in Dynamical Embodied Agents: Re-Analyzing a Minimally Cognitive Model Agent

    ERIC Educational Resources Information Center

    Mirolli, Marco

    2012-01-01

    Understanding the role of "representations" in cognitive science is a fundamental problem facing the emerging framework of embodied, situated, dynamical cognition. To make progress, I follow the approach proposed by an influential representational skeptic, Randall Beer: building artificial agents capable of minimally cognitive behaviors and…

  7. Optical chiral metamaterials: a review of the fundamentals, fabrication methods and applications.

    PubMed

    Wang, Zuojia; Cheng, Feng; Winsor, Thomas; Liu, Yongmin

    2016-10-14

    Optical chiral metamaterials have recently attracted considerable attention because they offer new and exciting opportunities for fundamental research and practical applications. Through pragmatic designs, the chiroptical response of chiral metamaterials can be several orders of magnitude higher than that of natural chiral materials. Meanwhile, the local chiral fields can be enhanced by plasmonic resonances to drive a wide range of physical and chemical processes in both linear and nonlinear regimes. In this review, we will discuss the fundamental principles of chiral metamaterials, various optical chiral metamaterials realized by different nanofabrication approaches, and the applications and future prospects of this emerging field.

  8. Optical chiral metamaterials: a review of the fundamentals, fabrication methods and applications

    NASA Astrophysics Data System (ADS)

    Wang, Zuojia; Cheng, Feng; Winsor, Thomas; Liu, Yongmin

    2016-10-01

    Optical chiral metamaterials have recently attracted considerable attention because they offer new and exciting opportunities for fundamental research and practical applications. Through pragmatic designs, the chiroptical response of chiral metamaterials can be several orders of magnitude higher than that of natural chiral materials. Meanwhile, the local chiral fields can be enhanced by plasmonic resonances to drive a wide range of physical and chemical processes in both linear and nonlinear regimes. In this review, we will discuss the fundamental principles of chiral metamaterials, various optical chiral metamaterials realized by different nanofabrication approaches, and the applications and future prospects of this emerging field.

  9. Supramolecular chemistry and chemical warfare agents: from fundamentals of recognition to catalysis and sensing.

    PubMed

    Sambrook, M R; Notman, S

    2013-12-21

    Supramolecular chemistry presents many possible avenues for the mitigation of the effects of chemical warfare agents (CWAs), including sensing, catalysis and sequestration. To-date, efforts in this field both to study fundamental interactions between CWAs and to design and exploit host systems remain sporadic. In this tutorial review the non-covalent recognition of CWAs is considered from first principles, including taking inspiration from enzymatic systems, and gaps in fundamental knowledge are indicated. Examples of synthetic systems developed for the recognition of CWAs are discussed with a focus on the supramolecular complexation behaviour and non-covalent approaches rather than on the proposed applications.

  10. Optical chiral metamaterials: a review of the fundamentals, fabrication methods and applications.

    PubMed

    Wang, Zuojia; Cheng, Feng; Winsor, Thomas; Liu, Yongmin

    2016-10-14

    Optical chiral metamaterials have recently attracted considerable attention because they offer new and exciting opportunities for fundamental research and practical applications. Through pragmatic designs, the chiroptical response of chiral metamaterials can be several orders of magnitude higher than that of natural chiral materials. Meanwhile, the local chiral fields can be enhanced by plasmonic resonances to drive a wide range of physical and chemical processes in both linear and nonlinear regimes. In this review, we will discuss the fundamental principles of chiral metamaterials, various optical chiral metamaterials realized by different nanofabrication approaches, and the applications and future prospects of this emerging field. PMID:27606801

  11. The average enzyme principle.

    PubMed

    Reznik, Ed; Chaudhary, Osman; Segrè, Daniel

    2013-09-01

    The Michaelis-Menten equation for an irreversible enzymatic reaction depends linearly on the enzyme concentration. Even if the enzyme concentration changes in time, this linearity implies that the amount of substrate depleted during a given time interval depends only on the average enzyme concentration. Here, we use a time re-scaling approach to generalize this result to a broad category of multi-reaction systems, whose constituent enzymes have the same dependence on time, e.g. they belong to the same regulon. This "average enzyme principle" provides a natural methodology for jointly studying metabolism and its regulation.

  12. Equivalence Principle in Cosmology

    NASA Astrophysics Data System (ADS)

    Kopeikin, Sergei

    2014-01-01

    We analyse the Einstein equivalence principle (EEP) for a Hubble observer in Friedmann-Lemaître-Robertson-Walker (FLRW) spacetime. We show that the affine structure of the light cone in the FLRW spacetime should be treated locally in terms of the optical metric gαβ which is not reduced to the Minkowski metric fαβ due to the nonuniform parametrization of the local equations of light propagation with the proper time of the observer's clock. The physical consequence of this difference is that the Doppler shift of radio waves measured locally is affected by the Hubble expansion.

  13. Talus fractures: surgical principles.

    PubMed

    Rush, Shannon M; Jennings, Meagan; Hamilton, Graham A

    2009-01-01

    Surgical treatment of talus fractures can challenge even the most skilled foot and ankle surgeon. Complicated fracture patterns combined with joint dislocation of variable degrees require accurate assessment, sound understanding of principles of fracture care, and broad command of internal fixation techniques needed for successful surgical care. Elimination of unnecessary soft tissue dissection, a low threshold for surgical reduction, liberal use of malleolar osteotomy to expose body fracture, and detailed attention to fracture reduction and joint alignment are critical to the success of treatment. Even with the best surgical care complications are common and seem to correlate with injury severity and open injuries. PMID:19121756

  14. Principles of smile design

    PubMed Central

    Bhuvaneswaran, Mohan

    2010-01-01

    An organized and systematic approach is required to evaluate, diagnose and resolve esthetic problems predictably. It is of prime importance that the final result is not dependent only on the looks alone. Our ultimate goal as clinicians is to achieve pleasing composition in the smile by creating an arrangement of various esthetic elements. This article reviews the various principles that govern the art of smile designing. The literature search was done using PubMed search and Medline. This article will provide a basic knowledge to the reader to bring out a functional stable smile. PMID:21217950

  15. Principles of smile design.

    PubMed

    Bhuvaneswaran, Mohan

    2010-10-01

    An organized and systematic approach is required to evaluate, diagnose and resolve esthetic problems predictably. It is of prime importance that the final result is not dependent only on the looks alone. Our ultimate goal as clinicians is to achieve pleasing composition in the smile by creating an arrangement of various esthetic elements. This article reviews the various principles that govern the art of smile designing. The literature search was done using PubMed search and Medline. This article will provide a basic knowledge to the reader to bring out a functional stable smile.

  16. Recursively minimally-deformed oscillators

    NASA Astrophysics Data System (ADS)

    Katriel, J.; Quesne, C.

    1996-04-01

    A recursive deformation of the boson commutation relation is introduced. Each step consists of a minimal deformation of a commutator [a,a°]=fk(... ;n̂) into [a,a°]qk+1=fk(... ;n̂), where ... stands for the set of deformation parameters that fk depends on, followed by a transformation into the commutator [a,a°]=fk+1(...,qk+1;n̂) to which the deformed commutator is equivalent within the Fock space. Starting from the harmonic oscillator commutation relation [a,a°]=1 we obtain the Arik-Coon and Macfarlane-Biedenharn oscillators at the first and second steps, respectively, followed by a sequence of multiparameter generalizations. Several other types of deformed commutation relations related to the treatment of integrable models and to parastatistics are also obtained. The ``generic'' form consists of a linear combination of exponentials of the number operator, and the various recursive families can be classified according to the number of free linear parameters involved, that depends on the form of the initial commutator.

  17. Differentially Private Empirical Risk Minimization

    PubMed Central

    Chaudhuri, Kamalika; Monteleoni, Claire; Sarwate, Anand D.

    2011-01-01

    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance. PMID:21892342

  18. LESSons in minimally invasive urology.

    PubMed

    Dev, Harveer; Sooriakumaran, Prasanna; Tewari, Ashutosh; Rane, Abhay

    2011-05-01

    Since the introduction of laparoscopic surgery, the promise of lower postoperative morbidity and improved cosmesis has been achieved. LaparoEndoscopic Single Site (LESS) surgery potentially takes this further. Following the first human urological LESS report in 2007, numerous case series have emerged, as well as comparative studies comparing LESS with standard laparoscopy. Technological developments in instrumentation, access and optics devices are overcoming some of the challenges that are raised when operating through a single site. Further advances in the technique have included the incorporation of robotics (R-LESS), which exploit the ergonomic benefits of ex vivo robotic platforms in an attempt to further improve the implementation of LESS procedures. In the future, urologists may be able to benefit from in vivo micro-robots that will allow the manipulation of tissue from internal repositionable platforms. The use of magnetic anchoring and guidance systems (MAGS) might allow the external manoeuvring of intra-corporeal instruments to reduce clashing and facilitate triangulation. However, the final promise in minimally invasive surgery is natural orifice transluminal endoscopic surgery (NOTES), with its scarless technique. It remains to be seen whether NOTES, LESS, or any of these future developments will prove their clinical utility over standard laparoscopic methods.

  19. Medical waste: a minimal hazard.

    PubMed

    Keene, J H

    1991-11-01

    Medical waste is a subset of municipal waste, and regulated medical waste comprises less than 1% of the total municipal waste volume in the United States. As part of the overall waste stream, medical waste does contribute in a relative way to the aesthetic damage of the environment. Likewise, some small portion of the total release of hazardous chemicals and radioactive materials is derived from medical wastes. These comments can be made about any generated waste, regulated or unregulated. Healthcare professionals, including infection control personnel, microbiologists, public health officials, and others, have unsuccessfully argued that there is no evidence that past methods of treatment and disposal of regulated medical waste constitute any public health hazard. Historically, discovery of environmental contamination by toxic chemical disposal has followed assurances that the material was being disposed of in a safe manner. Therefore, a cynical public and its elected officials have demanded proof that the treatment and disposal of medical waste (i.e., infectious waste) do not constitute a public health hazard. Existent studies on municipal waste provide that proof. In order to argue that the results of these municipal waste studies are demonstrative of the minimal potential infectious environmental impact and lack of public health hazard associated with medical waste, we must accept the following: that the pathogens are the same whether they come from the hospital or the community, and that the municipal waste studied contained waste materials we now define as regulated medical waste.(ABSTRACT TRUNCATED AT 250 WORDS)

  20. The Principle of Maximum Conformality

    SciTech Connect

    Brodsky, Stanley J; Giustino, Di; /SLAC

    2011-04-05

    A key problem in making precise perturbative QCD predictions is the uncertainty in determining the renormalization scale of the running coupling {alpha}{sub s}({mu}{sup 2}). It is common practice to guess a physical scale {mu} = Q which is of order of a typical momentum transfer Q in the process, and then vary the scale over a range Q/2 and 2Q. This procedure is clearly problematic since the resulting fixed-order pQCD prediction will depend on the renormalization scheme, and it can even predict negative QCD cross sections at next-to-leading-order. Other heuristic methods to set the renormalization scale, such as the 'principle of minimal sensitivity', give unphysical results for jet physics, sum physics into the running coupling not associated with renormalization, and violate the transitivity property of the renormalization group. Such scale-setting methods also give incorrect results when applied to Abelian QED. Note that the factorization scale in QCD is introduced to match nonperturbative and perturbative aspects of the parton distributions in hadrons; it is present even in conformal theory and thus is a completely separate issue from renormalization scale setting. The PMC provides a consistent method for determining the renormalization scale in pQCD. The PMC scale-fixed prediction is independent of the choice of renormalization scheme, a key requirement of renormalization group invariance. The results avoid renormalon resummation and agree with QED scale-setting in the Abelian limit. The PMC global scale can be derived efficiently at NLO from basic properties of the PQCD cross section. The elimination of the renormalization scheme ambiguity using the PMC will not only increases the precision of QCD tests, but it will also increase the sensitivity of colliders to new physics beyond the Standard Model.