Science.gov

Sample records for minimization fundamental principles

  1. Fundamental principles of particle detectors

    SciTech Connect

    Fernow, R.C.

    1988-01-01

    This paper goes through the fundamental physics of particles-matter interactions which is necessary for the detection of these particles with detectors. A listing of 41 concepts and detector principles are given. 14 refs., 11 figs.

  2. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].

  3. Principle of minimal work fluctuations

    NASA Astrophysics Data System (ADS)

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality =e-β Δ F , a change in the fluctuations of e-β W may impact how rapidly the statistical average of e-β W converges towards the theoretical value e-β Δ F, where W is the work, β is the inverse temperature, and Δ F is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-β W. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-β W, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-β W. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014), 10.1103/PhysRevE.90.052132].

  4. Gas cell neutralizers (Fundamental principles)

    SciTech Connect

    Fuehrer, B.

    1985-06-01

    Neutralizing an ion-beam of the size and energy levels involved in the neutral-particle-beam program represents a considerable extension of the state-of-the-art of neutralizer technology. Many different mediums (e.g., solid, liquid, gas, plasma, photons) can be used to strip the hydrogen ion of its extra electron. A large, multidisciplinary R and D effort will no doubt be required to sort out all of the ''pros and cons'' of these various techniques. The purpose of this particular presentation is to discuss some basic configurations and fundamental principles of the gas type of neutralizer cell. Particular emphasis is placed on the ''Gasdynamic Free-Jet'' neutralizer since this configuration has the potential of being much shorter than other type of gas cells (in the beam direction) and it could operate in nearly a continuous mode (CW) if necessary. These were important considerations in the ATSU design which is discussed in some detail in the second presentation entitled ''ATSU Point Design''.

  5. Diesel fundamentals: Principles and service

    SciTech Connect

    Thiessen, F.J.; Dales, D.N.

    1986-01-01

    The contents of this book are: Tools and fasteners; Engine performance factors; Camshafts and camshaft drives; Cylinder heads and valves; Cylinder head and valve service; Diesel fuel injection; Cummins PT fuel injection system; Caterpillar fuel injection systems; Electrical principles; AC charging systems; and Cranking systems.

  6. Fundamental Principles of Proper Space Kinematics

    NASA Astrophysics Data System (ADS)

    Wade, Sean

    It is desirable to understand the movement of both matter and energy in the universe based upon fundamental principles of space and time. Time dilation and length contraction are features of Special Relativity derived from the observed constancy of the speed of light. Quantum Mechanics asserts that motion in the universe is probabilistic and not deterministic. While the practicality of these dissimilar theories is well established through widespread application inconsistencies in their marriage persist, marring their utility, and preventing their full expression. After identifying an error in perspective the current theories are tested by modifying logical assumptions to eliminate paradoxical contradictions. Analysis of simultaneous frames of reference leads to a new formulation of space and time that predicts the motion of both kinds of particles. Proper Space is a real, three-dimensional space clocked by proper time that is undergoing a densification at the rate of c. Coordinate transformations to a familiar object space and a mathematical stationary space clarify the counterintuitive aspects of Special Relativity. These symmetries demonstrate that within the local universe stationary observers are a forbidden frame of reference; all is in motion. In lieu of Quantum Mechanics and Uncertainty the use of the imaginary number i is restricted for application to the labeling of mass as either material or immaterial. This material phase difference accounts for both the perceived constant velocity of light and its apparent statistical nature. The application of Proper Space Kinematics will advance more accurate representations of microscopic, oscopic, and cosmological processes and serve as a foundation for further study and reflection thereafter leading to greater insight.

  7. The "Fundamental Pedogagical Principle" in Second Language Teaching.

    ERIC Educational Resources Information Center

    Krashen, Stephen D.

    1981-01-01

    A fundamental principle of second language acquisition is stated and applied to language teaching. The principle states that learners acquire a second language when they receive comprehensible input in situations where their affective filters are sufficiently low. The theoretical background of this principle consists of five hypotheses: the…

  8. Fundamental Ethical Principles in Sports Medicine.

    PubMed

    Devitt, Brian M

    2016-04-01

    In sports medicine, the practice of ethics presents many unique challenges because of the unusual clinical environment of caring for players within the context of a team whose primary goal is to win. Ethical issues frequently arise because a doctor-patient-team triad often replaces the traditional doctor-patient relationship. Conflict may exist when the team's priority clashes with or even replaces the doctor's obligation to player well-being. Customary ethical norms that govern most forms of clinical practice, such as autonomy and confidentiality, are not easily translated to sports medicine. Ethical principles and examples of how they relate to sports medicine are discussed.

  9. Functional Neuroimaging: Fundamental Principles and Clinical Applications.

    PubMed

    Khanna, Nishanth; Altmeyer, Wilson; Zhuo, Jiachen; Steven, Andrew

    2015-04-01

    Functional imaging modalities, such as functional magnetic resonance imaging (fMRI) and diffusion tensor imaging (DTI), are rapidly changing the scope and practice of neuroradiology. While these modalities have long been used in research, they are increasingly being used in clinical practice to enable reliable identification of eloquent cortex and white matter tracts in order to guide treatment planning and to serve as a diagnostic supplement when traditional imaging fails. An understanding of the scientific principles underlying fMRI and DTI is necessary in current radiological practice. fMRI relies on a compensatory hemodynamic response seen in cortical activation and the intrinsic discrepant magnetic properties of deoxy- and oxyhemoglobin. Neuronal activity can be indirectly visualized based on a hemodynamic response, termed neurovascular coupling. fMRI demonstrates utility in identifying areas of cortical activation (i.e., task-based activation) and in discerning areas of neuronal connectivity when used during the resting state, termed resting state fMRI. While fMRI is limited to visualization of gray matter, DTI permits visualization of white matter tracts through diffusion restriction along different axes. We will discuss the physical, statistical and physiological principles underlying these functional imaging modalities and explore new promising clinical applications.

  10. [Fundamentals and principles of grafts and flaps].

    PubMed

    Cruz-Navarro, Natalio; León-Dueñas, Eduardo

    2014-01-01

    Reconstructive surgery of large urethral stenosis and the management of congenital anomalies such as hypospadias and epispadias require covering large cutaneous and mucosal defects with different techniques. The objective of this work is to define the main differences between tissues to be transferred and to study the principles that must govern the management of the various flaps and grafts used for these techniques. We analyze the anatomical and physiological features that may be key to understand the success and possible failures of these procedures, and we review technical details that must accompany in every case, not only during the operation, but also during the preoperative and postoperative period. We conclude stating that grafts (mainly oral and preputial mucosa) and flaps are increasingly used for the repair of urethral stenosis. Grafts must be prepared adequately in the back table and thinned to the maximum, and also be fixed properly, to guarantee their immobility until neovascularization is assured.

  11. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    PubMed

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  12. Fundamental Principles of Coherent-Feedback Quantum Control

    DTIC Science & Technology

    2014-12-08

    AFRL-OSR-VA-TR-2015-0009 FUNDAMENTAL PRINCIPLES OF COHERENT- FEEDBACK QUANTUM CONTROL Hideo Mabuchi LELAND STANFORD JUNIOR UNIV CA Final Report 12/08...foundations and potential applications of coherent- feedback quantum control. We have focused on potential applications in quantum-enhanced metrology and...picture of how coherent feedback can provide a kind of circuit/network theory for quantum engineering, enabling rigorous analysis and numerical simulation

  13. Applying Minimal Manual Principles for Documentation of Graphical User Interfaces.

    ERIC Educational Resources Information Center

    Nowaczyk, Ronald H.; James, E. Christopher

    1993-01-01

    Investigates the need to include computer screens in documentation for software using a graphical user interface. Describes the uses and purposes of "minimal manuals" and their principles. Studies student reaction to their use of one of three on-screen manuals: screens, icon, and button. Finds some benefit for including icon and button…

  14. Fundamental principles of energy consumption for gene expression

    NASA Astrophysics Data System (ADS)

    Huang, Lifang; Yuan, Zhanjiang; Yu, Jianshe; Zhou, Tianshou

    2015-12-01

    How energy is consumed in gene expression is largely unknown mainly due to complexity of non-equilibrium mechanisms affecting expression levels. Here, by analyzing a representative gene model that considers complexity of gene expression, we show that negative feedback increases energy consumption but positive feedback has an opposite effect; promoter leakage always reduces energy consumption; generating more bursts needs to consume more energy; and the speed of promoter switching is at the cost of energy consumption. We also find that the relationship between energy consumption and expression noise is multi-mode, depending on both the type of feedback and the speed of promoter switching. Altogether, these results constitute fundamental principles of energy consumption for gene expression, which lay a foundation for designing biologically reasonable gene modules. In addition, we discuss possible biological implications of these principles by combining experimental facts.

  15. Rigorous force field optimization principles based on statistical distance minimization

    SciTech Connect

    Vlcek, Lukas; Chialvo, Ariel A.

    2015-10-14

    We use the concept of statistical distance to define a measure of distinguishability between a pair of statistical mechanical systems, i.e., a model and its target, and show that its minimization leads to general convergence of the model’s static measurable properties to those of the target. We exploit this feature to define a rigorous basis for the development of accurate and robust effective molecular force fields that are inherently compatible with coarse-grained experimental data. The new model optimization principles and their efficient implementation are illustrated through selected examples, whose outcome demonstrates the higher robustness and predictive accuracy of the approach compared to other currently used methods, such as force matching and relative entropy minimization. We also discuss relations between the newly developed principles and established thermodynamic concepts, which include the Gibbs-Bogoliubov inequality and the thermodynamic length.

  16. Waste heat boiler optimization by entropy minimization principle

    SciTech Connect

    Reddy, B.V.; Murali, J.; Satheesh, V.S.; Nag, P.K.

    1996-12-31

    A second law analysis has been undertaken for a waste heat boiler having an economizer, evaporator and superheater. Following the principle of minimization of entropy generation, a general equation for entropy generation number is derived, which incorporates all the operating variables. By differentiating the entropy generation number equation with respect to the operating parameters, various optimization parameters can be obtained. Few illustrations have been made to see the effect of various parameters on entropy generation number.

  17. Negative-Refraction Metamaterials: Fundamental Principles and Applications

    NASA Astrophysics Data System (ADS)

    Eleftheriades, G. V.; Balmain, K. G.

    2005-06-01

    Learn about the revolutionary new technology of negative-refraction metamaterials Negative-Refraction Metamaterials: Fundamental Principles and Applications introduces artificial materials that support the unusual electromagnetic property of negative refraction. Readers will discover several classes of negative-refraction materials along with their exciting, groundbreaking applications, such as lenses and antennas, imaging with super-resolution, microwave devices, dispersion-compensating interconnects, radar, and defense. The book begins with a chapter describing the fundamentals of isotropic metamaterials in which a negative index of refraction is defined. In the following chapters, the text builds on the fundamentals by describing a range of useful microwave devices and antennas. Next, a broad spectrum of exciting new research and emerging applications is examined, including: Theory and experiments behind a super-resolving, negative-refractive-index transmission-line lens 3-D transmission-line metamaterials with a negative refractive index Numerical simulation studies of negative refraction of Gaussian beams and associated focusing phenomena Unique advantages and theory of shaped lenses made of negative-refractive-index metamaterials A new type of transmission-line metamaterial that is anisotropic and supports the formation of sharp steerable beams (resonance cones) Implementations of negative-refraction metamaterials at optical frequencies Unusual propagation phenomena in metallic waveguides partially filled with negative-refractive-index metamaterials Metamaterials in which the refractive index and the underlying group velocity are both negative This work brings together the best minds in this cutting-edge field. It is fascinating reading for scientists, engineers, and graduate-level students in physics, chemistry, materials science, photonics, and electrical engineering.

  18. Randomness Amplification under Minimal Fundamental Assumptions on the Devices

    NASA Astrophysics Data System (ADS)

    Ramanathan, Ravishankar; Brandão, Fernando G. S. L.; Horodecki, Karol; Horodecki, Michał; Horodecki, Paweł; Wojewódka, Hanna

    2016-12-01

    Recently, the physically realistic protocol amplifying the randomness of Santha-Vazirani sources producing cryptographically secure random bits was proposed; however, for reasons of practical relevance, the crucial question remained open regarding whether this can be accomplished under the minimal conditions necessary for the task. Namely, is it possible to achieve randomness amplification using only two no-signaling components and in a situation where the violation of a Bell inequality only guarantees that some outcomes of the device for specific inputs exhibit randomness? Here, we solve this question and present a device-independent protocol for randomness amplification of Santha-Vazirani sources using a device consisting of two nonsignaling components. We show that the protocol can amplify any such source that is not fully deterministic into a fully random source while tolerating a constant noise rate and prove the composable security of the protocol against general no-signaling adversaries. Our main innovation is the proof that even the partial randomness certified by the two-party Bell test [a single input-output pair (u* , x* ) for which the conditional probability P (x*|u*) is bounded away from 1 for all no-signaling strategies that optimally violate the Bell inequality] can be used for amplification. We introduce the methodology of a partial tomographic procedure on the empirical statistics obtained in the Bell test that ensures that the outputs constitute a linear min-entropy source of randomness. As a technical novelty that may be of independent interest, we prove that the Santha-Vazirani source satisfies an exponential concentration property given by a recently discovered generalized Chernoff bound.

  19. Molecular diagnosis of B- and T-cell lymphomas: fundamental principles and clinical applications.

    PubMed

    Rezuke, W N; Abernathy, E C; Tsongalis, G J

    1997-10-01

    Molecular diagnostic assays have become routine in the evaluation of lymphoid malignancies. Both Southern transfer and polymerase chain reaction (PCR) technologies are used to assess for B- and T-cell clonality, the presence of rearrangements involving protooncogenes such as bcl-1 and bcl-2, and the monitoring of minimal residual disease. We review the fundamentals of B- and T-cell ontogeny as well as the basic principles of the Southern transfer and PCR assays and their applications to the diagnosis of lymphoid malignancies.

  20. Holographic fluctuations and the principle of minimal complexity

    NASA Astrophysics Data System (ADS)

    Chemissany, Wissam; Osborne, Tobias J.

    2016-12-01

    We discuss, from a quantum information perspective, recent proposals of Maldacena, Ryu, Takayanagi, van Raamsdonk, Swingle, and Susskind that spacetime is an emergent property of the quantum entanglement of an associated boundary quantum system. We review the idea that the informational principle of minimal complexity determines a dual holographic bulk spacetime from a minimal quantum circuit U preparing a given boundary state from a trivial reference state. We describe how this idea may be extended to determine the relationship between the fluctuations of the bulk holographic geometry and the fluctuations of the boundary low-energy subspace. In this way we obtain, for every quantum system, an Einstein-like equation of motion for what might be interpreted as a bulk gravity theory dual to the boundary system.

  1. Application of trajectory optimization principles to minimize aircraft operating costs

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.; Morello, S. A.; Erzberger, H.

    1979-01-01

    This paper summarizes various applications of trajectory optimization principles that have been or are being devised by both government and industrial researchers to minimize aircraft direct operating costs (DOC). These costs (time and fuel) are computed for aircraft constrained to fly over a fixed range. Optimization theory is briefly outlined, and specific algorithms which have resulted from application of this theory are described. Typical results which demonstrate use of these algorithms and the potential savings which they can produce are given. Finally, need for further trajectory optimization research is presented.

  2. Fundamental limits on transparency: first-principles calculations of absorption

    NASA Astrophysics Data System (ADS)

    Peelaers, Hartwin

    2013-03-01

    Transparent conducting oxides (TCOs) are a technologically important class of materials with applications ranging from solar cells, displays, smart windows, and touch screens to light-emitting diodes. TCOs combine high conductivity, provided by a high concentration of electrons in the conduction band, with transparency in the visible region of the spectrum. The requirement of transparency is usually tied to the band gap being sufficiently large to prevent absorption of visible photons. This is a necessary but not sufficient condition: indeed, the high concentration of free carriers can also lead to optical absorption by excitation of electrons to higher conduction-band states. A fundamental understanding of the factors that limit transparency in TCOs is essential for further progress in materials and applications. The Drude theory is widely used, but it is phenomenological in nature and tends to work poorly at shorter wavelengths, where band-structure effects are important. First-principles calculations have been performed, but were limited to direct transitions; as we show in the present work, indirect transitions assisted by phonons or defects actually dominate. Our calculations are the first to address indirect free-carrier absorption in a TCO completely from first principles. We present results for SnO2, but the methodology is general and is also being applied to ZnO and In2O3. The calculations provide not just quantitative results but also deeper insights in the mechanisms that govern absorption processes in different wavelength regimes, which is essential for engineering improved materials to be used in more efficient devices. For SnO2, we find that absorption is modest in the visible, and much stronger in the ultraviolet and infrared. Work performed in collaboration with E. Kioupakis and C.G. Van de Walle, and supported by DOE, NSF, and BAEF.

  3. Ultrafast mid-IR laser scalpel: protein signals of the fundamental limits to minimally invasive surgery.

    PubMed

    Amini-Nik, Saeid; Kraemer, Darren; Cowan, Michael L; Gunaratne, Keith; Nadesan, Puviindran; Alman, Benjamin A; Miller, R J Dwayne

    2010-09-28

    Lasers have in principle the capability to cut at the level of a single cell, the fundamental limit to minimally invasive procedures and restructuring biological tissues. To date, this limit has not been achieved due to collateral damage on the macroscale that arises from thermal and shock wave induced collateral damage of surrounding tissue. Here, we report on a novel concept using a specifically designed Picosecond IR Laser (PIRL) that selectively energizes water molecules in the tissue to drive ablation or cutting process faster than thermal exchange of energy and shock wave propagation, without plasma formation or ionizing radiation effects. The targeted laser process imparts the least amount of energy in the remaining tissue without any of the deleterious photochemical or photothermal effects that accompanies other laser wavelengths and pulse parameters. Full thickness incisional and excisional wounds were generated in CD1 mice using the Picosecond IR Laser, a conventional surgical laser (DELight Er:YAG) or mechanical surgical tools. Transmission and scanning electron microscopy showed that the PIRL laser produced minimal tissue ablation with less damage of surrounding tissues than wounds formed using the other modalities. The width of scars formed by wounds made by the PIRL laser were half that of the scars produced using either a conventional surgical laser or a scalpel. Aniline blue staining showed higher levels of collagen in the early stage of the wounds produced using the PIRL laser, suggesting that these wounds mature faster. There were more viable cells extracted from skin using the PIRL laser, suggesting less cellular damage. β-catenin and TGF-β signalling, which are activated during the proliferative phase of wound healing, and whose level of activation correlates with the size of wounds was lower in wounds generated by the PIRL system. Wounds created with the PIRL systsem also showed a lower rate of cell proliferation. Direct comparison of wound

  4. The Virtual Exploratorium: Connecting Student-constructed Visualization to Fundamental Geophysical Principles

    NASA Astrophysics Data System (ADS)

    Pandya, R.; Bramer, D.; Elliott, D.; Hay, K.; Marlino, M.; Middleton, D.; Ramamurthy, M.; Scheiltin, T.; Wilhelmson, R.

    2001-05-01

    The Virtual Exploratorium (VE) is a web-based inquiry environment in which learners use authentic data sets and scientific tools to build their own visualizations of geophysical phenomena. Learner-constructed visualizations not only give the learner a more robust conception than passive visualization; they can also guide inquiry toward fundamental physical principles. In the VE, learners will explore these fundamental physical principles using concept models. Concept models are Java-based applications for discovering fundamental principles in simplified and idealized settings. Learners will use probes to connect the principles they discover to the complex geophysical phenomenon they visualized. Probes are similar to the concept models, except that they respond to the environment of the student-constructed visualization rather than simplified idealized environments. The probes help learners contextualize the fundamental principles and bridge the gap between complex realistic visualization with multiple processes and the handful of underlying physical principles common across the geosciences.

  5. Fundamental concepts of effective troponin use: important principles for internists.

    PubMed

    Sara, Jaskanwal D S; Holmes, David R; Jaffe, Allan S

    2015-02-01

    Troponin testing is an essential component of our diagnostic approach to patients in acute medical care settings. With the advent of high-sensitivity troponin assays, its importance will extend to patients in chronic disease settings. Although elevated troponin levels provide diagnostic information, inform treatment decisions, and influence patient prognosis, proper interpretation of the values is essential. This requires an understanding of the operating characteristics of troponin testing; the likelihood ratios associated with a positive/negative test result and the pre- and post-test probabilities related to individual clinical settings. These principles will become more important as high-sensitivity assays become introduced over the coming years in the United States. This article reviews the important principles of troponin testing focusing in particular on acute settings and is aimed at internal medicine and hospital specialists.

  6. Fundamental Principles of Classical Mechanics: a Geometrical Perspectives

    NASA Astrophysics Data System (ADS)

    Lam, Kai S.

    2014-07-01

    Classical mechanics is the quantitative study of the laws of motion for oscopic physical systems with mass. The fundamental laws of this subject, known as Newton's Laws of Motion, are expressed in terms of second-order differential equations governing the time evolution of vectors in a so-called configuration space of a system (see Chapter 12). In an elementary setting, these are usually vectors in 3-dimensional Euclidean space, such as position vectors of point particles; but typically they can be vectors in higher dimensional and more abstract spaces. A general knowledge of the mathematical properties of vectors, not only in their most intuitive incarnations as directed arrows in physical space but as elements of abstract linear vector spaces, and those of linear operators (transformations) on vector spaces as well, is then indispensable in laying the groundwork for both the physical and the more advanced mathematical - more precisely topological and geometrical - concepts that will prove to be vital in our subject. In this beginning chapter we will review these properties, and introduce the all-important related notions of dual spaces and tensor products of vector spaces. The notational convention for vectorial and tensorial indices used for the rest of this book (except when otherwise specified) will also be established...

  7. Fundamental Principles of Network Formation among Preschool Children1

    PubMed Central

    Schaefer, David R.; Light, John M.; Fabes, Richard A.; Hanish, Laura D.; Martin, Carol Lynn

    2009-01-01

    The goal of this research was to investigate the origins of social networks by examining the formation of children’s peer relationships in 11 preschool classes throughout the school year. We investigated whether several fundamental processes of relationship formation were evident at this age, including reciprocity, popularity, and triadic closure effects. We expected these mechanisms to change in importance over time as the network crystallizes, allowing more complex structures to evolve from simpler ones in a process we refer to as structural cascading. We analyzed intensive longitudinal observational data of children’s interactions using the SIENA actor-based model. We found evidence that reciprocity, popularity, and triadic closure all shaped the formation of preschool children’s networks. The influence of reciprocity remained consistent, whereas popularity and triadic closure became increasingly important over the course of the school year. Interactions between age and endogenous network effects were nonsignificant, suggesting that these network formation processes were not moderated by age in this sample of young children. We discuss the implications of our longitudinal network approach and findings for the study of early network developmental processes. PMID:20161606

  8. The Minimal Control Principle Predicts Strategy Shifts in the Abstract Decision Making Task

    ERIC Educational Resources Information Center

    Taatgen, Niels A.

    2011-01-01

    The minimal control principle (Taatgen, 2007) predicts that people strive for problem-solving strategies that require as few internal control states as possible. In an experiment with the Abstract Decision Making task (ADM task; Joslyn & Hunt, 1998) the reward structure was manipulated to make either a low-control strategy or a high-strategy…

  9. Real processing (RP) I: The principle of minimal entropy production (PME) of irreversible thermodynamics and the principle of minimal deformation (PMD) of hydrodynamics, their dependence and applications

    NASA Astrophysics Data System (ADS)

    Reiser, Bernhard

    1996-02-01

    The principle of minimal entropy production (PME) of irreversible thermodynamics is generalized to determine process parameters in process engineering with well-known mathematical methods. This useful instrument applied to industrial processes is called real processing (RP). A special form of the PME is the principle of minimal deformation (PMD) which allows for applications in hydrodynamics (HD). A second and independent derivation of the PMD takes a similar way as the derivation of the statement of Helmholtz and Rayleigh (SHR). The generalization of SHR then leads to the PMD derived within HD. Next in a similar way. starting directly from the Navier-Stokes equation (NSE) can lead in a third way to the PMD. Several applications of the PMD are given: An analytical and numerical application of PMD is given for the entrance flow of tubes. Physical and analytical applications are the Crocco-Vazsonyi-type equations opening new possibilities of analytical treatment of process engineering problems. Process engineering models may be replaced by applications of PMD. In particular, turbulence may be treated as the answer of nature on PMD. A basic mathematical treatment of these subjects is possible by the gradient field theory (GFT), a particular method of vector analysis stemming from the Clebsch Ansatz for vector fields, which can be ordered from the author together with an advanced detailed mathematical treatment of these subjects.

  10. SU(2) gauge theory with two fundamental flavors: A minimal template for model building

    NASA Astrophysics Data System (ADS)

    Arthur, Rudy; Drach, Vincent; Hansen, Martin; Hietanen, Ari; Pica, Claudio; Sannino, Francesco

    2016-11-01

    We investigate the continuum spectrum of the SU(2) gauge theory with Nf=2 flavors of fermions in the fundamental representation. This model provides a minimal template which is ideal for a wide class of Standard Model extensions featuring novel strong dynamics that range from composite (Goldstone) Higgs theories to several intriguing types of dark matter candidates, such as the strongly interacting massive particles (SIMPs). We improve our previous lattice analysis [1] by adding more data at light quark masses, at two additional lattice spacings, by determining the lattice cutoff via a Wilson flow measure of the w0 parameter, and by measuring the relevant renormalization constants nonperturbatively in the regularization-invariant momentum (RI'-MOM) scheme. Our result for the lightest isovector state in the vector channel, in units of the pseudoscalar decay constant, is mV/FPS˜13.1 (2.2 ) (combining statistical and systematic errors). For the axial channel our result is mA/FPS˜14.5 (3.6 ) , which however does include a similarly sized additional systematic error due to residual excited-states contamination. In the context of the composite (Goldstone) Higgs models, our result for the spin-one resonances are mV>3.2 (5 ) TeV and mA>3.6 (9 ) TeV , which are above the current LHC constraints. In the context of dark matter models, for the SIMP case our results indicate the occurrence of a compressed spectrum at the required large dark pion mass, which implies the need to include the effects of spin-one resonances in phenomenological estimates.

  11. The uncertainty threshold principle - Fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1976-01-01

    The fundamental limitations of the optimal control of dynamic systems with random parameters are analyzed by studying a scalar linear-quadratic optimal control example. It is demonstrated that optimum long-range decision making is possible only if the dynamic uncertainty (quantified by the means and covariances of the random parameters) is below a certain threshold. If this threshold is exceeded, there do not exist optimum decision rules. This phenomenon is called the 'uncertainty threshold principle'. The implications of this phenomenon to the field of modelling, identification, and adaptive control are discussed.

  12. Research on the fundamental principles of China's marine invasive species prevention legislation.

    PubMed

    Bai, Jiayu

    2014-12-15

    China's coastal area is severely damaged by marine invasive species. Traditional tort theory resolves issues relevant to property damage or personal injuries, through which plaintiffs cannot cope with the ecological damage caused by marine invasive species. Several defects exist within the current legal regimes, such as imperfect management systems, insufficient unified technical standards, and unsound legal responsibility systems. It is necessary to pass legislation to prevent the ecological damage caused by marine invasive species. This investigation probes the fundamental principles needed for the administration and legislation of an improved legal framework to combat the problem of invasive species within China's coastal waters.

  13. Astronomical Tests of Relativity: Beyond Parameterized Post-Newtonian Formalism (PPN), to Testing Fundamental Principles

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik

    2009-05-01

    By the early 1970s, the improved accuracy of astrometric and time measurements enabled researchers not only to experimentally compare relativistic gravity with the Newtonian predictions, but also to compare different relativistic gravitational theories (e.g., the Brans-Dicke Scalar-Tensor Theory of Gravitation). For this comparison, Kip Thorne and others developed the Parameterized Post-Newtonian Formalism (PPN), and derived the dependence of different astronomically observable effects on the values of the corresponding parameters. Since then, all the observations have confirmed General Relativity. In other words, the question of which relativistic gravitation theory is in the best accordance with the experiments has been largely settled. This does not mean that General Relativity is the final theory of gravitation: it needs to be reconciled with quantum physics (into quantum gravity), it may also need to be reconciled with numerous surprising cosmological observations, etc. It is therefore reasonable to prepare an extended version of the PPN formalism, that will enable us to test possible quantum-related modifications of General Relativity. In particular, we need to include the possibility of violating fundamental principles that underlie the PPN formalism but that may be violated in quantum physics, such as scale-invariance, T-invariance, P-invariance, energy conservation, spatial isotropy violations, etc. In this talk, we present the first attempt to design the corresponding extended PPN formalism, with the (partial) analysis of the relation between the corresponding fundamental physical principles.

  14. Driving an Active Vibration Balancer to Minimize Vibrations at the Fundamental and Harmonic Frequencies

    NASA Technical Reports Server (NTRS)

    Holliday, Ezekiel S. (Inventor)

    2014-01-01

    Vibrations of a principal machine are reduced at the fundamental and harmonic frequencies by driving the drive motor of an active balancer with balancing signals at the fundamental and selected harmonics. Vibrations are sensed to provide a signal representing the mechanical vibrations. A balancing signal generator for the fundamental and for each selected harmonic processes the sensed vibration signal with adaptive filter algorithms of adaptive filters for each frequency to generate a balancing signal for each frequency. Reference inputs for each frequency are applied to the adaptive filter algorithms of each balancing signal generator at the frequency assigned to the generator. The harmonic balancing signals for all of the frequencies are summed and applied to drive the drive motor. The harmonic balancing signals drive the drive motor with a drive voltage component in opposition to the vibration at each frequency.

  15. Minimization principles for the coupled problem of Darcy-Biot-type fluid transport in porous media linked to phase field modeling of fracture

    NASA Astrophysics Data System (ADS)

    Miehe, Christian; Mauthe, Steffen; Teichtmeister, Stephan

    2015-09-01

    This work develops new minimization and saddle point principles for the coupled problem of Darcy-Biot-type fluid transport in porous media at fracture. It shows that the quasi-static problem of elastically deforming, fluid-saturated porous media is related to a minimization principle for the evolution problem. This two-field principle determines the rate of deformation and the fluid mass flux vector. It provides a canonically compact model structure, where the stress equilibrium and the inverse Darcy's law appear as the Euler equations of a variational statement. A Legendre transformation of the dissipation potential relates the minimization principle to a characteristic three field saddle point principle, whose Euler equations determine the evolutions of deformation and fluid content as well as Darcy's law. A further geometric assumption results in modified variational principles for a simplified theory, where the fluid content is linked to the volumetric deformation. The existence of these variational principles underlines inherent symmetries of Darcy-Biot theories of porous media. This can be exploited in the numerical implementation by the construction of time- and space-discrete variational principles, which fully determine the update problems of typical time stepping schemes. Here, the proposed minimization principle for the coupled problem is advantageous with regard to a new unconstrained stable finite element design, while space discretizations of the saddle point principles are constrained by the LBB condition. The variational principles developed provide the most fundamental approach to the discretization of nonlinear fluid-structure interactions, showing symmetric systems in algebraic update procedures. They also provide an excellent starting point for extensions towards more complex problems. This is demonstrated by developing a minimization principle for a phase field description of fracture in fluid-saturated porous media. It is designed for an

  16. Contemporary extracorporeal membrane oxygenation therapy in adults: Fundamental principles and systematic review of the evidence.

    PubMed

    Squiers, John J; Lima, Brian; DiMaio, J Michael

    2016-07-01

    Extracorporeal membrane oxygenation (ECMO) provides days to weeks of support for patients with respiratory, cardiac, or combined cardiopulmonary failure. Since ECMO was first reported in 1974, nearly 70,000 runs of ECMO have been implemented, and the use of ECMO in adults increased by more than 400% from 2006 to 2011 in the United States. A variety of factors, including the 2009 influenza A epidemic, results from recent clinical trials, and improvements in ECMO technology, have motivated this increased use in adults. Because ECMO is increasingly becoming available to a diverse population of critically ill patients, we provide an overview of its fundamental principles and a systematic review of the evidence basis of this treatment modality for a variety of indications in adults.

  17. Position-sensitive detection of slow neutrons: Survey of fundamental principles

    SciTech Connect

    Crawford, R.K.

    1992-01-01

    This paper sets forth the fundamental principles governing the development of position-sensitive detection systems for slow neutrons. Since neutrons are only weakly interacting with most materials, it is not generally practical to detect slow neutrons directly. Therefore all practical slow neutron detection mechanisms depend on the use of nuclear reactions to convert'' the neutron to one or more charged particles, followed by the subsequent detection of the charged particles. The different conversion reactions which can be used are discussed, along with the relative merits of each. This is followed with a discussion of the various methods of charged particle detection, how these lend themselves to position-sensitive encoding, and the means of position encoding which can be applied to each case. Detector performance characteristics which may be of importance to the end user are discussed and related to these various detection and position-encoding mechanisms.

  18. Position-sensitive detection of slow neutrons: Survey of fundamental principles

    SciTech Connect

    Crawford, R.K.

    1992-07-01

    This paper sets forth the fundamental principles governing the development of position-sensitive detection systems for slow neutrons. Since neutrons are only weakly interacting with most materials, it is not generally practical to detect slow neutrons directly. Therefore all practical slow neutron detection mechanisms depend on the use of nuclear reactions to ``convert`` the neutron to one or more charged particles, followed by the subsequent detection of the charged particles. The different conversion reactions which can be used are discussed, along with the relative merits of each. This is followed with a discussion of the various methods of charged particle detection, how these lend themselves to position-sensitive encoding, and the means of position encoding which can be applied to each case. Detector performance characteristics which may be of importance to the end user are discussed and related to these various detection and position-encoding mechanisms.

  19. The fundamental operating principles of electronic root canal length measurement devices.

    PubMed

    Nekoofar, M H; Ghandi, M M; Hayes, S J; Dummer, P M H

    2006-08-01

    It is generally accepted that root canal treatment procedures should be confined within the root canal system. To achieve this objective the canal terminus must be detected accurately during canal preparation and precise control of working length during the process must be maintained. Several techniques have been used for determining the apical canal terminus including electronic methods. However, the fundamental electronic operating principles and classification of the electronic devices used in this method are often unknown and a matter of controversy. The basic assumption with all electronic length measuring devices is that human tissues have certain characteristics that can be modelled by a combination of electrical components. Therefore, by measuring the electrical properties of the model, such as resistance and impedance, it should be possible to detect the canal terminus. The root canal system is surrounded by dentine and cementum that are insulators to electrical current. At the minor apical foramen, however, there is a small hole in which conductive materials within the canal space (tissue, fluid) are electrically connected to the periodontal ligament that is itself a conductor of electric current. Thus, dentine, along with tissue and fluid inside the canal, forms a resistor, the value of which depends on their dimensions, and their inherent resistivity. When an endodontic file penetrates inside the canal and approaches the minor apical foramen, the resistance between the endodontic file and the foramen decreases, because the effective length of the resistive material (dentine, tissue, fluid) decreases. As well as resistive properties, the structure of the tooth root has capacitive characteristics. Therefore, various electronic methods have been developed that use a variety of other principles to detect the canal terminus. Whilst the simplest devices measure resistance, other devices measure impedance using either high frequency, two frequencies, or

  20. Unleashing the Power of Microcanonical Inflection-Point Analysis: The Principle of Minimal Sensitivity

    NASA Astrophysics Data System (ADS)

    Qi, Kai; Bachmann, Michael

    2015-03-01

    In analogy to the principle of minimal sensitivity proposed by Stevenson for perturbative approaches in quantum field theory, we generalize microcanonical inflection-point analysis by probing higher-order derivatives of the inverse temperature β (E) for signals of transitions in finite complex systems. To illustrate the power of this analysis, we investigate adsorption properties of a simple-cubic lattice polymer model. The pseudophase diagram based on microcanonical inflection-point analysis is constructed. This example confirms the general potential of microcanonical statistical analysis for studies of pseudophase transitions for systems of finite size. Supported by NSF Grant DMR-1207437.

  1. Designing nanomaterials to maximize performance and minimize undesirable implications guided by the Principles of Green Chemistry.

    PubMed

    Gilbertson, Leanne M; Zimmerman, Julie B; Plata, Desiree L; Hutchison, James E; Anastas, Paul T

    2015-08-21

    The Twelve Principles of Green Chemistry were first published in 1998 and provide a framework that has been adopted not only by chemists, but also by design practitioners and decision-makers (e.g., materials scientists and regulators). The development of the Principles was initially motivated by the need to address decades of unintended environmental pollution and human health impacts from the production and use of hazardous chemicals. Yet, for over a decade now, the Principles have been applied to the synthesis and production of engineered nanomaterials (ENMs) and the products they enable. While the combined efforts of the global scientific community have led to promising advances in the field of nanotechnology, there remain significant research gaps and the opportunity to leverage the potential global economic, societal and environmental benefits of ENMs safely and sustainably. As such, this tutorial review benchmarks the successes to date and identifies critical research gaps to be considered as future opportunities for the community to address. A sustainable material design framework is proposed that emphasizes the importance of establishing structure-property-function (SPF) and structure-property-hazard (SPH) relationships to guide the rational design of ENMs. The goal is to achieve or exceed the functional performance of current materials and the technologies they enable, while minimizing inherent hazard to avoid risk to human health and the environment at all stages of the life cycle.

  2. Ultra-high resolution flat-panel volume CT: fundamental principles, design architecture, and system characterization.

    PubMed

    Gupta, Rajiv; Grasruck, Michael; Suess, Christoph; Bartling, Soenke H; Schmidt, Bernhard; Stierstorfer, Karl; Popescu, Stefan; Brady, Tom; Flohr, Thomas

    2006-06-01

    Digital flat-panel-based volume CT (VCT) represents a unique design capable of ultra-high spatial resolution, direct volumetric imaging, and dynamic CT scanning. This innovation, when fully developed, has the promise of opening a unique window on human anatomy and physiology. For example, the volumetric coverage offered by this technology enables us to observe the perfusion of an entire organ, such as the brain, liver, or kidney, tomographically (e.g., after a transplant or ischemic event). By virtue of its higher resolution, one can directly visualize the trabecular structure of bone. This paper describes the basic design architecture of VCT. Three key technical challenges, viz., scatter correction, dynamic range extension, and temporal resolution improvement, must be addressed for successful implementation of a VCT scanner. How these issues are solved in a VCT prototype and the modifications necessary to enable ultra-high resolution volumetric scanning are described. The fundamental principles of scatter correction and dose reduction are illustrated with the help of an actual prototype. The image quality metrics of this prototype are characterized and compared with a multi-detector CT (MDCT).

  3. Mobility analysis tool based on the fundamental principle of conservation of energy.

    SciTech Connect

    Spletzer, Barry Louis; Nho, Hyuchul C.; Salton, Jonathan Robert

    2007-08-01

    In the past decade, a great deal of effort has been focused in research and development of versatile robotic ground vehicles without understanding their performance in a particular operating environment. As the usage of robotic ground vehicles for intelligence applications increases, understanding mobility of the vehicles becomes critical to increase the probability of their successful operations. This paper describes a framework based on conservation of energy to predict the maximum mobility of robotic ground vehicles over general terrain. The basis of the prediction is the difference between traction capability and energy loss at the vehicle-terrain interface. The mission success of a robotic ground vehicle is primarily a function of mobility. Mobility of a vehicle is defined as the overall capability of a vehicle to move from place to place while retaining its ability to perform its primary mission. A mobility analysis tool based on the fundamental principle of conservation of energy is described in this document. The tool is a graphical user interface application. The mobility analysis tool has been developed at Sandia National Laboratories, Albuquerque, NM. The tool is at an initial stage of development. In the future, the tool will be expanded to include all vehicles and terrain types.

  4. [Fundamental ethical principles in the European framework programmes for research and development].

    PubMed

    Hirsch, François; Karatzas, Isidoros; Zilgalvis, Pēteris

    2009-01-01

    The European Commission is one of the most important international funding bodies for research conducted in Europe and beyond, including developing countries and countries in transition. Through its framework programmes for research and development, the European Union finances a vast array of projects concerning fields affecting the citizens' health, as well as the researchers' mobility, the development of new technologies or the safeguard of the environment. With the agreement of the European Parliament and of the Council of Ministers, the two decisional authorities of the European Union, the 7th framework programmes was started on December 2006. This program has a budget of 54 billion Euros to be distributed over a 7-year period. Therefore, the European Union aims to fully address the challenge as stated by the European Council of Lisbon (of March 2000) which declared the idea of providing 3% of the GDP of all the Member States for the purpose of research and development. One of the important conditions stated by the Members of the European Parliament to allocate this financing is to ensuring that "the funding research activities respect the fundamental ethical principles". In this article, we will approach this aspect of the evaluation.

  5. Simple approach to sediment provenance tracing using element analysis and fundamental principles

    NASA Astrophysics Data System (ADS)

    Matys Grygar, Tomas; Elznicova, Jitka; Popelka, Jan

    2016-04-01

    Common sediment fingerprinting techniques use either (1) extensive analytical datasets, sometimes nearly complete with respect to accessible characterization techniques; they are processed by multidimensional statistics based on certain statistical assumptions on distribution functions of analytical results and conservativeness/additivity of some components, or (2) analytically demanding characteristics such as isotope ratios assumed to be unequivocal "labels" on the parent material unaltered by any catchment process. The inherent problem of the approach ad (1) is that interpretation of statistical components ("sources") is done ex post and remains purely formal. The problem of the approach ad (2) is that catchment processes (weathering, transport, deposition) can modify most geochemical parameters of soils and sediments, in other words, that the idea that some geochemistry parameters are "conservative" may be idealistic. Grain-size effects and sediment provenance have a joint influence on chemical composition of fluvial sediments that is indeed not easy to distinguish. Attempts to separate those two main components using only statistics seem risky and equivocal, because grain-size dependence of element composition is nearly individual for each element and reflects sediment maturity and catchment-specific formation transport processes. We suppose that the use of less extensive datasets of analytical results and their interpretation respecting fundamental principles should be more robust than only statistic tools applied to overwhelming datasets. We examined sediment composition, both published by other researchers and gathered by us, and we found some general principles, which are in our opinion relevant for fingerprinting: (1) Concentrations of all elements are grain-size sensitive, i.e. there are no "conservative" elements in conventional sense of provenance- or transport-pathways tracing, (2) fractionation by catchment processes and fluvial transport changes

  6. Fundamental Assumptions and Aims Underlying the Principles and Policies of Federal Financial Aid to Students. Research Report.

    ERIC Educational Resources Information Center

    Johnstone, D. Bruce

    As background to the National Dialogue on Student Financial Aid, this essay discusses the fundamental assumptions and aims that underlie the principles and policies of federal financial aid to students. These eight assumptions and aims are explored: (1) higher education is the province of states, and not of the federal government; (2) the costs of…

  7. D 1 , 2 (RN) versus C (RN) local minimizer and a Hopf-type maximum principle

    NASA Astrophysics Data System (ADS)

    Carl, Siegfried; Costa, David G.; Tehrani, Hossein

    2016-08-01

    We consider functionals of the form Φ (u) =1/2∫RN | ∇u|2 -∫RN b (x) G (u) on D 1 , 2 (RN), N ≥ 3, whose critical points are the weak solutions of a corresponding elliptic equation in the whole RN. We present a Brezis-Nirenberg type result and a Hopf-type maximum principle in the context of the space D 1 , 2 (RN). More precisely, we prove that a local minimizer of Φ in the topology of the subspace V must be a local minimizer of Φ in the D 1 , 2 (RN)-topology, where V is given by V : = { v ∈D 1 , 2 (RN) : v ∈ C (RN)withsupx∈RN ⁡ (1 + | x| N - 2) | v (x) | < ∞ }. It is well-known that the Brezis-Nirenberg result has been proved a strong tool in the study of multiple solutions for elliptic boundary value problems in bounded domains. We believe that the result obtained in this paper may play a similar role for elliptic problems in RN.

  8. Prediction of metabolic flux distribution from gene expression data based on the flux minimization principle.

    PubMed

    Song, Hyun-Seob; Reifman, Jaques; Wallqvist, Anders

    2014-01-01

    Prediction of possible flux distributions in a metabolic network provides detailed phenotypic information that links metabolism to cellular physiology. To estimate metabolic steady-state fluxes, the most common approach is to solve a set of macroscopic mass balance equations subjected to stoichiometric constraints while attempting to optimize an assumed optimal objective function. This assumption is justifiable in specific cases but may be invalid when tested across different conditions, cell populations, or other organisms. With an aim to providing a more consistent and reliable prediction of flux distributions over a wide range of conditions, in this article we propose a framework that uses the flux minimization principle to predict active metabolic pathways from mRNA expression data. The proposed algorithm minimizes a weighted sum of flux magnitudes, while biomass production can be bounded to fit an ample range from very low to very high values according to the analyzed context. We have formulated the flux weights as a function of the corresponding enzyme reaction's gene expression value, enabling the creation of context-specific fluxes based on a generic metabolic network. In case studies of wild-type Saccharomyces cerevisiae, and wild-type and mutant Escherichia coli strains, our method achieved high prediction accuracy, as gauged by correlation coefficients and sums of squared error, with respect to the experimentally measured values. In contrast to other approaches, our method was able to provide quantitative predictions for both model organisms under a variety of conditions. Our approach requires no prior knowledge or assumption of a context-specific metabolic functionality and does not require trial-and-error parameter adjustments. Thus, our framework is of general applicability for modeling the transcription-dependent metabolism of bacteria and yeasts.

  9. Majorana-Time-Reversal Symmetries: A Fundamental Principle for Sign-Problem-Free Quantum Monte Carlo Simulations

    NASA Astrophysics Data System (ADS)

    Li, Zi-Xiang; Jiang, Yi-Fan; Yao, Hong

    2016-12-01

    A fundamental open issue in physics is whether and how the fermion sign problem in quantum Monte Carlo (QMC) simulations can be solved generically. Here, we show that Majorana-time-reversal (MTR) symmetries can provide a unifying principle to solve the fermion sign problem in interacting fermionic models. By systematically classifying Majorana-bilinear operators according to the anticommuting MTR symmetries they respect, we rigorously prove that there are two and only two fundamental symmetry classes which are sign-problem-free and which we call the "Majorana class" and "Kramers class," respectively. Novel sign-problem-free models in the Majorana class include interacting topological superconductors and interacting models of charge-4 e superconductors. We believe that our MTR unifying principle could shed new light on sign-problem-free QMC simulation on strongly correlated systems and interacting topological matters.

  10. Developing a Dynamics and Vibrations Course for Civil Engineering Students Based on Fundamental-Principles

    ERIC Educational Resources Information Center

    Barroso, Luciana R.; Morgan, James R.

    2012-01-01

    This paper describes the creation and evolution of an undergraduate dynamics and vibrations course for civil engineering students. Incorporating vibrations into the course allows students to see and study "real" civil engineering applications of the course content. This connection of academic principles to real life situations is in…

  11. Education in Vietnam: Fundamental Principles and Curricula. General Information Series, No. 3. Indochinese Refugee Education Guides.

    ERIC Educational Resources Information Center

    Center for Applied Linguistics, Arlington, VA.

    This guide reconstructs the curricula taught in Vietnam at the elementary level. It includes the underlying educational principles and lists the subjects along with the number of hours they are taught. The curriculum for each of the first five compulsory grades is presented separately, and four charts give overall statistics. The intent of the…

  12. A new Big Five: fundamental principles for an integrative science of personality.

    PubMed

    McAdams, Dan P; Pals, Jennifer L

    2006-04-01

    Despite impressive advances in recent years with respect to theory and research, personality psychology has yet to articulate clearly a comprehensive framework for understanding the whole person. In an effort to achieve that aim, the current article draws on the most promising empirical and theoretical trends in personality psychology today to articulate 5 big principles for an integrative science of the whole person. Personality is conceived as (a) an individual's unique variation on the general evolutionary design for human nature, expressed as a developing pattern of (b) dispositional traits, (c) characteristic adaptations, and (d) self-defining life narratives, complexly and differentially situated (e) in culture and social context. The 5 principles suggest a framework for integrating the Big Five model of personality traits with those self-defining features of psychological individuality constructed in response to situated social tasks and the human need to make meaning in culture.

  13. The Uncertainty Threshold Principle: Some Fundamental Limitations of Optimal Decision Making Under Dynamic Uncertainity

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  14. The uncertainty threshold principle - Some fundamental limitations of optimal decision making under dynamic uncertainty

    NASA Technical Reports Server (NTRS)

    Athans, M.; Ku, R.; Gershwin, S. B.

    1977-01-01

    This note shows that the optimal control of dynamic systems with uncertain parameters has certain limitations. In particular, by means of a simple scalar linear-quadratic optimal control example, it is shown that the infinite horizon solution does not exist if the parameter uncertainty exceeds a certain quantifiable threshold; we call this the uncertainty threshold principle. The philosophical and design implications of this result are discussed.

  15. Atomic force microscopy: Unraveling the fundamental principles governing secretion and membrane fusion in cells.

    PubMed

    Jena, Bhanu P

    2009-07-01

    The story of cell secretion and membrane fusion is as old as life itself. Without these fundamental cellular processes known to occur in yeast to humans, life would cease to exist. In the last 15 years, primarily using the atomic force microscope, a detailed understanding of the molecular process and of the molecular machinery and mechanism of secretion and membrane fusion in cells has come to light. This has led to a paradigm shift in our understanding of the underlying mechanism of cell secretion. The journey leading to the discovery of a new cellular structure the 'porosome',-the universal secretory machinery in cells, and the contributions of the AFM in our understanding of the general molecular machinery and mechanism of cell secretion and membrane fusion, is briefly discussed in this article.

  16. A review of fundamental principles for animal models of DOHaD research: an Australian perspective.

    PubMed

    Dickinson, H; Moss, T J; Gatford, K L; Moritz, K M; Akison, L; Fullston, T; Hryciw, D H; Maloney, C A; Morris, M J; Wooldridge, A L; Schjenken, J E; Robertson, S A; Waddell, B J; Mark, P J; Wyrwoll, C S; Ellery, S J; Thornburg, K L; Muhlhausler, B S; Morrison, J L

    2016-10-01

    Epidemiology formed the basis of 'the Barker hypothesis', the concept of 'developmental programming' and today's discipline of the Developmental Origins of Health and Disease (DOHaD). Animal experimentation provided proof of the underlying concepts, and continues to generate knowledge of underlying mechanisms. Interventions in humans, based on DOHaD principles, will be informed by experiments in animals. As knowledge in this discipline has accumulated, from studies of humans and other animals, the complexity of interactions between genome, environment and epigenetics, has been revealed. The vast nature of programming stimuli and breadth of effects is becoming known. As a result of our accumulating knowledge we now appreciate the impact of many variables that contribute to programmed outcomes. To guide further animal research in this field, the Australia and New Zealand DOHaD society (ANZ DOHaD) Animals Models of DOHaD Research Working Group convened at the 2nd Annual ANZ DOHaD Congress in Melbourne, Australia in April 2015. This review summarizes the contributions of animal research to the understanding of DOHaD, and makes recommendations for the design and conduct of animal experiments to maximize relevance, reproducibility and translation of knowledge into improving health and well-being.

  17. A covariant action principle for dissipative fluid dynamics: from formalism to fundamental physics

    NASA Astrophysics Data System (ADS)

    Andersson, N.; Comer, G. L.

    2015-04-01

    We present a new variational framework for dissipative general relativistic fluid dynamics. The model extends the convective variational principle for multi-fluid systems to account for a range of dissipation channels. The key ingredients in the construction are (i) the use of a lower dimensional matter space for each fluid component, and (ii) an extended functional dependence for the associated volume forms. In an effort to make the concepts clear, the formalism is developed step-by-step with model examples considered at each level. Thus we consider a model for heat flow, derive the relativistic Navier-Stokes equations and discuss why the individual dissipative stress tensors need not be spacetime symmetric. We argue that the new formalism, which notably does not involve an expansion away from an assumed equilibrium state, provides a conceptual breakthrough in this area of research. We also provide an ambitious list of directions in which one may want to extend it in the future. This involves an exciting set of problems, relating to both applications and foundational issues.

  18. Prediction of Metabolic Flux Distribution from Gene Expression Data Based on the Flux Minimization Principle

    DTIC Science & Technology

    2014-11-14

    expression data. The proposed algorithm minimizes a weighted sum of flux magnitudes, while biomass production can be bounded to fit an ample range from...approach to investigate metabolism and metabolic processes is to analyze the flow of material and energy through a metabolic network. In particular, the...maximizing a certain fitness function (typically, biomass production) and estimates the flux distribution by solving a linear programming (LP

  19. A minimization principle for the description of modes associated with finite-time instabilities

    PubMed Central

    Babaee, H.

    2016-01-01

    We introduce a minimization formulation for the determination of a finite-dimensional, time-dependent, orthonormal basis that captures directions of the phase space associated with transient instabilities. While these instabilities have finite lifetime, they can play a crucial role either by altering the system dynamics through the activation of other instabilities or by creating sudden nonlinear energy transfers that lead to extreme responses. However, their essentially transient character makes their description a particularly challenging task. We develop a minimization framework that focuses on the optimal approximation of the system dynamics in the neighbourhood of the system state. This minimization formulation results in differential equations that evolve a time-dependent basis so that it optimally approximates the most unstable directions. We demonstrate the capability of the method for two families of problems: (i) linear systems, including the advection–diffusion operator in a strongly non-normal regime as well as the Orr–Sommerfeld/Squire operator, and (ii) nonlinear problems, including a low-dimensional system with transient instabilities and the vertical jet in cross-flow. We demonstrate that the time-dependent subspace captures the strongly transient non-normal energy growth (in the short-time regime), while for longer times the modes capture the expected asymptotic behaviour. PMID:27118900

  20. Quantum statistical entropy and minimal length of 5D Ricci-flat black string with generalized uncertainty principle

    SciTech Connect

    Liu Molin; Gui Yuanxing; Liu Hongya

    2008-12-15

    In this paper, we study the quantum statistical entropy in a 5D Ricci-flat black string solution, which contains a 4D Schwarzschild-de Sitter black hole on the brane, by using the improved thin-layer method with the generalized uncertainty principle. The entropy is the linear sum of the areas of the event horizon and the cosmological horizon without any cutoff and any constraint on the bulk's configuration rather than the usual uncertainty principle. The system's density of state and free energy are convergent in the neighborhood of horizon. The small-mass approximation is determined by the asymptotic behavior of metric function near horizons. Meanwhile, we obtain the minimal length of the position {delta}x, which is restrained by the surface gravities and the thickness of layer near horizons.

  1. WeBSurg: An innovative educational Web site in minimally invasive surgery--principles and results.

    PubMed

    Mutter, Didier; Vix, Michel; Dallemagne, Bernard; Perretta, Silvana; Leroy, Joël; Marescaux, Jacques

    2011-03-01

    Internet has dramatically changed clinical practice and information sharing among the surgical community and has revolutionized the access to surgical education. High-speed Internet broadcasting allows display of high-quality high-definition full-screen videos. Herein, Internet access to surgical procedures plays a major role in continuing medical education (CME). The WeBSurg Web site is a virtual surgical university dedicated to post-graduate education in minimally invasive surgery. Its results measured through its members, number of visitors coming from 213 different countries, as well as the amount of data transmitted through the provider LimeLight, confirm that WeBSurg appears as the first Web site in surgical CME. The Internet offers a tailored education for all levels of surgical expertise as well as for all types of Internet access. This represents a global multimedia solution at the cutting edge of technology and surgical evolution, which responds to the modern ethos of "always, anywhere, anytime."

  2. Minimal metabolic engineering of Saccharomyces cerevisiae for efficient anaerobic xylose fermentation: a proof of principle.

    PubMed

    Kuyper, Marko; Winkler, Aaron A; van Dijken, Johannes P; Pronk, Jack T

    2004-03-01

    When xylose metabolism in yeasts proceeds exclusively via NADPH-specific xylose reductase and NAD-specific xylitol dehydrogenase, anaerobic conversion of the pentose to ethanol is intrinsically impossible. When xylose reductase has a dual specificity for both NADPH and NADH, anaerobic alcoholic fermentation is feasible but requires the formation of large amounts of polyols (e.g., xylitol) to maintain a closed redox balance. As a result, the ethanol yield on xylose will be sub-optimal. This paper demonstrates that anaerobic conversion of xylose to ethanol, without substantial by-product formation, is possible in Saccharomyces cerevisiae when a heterologous xylose isomerase (EC 5.3.1.5) is functionally expressed. Transformants expressing the XylA gene from the anaerobic fungus Piromyces sp. E2 (ATCC 76762) grew in synthetic medium in shake-flask cultures on xylose with a specific growth rate of 0.005 h(-1). After prolonged cultivation on xylose, a mutant strain was obtained that grew aerobically and anaerobically on xylose, at specific growth rates of 0.18 and 0.03 h(-1), respectively. The anaerobic ethanol yield was 0.42 g ethanol x g xylose(-1) and also by-product formation was comparable to that of glucose-grown anaerobic cultures. These results illustrate that only minimal genetic engineering is required to recruit a functional xylose metabolic pathway in Saccharomyces cerevisiae. Activities and/or regulatory properties of native S. cerevisiae gene products can subsequently be optimised via evolutionary engineering. These results provide a gateway towards commercially viable ethanol production from xylose with S. cerevisiae.

  3. Unravelling the fundamentals of thermal and chemical expansion of BaCeO3 from first principles phonon calculations.

    PubMed

    Løken, Andreas; Haugsrud, Reidar; Bjørheim, Tor S

    2016-11-16

    Differentiating chemical and thermal expansion is virtually impossible to achieve experimentally. While thermal expansion stems from a softening of the phonon spectra, chemical expansion depends on the chemical composition of the material. In the present contribution, we, for the first time, completely decouple thermal and chemical expansion through first principles phonon calculations on BaCeO3, providing new fundamental insights to lattice expansion. We assess the influence of defects on thermal expansion, and how this in turn affects the interpretation of chemical expansion and defect thermodynamics. The calculations reveal that the linear thermal expansion coefficient is lowered by the introduction of oxygen vacancies being 10.6 × 10(-6) K(-1) at 300 K relative to 12.2 × 10(-6) K(-1) for both the protonated and defect-free bulk lattice. We further demonstrate that the chemical expansion coefficient upon hydration varies with temperature, ranging from 0.070 to 0.115 per mole oxygen vacancy. Ultimately, we find that, due to differences in the thermal expansion coefficients under dry and wet conditions, the chemical expansion coefficients determined experimentally are grossly underestimated - around 55% lower in the case of 10 mol% acceptor doped BaCeO3. Lastly, we evaluate the effect of these volume changes on the vibrational thermodynamics.

  4. Headache in a high school student - a reminder of fundamental principles of clinical medicine and common pitfalls of cognition.

    PubMed

    Afghan, Zakira; Hussain, Abid; Asim, Muhammad

    2015-01-01

    Primary headache disorders account for the majority of the cases of headache. Nevertheless, the primary objective of a physician, when encountered with a patient with headache is to rule out a secondary cause the headache. This entails a search for specific associated red-flag symptoms or signs that may indicate a serious condition, as well as a heightened suspicion of and evaluation for a don't miss diagnosis. We present a case of a high-school student whose first manifestation of systemic lupus erythematosus (SLE) was a headache due to cerebral venous and sinus thrombosis, initially misdiagnosed as tension-headache and 'ophthalmoplegic migraine' (now known as 'recurrent painful ophthalmoplegic neuropathy'). The patient made a complete neurological and radiological recovery after systemic anticoagulation and treatment of SLE. An analysis of the clinical errors and cognitive biases leading to delayed referral to hospital is presented. We highlight the fact that adherence to the fundamental principles of clinical medicine and enhancement of cognitive awareness is required to reduce diagnostic errors.

  5. Dielectric properties of agricultural products – fundamental principles, influencing factors, and measurement technirques. Chapter 4. Electrotechnologies for Food Processing: Book Series. Volume 3. Radio-Frequency Heating

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In this chapter, definitions of dielectric properties, or permittivity, of materials and a brief discussion of the fundamental principles governing their behavior with respect to influencing factors are presented. The basic physics of the influence of frequency of the electric fields and temperatur...

  6. Fundamentals of the route theory for satellite constellation design for Earth discontinuous coverage. Part 3: Low-cost Earth observation with minimal satellite swath

    NASA Astrophysics Data System (ADS)

    Razoumny, Yury N.

    2016-12-01

    Continuing the series of papers with description of the fundamentals of the Route Theory for satellite constellation design, the general method for minimization of the satellite swath width required under given constraint on the maximum revisit time (MRT), the main quality characteristic of the satellite constellation discontinuous coverage, is presented. The interrelation between MRT and multiplicity of the periodic coverage - the minimum number of the observation sessions realized for the points of observation region during the satellite tracks' repetition period - is revealed and described. In particular, it is shown that a change of MRT can occur only at points of coverage multiplicity changing. Basic elements of multifold Earth coverage theory are presented and used for obtaining analytical relations for the minimum swath width providing given multifold coverage. The satellite swath width calculation procedure for the multifold coverage of rotating Earth using the iterations on the sphere of stationary coverage is developed. The numerical results for discontinuous coverage with minimal satellite swath, including comparison with some known particular cases and implementations of the method, are presented.

  7. Lessons that Bear Repeating and Repeating that Bears Lessons: An Interdisciplinary Unit on Principles of Minimalism in Modern Music, Art, and Poetry (Grades 4-8)

    ERIC Educational Resources Information Center

    Smigel, Eric; McDonald, Nan L.

    2012-01-01

    This theory-to-practice article focuses on interdisciplinary classroom activities based on principles of minimalism in modern music, art, and poetry. A lesson sequence was designed for an inner-city Grades 4 and 5 general classroom of English language learners, where the unit was taught, assessed, and documented by the authors. Included in the…

  8. [Human rights and genetics: the fundamental principles of the Universal Declaration on the Human Genome and Human Rights].

    PubMed

    Bergel, S D

    1998-01-01

    The Universal Declaration on the Human Genome and Human Rights sets out generally agreed criteria in response to the human rights challenges posed by advances in molecular biology and genetics. The lynchpin of these criteria is respect for human dignity, a premise from which other principles are derived. The author examines and gives the justification for these principles, and refers to another crucial bioethics text, the recent Council of Europe Convention on the protection of human rights and the dignity of the human person in regard to applications of biology and medicine.

  9. Breakdown minimization principle versus Wardrop's equilibria for dynamic traffic assignment and control in traffic and transportation networks: A critical mini-review

    NASA Astrophysics Data System (ADS)

    Kerner, Boris S.

    2017-01-01

    We review physical results of applications of the breakdown minimization (BM) principle versus applications of the classical Wardrop's equilibria (Wardrop's user equilibrium (UE) and system optimum (SO)) for dynamic traffic assignment and control in traffic and transportation networks. It is shown that depending on the total network inflow rate there are two different applications of the BM principle: (i) The network throughput maximization approach that maximizes the network throughput ensuring free flow conditions in the network. (ii) The minimization of the network breakdown probability at relatively large network inflow rates. Probabilistic features of the application of the BM principle are studied. We have found that when the application of the BM principle cannot prevent traffic breakdown in the network, nevertheless, a combination of the application of the BM principle with dynamic control of traffic breakdown at network bottlenecks can lead to the dissolution of traffic congestion. We show that applications of the classical Wardrop's equilibria for dynamic traffic assignment deteriorate basically the traffic system in networks.

  10. The biological default state of cell proliferation with variation and motility, a fundamental principle for a theory of organisms.

    PubMed

    Soto, Ana M; Longo, Giuseppe; Montévil, Maël; Sonnenschein, Carlos

    2016-10-01

    The principle of inertia is central to the modern scientific revolution. By postulating this principle Galileo at once identified a pertinent physical observable (momentum) and a conservation law (momentum conservation). He then could scientifically analyze what modifies inertial movement: gravitation and friction. Inertia, the default state in mechanics, represented a major theoretical commitment: there is no need to explain uniform rectilinear motion, rather, there is a need to explain departures from it. By analogy, we propose a biological default state of proliferation with variation and motility. From this theoretical commitment, what requires explanation is proliferative quiescence, lack of variation, lack of movement. That proliferation is the default state is axiomatic for biologists studying unicellular organisms. Moreover, it is implied in Darwin's "descent with modification". Although a "default state" is a theoretical construct and a limit case that does not need to be instantiated, conditions that closely resemble unrestrained cell proliferation are readily obtained experimentally. We will illustrate theoretical and experimental consequences of applying and of ignoring this principle.

  11. How to increase treatment effectiveness and efficiency in psychiatry: creative psychopharmacotherapy - part 1: definition, fundamental principles and higher effectiveness polypharmacy.

    PubMed

    Jakovljević, Miro

    2013-09-01

    Psychopharmacotherapy is a fascinating field that can be understood in many different ways. It is both a science and an art of communication with a heavily subjective dimension. The advent of a significant number of the effective and well tolerated mental health medicines during and after 1990s decade of the brain has increased our possibilities to treat major mental disorders in more successful ways with much better treatment outcome including full recovery. However, there is a huge gap between our possibilities for achieving high treatment effectiveness and not satisfying results in day-to-day clinical practice. Creative approach to psychopharmacotherapy could advance everyday clinical practice and bridge the gap. Creative psychopharmacotherapy is a concept that incorporates creativity as its fundamental tool. Creativity involves the intention and ability to transcend limiting traditional ideas, rules, patterns and relationships and to create meaningful new ideas, interpretations, contexts and methods in clinical psychopharmacology.

  12. First-principle calculations of the fundamental properties of CuBrxI1-x ternary alloy

    NASA Astrophysics Data System (ADS)

    Touam, S.; Boukhtouta, M.; Hamioud, L.; Ghemid, S.; Meradji, H.; El Haj Hassan, F.

    2015-11-01

    Ab initio full-potential linearised augmented plane wave (FP-LAPW) method within density functional theory is applied to study the effect of composition on the structural, electronic and thermodynamic properties of CuBrxI1-x ternary alloy. The structural properties at equilibrium are investigated by using the new form of generalised gradient approximations that are based on the optimisation of total energy. For band structure calculations, both Engel-Vosko and modified Becke-Johnson of the exchange-correlation energy and potential, respectively, are used. Deviation of the lattice constants from Vegard's law and the bulk modulus from linear concentration dependence are observed. The microscopic origins of the gap bowing were explained by using the approach of Zunger and co-workers. On the other hand, the thermodynamic stability of this alloy was investigated by calculating the excess enthalpy of mixing ∆Hm as well as the phase diagram by calculating the critical temperatures. A numerical first-principle calculations of the elastic constants as function of pressure is used to calculate C11, C12 and C44.

  13. Encapsulation of C60 fullerenes into single-walled carbon nanotubes: Fundamental mechanical principles and conventional applied mathematical modeling

    NASA Astrophysics Data System (ADS)

    Baowan, Duangkamon; Thamwattana, Ngamta; Hill, James M.

    2007-10-01

    A well-known self-assembled hybrid carbon nanostructure is a nanopeapod which may be regarded as the prototype nanocarrier for drug delivery. While the investigation of the packing of C60 molecules inside a carbon nanotube is usually achieved through either experimentation or large scale computation, this paper adopts elementary mechanical principles and classical applied mathematical modeling techniques to formulate explicit analytical criteria and ideal model behavior for such encapsulation. In particular, we employ the Lennard-Jones potential and the continuum approximation to determine three encapsulation mechanisms for a C60 fullerene entering a tube: (i) through the tube open end (head-on), (ii) around the edge of the tube open end, and (iii) through a defect opening on the tube wall. These three encapsulation mechanisms are undertaken for each of the three specific carbon nanotubes (10,10), (16,16), and (20,20). We assume that all configurations are in vacuum and the C60 fullerene is initially at rest. Double integrals are performed to determine the energy of the system and analytical expressions are obtained in terms of hypergeometric functions. Our results suggest that the C60 fullerene is most likely to be encapsulated by head-on through the open tube end and that encapsulation around the tube edge is least likely to occur because of the large van der Waals energy barriers which exist at the tube ends.

  14. A review of the fundamentals of polymer-modified asphalts: Asphalt/polymer interactions and principles of compatibility.

    PubMed

    Polacco, Giovanni; Filippi, Sara; Merusi, Filippo; Stastna, George

    2015-10-01

    During the last decades, the number of vehicles per citizen as well as the traffic speed and load has dramatically increased. This sudden and somehow unplanned overloading has strongly shortened the life of pavements and increased its cost of maintenance and risks to users. In order to limit the deterioration of road networks, it is necessary to improve the quality and performance of pavements, which was achieved through the addition of a polymer to the bituminous binder. Since their introduction, polymer-modified asphalts have gained in importance during the second half of the twentieth century, and they now play a fundamental role in the field of road paving. With high-temperature and high-shear mixing with asphalt, the polymer incorporates asphalt molecules, thereby forming a swallowed network that involves the entire binder and results in a significant improvement of the viscoelastic properties in comparison with those of the unmodified binder. Such a process encounters the well-known difficulties related to the poor solubility of polymers, which limits the number of macromolecules able to not only form such a structure but also maintain it during high-temperature storage in static conditions, which may be necessary before laying the binder. Therefore, polymer-modified asphalts have been the subject of numerous studies aimed to understand and optimize their structure and storage stability, which gradually attracted polymer scientists into this field that was initially explored by civil engineers. The analytical techniques of polymer science have been applied to polymer-modified asphalts, which resulted in a good understanding of their internal structure. Nevertheless, the complexity and variability of asphalt composition rendered it nearly impossible to generalize the results and univocally predict the properties of a given polymer/asphalt pair. The aim of this paper is to review these aspects of polymer-modified asphalts. Together with a brief description of

  15. Fundamental ecology is fundamental.

    PubMed

    Courchamp, Franck; Dunne, Jennifer A; Le Maho, Yvon; May, Robert M; Thébaud, Christophe; Hochberg, Michael E

    2015-01-01

    The primary reasons for conducting fundamental research are satisfying curiosity, acquiring knowledge, and achieving understanding. Here we develop why we believe it is essential to promote basic ecological research, despite increased impetus for ecologists to conduct and present their research in the light of potential applications. This includes the understanding of our environment, for intellectual, economical, social, and political reasons, and as a major source of innovation. We contend that we should focus less on short-term, objective-driven research and more on creativity and exploratory analyses, quantitatively estimate the benefits of fundamental research for society, and better explain the nature and importance of fundamental ecology to students, politicians, decision makers, and the general public. Our perspective and underlying arguments should also apply to evolutionary biology and to many of the other biological and physical sciences.

  16. Fundamentals of Diesel Engines.

    ERIC Educational Resources Information Center

    Marine Corps Inst., Washington, DC.

    This student guide, one of a series of correspondence training courses designed to improve the job performance of members of the Marine Corps, deals with the fundamentals of diesel engine mechanics. Addressed in the three individual units of the course are the following topics: basic principles of diesel mechanics; principles, mechanics, and…

  17. Buridan's Principle

    NASA Astrophysics Data System (ADS)

    Lamport, Leslie

    2012-08-01

    Buridan's principle asserts that a discrete decision based upon input having a continuous range of values cannot be made within a bounded length of time. It appears to be a fundamental law of nature. Engineers aware of it can design devices so they have an infinitessimal probability of not making a decision quickly enough. Ignorance of the principle could have serious consequences.

  18. Understanding the Fundamental Principles Underlying the Survival and Efficient Recovery of Multi-Scale Techno-Social Networks Following a WMD Event (A)

    DTIC Science & Technology

    2016-07-01

    This model has been used to provide estimates for the growth of the epidemic in West Africa, the probability and expected number of Ebola virus...vol. 114, no. 15, p. 158701, 2015. Abstract: Based on Jaynes’s maximum entropy principle, exponential random graphs provide a family of principled...curtailing the outbreak. Results: We model the short-term growth rate of the disease in the affected West African countries and estimate the basic

  19. Use of minimal invasive extracorporeal circulation in cardiac surgery: principles, definitions and potential benefits. A position paper from the Minimal invasive Extra-Corporeal Technologies international Society (MiECTiS).

    PubMed

    Anastasiadis, Kyriakos; Murkin, John; Antonitsis, Polychronis; Bauer, Adrian; Ranucci, Marco; Gygax, Erich; Schaarschmidt, Jan; Fromes, Yves; Philipp, Alois; Eberle, Balthasar; Punjabi, Prakash; Argiriadou, Helena; Kadner, Alexander; Jenni, Hansjoerg; Albrecht, Guenter; van Boven, Wim; Liebold, Andreas; de Somer, Fillip; Hausmann, Harald; Deliopoulos, Apostolos; El-Essawi, Aschraf; Mazzei, Valerio; Biancari, Fausto; Fernandez, Adam; Weerwind, Patrick; Puehler, Thomas; Serrick, Cyril; Waanders, Frans; Gunaydin, Serdar; Ohri, Sunil; Gummert, Jan; Angelini, Gianni; Falk, Volkmar; Carrel, Thierry

    2016-05-01

    Minimal invasive extracorporeal circulation (MiECC) systems have initiated important efforts within science and technology to further improve the biocompatibility of cardiopulmonary bypass components to minimize the adverse effects and improve end-organ protection. The Minimal invasive Extra-Corporeal Technologies international Society was founded to create an international forum for the exchange of ideas on clinical application and research of minimal invasive extracorporeal circulation technology. The present work is a consensus document developed to standardize the terminology and the definition of minimal invasive extracorporeal circulation technology as well as to provide recommendations for the clinical practice. The goal of this manuscript is to promote the use of MiECC systems into clinical practice as a multidisciplinary strategy involving cardiac surgeons, anaesthesiologists and perfusionists.

  20. Minimal Pairs: Minimal Importance?

    ERIC Educational Resources Information Center

    Brown, Adam

    1995-01-01

    This article argues that minimal pairs do not merit as much attention as they receive in pronunciation instruction. There are other aspects of pronunciation that are of greater importance, and there are other ways of teaching vowel and consonant pronunciation. (13 references) (VWL)

  1. Fundamentals of fluid lubrication

    NASA Technical Reports Server (NTRS)

    Hamrock, Bernard J.

    1991-01-01

    The aim is to coordinate the topics of design, engineering dynamics, and fluid dynamics in order to aid researchers in the area of fluid film lubrication. The lubrication principles that are covered can serve as a basis for the engineering design of machine elements. The fundamentals of fluid film lubrication are presented clearly so that students that use the book will have confidence in their ability to apply these principles to a wide range of lubrication situations. Some guidance on applying these fundamentals to the solution of engineering problems is also provided.

  2. Homeschooling and Religious Fundamentalism

    ERIC Educational Resources Information Center

    Kunzman, Robert

    2010-01-01

    This article considers the relationship between homeschooling and religious fundamentalism by focusing on their intersection in the philosophies and practices of conservative Christian homeschoolers in the United States. Homeschooling provides an ideal educational setting to support several core fundamentalist principles: resistance to…

  3. Fundamentals of Electromagnetic Phenomena

    NASA Astrophysics Data System (ADS)

    Lorrain, Paul; Corson, Dale R.; Lorrain, Francois

    Based on the classic Electromagnetic Fields and Waves by the same authors, Fundamentals of Electromagnetic Phenomena capitalizes on the older text's traditional strengths--solid physics, inventive problems, and an experimental approach--while offering a briefer, more accessible introduction to the basic principles of electromagnetism.

  4. Minimal Length Scale Scenarios for Quantum Gravity.

    PubMed

    Hossenfelder, Sabine

    2013-01-01

    We review the question of whether the fundamental laws of nature limit our ability to probe arbitrarily short distances. First, we examine what insights can be gained from thought experiments for probes of shortest distances, and summarize what can be learned from different approaches to a theory of quantum gravity. Then we discuss some models that have been developed to implement a minimal length scale in quantum mechanics and quantum field theory. These models have entered the literature as the generalized uncertainty principle or the modified dispersion relation, and have allowed the study of the effects of a minimal length scale in quantum mechanics, quantum electrodynamics, thermodynamics, black-hole physics and cosmology. Finally, we touch upon the question of ways to circumvent the manifestation of a minimal length scale in short-distance physics.

  5. A Reassessment of George Pierce Baker's "The Principles of Argumentation": Minimizing the Use of Formal Logic in Favor of Practical Approaches

    ERIC Educational Resources Information Center

    Bordelon, Suzanne

    2006-01-01

    In this article, the author demonstrated how recent histories relied primarily on previous accounts and one textbook to characterize George Pierce Baker's work. This narrow assessment of "The Principles of Argumentation" limits one's understanding of his contribution to argumentation theory and pedagogy. Similarly, one has seen the need for care…

  6. Tether fundamentals

    NASA Technical Reports Server (NTRS)

    Carroll, J. A.

    1986-01-01

    Some fundamental aspects of tethers are presented and briefly discussed. The effects of gravity gradients, dumbbell libration in circular orbits, tether control strategies and impact hazards for tethers are among those fundamentals. Also considered are aerodynamic drag, constraints in momentum transfer applications and constraints with permanently deployed tethers. The theoretical feasibility of these concepts are reviewed.

  7. Fundamental Reaction Pathway for Peptide Metabolism by Proteasome: Insights from First-principles Quantum Mechanical/Molecular Mechanical Free Energy Calculations

    PubMed Central

    Wei, Donghui; Fang, Lei; Tang, Mingsheng; Zhan, Chang-Guo

    2013-01-01

    Proteasome is the major component of the crucial nonlysosomal protein degradation pathway in the cells, but the detailed reaction pathway is unclear. In this study, first-principles quantum mechanical/molecular mechanical free energy calculations have been performed to explore, for the first time, possible reaction pathways for proteasomal proteolysis/hydrolysis of a representative peptide, succinyl-leucyl-leucyl-valyl-tyrosyl-7-amino-4-methylcoumarin (Suc-LLVY-AMC). The computational results reveal that the most favorable reaction pathway consists of six steps. The first is a water-assisted proton transfer within proteasome, activating Thr1-Oγ. The second is a nucleophilic attack on the carbonyl carbon of a Tyr residue of substrate by the negatively charged Thr1-Oγ, followed by the dissociation of the amine AMC (third step). The fourth step is a nucleophilic attack on the carbonyl carbon of the Tyr residue of substrate by a water molecule, accompanied by a proton transfer from the water molecule to Thr1-Nz. Then, Suc-LLVY is dissociated (fifth step), and Thr1 is regenerated via a direct proton transfer from Thr1-Nz to Thr1-Oγ. According to the calculated energetic results, the overall reaction energy barrier of the proteasomal hydrolysis is associated with the transition state (TS3b) for the third step involving a water-assisted proton transfer. The determined most favorable reaction pathway and the rate-determining step have provided a reasonable interpretation of the reported experimental observations concerning the substituent and isotopic effects on the kinetics. The calculated overall free energy barrier of 18.2 kcal/mol is close to the experimentally-derived activation free energy of ~18.3–19.4 kcal/mol, suggesting that the computational results are reasonable. PMID:24111489

  8. Minimal cosmography

    NASA Astrophysics Data System (ADS)

    Piazza, Federico; Schücker, Thomas

    2016-04-01

    The minimal requirement for cosmography—a non-dynamical description of the universe—is a prescription for calculating null geodesics, and time-like geodesics as a function of their proper time. In this paper, we consider the most general linear connection compatible with homogeneity and isotropy, but not necessarily with a metric. A light-cone structure is assigned by choosing a set of geodesics representing light rays. This defines a "scale factor" and a local notion of distance, as that travelled by light in a given proper time interval. We find that the velocities and relativistic energies of free-falling bodies decrease in time as a consequence of cosmic expansion, but at a rate that can be different than that dictated by the usual metric framework. By extrapolating this behavior to photons' redshift, we find that the latter is in principle independent of the "scale factor". Interestingly, redshift-distance relations and other standard geometric observables are modified in this extended framework, in a way that could be experimentally tested. An extremely tight constraint on the model, however, is represented by the blackbody-ness of the cosmic microwave background. Finally, as a check, we also consider the effects of a non-metric connection in a different set-up, namely, that of a static, spherically symmetric spacetime.

  9. Fundamentals of petroleum maps

    SciTech Connect

    Mc Elroy, D.P.

    1986-01-01

    It's a complete guide to the fundamentals of reading, using, and making petroleum maps. The topics covered are well spotting, lease posting, contouring, hanging cross sections, and ink drafting. This book not only tells the how of petroleum mapping, but it also tells the why to better understand the principles and techniques. The books does not teach ''drafting,'' but does describe the proper care and use of drafting equipment for those who are totally new to the task.

  10. Monte Carlo fundamentals

    SciTech Connect

    Brown, F.B.; Sutton, T.M.

    1996-02-01

    This report is composed of the lecture notes from the first half of a 32-hour graduate-level course on Monte Carlo methods offered at KAPL. These notes, prepared by two of the principle developers of KAPL`s RACER Monte Carlo code, cover the fundamental theory, concepts, and practices for Monte Carlo analysis. In particular, a thorough grounding in the basic fundamentals of Monte Carlo methods is presented, including random number generation, random sampling, the Monte Carlo approach to solving transport problems, computational geometry, collision physics, tallies, and eigenvalue calculations. Furthermore, modern computational algorithms for vector and parallel approaches to Monte Carlo calculations are covered in detail, including fundamental parallel and vector concepts, the event-based algorithm, master/slave schemes, parallel scaling laws, and portability issues.

  11. Predicting impurity gases and phases during hydrogen evolution from complex metal hydrides using free energy minimization enabled by first-principles calculations.

    PubMed

    Kim, Ki Chul; Allendorf, Mark D; Stavila, Vitalie; Sholl, David S

    2010-09-07

    First-principles calculations represent a potent tool for screening metal hydride mixtures that can reversibly store hydrogen. A number of promising new hydride systems with high hydrogen capacity and favorable thermodynamics have been predicted this way. An important limitation of these studies, however, is the assumption that H(2) is the only gas-phase product of the reaction, which is not always the case. This paper summarizes new theoretical and numerical approaches that can be used to predict thermodynamic equilibria in complex metal hydride systems with competing reaction pathways. We report thermochemical equilibrium calculations using data obtained from density functional theory (DFT) computations to describe the possible occurrence of gas-phase products other than H(2) in three complex hydrides, LiNH(2), LiBH(4), and Mg(BH(4))(2), and mixtures of these with the destabilizing compounds LiH, MgH(2), and C. The systems under investigation contain N, C, and/or B and thus have the potential to evolve N(2), NH(3), hydrocarbons, and/or boranes as well as H(2). Equilibria as a function of both temperature and total pressure are predicted. The results indicate that significant amounts of these species can form under some conditions. In particular, the thermodynamic model predicts formation of N(2) and NH(3) as products of LiNH(2) decomposition. Comparison with published experimental data indicates that N(2) formation must be kinetically limited. Our examination of C-containing systems indicates that methane is the stable gas-phase species at low temperatures, not H(2). On the other hand, very low amounts of boranes (primarily BH(3)) are predicted to form in B-containing systems.

  12. A Yoga Strengthening Program Designed to Minimize the Knee Adduction Moment for Women with Knee Osteoarthritis: A Proof-Of-Principle Cohort Study

    PubMed Central

    2015-01-01

    People with knee osteoarthritis may benefit from exercise prescriptions that minimize knee loads in the frontal plane. The primary objective of this study was to determine whether a novel 12-week strengthening program designed to minimize exposure to the knee adduction moment (KAM) could improve symptoms and knee strength in women with symptomatic knee osteoarthritis. A secondary objective was to determine whether the program could improve mobility and fitness, and decrease peak KAM during gait. The tertiary objective was to evaluate the biomechanical characteristics of this yoga program. In particular, we compared the peak KAM during gait with that during yoga postures at baseline. We also compared lower limb normalized mean electromyography (EMG) amplitudes during yoga postures between baseline and follow-up. Primary measures included self-reported pain and physical function (Knee injury and Osteoarthritis Outcome Score) and knee strength (extensor and flexor torques). Secondary measures included mobility (six-minute walk, 30-second chair stand, stair climbing), fitness (submaximal cycle ergometer test), and clinical gait analysis using motion capture synchronized with electromyography and force measurement. Also, KAM and normalized mean EMG amplitudes were collected during yoga postures. Forty-five women over age 50 with symptomatic knee osteoarthritis, consistent with the American College of Rheumatology criteria, enrolled in our 12-week (3 sessions per week) program. Data from 38 were analyzed (six drop-outs; one lost to co-intervention). Participants experienced reduced pain (mean improvement 10.1–20.1 normalized to 100; p<0.001), increased knee extensor strength (mean improvement 0.01 Nm/kg; p = 0.004), and increased flexor strength (mean improvement 0.01 Nm/kg; p = 0.001) at follow-up compared to baseline. Participants improved mobility on the six-minute walk (mean improvement 37.7 m; p<0.001) and 30-second chair stand (mean improvement 1.3; p = 0.006) at

  13. Marketing fundamentals.

    PubMed

    Redmond, W H

    2001-01-01

    This chapter outlines current marketing practice from a managerial perspective. The role of marketing within an organization is discussed in relation to efficiency and adaptation to changing environments. Fundamental terms and concepts are presented in an applied context. The implementation of marketing plans is organized around the four P's of marketing: product (or service), promotion (including advertising), place of delivery, and pricing. These are the tools with which marketers seek to better serve their clients and form the basis for competing with other organizations. Basic concepts of strategic relationship management are outlined. Lastly, alternate viewpoints on the role of advertising in healthcare markets are examined.

  14. Does osteoderm growth follow energy minimization principles?

    PubMed

    Sensale, Sebastián; Jones, Washington; Blanco, R Ernesto

    2014-08-01

    Although the growth and development of tissues and organs of extinct species cannot be directly observed, their fossils can record and preserve evidence of these mechanisms. It is generally accepted that bone architecture is the result of genetically based biomechanical constraints, but what about osteoderms? In this article, the influence of physical constraints on cranial osteoderms growth is assessed. Comparisons among lepidosaurs, synapsids, and archosaurs are performed; according to these analyses, lepidosaur osteoderms growth is predicted to be less energy demanding than that of synapsids and archosaurs. Obtained results also show that, from an energetic viewpoint, ankylosaurid osteoderms growth resembles more that of mammals than the one of reptilians, adding evidence to debate whether dinosaurs were hot or cold blooded.

  15. Commentary: Minimizing Evaluation Misuse as Principled Practice

    ERIC Educational Resources Information Center

    Cousins, J. Bradley

    2004-01-01

    "Ethical Challenges," in my experience, is invariably interesting, often instructive and sometimes amusing. Some of the most engaging stimulus scenarios raise thorny evaluation practice issues that ultimately lead to disparate points of view about the nature of the issue and how to handle it (Datta, 2002; Smith, 2002). Despite my poor performance…

  16. Field Theory of Fundamental Interactions

    NASA Astrophysics Data System (ADS)

    Wang, Shouhong; Ma, Tian

    2017-01-01

    First, we present two basic principles, the principle of interaction dynamics (PID) and the principle of representation invariance (PRI). Intuitively, PID takes the variation of the action under energy-momentum conservation constraint. We show that the PID is the requirement of the presence of dark matter and dark energy, the Higgs field and the quark confinement. PRI requires that the SU(N) gauge theory be independent of representations of SU(N). It is clear that PRI is the logic requirement of any gauge theory. With PRI, we demonstrate that the coupling constants for the strong and the weak interactions are the main sources of these two interactions, reminiscent of the electric charge. Second, we emphasize that symmetry principles-the principle of general relativity and the principle of Lorentz invariance and gauge invariance-together with the simplicity of laws of nature, dictate the actions for the four fundamental interactions. Finally, we show that the PID and the PRI, together with the symmetry principles give rise to a unified field model for the fundamental interactions, which is consistent with current experimental observations and offers some new physical predictions. The research is supported in part by the National Science Foundation (NSF) grant DMS-1515024, and by the Office of Naval Research (ONR) grant N00014-15-1-2662.

  17. Fundamentals of Environmental Education. Report.

    ERIC Educational Resources Information Center

    1976

    An outline of fundamental definitions, relationships, and human responsibilities related to environment provides a basis from which a variety of materials, programs, and activities can be developed. The outline can be used in elementary, secondary, higher education, or adult education programs. The framework is based on principles of the science…

  18. Minimal Reduplication

    ERIC Educational Resources Information Center

    Kirchner, Jesse Saba

    2010-01-01

    This dissertation introduces Minimal Reduplication, a new theory and framework within generative grammar for analyzing reduplication in human language. I argue that reduplication is an emergent property in multiple components of the grammar. In particular, reduplication occurs independently in the phonology and syntax components, and in both cases…

  19. The minimal work cost of information processing

    PubMed Central

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-01-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics. PMID:26151678

  20. Evolutionary principles and their practical application

    PubMed Central

    Hendry, Andrew P; Kinnison, Michael T; Heino, Mikko; Day, Troy; Smith, Thomas B; Fitt, Gary; Bergstrom, Carl T; Oakeshott, John; Jørgensen, Peter S; Zalucki, Myron P; Gilchrist, George; Southerton, Simon; Sih, Andrew; Strauss, Sharon; Denison, Robert F; Carroll, Scott P

    2011-01-01

    Evolutionary principles are now routinely incorporated into medicine and agriculture. Examples include the design of treatments that slow the evolution of resistance by weeds, pests, and pathogens, and the design of breeding programs that maximize crop yield or quality. Evolutionary principles are also increasingly incorporated into conservation biology, natural resource management, and environmental science. Examples include the protection of small and isolated populations from inbreeding depression, the identification of key traits involved in adaptation to climate change, the design of harvesting regimes that minimize unwanted life-history evolution, and the setting of conservation priorities based on populations, species, or communities that harbor the greatest evolutionary diversity and potential. The adoption of evolutionary principles has proceeded somewhat independently in these different fields, even though the underlying fundamental concepts are the same. We explore these fundamental concepts under four main themes: variation, selection, connectivity, and eco-evolutionary dynamics. Within each theme, we present several key evolutionary principles and illustrate their use in addressing applied problems. We hope that the resulting primer of evolutionary concepts and their practical utility helps to advance a unified multidisciplinary field of applied evolutionary biology. PMID:25567966

  1. Dynamic sealing principles

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function, these are presented and discussed. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin film seals, which enable leakage calculations to be made, are also presented. Concept of flow functions, application of Reynolds lubrication equation, and nonlubrication equation flow, friction and wear; and seal lubrication regimes are explained.

  2. Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory

    SciTech Connect

    Kerner, Boris S.

    2015-03-10

    It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliable used for control and optimization in traffic networks. It is shown that generally accepted fundamentals and methodologies of traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (for example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular stochastic value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of traffic and transportation theory, we discuss three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.

  3. Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory: A brief review

    NASA Astrophysics Data System (ADS)

    Kerner, Boris S.

    2013-11-01

    It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliably used for control and optimization in traffic networks. It is shown that the generally accepted fundamentals and methodologies of the traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of the traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (for example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular (fixed or stochastic) value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of the traffic and transportation theory, we discuss the three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.

  4. Testing Our Fundamental Assumptions

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-06-01

    Science is all about testing the things we take for granted including some of the most fundamental aspects of how we understand our universe. Is the speed of light in a vacuum the same for all photons regardless of their energy? Is the rest mass of a photon actually zero? A series of recent studies explore the possibility of using transient astrophysical sources for tests!Explaining Different Arrival TimesArtists illustration of a gamma-ray burst, another extragalactic transient, in a star-forming region. [NASA/Swift/Mary Pat Hrybyk-Keith and John Jones]Suppose you observe a distant transient astrophysical source like a gamma-ray burst, or a flare from an active nucleus and two photons of different energies arrive at your telescope at different times. This difference in arrival times could be due to several different factors, depending on how deeply you want to question some of our fundamental assumptions about physics:Intrinsic delayThe photons may simply have been emitted at two different times by the astrophysical source.Delay due to Lorentz invariance violationPerhaps the assumption that all massless particles (even two photons with different energies) move at the exact same velocity in a vacuum is incorrect.Special-relativistic delayMaybe there is a universal speed for massless particles, but the assumption that photons have zero rest mass is wrong. This, too, would cause photon velocities to be energy-dependent.Delay due to gravitational potentialPerhaps our understanding of the gravitational potential that the photons experience as they travel is incorrect, also causing different flight times for photons of different energies. This would mean that Einsteins equivalence principle, a fundamental tenet of general relativity (GR), is incorrect.If we now turn this problem around, then by measuring the arrival time delay between photons of different energies from various astrophysical sources the further away, the better we can provide constraints on these

  5. Periodic minimal surfaces

    NASA Astrophysics Data System (ADS)

    Mackay, Alan L.

    1985-04-01

    A minimal surface is one for which, like a soap film with the same pressure on each side, the mean curvature is zero and, thus, is one where the two principal curvatures are equal and opposite at every point. For every closed circuit in the surface, the area is a minimum. Schwarz1 and Neovius2 showed that elements of such surfaces could be put together to give surfaces periodic in three dimensions. These periodic minimal surfaces are geometrical invariants, as are the regular polyhedra, but the former are curved. Minimal surfaces are appropriate for the description of various structures where internal surfaces are prominent and seek to adopt a minimum area or a zero mean curvature subject to their topology; thus they merit more complete numerical characterization. There seem to be at least 18 such surfaces3, with various symmetries and topologies, related to the crystallographic space groups. Recently, glyceryl mono-oleate (GMO) was shown by Longley and McIntosh4 to take the shape of the F-surface. The structure postulated is shown here to be in good agreement with an analysis of the fundamental geometry of periodic minimal surfaces.

  6. The ZOOM minimization package

    SciTech Connect

    Fischler, Mark S.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  7. Contact mechanisms and design principles for Schottky contacts to group-III nitrides

    NASA Astrophysics Data System (ADS)

    Mohammad, S. Noor

    2005-03-01

    Contact mechanisms and design principles for Schottky contacts to group-III nitrides have been studied. These contacts, made generally by using simple principles and past experiences, suffer from serious drawbacks. The importance of various parameters such as surface morphology, surface treatment, metal/semiconductor interactions at the interface, thermal stability, minimization of doping by metal deposition and etching, elimination of edge electric field, etc., for them has been thoroughly investigated. Several design principles have been proposed. Both theoretical and experimental data have been presented to justify the validity of the proposed contact mechanisms and design principles. While theoretical calculations provide fundamental physics underlying heavy doping, leakage, etc., the experimental data provide verification of the contact mechanisms and design principles. The proposed principles are general enough to be applicable to most, if not all, Schottky contacts.

  8. Fundamentals of the Control of Gas-Turbine Power Plants for Aircraft. Part 2; Principles of Control Common to Jet, Turbine-Propeller Jet, and Ducted-Fan Jet Power Plants

    NASA Technical Reports Server (NTRS)

    Kuehl, H.

    1947-01-01

    After defining the aims and requirements to be set for a control system of gas-turbine power plants for aircraft, the report will deal with devices that prevent the quantity of fuel supplied per unit of time from exceeding the value permissible at a given moment. The general principles of the actuation of the adjustable parts of the power plant are also discussed.

  9. Towards a Minimal System for Cell Division

    NASA Astrophysics Data System (ADS)

    Schwille, Petra

    We have entered the "omics" era of the life sciences, meaning that our general knowledge about biological systems has become vast, complex, and almost impossible to fully comprehend. Consequently, the challenge for quantitative biology and biophysics is to identify appropriate procedures and protocols that allow the researcher to strip down the complexity of a biological system to a level that can be reliably modeled but still retains the essential features of its "real" counterpart. The virtue of physics has always been the reductionist approach, which allowed scientists to identify the underlying basic principles of seemingly complex phenomena, and subject them to rigorous mathematical treatment. Biological systems are obviously among the most complex phenomena we can think of, and it is fair to state that our rapidly increasing knowledge does not make it easier to identify a small set of fundamental principles of the big concept of "life" that can be defined and quantitatively understood. Nevertheless, it is becoming evident that only by tight cooperation and interdisciplinary exchange between the life sciences and quantitative sciences, and by applying intelligent reductionist approaches also to biology, will we be able to meet the intellectual challenges of the twenty-first century. These include not only the collection and proper categorization of the data, but also their true understanding and harnessing such that we can solve important practical problems imposed by medicine or the worldwide need for new energy sources. Many of these approaches are reflected by the modern buzz word "synthetic biology", therefore I briefly discuss this term in the first section. Further, I outline some endeavors of our and other groups to model minimal biological systems, with particular focus on the possibility of generating a minimal system for cell division.

  10. GRBs and Fundamental Physics

    NASA Astrophysics Data System (ADS)

    Petitjean, Patrick; Wang, F. Y.; Wu, X. F.; Wei, J. J.

    2016-12-01

    Gamma-ray bursts (GRBs) are short and intense flashes at the cosmological distances, which are the most luminous explosions in the Universe. The high luminosities of GRBs make them detectable out to the edge of the visible universe. So, they are unique tools to probe the properties of high-redshift universe: including the cosmic expansion and dark energy, star formation rate, the reionization epoch and the metal evolution of the Universe. First, they can be used to constrain the history of cosmic acceleration and the evolution of dark energy in a redshift range hardly achievable by other cosmological probes. Second, long GRBs are believed to be formed by collapse of massive stars. So they can be used to derive the high-redshift star formation rate, which can not be probed by current observations. Moreover, the use of GRBs as cosmological tools could unveil the reionization history and metal evolution of the Universe, the intergalactic medium (IGM) properties and the nature of first stars in the early universe. But beyond that, the GRB high-energy photons can be applied to constrain Lorentz invariance violation (LIV) and to test Einstein's Equivalence Principle (EEP). In this paper, we review the progress on the GRB cosmology and fundamental physics probed by GRBs.

  11. Fundamentals of Cryogenics

    NASA Technical Reports Server (NTRS)

    Johnson, Wesley; Tomsik, Thomas; Moder, Jeff

    2014-01-01

    Analysis of the extreme conditions that are encountered in cryogenic systems requires the most effort out of analysts and engineers. Due to the costs and complexity associated with the extremely cold temperatures involved, testing is sometimes minimized and extra analysis is often relied upon. This short course is designed as an introduction to cryogenic engineering and analysis, and it is intended to introduce the basic concepts related to cryogenic analysis and testing as well as help the analyst understand the impacts of various requests on a test facility. Discussion will revolve around operational functions often found in cryogenic systems, hardware for both tests and facilities, and what design or modelling tools are available for performing the analysis. Emphasis will be placed on what scenarios to use what hardware or the analysis tools to get the desired results. The class will provide a review of first principles, engineering practices, and those relations directly applicable to this subject including such topics as cryogenic fluids, thermodynamics and heat transfer, material properties at low temperature, insulation, cryogenic equipment, instrumentation, refrigeration, testing of cryogenic systems, cryogenics safety and typical thermal and fluid analysis used by the engineer. The class will provide references for further learning on various topics in cryogenics for those who want to dive deeper into the subject or have encountered specific problems.

  12. Basic Comfort Heating Principles.

    ERIC Educational Resources Information Center

    Dempster, Chalmer T.

    The material in this beginning book for vocational students presents fundamental principles needed to understand the heating aspect of the sheet metal trade and supplies practical experience to the student so that he may become familiar with the process of determining heat loss for average structures. Six areas covered are: (1) Background…

  13. Bernoulli's Principle

    ERIC Educational Resources Information Center

    Hewitt, Paul G.

    2004-01-01

    Some teachers have difficulty understanding Bernoulli's principle particularly when the principle is applied to the aerodynamic lift. Some teachers favor using Newton's laws instead of Bernoulli's principle to explain the physics behind lift. Some also consider Bernoulli's principle too difficult to explain to students and avoid teaching it…

  14. Plasma Modeling Enabled Technology Development Empowered by Fundamental Scattering Data

    NASA Astrophysics Data System (ADS)

    Kushner, Mark J.

    2016-05-01

    Technology development increasingly relies on modeling to speed the innovation cycle. This is particularly true for systems using low temperature plasmas (LTPs) and their role in enabling energy efficient processes with minimal environmental impact. In the innovation cycle, LTP modeling supports investigation of fundamental processes that seed the cycle, optimization of newly developed technologies, and prediction of performance of unbuilt systems for new applications. Although proof-of-principle modeling may be performed for idealized systems in simple gases, technology development must address physically complex systems that use complex gas mixtures that now may be multi-phase (e.g., in contact with liquids). The variety of fundamental electron and ion scattering, and radiation transport data (FSRD) required for this modeling increases as the innovation cycle progresses, while the accuracy required of that data depends on the intended outcome. In all cases, the fidelity, depth and impact of the modeling depends on the availability of FSRD. Modeling and technology development are, in fact, empowered by the availability and robustness of FSRD. In this talk, examples of the impact of and requirements for FSRD in the innovation cycle enabled by plasma modeling will be discussed using results from multidimensional and global models. Examples of fundamental studies and technology optimization will focus on microelectronics fabrication and on optically pumped lasers. Modeling of systems as yet unbuilt will address the interaction of atmospheric pressure plasmas with liquids. Work supported by DOE Office of Fusion Energy Science and the National Science Foundation.

  15. Combustion Fundamentals Research

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Increased emphasis is placed on fundamental and generic research at Lewis Research Center with less systems development efforts. This is especially true in combustion research, where the study of combustion fundamentals has grown significantly in order to better address the perceived long term technical needs of the aerospace industry. The main thrusts for this combustion fundamentals program area are as follows: analytical models of combustion processes, model verification experiments, fundamental combustion experiments, and advanced numeric techniques.

  16. Exchange Rates and Fundamentals.

    ERIC Educational Resources Information Center

    Engel, Charles; West, Kenneth D.

    2005-01-01

    We show analytically that in a rational expectations present-value model, an asset price manifests near-random walk behavior if fundamentals are I (1) and the factor for discounting future fundamentals is near one. We argue that this result helps explain the well-known puzzle that fundamental variables such as relative money supplies, outputs,…

  17. Reconstruction of fundamental SUSY parameters

    SciTech Connect

    P. M. Zerwas et al.

    2003-09-25

    We summarize methods and expected accuracies in determining the basic low-energy SUSY parameters from experiments at future e{sup +}e{sup -} linear colliders in the TeV energy range, combined with results from LHC. In a second step we demonstrate how, based on this set of parameters, the fundamental supersymmetric theory can be reconstructed at high scales near the grand unification or Planck scale. These analyses have been carried out for minimal supergravity [confronted with GMSB for comparison], and for a string effective theory.

  18. The Fundamental Manifold of Spheroids

    NASA Astrophysics Data System (ADS)

    Zaritsky, Dennis; Gonzalez, Anthony H.; Zabludoff, Ann I.

    2006-02-01

    We present a unifying empirical description of the structural and kinematic properties of all spheroids embedded in dark matter halos. We find that the intracluster stellar spheroidal components of galaxy clusters, which we call cluster spheroids (CSphs) and which are typically 100 times the size of normal elliptical galaxies, lie on a ``fundamental plane'' as tight as that defined by elliptical galaxies (rms in effective radius of ~0.07) but having a different slope. The slope, as measured by the coefficient of the logσ term, declines significantly and systematically between the fundamental planes of ellipticals, brightest cluster galaxies (BCGs), and CSphs. We attribute this decline primarily to a continuous change in Me/Le, the mass-to-light ratio within the effective radius re, with spheroid scale. The magnitude of the slope change requires that it arise principally from differences in the relative distributions of luminous and dark matter, rather than from stellar population differences such as in age and metallicity. By expressing the Me/Le term as a function of σ in the simple derivation of the fundamental plane and requiring the behavior of that term to mimic the observed nonlinear relationship between logMe/Le and logσ, we simultaneously fit a two-dimensional manifold to the measured properties of dwarf elliptical and elliptical galaxies, BCGs, and CSphs. The combined data have an rms scatter in logre of 0.114 (0.099 for the combination of ellipticals, BCGs, and CSphs), which is modestly larger than each fundamental plane has alone, but which includes the scatter introduced by merging different studies done in different filters by different investigators. This ``fundamental manifold'' fits the structural and kinematic properties of spheroids that span a factor of 100 in σ and 1000 in re. While our mathematical form is neither unique nor derived from physical principles, the tightness of the fit leaves little room for improvement by other unification

  19. The New Minimal Standard Model

    SciTech Connect

    Davoudiasl, Hooman; Kitano, Ryuichiro; Li, Tianjun; Murayama, Hitoshi

    2005-01-13

    We construct the New Minimal Standard Model that incorporates the new discoveries of physics beyond the Minimal Standard Model (MSM): Dark Energy, non-baryonic Dark Matter, neutrino masses, as well as baryon asymmetry and cosmic inflation, adopting the principle of minimal particle content and the most general renormalizable Lagrangian. We base the model purely on empirical facts rather than aesthetics. We need only six new degrees of freedom beyond the MSM. It is free from excessive flavor-changing effects, CP violation, too-rapid proton decay, problems with electroweak precision data, and unwanted cosmological relics. Any model of physics beyond the MSM should be measured against the phenomenological success of this model.

  20. A Fundamental Theorem on Particle Acceleration

    SciTech Connect

    Xie, Ming

    2003-05-01

    A fundamental theorem on particle acceleration is derived from the reciprocity principle of electromagnetism and a rigorous proof of the theorem is presented. The theorem establishes a relation between acceleration and radiation, which is particularly useful for insightful understanding of and practical calculation about the first order acceleration in which energy gain of the accelerated particle is linearly proportional to the accelerating field.

  1. Fundamental Ideas: Rethinking Computer Science Education.

    ERIC Educational Resources Information Center

    Schwill, Andreas

    1997-01-01

    Describes a way to teach computer science based on J.S. Bruner's psychological framework. This educational philosophy has been integrated into two German federal state schools. One way to cope with the rapid developments in computer science is to teach the fundamental ideas, principles, methods, and ways of thinking to K-12 students, (PEN)

  2. Discrete minimal flavor violation

    SciTech Connect

    Zwicky, Roman; Fischbacher, Thomas

    2009-10-01

    We investigate the consequences of replacing the global flavor symmetry of minimal flavor violation (MFV) SU(3){sub Q}xSU(3){sub U}xSU(3){sub D}x{center_dot}{center_dot}{center_dot} by a discrete D{sub Q}xD{sub U}xD{sub D}x{center_dot}{center_dot}{center_dot} symmetry. Goldstone bosons resulting from the breaking of the flavor symmetry generically lead to bounds on new flavor structure many orders of magnitude above the TeV scale. The absence of Goldstone bosons for discrete symmetries constitute the primary motivation of our work. Less symmetry implies further invariants and renders the mass-flavor basis transformation observable in principle and calls for a hierarchy in the Yukawa matrix expansion. We show, through the dimension of the representations, that the (discrete) symmetry in principle does allow for additional {delta}F=2 operators. If though the {delta}F=2 transitions are generated by two subsequent {delta}F=1 processes, as, for example, in the standard model, then the four crystal-like groups {sigma}(168){approx_equal}PSL(2,F{sub 7}), {sigma}(72{phi}), {sigma}(216{phi}) and especially {sigma}(360{phi}) do provide enough protection for a TeV-scale discrete MFV scenario. Models where this is not the case have to be investigated case by case. Interestingly {sigma}(216{phi}) has a (nonfaithful) representation corresponding to an A{sub 4} symmetry. Moreover we argue that the, apparently often omitted, (D) groups are subgroups of an appropriate {delta}(6g{sup 2}). We would like to stress that we do not provide an actual model that realizes the MFV scenario nor any other theory of flavor.

  3. Esophagectomy - minimally invasive

    MedlinePlus

    Minimally invasive esophagectomy; Robotic esophagectomy; Removal of the esophagus - minimally invasive; Achalasia - esophagectomy; Barrett esophagus - esophagectomy; Esophageal cancer - esophagectomy - laparoscopic; Cancer of the ...

  4. Sensors, Volume 1, Fundamentals and General Aspects

    NASA Astrophysics Data System (ADS)

    Grandke, Thomas; Ko, Wen H.

    1996-12-01

    'Sensors' is the first self-contained series to deal with the whole area of sensors. It describes general aspects, technical and physical fundamentals, construction, function, applications and developments of the various types of sensors. This volume deals with the fundamentals and common principles of sensors and covers the wide areas of principles, technologies, signal processing, and applications. Contents include: Sensor Fundamentals, e.g. Sensor Parameters, Modeling, Design and Packaging; Basic Sensor Technologies, e.g. Thin and Thick Films, Integrated Magnetic Sensors, Optical Fibres and Intergrated Optics, Ceramics and Oxides; Sensor Interfaces, e.g. Signal Processing, Multisensor Signal Processing, Smart Sensors, Interface Systems; Sensor Applications, e.g. Automotive: On-board Sensors, Traffic Surveillance and Control, Home Appliances, Environmental Monitoring, etc. This volume is an indispensable reference work and text book for both specialits and newcomers, researchers and developers.

  5. A correspondence principle

    NASA Astrophysics Data System (ADS)

    Hughes, Barry D.; Ninham, Barry W.

    2016-02-01

    A single mathematical theme underpins disparate physical phenomena in classical, quantum and statistical mechanical contexts. This mathematical "correspondence principle", a kind of wave-particle duality with glorious realizations in classical and modern mathematical analysis, embodies fundamental geometrical and physical order, and yet in some sense sits on the edge of chaos. Illustrative cases discussed are drawn from classical and anomalous diffusion, quantum mechanics of single particles and ideal gases, quasicrystals and Casimir forces.

  6. Fundamental Physical Constants

    National Institute of Standards and Technology Data Gateway

    SRD 121 CODATA Fundamental Physical Constants (Web, free access)   This site, developed in the Physics Laboratory at NIST, addresses three topics: fundamental physical constants, the International System of Units (SI), which is the modern metric system, and expressing the uncertainty of measurement results.

  7. Fundamentals of Physics

    NASA Astrophysics Data System (ADS)

    Halliday, David; Resnick, Robert; Walker, Jearl

    2003-01-01

    No other book on the market today can match the success of Halliday, Resnick and Walker's Fundamentals of Physics! In a breezy, easy-to-understand style the book offers a solid understanding of fundamental physics concepts, and helps readers apply this conceptual understanding to quantitative problem solving.

  8. Software Engineering Principles.

    DTIC Science & Technology

    1980-07-01

    but many differences as well . ct goal: Develop a family of military message systems using 2nt software engineering principles :ovide useful product to...The hard copy could then be manually scanned , distributed, and logged. SMP would be useful in developing and testing MP. It would provide minimal...design decisions.t4 C. Alternative ways to develop the program 1. Start from scratch. 2. Start with Stage 3. Scan line by line and make required changes. 3

  9. Principled Narrative

    ERIC Educational Resources Information Center

    MacBeath, John; Swaffield, Sue; Frost, David

    2009-01-01

    This article provides an overview of the "Carpe Vitam: Leadership for Learning" project, accounting for its provenance and purposes, before focusing on the principles for practice that constitute an important part of the project's legacy. These principles framed the dialogic process that was a dominant feature of the project and are presented,…

  10. Minimal Cells-Real and Imagined.

    PubMed

    Glass, John I; Merryman, Chuck; Wise, Kim S; Hutchison, Clyde A; Smith, Hamilton O

    2017-03-27

    A minimal cell is one whose genome only encodes the minimal set of genes necessary for the cell to survive. Scientific reductionism postulates the best way to learn the first principles of cellular biology would be to use a minimal cell in which the functions of all genes and components are understood. The genes in a minimal cell are, by definition, essential. In 2016, synthesis of a genome comprised of only the set of essential and quasi-essential genes encoded by the bacterium Mycoplasma mycoides created a near-minimal bacterial cell. This organism performs the cellular functions common to all organisms. It replicates DNA, transcribes RNA, translates proteins, undergoes cell division, and little else. In this review, we examine this organism and contrast it with other bacteria that have been used as surrogates for a minimal cell.

  11. Minimal covering problem and PLA minimization

    SciTech Connect

    Young, M.H.; Muroga, S.

    1985-12-01

    Solving the minimal covering problem by an implicit enumeration method is discussed. The implicit enumeration method in this paper is a modification of the Quine-McCluskey method tailored to computer processing and also its extension, utilizing some new properties of the minimal covering problem for speedup. A heuristic algorithm is also presented to solve large-scale problems. Its application to the minimization of programmable logic arrays (i.e., PLAs) is shown as an example. Computational experiences are presented to confirm the improvements by the implicit enumeration method discussed.

  12. Prepulse minimization in KALI-5000.

    PubMed

    Kumar, D Durga Praveen; Mitra, S; Senthil, K; Sharma, Vishnu K; Singh, S K; Roy, A; Sharma, Archana; Nagesh, K V; Chakravarthy, D P

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given.

  13. Prepulse minimization in KALI-5000

    NASA Astrophysics Data System (ADS)

    Kumar, D. Durga Praveen; Mitra, S.; Senthil, K.; Sharma, Vishnu K.; Singh, S. K.; Roy, A.; Sharma, Archana; Nagesh, K. V.; Chakravarthy, D. P.

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given.

  14. Fundamental principles of writing a successful grant proposal.

    PubMed

    Chung, Kevin C; Shauver, Melissa J

    2008-04-01

    It is important for the field of hand surgery to develop a new generation of surgeon-scientists who can produce high-impact studies to raise the profile of this specialty. To this end, organizations such as the American Society for Surgery of the Hand have initiated programs to promote multicenter clinical research that can be competitive for fiscal support from the National Institutes of Health and other funding agencies. Crafting a well-structured grant proposal is critical to securing adequate funding to investigate the many clinical and basic science questions in hand surgery. In this article, we present the key elements of a successful grant proposal to help potential applicants to navigate the complex pathways in the grant application process.

  15. Flat-panel volume CT: fundamental principles, technology, and applications.

    PubMed

    Gupta, Rajiv; Cheung, Arnold C; Bartling, Soenke H; Lisauskas, Jennifer; Grasruck, Michael; Leidecker, Christianne; Schmidt, Bernhard; Flohr, Thomas; Brady, Thomas J

    2008-01-01

    Flat-panel volume computed tomography (CT) systems have an innovative design that allows coverage of a large volume per rotation, fluoroscopic and dynamic imaging, and high spatial resolution that permits visualization of complex human anatomy such as fine temporal bone structures and trabecular bone architecture. In simple terms, flat-panel volume CT scanners can be thought of as conventional multidetector CT scanners in which the detector rows have been replaced by an area detector. The flat-panel detector has wide z-axis coverage that enables imaging of entire organs in one axial acquisition. Its fluoroscopic and angiographic capabilities are useful for intraoperative and vascular applications. Furthermore, the high-volume coverage and continuous rotation of the detector may enable depiction of dynamic processes such as coronary blood flow and whole-brain perfusion. Other applications in which flat-panel volume CT may play a role include small-animal imaging, nondestructive testing in animal survival surgeries, and tissue-engineering experiments. Such versatility has led some to predict that flat-panel volume CT will gain importance in interventional and intraoperative applications, especially in specialties such as cardiac imaging, interventional neuroradiology, orthopedics, and otolaryngology. However, the contrast resolution of flat-panel volume CT is slightly inferior to that of multidetector CT, a higher radiation dose is needed to achieve a comparable signal-to-noise ratio, and a slower scintillator results in a longer scanning time.

  16. Governing during an Institutional Crisis: 10 Fundamental Principles

    ERIC Educational Resources Information Center

    White, Lawrence

    2012-01-01

    In today's world, managing a campus crisis poses special challenges for an institution's governing board, which may operate some distance removed from the immediate events giving rise to the crisis. In its most challenging form, a campus crisis--a shooting, a natural disaster, a fraternity hazing death, the arrest of a prominent campus…

  17. Painting Victory: A Discussion of Leadership and Its Fundamental Principles.

    DTIC Science & Technology

    2007-11-02

    accordingly is instrumental to the art of leading. Consider how Leonardo da Vinci and Pablo Picasso perceive the human form in their own different and...SARDANAPALUS. 60 6. LEONARDO DA VINCI’S VIRGIN OF THE ROCKS AND PABLO PICASSO’S PORTRAIT OF UHNE 63 7. THEO VAN DOESBURG’ S SKETCHES AND...In contrast, a leader who is less confident with details may not create accomplishments as vivid. Leonardo Da Vinci’s vivid style and realism

  18. 47 CFR 36.2 - Fundamental principles underlying procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... PROPERTY COSTS, REVENUES, EXPENSES, TAXES AND RESERVES FOR TELECOMMUNICATIONS COMPANIES 1 General § 36.2... outlined in this part: (1) Separations are intended to apportion costs among categories or jurisdictions by... consideration to relative occupancy and relative time measurements. (3) In the development of “actual...

  19. 47 CFR 36.2 - Fundamental principles underlying procedures.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... seconds per call) in studies of traffic handled or work performed during a representative period for all... following treatment is applied: (1) In the case of property rented to affiliates, the property and related... case of property rented from affiliates, the property and related expenses are included with, and...

  20. 47 CFR 36.2 - Fundamental principles underlying procedures.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... seconds per call) in studies of traffic handled or work performed during a representative period for all... following treatment is applied: (1) In the case of property rented to affiliates, the property and related... case of property rented from affiliates, the property and related expenses are included with, and...

  1. About the ZOOM minimization package

    SciTech Connect

    Fischler, M.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  2. Levitated Optomechanics for Fundamental Physics

    NASA Astrophysics Data System (ADS)

    Rashid, Muddassar; Bateman, James; Vovrosh, Jamie; Hempston, David; Ulbricht, Hendrik

    2015-05-01

    Optomechanics with levitated nano- and microparticles is believed to form a platform for testing fundamental principles of quantum physics, as well as find applications in sensing. We will report on a new scheme to trap nanoparticles, which is based on a parabolic mirror with a numerical aperture of 1. Combined with achromatic focussing, the setup is a cheap and readily straightforward solution to trapping nanoparticles for further study. Here, we report on the latest progress made in experimentation with levitated nanoparticles; these include the trapping of 100 nm nanodiamonds (with NV-centres) down to 1 mbar as well as the trapping of 50 nm Silica spheres down to 10?4 mbar without any form of feedback cooling. We will also report on the progress to implement feedback stabilisation of the centre of mass motion of the trapped particle using digital electronics. Finally, we argue that such a stabilised particle trap can be the particle source for a nanoparticle matterwave interferometer. We will present our Talbot interferometer scheme, which holds promise to test the quantum superposition principle in the new mass range of 106 amu. EPSRC, John Templeton Foundation.

  3. Compression as a universal principle of animal behavior.

    PubMed

    Ferrer-i-Cancho, Ramon; Hernández-Fernández, Antoni; Lusseau, David; Agoramoorthy, Govindasamy; Hsu, Minna J; Semple, Stuart

    2013-01-01

    A key aim in biology and psychology is to identify fundamental principles underpinning the behavior of animals, including humans. Analyses of human language and the behavior of a range of non-human animal species have provided evidence for a common pattern underlying diverse behavioral phenomena: Words follow Zipf's law of brevity (the tendency of more frequently used words to be shorter), and conformity to this general pattern has been seen in the behavior of a number of other animals. It has been argued that the presence of this law is a sign of efficient coding in the information theoretic sense. However, no strong direct connection has been demonstrated between the law and compression, the information theoretic principle of minimizing the expected length of a code. Here, we show that minimizing the expected code length implies that the length of a word cannot increase as its frequency increases. Furthermore, we show that the mean code length or duration is significantly small in human language, and also in the behavior of other species in all cases where agreement with the law of brevity has been found. We argue that compression is a general principle of animal behavior that reflects selection for efficiency of coding.

  4. Minimal change disease

    MedlinePlus

    ... seen under a very powerful microscope called an electron microscope. Minimal change disease is the most common ... biopsy and examination of the tissue with an electron microscope can show signs of minimal change disease.

  5. Better Hyper-minimization

    NASA Astrophysics Data System (ADS)

    Maletti, Andreas

    Hyper-minimization aims to compute a minimal deterministic finite automaton (dfa) that recognizes the same language as a given dfa up to a finite number of errors. Algorithms for hyper-minimization that run in time O(n logn), where n is the number of states of the given dfa, have been reported recently in [Gawrychowski and Jeż: Hyper-minimisation made efficient. Proc. Mfcs, Lncs 5734, 2009] and [Holzer and Maletti: An n logn algorithm for hyper-minimizing a (minimized) deterministic automaton. Theor. Comput. Sci. 411, 2010]. These algorithms are improved to return a hyper-minimal dfa that commits the least number of errors. This closes another open problem of [Badr, Geffert, and Shipman: Hyper-minimizing minimized deterministic finite state automata. Rairo Theor. Inf. Appl. 43, 2009]. Unfortunately, the time complexity for the obtained algorithm increases to O(n 2).

  6. The Fundamentals of an African American Value System.

    ERIC Educational Resources Information Center

    Alexander, E. Curtis

    The Nguzo Saba or "Seven Principles of Blackness" provide the fundamental basis for the development of an African America value system that is based on the cultural and historical particularisms of being Black in an American society that devalues Black efficacy and Black people. The fundamentals of this value system, foundational to the Kwanzaa…

  7. The Subordination of Aesthetic Fundamentals in College Art Instruction

    ERIC Educational Resources Information Center

    Lavender, Randall

    2003-01-01

    Opportunities for college students of art and design to study fundamentals of visual aesthetics, integrity of form, and principles of composition are limited today by a number of factors. With the well-documented prominence of postmodern critical theory in the world of contemporary art, the study of aesthetic fundamentals is largely subordinated…

  8. Increasingly minimal bias routing

    DOEpatents

    Bataineh, Abdulla; Court, Thomas; Roweth, Duncan

    2017-02-21

    A system and algorithm configured to generate diversity at the traffic source so that packets are uniformly distributed over all of the available paths, but to increase the likelihood of taking a minimal path with each hop the packet takes. This is achieved by configuring routing biases so as to prefer non-minimal paths at the injection point, but increasingly prefer minimal paths as the packet proceeds, referred to herein as Increasing Minimal Bias (IMB).

  9. Arguing against fundamentality

    NASA Astrophysics Data System (ADS)

    McKenzie, Kerry

    This paper aims to open up discussion on the relationship between fundamentality and naturalism, and in particular on the question of whether fundamentality may be denied on naturalistic grounds. A historico-inductive argument for an anti-fundamentalist conclusion, prominent within the contemporary metaphysical literature, is examined; finding it wanting, an alternative 'internal' strategy is proposed. By means of an example from the history of modern physics - namely S-matrix theory - it is demonstrated that (1) this strategy can generate similar (though not identical) anti-fundamentalist conclusions on more defensible naturalistic grounds, and (2) that fundamentality questions can be empirical questions. Some implications and limitations of the proposed approach are discussed.

  10. Fundamentals of fluid sealing

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamentals of fluid sealing, including seal operating regimes, are discussed and the general fluid-flow equations for fluid sealing are developed. Seal performance parameters such as leakage and power loss are presented. Included in the discussion are the effects of geometry, surface deformations, rotation, and both laminar and turbulent flows. The concept of pressure balancing is presented, as are differences between liquid and gas sealing. Mechanisms of seal surface separation, fundamental friction and wear concepts applicable to seals, seal materials, and pressure-velocity (PV) criteria are discussed.

  11. Quantum correlations are tightly bound by the exclusivity principle.

    PubMed

    Yan, Bin

    2013-06-28

    It is a fundamental problem in physics of what principle limits the correlations as predicted by our current description of nature, based on quantum mechanics. One possible explanation is the "global exclusivity" principle recently discussed in Phys. Rev. Lett. 110, 060402 (2013). In this work we show that this principle actually has a much stronger restriction on the probability distribution. We provide a tight constraint inequality imposed by this principle and prove that this principle singles out quantum correlations in scenarios represented by any graph. Our result implies that the exclusivity principle might be one of the fundamental principles of nature.

  12. Counting Attribute Blocks: Constructing Meaning for the Multiplication Principle.

    ERIC Educational Resources Information Center

    Bird, Elliott

    2000-01-01

    Presents an activity in which attribute blocks help middle school students understand the fundamental counting principle and the multiplication rule. Demonstrates how these materials can aid students in building a conceptual understanding of multiplication and the counting principle. (ASK)

  13. Food Service Fundamentals.

    ERIC Educational Resources Information Center

    Marine Corps Inst., Washington, DC.

    Developed as part of the Marine Corps Institute (MCI) correspondence training program, this course on food service fundamentals is designed to provide a general background in the basic aspects of the food service program in the Marine Corps; it is adaptable for nonmilitary instruction. Introductory materials include specific information for MCI…

  14. Unification of Fundamental Forces

    NASA Astrophysics Data System (ADS)

    Salam, Abdus; Taylor, Foreword by John C.

    2005-10-01

    Foreword John C. Taylor; 1. Unification of fundamental forces Abdus Salam; 2. History unfolding: an introduction to the two 1968 lectures by W. Heisenberg and P. A. M. Dirac Abdus Salam; 3. Theory, criticism, and a philosophy Werner Heisenberg; 4. Methods in theoretical physics Paul Adrian Maurice Dirac.

  15. Basic Publication Fundamentals.

    ERIC Educational Resources Information Center

    Savedge, Charles E., Ed.

    Designed for students who produce newspapers and newsmagazines in junior high, middle, and elementary schools, this booklet is both a scorebook and a fundamentals text. The scorebook provides realistic criteria for judging publication excellence at these educational levels. All the basics for good publications are included in the text of the…

  16. Reading Is Fundamental, 1977.

    ERIC Educational Resources Information Center

    Smithsonian Institution, Washington, DC. National Reading is Fun-damental Program.

    Reading Is Fundamental (RIF) is a national, nonprofit organization designed to motivate children to read by making a wide variety of inexpensive books available to them and allowing the children to choose and keep books that interest them. This annual report for 1977 contains the following information on the RIF project: an account of the…

  17. Laser Fundamentals and Experiments.

    ERIC Educational Resources Information Center

    Van Pelt, W. F.; And Others

    As a result of work performed at the Southwestern Radiological Health Laboratory with respect to lasers, this manual was prepared in response to the increasing use of lasers in high schools and colleges. It is directed primarily toward the high school instructor who may use the text for a short course in laser fundamentals. The definition of the…

  18. The Fundamental Property Relation.

    ERIC Educational Resources Information Center

    Martin, Joseph J.

    1983-01-01

    Discusses a basic equation in thermodynamics (the fundamental property relation), focusing on a logical approach to the development of the relation where effects other than thermal, compression, and exchange of matter with the surroundings are considered. Also demonstrates erroneous treatments of the relation in three well-known textbooks. (JN)

  19. Fundamentals of Library Instruction

    ERIC Educational Resources Information Center

    McAdoo, Monty L.

    2012-01-01

    Being a great teacher is part and parcel of being a great librarian. In this book, veteran instruction services librarian McAdoo lays out the fundamentals of the discipline in easily accessible language. Succinctly covering the topic from top to bottom, he: (1) Offers an overview of the historical context of library instruction, drawing on recent…

  20. Fundamentals of soil science

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study guide provides comments and references for professional soil scientists who are studying for the soil science fundamentals exam needed as the first step for certification. The performance objectives were determined by the Soil Science Society of America's Council of Soil Science Examiners...

  1. Fundamentals of Solid Lubrication

    DTIC Science & Technology

    2012-03-01

    NOTES 14. ABSTRACT During this program, we have worked to develop a fundamental understanding of the chemical and tribological issues related to...approach, tribological measurements performed over a range of length scales, and the correlation of the two classes of information. Research activities...correlated measurements of surface composition and environmentally specific tribological performance of thin film solid lubricants. • Correlate shear

  2. Fundamentals of Refrigeration.

    ERIC Educational Resources Information Center

    Sutliff, Ronald D.; And Others

    This self-study course is designed to familiarize Marine enlisted personnel with the principles of the refrigeration process. The course contains five study units. Each study unit begins with a general objective, which is a statement of what the student should learn from the unit. The study units are divided into numbered work units, each…

  3. Fundaments of plant cybernetics.

    PubMed

    Zucconi, F

    2001-01-01

    A systemic approach is proposed for analyzing plants' physiological organization and cybernesis. To this end, the plant is inspected as a system, starting from the integration of crown and root systems, and its impact on a number of basic epigenetic events. The approach proves to be axiomatic and facilitates the definition of the principles behind the plant's autonomous control of growth and reproduction.

  4. FUNDAMENTALS OF TELEVISION SYSTEMS.

    ERIC Educational Resources Information Center

    KESSLER, WILLIAM J.

    DESIGNED FOR A READER WITHOUT SPECIAL TECHNICAL KNOWLEDGE, THIS ILLUSTRATED RESOURCE PAPER EXPLAINS THE COMPONENTS OF A TELEVISION SYSTEM AND RELATES THEM TO THE COMPLETE SYSTEM. SUBJECTS DISCUSSED ARE THE FOLLOWING--STUDIO ORGANIZATION AND COMPATIBLE COLOR TELEVISION PRINCIPLES, WIRED AND RADIO TRANSMISSION SYSTEMS, DIRECT VIEW AND PROJECTION…

  5. Free-energy minimization and the dark-room problem.

    PubMed

    Friston, Karl; Thornton, Christopher; Clark, Andy

    2012-01-01

    Recent years have seen the emergence of an important new fundamental theory of brain function. This theory brings information-theoretic, Bayesian, neuroscientific, and machine learning approaches into a single framework whose overarching principle is the minimization of surprise (or, equivalently, the maximization of expectation). The most comprehensive such treatment is the "free-energy minimization" formulation due to Karl Friston (see e.g., Friston and Stephan, 2007; Friston, 2010a,b - see also Fiorillo, 2010; Thornton, 2010). A recurrent puzzle raised by critics of these models is that biological systems do not seem to avoid surprises. We do not simply seek a dark, unchanging chamber, and stay there. This is the "Dark-Room Problem." Here, we describe the problem and further unpack the issues to which it speaks. Using the same format as the prolog of Eddington's Space, Time, and Gravitation (Eddington, 1920) we present our discussion as a conversation between: an information theorist (Thornton), a physicist (Friston), and a philosopher (Clark).

  6. Principles of electromagnetic theory

    SciTech Connect

    Kovetz, A.H. )

    1990-01-01

    This book emphasizes the fundamental understanding of the laws governing the behavior of charge and current carrying bodies. Electromagnetism is presented as a classical theory, based-like mechanics-on principles that are independent of the atomic constitution of matter. This book is unique among electromagnetic texts in its treatment of the precise manner in which electromagnetism is linked to mechanics and thermodynamics. Applications include electrostriction, piezoelectricity, ferromagnetism, superconductivity, thermoelectricity, magnetohydrodynamics, radiation from charged particles, electromagnetic wave propagation and guided waves. There are many worked examples of dynamical and thermal effects of electromagnetic fields, and of effects resulting from the motion of bodies.

  7. Fundamental properties of resonances.

    PubMed

    Ceci, S; Hadžimehmedović, M; Osmanović, H; Percan, A; Zauner, B

    2017-03-27

    All resonances, from hydrogen nuclei excited by the high-energy gamma rays in deep space to newly discovered particles produced in Large Hadron Collider, should be described by the same fundamental physical quantities. However, two distinct sets of properties are used to describe resonances: the pole parameters (complex pole position and residue) and the Breit-Wigner parameters (mass, width, and branching fractions). There is an ongoing decades-old debate on which one of them should be abandoned. In this study of nucleon resonances appearing in the elastic pion-nucleon scattering we discover an intricate interplay of the parameters from both sets, and realize that neither set is completely independent or fundamental on its own.

  8. Fundamental properties of resonances

    PubMed Central

    Ceci, S.; Hadžimehmedović, M.; Osmanović, H.; Percan, A.; Zauner, B.

    2017-01-01

    All resonances, from hydrogen nuclei excited by the high-energy gamma rays in deep space to newly discovered particles produced in Large Hadron Collider, should be described by the same fundamental physical quantities. However, two distinct sets of properties are used to describe resonances: the pole parameters (complex pole position and residue) and the Breit-Wigner parameters (mass, width, and branching fractions). There is an ongoing decades-old debate on which one of them should be abandoned. In this study of nucleon resonances appearing in the elastic pion-nucleon scattering we discover an intricate interplay of the parameters from both sets, and realize that neither set is completely independent or fundamental on its own. PMID:28345595

  9. Fundamentals of Polarized Light

    NASA Technical Reports Server (NTRS)

    Mishchenko, Michael

    2003-01-01

    The analytical and numerical basis for describing scattering properties of media composed of small discrete particles is formed by the classical electromagnetic theory. Although there are several excellent textbooks outlining the fundamentals of this theory, it is convenient for our purposes to begin with a summary of those concepts and equations that are central to the subject of this book and will be used extensively in the following chapters. We start by formulating Maxwell's equations and constitutive relations for time- harmonic macroscopic electromagnetic fields and derive the simplest plane-wave solution that underlies the basic optical idea of a monochromatic parallel beam of light. This solution naturally leads to the introduction of such fundamental quantities as the refractive index and the Stokes parameters. Finally, we define the concept of a quasi-monochromatic beam of light and discuss its implications.

  10. Fundamentals of neurobiology.

    PubMed

    Greg Hall, D

    2011-01-01

    Session 1 of the 2010 STP/IFSTP Joint Symposium on Toxicologic Neuropathology, titled "Fundamentals of Neurobiology," was organized to provide a foundation for subsequent sessions by presenting essential elements of neuroanatomy and nervous system function. A brief introduction to the session titled "Introduction to Correlative Neurobiology" was provided by Dr. Greg Hall (Eli Lilly and Company, Indianapolis, IN). Correlative neurobiology refers to considerations of the relationships between the highly organized and compartmentalized structure of nervous tissues and the functioning within this system.

  11. Fundamental studies in geodynamics

    NASA Technical Reports Server (NTRS)

    Anderson, D. L.; Hager, B. H.; Kanamori, H.

    1981-01-01

    Research in fundamental studies in geodynamics continued in a number of fields including seismic observations and analysis, synthesis of geochemical data, theoretical investigation of geoid anomalies, extensive numerical experiments in a number of geodynamical contexts, and a new field seismic volcanology. Summaries of work in progress or completed during this report period are given. Abstracts of publications submitted from work in progress during this report period are attached as an appendix.

  12. Fundamentals of Monte Carlo

    SciTech Connect

    Wollaber, Allan Benton

    2016-06-16

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  13. The 4th Thermodynamic Principle?

    SciTech Connect

    Montero Garcia, Jose de la Luz; Novoa Blanco, Jesus Francisco

    2007-04-28

    It should be emphasized that the 4th Principle above formulated is a thermodynamic principle and, at the same time, is mechanical-quantum and relativist, as it should inevitably be and its absence has been one of main the theoretical limitations of the physical theory until today.We show that the theoretical discovery of Dimensional Primitive Octet of Matter, the 4th Thermodynamic Principle, the Quantum Hexet of Matter, the Global Hexagonal Subsystem of Fundamental Constants of Energy and the Measurement or Connected Global Scale or Universal Existential Interval of the Matter is that it is possible to be arrived at a global formulation of the four 'forces' or fundamental interactions of nature. The Einstein's golden dream is possible.

  14. Structure and Dynamics of ER: Minimal Networks and Biophysical Constraints

    PubMed Central

    Lin, Congping; Zhang, Yiwei; Sparkes, Imogen; Ashwin, Peter

    2014-01-01

    The endoplasmic reticulum (ER) in live cells is a highly mobile network whose structure dynamically changes on a number of timescales. The role of such drastic changes in any system is unclear, although there are correlations with ER function. A better understanding of the fundamental biophysical constraints on the system will allow biologists to determine the effects of molecular factors on ER dynamics. Previous studies have identified potential static elements that the ER may remodel around. Here, we use these structural elements to assess biophysical principles behind the network dynamics. By analyzing imaging data of tobacco leaf epidermal cells under two different conditions, i.e., native state (control) and latrunculin B (treated), we show that the geometric structure and dynamics of ER networks can be understood in terms of minimal networks. Our results show that the ER network is well modeled as a locally minimal-length network between the static elements that potentially anchor the ER to the cell cortex over longer timescales; this network is perturbed by a mixture of random and deterministic forces. The network need not have globally minimum length; we observe cases where the local topology may change dynamically between different Euclidean Steiner network topologies. The networks in the treated cells are easier to quantify, because they are less dynamic (the treatment suppresses actin dynamics), but the same general features are found in control cells. Using a Langevin approach, we model the dynamics of the nonpersistent nodes and use this to show that the images can be used to estimate both local viscoelastic behavior of the cytoplasm and filament tension in the ER network. This means we can explain several aspects of the ER geometry in terms of biophysical principles. PMID:25099815

  15. Neutrons and Fundamental Symmetries

    SciTech Connect

    Plaster, Bradley

    2016-01-11

    The research supported by this project addressed fundamental open physics questions via experiments with subatomic particles. In particular, neutrons constitute an especially ideal “laboratory” for fundamental physics tests, as their sensitivities to the four known forces of nature permit a broad range of tests of the so-called “Standard Model”, our current best physics model for the interactions of subatomic particles. Although the Standard Model has been a triumphant success for physics, it does not provide satisfactory answers to some of the most fundamental open questions in physics, such as: are there additional forces of nature beyond the gravitational, electromagnetic, weak nuclear, and strong nuclear forces?, or why does our universe consist of more matter than anti-matter? This project also contributed significantly to the training of the next generation of scientists, of considerable value to the public. Young scientists, ranging from undergraduate students to graduate students to post-doctoral researchers, made significant contributions to the work carried out under this project.

  16. Value of Fundamental Science

    NASA Astrophysics Data System (ADS)

    Burov, Alexey

    Fundamental science is a hard, long-term human adventure that has required high devotion and social support, especially significant in our epoch of Mega-science. The measure of this devotion and this support expresses the real value of the fundamental science in public opinion. Why does fundamental science have value? What determines its strength and what endangers it? The dominant answer is that the value of science arises out of curiosity and is supported by the technological progress. Is this really a good, astute answer? When trying to attract public support, we talk about the ``mystery of the universe''. Why do these words sound so attractive? What is implied by and what is incompatible with them? More than two centuries ago, Immanuel Kant asserted an inseparable entanglement between ethics and metaphysics. Thus, we may ask: which metaphysics supports the value of scientific cognition, and which does not? Should we continue to neglect the dependence of value of pure science on metaphysics? If not, how can this issue be addressed in the public outreach? Is the public alienated by one or another message coming from the face of science? What does it mean to be politically correct in this sort of discussion?

  17. Fundamentals of Shiftwork Scheduling

    DTIC Science & Technology

    2006-04-01

    works non-standard shifts that include evenings, nights and/or weekends. Sleep hygiene . Healthful time management with respect to sleep quality. Work...This report is designed for use by managers , supervisors, shiftwork schedulers and employees. It defines the principles and components of a method of...contributions are gratefully acknowledged vLi SUMMARY This report is designed for use by managers , supervisors, shifiwork schedulers and employees. It

  18. Ecological Principles and Guidelines for Managing the Use of Land

    SciTech Connect

    Dale, Virginia H; Brown, Sandra; Haeuber, R A; Hobbs, N T; Huntly, N; Naiman, R J; Riebsame, W E; Turner, M G; Valone, T J

    2014-01-01

    The many ways that people have used and managed land throughout history has emerged as a primary cause of land-cover change around the world. Thus, land use and land management increasingly represent a fundamental source of change in the global environment. Despite their global importance, however, many decisions about the management and use of land are made with scant attention to ecological impacts. Thus, ecologists' knowledge of the functioning of Earth's ecosystems is needed to broaden the scientific basis of decisions on land use and management. In response to this need, the Ecological Society of America established a committee to examine the ways that land-use decisions are made and the ways that ecologists could help inform those decisions. This paper reports the scientific findings of that committee. Five principles of ecological science have particular implications for land use and can assure that fundamental processes of Earth's ecosystems are sustained. These ecological principles deal with time, species, place, dis- turbance, and the landscape. The recognition that ecological processes occur within a temporal setting and change over time is fundamental to analyzing the effects of land use. In addition, individual species and networks of interacting species have strong and far-reaching effects on ecological processes. Furthermore, each site or region has a unique set of organisms and abiotic conditions influencing and constraining ecological processes. Distur- bances are important and ubiquitous ecological events whose effects may strongly influence population, com- munity, and ecosystem dynamics. Finally, the size, shape, and spatial relationships of habitat patches on the landscape affect the structure and function of ecosystems. The responses of the land to changes in use and management by people depend on expressions of these fundamental principles in nature. These principles dictate several guidelines for land use. The guidelines give practical

  19. Microscopic Description of Le Chatelier's Principle

    ERIC Educational Resources Information Center

    Novak, Igor

    2005-01-01

    A simple approach that "demystifies" Le Chatelier's principle (LCP) and simulates students to think about fundamental physical background behind the well-known principles is presented. The approach uses microscopic descriptors of matter like energy levels and populations and does not require any assumption about the fixed amount of substance being…

  20. Minimizing Classroom Interruptions.

    ERIC Educational Resources Information Center

    Partin, Ronald L.

    1987-01-01

    Offers suggestions for minimizing classroom interruptions, such as suggesting to the principal that announcements not be read over the intercom during class time and arranging desks and chairs so as to minimize visual distractions. Contains a school interruption survey form. (JC)

  1. Controlling molecular transport in minimal emulsions

    PubMed Central

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of ‘minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions. PMID:26797564

  2. Controlling molecular transport in minimal emulsions

    NASA Astrophysics Data System (ADS)

    Gruner, Philipp; Riechers, Birte; Semin, Benoît; Lim, Jiseok; Johnston, Abigail; Short, Kathleen; Baret, Jean-Christophe

    2016-01-01

    Emulsions are metastable dispersions in which molecular transport is a major mechanism driving the system towards its state of minimal energy. Determining the underlying mechanisms of molecular transport between droplets is challenging due to the complexity of a typical emulsion system. Here we introduce the concept of `minimal emulsions', which are controlled emulsions produced using microfluidic tools, simplifying an emulsion down to its minimal set of relevant parameters. We use these minimal emulsions to unravel the fundamentals of transport of small organic molecules in water-in-fluorinated-oil emulsions, a system of great interest for biotechnological applications. Our results are of practical relevance to guarantee a sustainable compartmentalization of compounds in droplet microreactors and to design new strategies for the dynamic control of droplet compositions.

  3. Radar principles

    NASA Technical Reports Server (NTRS)

    Sato, Toru

    1989-01-01

    Discussed here is a kind of radar called atmospheric radar, which has as its target clear air echoes from the earth's atmosphere produced by fluctuations of the atmospheric index of refraction. Topics reviewed include the vertical structure of the atmosphere, the radio refractive index and its fluctuations, the radar equation (a relation between transmitted and received power), radar equations for distributed targets and spectral echoes, near field correction, pulsed waveforms, the Doppler principle, and velocity field measurements.

  4. Information physics fundamentals of nanophotonics.

    PubMed

    Naruse, Makoto; Tate, Naoya; Aono, Masashi; Ohtsu, Motoichi

    2013-05-01

    Nanophotonics has been extensively studied with the aim of unveiling and exploiting light-matter interactions that occur at a scale below the diffraction limit of light, and recent progress made in experimental technologies--both in nanomaterial fabrication and characterization--is driving further advancements in the field. From the viewpoint of information, on the other hand, novel architectures, design and analysis principles, and even novel computing paradigms should be considered so that we can fully benefit from the potential of nanophotonics. This paper examines the information physics aspects of nanophotonics. More specifically, we present some fundamental and emergent information properties that stem from optical excitation transfer mediated by optical near-field interactions and the hierarchical properties inherent in optical near-fields. We theoretically and experimentally investigate aspects such as unidirectional signal transfer, energy efficiency and networking effects, among others, and we present their basic theoretical formalisms and describe demonstrations of practical applications. A stochastic analysis of light-assisted material formation is also presented, where an information-based approach provides a deeper understanding of the phenomena involved, such as self-organization. Furthermore, the spatio-temporal dynamics of optical excitation transfer and its inherent stochastic attributes are utilized for solution searching, paving the way to a novel computing paradigm that exploits coherent and dissipative processes in nanophotonics.

  5. Fundamental experiments in velocimetry

    SciTech Connect

    Briggs, Matthew Ellsworth; Hull, Larry; Shinas, Michael

    2009-01-01

    One can understand what velocimetry does and does not measure by understanding a few fundamental experiments. Photon Doppler Velocimetry (PDV) is an interferometer that will produce fringe shifts when the length of one of the legs changes, so we might expect the fringes to change whenever the distance from the probe to the target changes. However, by making PDV measurements of tilted moving surfaces, we have shown that fringe shifts from diffuse surfaces are actually measured only from the changes caused by the component of velocity along the beam. This is an important simplification in the interpretation of PDV results, arising because surface roughness randomizes the scattered phases.

  6. Fundamentals of satellite navigation

    NASA Astrophysics Data System (ADS)

    Stiller, A. H.

    The basic operating principles and capabilities of conventional and satellite-based navigation systems for air, sea, and land vehicles are reviewed and illustrated with diagrams. Consideration is given to autonomous onboard systems; systems based on visible or radio beacons; the Transit, Cicada, Navstar-GPS, and Glonass satellite systems; the physical laws and parameters of satellite motion; the definition of time in satellite systems; and the content of the demodulated GPS data signal. The GPS and Glonass data format frames are presented graphically, and tables listing the GPS and Glonass satellites, their technical characteristics, and the (past or scheduled) launch dates are provided.

  7. Fundamentals of Space Systems

    NASA Astrophysics Data System (ADS)

    Pisacane, Vincent L.

    2005-06-01

    Fundamentals of Space Systems was developed to satisfy two objectives: the first is to provide a text suitable for use in an advanced undergraduate or beginning graduate course in both space systems engineering and space system design. The second is to be a primer and reference book for space professionals wishing to broaden their capabilities to develop, manage the development, or operate space systems. The authors of the individual chapters are practicing engineers that have had extensive experience in developing sophisticated experimental and operational spacecraft systems in addition to having experience teaching the subject material. The text presents the fundamentals of all the subsystems of a spacecraft missions and includes illustrative examples drawn from actual experience to enhance the learning experience. It included a chapter on each of the relevant major disciplines and subsystems including space systems engineering, space environment, astrodynamics, propulsion and flight mechanics, attitude determination and control, power systems, thermal control, configuration management and structures, communications, command and telemetry, data processing, embedded flight software, survuvability and reliability, integration and test, mission operations, and the initial conceptual design of a typical small spacecraft mission.

  8. Semi-analytical formulation of modal dispersion parameter of an optical fiber with Kerr nonlinearity and using a novel fundamental modal field approximation

    NASA Astrophysics Data System (ADS)

    Choudhury, Raja Roy; Choudhury, Arundhati Roy; Ghose, Mrinal Kanti

    2013-09-01

    To characterize nonlinear optical fiber, a semi-analytical formulation using variational principle and the Nelder-Mead Simplex method for nonlinear unconstrained minimization is proposed. The number of optimizing parameters in order to optimize core parameter U has been increased to incorporate more flexibility in the formulation of an innovative form of fundamental modal field. This formulation provides accurate analytical expressions for modal dispersion parameter (g) of optical fiber with Kerr nonlinearity. The minimization of core parameter (U), which involves Kerr nonlinearity through the nonstationary expression of propagation constant, is carried out by the Nelder-Mead Simplex method of nonlinear unconstrained minimization, suitable for problems with nonsmooth functions as the method does not require any derivative information. This formulation has less computational burden for calculation of modal parameters than full numerical methods.

  9. The Role of Fisher Information Theory in the Development of Fundamental Laws in Physical Chemistry

    ERIC Educational Resources Information Center

    Honig, J. M.

    2009-01-01

    The unifying principle that involves rendering the Fisher information measure an extremum is reviewed. It is shown that with this principle, in conjunction with appropriate constraints, a large number of fundamental laws can be derived from a common source in a unified manner. The resulting economy of thought pertaining to fundamental principles…

  10. Quantum mechanics and the generalized uncertainty principle

    SciTech Connect

    Bang, Jang Young; Berger, Micheal S.

    2006-12-15

    The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

  11. Fundamentals of zoological scaling

    NASA Astrophysics Data System (ADS)

    Lin, Herbert

    1982-01-01

    Most introductory physics courses emphasize highly idealized problems with unique well-defined answers. Though many textbooks complement these problems with estimation problems, few books present anything more than an elementary discussion of scaling. This paper presents some fundamentals of scaling in the zoological domain—a domain complex by any standard, but one also well suited to illustrate the power of very simple physical ideas. We consider the following animal characteristics: skeletal weight, speed of running, height and range of jumping, food consumption, heart rate, lifetime, locomotive efficiency, frequency of wing flapping, and maximum sizes of animals that fly and hover. These relationships are compared to zoological data and everyday experience, and match reasonably well.

  12. Nuclei and Fundamental Symmetries

    NASA Astrophysics Data System (ADS)

    Haxton, Wick

    2016-09-01

    Nuclei provide marvelous laboratories for testing fundamental interactions, often enhancing weak processes through accidental degeneracies among states, and providing selection rules that can be exploited to isolate selected interactions. I will give an overview of current work, including the use of parity violation to probe unknown aspects of the hadronic weak interaction; nuclear electric dipole moment searches that may shed light on new sources of CP violation; and tests of lepton number violation made possible by the fact that many nuclei can only decay by rare second-order weak interactions. I will point to opportunities in both theory and experiment to advance the field. Based upon work supported in part by the US Department of Energy, Office of Science, Office of Nuclear Physics and SciDAC under Awards DE-SC00046548 (Berkeley), DE-AC02-05CH11231 (LBNL), and KB0301052 (LBNL).

  13. Wall of fundamental constants

    SciTech Connect

    Olive, Keith A.; Peloso, Marco; Uzan, Jean-Philippe

    2011-02-15

    We consider the signatures of a domain wall produced in the spontaneous symmetry breaking involving a dilatonlike scalar field coupled to electromagnetism. Domains on either side of the wall exhibit slight differences in their respective values of the fine-structure constant, {alpha}. If such a wall is present within our Hubble volume, absorption spectra at large redshifts may or may not provide a variation in {alpha} relative to the terrestrial value, depending on our relative position with respect to the wall. This wall could resolve the contradiction between claims of a variation of {alpha} based on Keck/Hires data and of the constancy of {alpha} based on Very Large Telescope data. We derive the properties of the wall and the parameters of the underlying microscopic model required to reproduce the possible spatial variation of {alpha}. We discuss the constraints on the existence of the low-energy domain wall and describe its observational implications concerning the variation of the fundamental constants.

  14. Fundamentals of gel dosimeters

    NASA Astrophysics Data System (ADS)

    McAuley, K. B.; Nasr, A. T.

    2013-06-01

    Fundamental chemical and physical phenomena that occur in Fricke gel dosimeters, polymer gel dosimeters, micelle gel dosimeters and genipin gel dosimeters are discussed. Fricke gel dosimeters are effective even though their radiation sensitivity depends on oxygen concentration. Oxygen contamination can cause severe problems in polymer gel dosimeters, even when THPC is used. Oxygen leakage must be prevented between manufacturing and irradiation of polymer gels, and internal calibration methods should be used so that contamination problems can be detected. Micelle gel dosimeters are promising due to their favourable diffusion properties. The introduction of micelles to gel dosimetry may open up new areas of dosimetry research wherein a range of water-insoluble radiochromic materials can be explored as reporter molecules.

  15. Minimal Orderings Revisited

    SciTech Connect

    Peyton, B.W.

    1999-07-01

    When minimum orderings proved too difficult to deal with, Rose, Tarjan, and Leuker instead studied minimal orderings and how to compute them (Algorithmic aspects of vertex elimination on graphs, SIAM J. Comput., 5:266-283, 1976). This paper introduces an algorithm that is capable of computing much better minimal orderings much more efficiently than the algorithm in Rose et al. The new insight is a way to use certain structures and concepts from modern sparse Cholesky solvers to re-express one of the basic results in Rose et al. The new algorithm begins with any initial ordering and then refines it until a minimal ordering is obtained. it is simple to obtain high-quality low-cost minimal orderings by using fill-reducing heuristic orderings as initial orderings for the algorithm. We examine several such initial orderings in some detail.

  16. Minimally invasive stomas.

    PubMed

    Hellinger, Michael D; Al Haddad, Abdullah

    2008-02-01

    Traditionally, stoma creation and end stoma reversal have been performed via a laparotomy incision. However, in many situations, stoma construction may be safely performed in a minimally invasive nature. This may include a trephine, laparoscopic, or combined approach. Furthermore, Hartmann's colostomy reversal, a procedure traditionally associated with substantial morbidity, may also be performed laparoscopically. The authors briefly review patient selection, preparation, and indications, and focus primarily on surgical techniques and results of minimally invasive stoma creation and Hartmann's reversal.

  17. Minimally invasive lumbar foraminotomy.

    PubMed

    Deutsch, Harel

    2013-07-01

    Lumbar radiculopathy is a common problem. Nerve root compression can occur at different places along a nerve root's course including in the foramina. Minimal invasive approaches allow easier exposure of the lateral foramina and decompression of the nerve root in the foramina. This video demonstrates a minimally invasive approach to decompress the lumbar nerve root in the foramina with a lateral to medial decompression. The video can be found here: http://youtu.be/jqa61HSpzIA.

  18. Achieving sustainable plant disease management through evolutionary principles.

    PubMed

    Zhan, Jiasui; Thrall, Peter H; Burdon, Jeremy J

    2014-09-01

    Plants and their pathogens are engaged in continuous evolutionary battles and sustainable disease management requires novel systems to create environments conducive for short-term and long-term disease control. In this opinion article, we argue that knowledge of the fundamental factors that drive host-pathogen coevolution in wild systems can provide new insights into disease development in agriculture. Such evolutionary principles can be used to guide the formulation of sustainable disease management strategies which can minimize disease epidemics while simultaneously reducing pressure on pathogens to evolve increased infectivity and aggressiveness. To ensure agricultural sustainability, disease management programs that reflect the dynamism of pathogen population structure are essential and evolutionary biologists should play an increasing role in their design.

  19. 40 year retrospective of fundamental mechanisms

    NASA Astrophysics Data System (ADS)

    Soileau, M. J.

    2008-10-01

    Fundamental mechanisms of laser induced damage (LID) have been one of the most controversial topics during the forty years of the Boulder Damage Symposium (Ref. 1.) LID is fundamentally a very nonlinear process and sensitive to a variety of parameters including wavelength, pulse width, spot size, focal conditions, material band gap, thermal-mechanical prosperities, and component design considerations. The complex interplay of many of these parameters and sample to sample materials variations combine to make detailed, first principle, models very problematic at best. The phenomenon of self-focusing, the multi spatial and temporal mode structure of most lasers, and the fact that samples are 'consumed' in testing complicate experiential results. This paper presents a retrospective of the work presented at this meeting.

  20. Going from principles to rules in research ethics.

    PubMed

    Sachs, Benjamin

    2011-01-01

    In research ethics there is a canon regarding what ethical rules ought to be followed by investigators vis-à-vis their treatment of subjects and a canon regarding what fundamental ethical principles apply to the endeavor. What I aim to demonstrate here is that several of the rules find no support in the principles. This leaves anyone who would insist that we not abandon those rules in the difficult position of needing to establish that we are nevertheless justified in believing in the validity of the rules. I conclude by arguing that this is not likely to be accomplished. The rules I call into question are the rules requiring: - that studies be designed in a scientifically valid way - that risks to subjects be minimized - that subjects be afforded post-trial access to experimental interventions - that inducements paid to subjects not be counted as a benefit to them - that inducements paid to subjects not be 'undue' - that subjects must remain free to withdraw from the study at any time for any reason without penalty. Both canons, the canon on principles and the canon on rules, are found in the overlap among ethical pronouncements that are themselves canonical: the Nuremberg Code, the Declaration of Helsinki, the Belmont Report, CIOMS's International Ethical Guidelines for Biomedical Research Involving Human Subjects, and NBAC's 2001 report, Ethical Issues in International Research: Clinical Trials in Developing Countries.

  1. DOE Fundamentals Handbook: Instrumentation and Control, Volume 1

    SciTech Connect

    Not Available

    1992-06-01

    The Instrumentation and Control Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of instrumentation and control systems. The handbook includes information on temperature, pressure, flow, and level detection systems; position indication systems; process control systems; and radiation detection principles. This information will provide personnel with an understanding of the basic operation of various types of DOE nuclear facility instrumentation and control systems.

  2. Interfaces at equilibrium: A guide to fundamentals.

    PubMed

    Marmur, Abraham

    2016-05-20

    The fundamentals of the thermodynamics of interfaces are reviewed and concisely presented. The discussion starts with a short review of the elements of bulk thermodynamics that are also relevant to interfaces. It continues with the interfacial thermodynamics of two-phase systems, including the definition of interfacial tension and adsorption. Finally, the interfacial thermodynamics of three-phase (wetting) systems is discussed, including the topic of non-wettable surfaces. A clear distinction is made between equilibrium conditions, in terms of minimizing energies (internal, Gibbs or Helmholtz), and equilibrium indicators, in terms of measurable, intrinsic properties (temperature, chemical potential, pressure). It is emphasized that the equilibrium indicators are the same whatever energy is minimized, if the boundary conditions are properly chosen. Also, to avoid a common confusion, a distinction is made between systems of constant volume and systems with drops of constant volume.

  3. Overlay accuracy fundamentals

    NASA Astrophysics Data System (ADS)

    Kandel, Daniel; Levinski, Vladimir; Sapiens, Noam; Cohen, Guy; Amit, Eran; Klein, Dana; Vakshtein, Irina

    2012-03-01

    Currently, the performance of overlay metrology is evaluated mainly based on random error contributions such as precision and TIS variability. With the expected shrinkage of the overlay metrology budget to < 0.5nm, it becomes crucial to include also systematic error contributions which affect the accuracy of the metrology. Here we discuss fundamental aspects of overlay accuracy and a methodology to improve accuracy significantly. We identify overlay mark imperfections and their interaction with the metrology technology, as the main source of overlay inaccuracy. The most important type of mark imperfection is mark asymmetry. Overlay mark asymmetry leads to a geometrical ambiguity in the definition of overlay, which can be ~1nm or less. It is shown theoretically and in simulations that the metrology may enhance the effect of overlay mark asymmetry significantly and lead to metrology inaccuracy ~10nm, much larger than the geometrical ambiguity. The analysis is carried out for two different overlay metrology technologies: Imaging overlay and DBO (1st order diffraction based overlay). It is demonstrated that the sensitivity of DBO to overlay mark asymmetry is larger than the sensitivity of imaging overlay. Finally, we show that a recently developed measurement quality metric serves as a valuable tool for improving overlay metrology accuracy. Simulation results demonstrate that the accuracy of imaging overlay can be improved significantly by recipe setup optimized using the quality metric. We conclude that imaging overlay metrology, complemented by appropriate use of measurement quality metric, results in optimal overlay accuracy.

  4. A systems approach to theoretical fluid mechanics: Fundamentals

    NASA Technical Reports Server (NTRS)

    Anyiwo, J. C.

    1978-01-01

    A preliminary application of the underlying principles of the investigator's general system theory to the description and analyses of the fluid flow system is presented. An attempt is made to establish practical models, or elements of the general fluid flow system from the point of view of the general system theory fundamental principles. Results obtained are applied to a simple experimental fluid flow system, as test case, with particular emphasis on the understanding of fluid flow instability, transition and turbulence.

  5. Minimally invasive procedures

    PubMed Central

    Baltayiannis, Nikolaos; Michail, Chandrinos; Lazaridis, George; Anagnostopoulos, Dimitrios; Baka, Sofia; Mpoukovinas, Ioannis; Karavasilis, Vasilis; Lampaki, Sofia; Papaiwannou, Antonis; Karavergou, Anastasia; Kioumis, Ioannis; Pitsiou, Georgia; Katsikogiannis, Nikolaos; Tsakiridis, Kosmas; Rapti, Aggeliki; Trakada, Georgia; Zissimopoulos, Athanasios; Zarogoulidis, Konstantinos

    2015-01-01

    Minimally invasive procedures, which include laparoscopic surgery, use state-of-the-art technology to reduce the damage to human tissue when performing surgery. Minimally invasive procedures require small “ports” from which the surgeon inserts thin tubes called trocars. Carbon dioxide gas may be used to inflate the area, creating a space between the internal organs and the skin. Then a miniature camera (usually a laparoscope or endoscope) is placed through one of the trocars so the surgical team can view the procedure as a magnified image on video monitors in the operating room. Specialized equipment is inserted through the trocars based on the type of surgery. There are some advanced minimally invasive surgical procedures that can be performed almost exclusively through a single point of entry—meaning only one small incision, like the “uniport” video-assisted thoracoscopic surgery (VATS). Not only do these procedures usually provide equivalent outcomes to traditional “open” surgery (which sometimes require a large incision), but minimally invasive procedures (using small incisions) may offer significant benefits as well: (I) faster recovery; (II) the patient remains for less days hospitalized; (III) less scarring and (IV) less pain. In our current mini review we will present the minimally invasive procedures for thoracic surgery. PMID:25861610

  6. Role of Fundamental Physics in Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Turyshev, Slava

    2004-01-01

    This talk will discuss the critical role that fundamental physics research plays for the human space exploration. In particular, the currently available technologies can already provide significant radiation reduction, minimize bone loss, increase crew productivity and, thus, uniquely contribute to overall mission success. I will discuss how fundamental physics research and emerging technologies may not only further reduce the risks of space travel, but also increase the crew mobility, enhance safety and increase the value of space exploration in the near future.

  7. Fundamentals of neurogastroenterology

    PubMed Central

    Wood, J; Alpers, D; Andrews, P

    1999-01-01

    Current concepts and basic principles of neurogastroenterology in relation to functional gastrointestinal disorders are reviewed. Neurogastroenterology is emphasized as a new and advancing subspecialty of clinical gastroenterology and digestive science. As such, it embraces the investigative sciences dealing with functions, malfunctions, and malformations in the brain and spinal cord, and the sympathetic, parasympathetic and enteric divisions of the autonomic innervation of the digestive tract. Somatomotor systems are included insofar as pharyngeal phases of swallowing and pelvic floor involvement in defecation, continence, and pelvic pain are concerned. Inclusion of basic physiology of smooth muscle, mucosal epithelium, and the enteric immune system in the neurogastroenterologic domain relates to requirements for compatibility with neural control mechanisms. Psychologic and psychiatric relations to functional gastrointestinal disorders are included because they are significant components of neurogastroenterology, especially in relation to projections of discomfort and pain to the digestive tract.


Keywords: enteric nervous system; brain-gut axis; autonomic nervous system; nausea; gut motility; mast cells; gastrointestinal pain; Rome II PMID:10457039

  8. Fundamentals and Techniques of Nonimaging

    SciTech Connect

    O'Gallagher, J. J.; Winston, R.

    2003-07-10

    This is the final report describing a long term basic research program in nonimaging optics that has led to major advances in important areas, including solar energy, fiber optics, illumination techniques, light detectors, and a great many other applications. The term ''nonimaging optics'' refers to the optics of extended sources in systems for which image forming is not important, but effective and efficient collection, concentration, transport, and distribution of light energy is. Although some of the most widely known developments of the early concepts have been in the field of solar energy, a broad variety of other uses have emerged. Most important, under the auspices of this program in fundamental research in nonimaging optics established at the University of Chicago with support from the Office of Basic Energy Sciences at the Department of Energy, the field has become very dynamic, with new ideas and concepts continuing to develop, while applications of the early concepts continue to be pursued. While the subject began as part of classical geometrical optics, it has been extended subsequently to the wave optics domain. Particularly relevant to potential new research directions are recent developments in the formalism of statistical and wave optics, which may be important in understanding energy transport on the nanoscale. Nonimaging optics permits the design of optical systems that achieve the maximum possible concentration allowed by physical conservation laws. The earliest designs were constructed by optimizing the collection of the extreme rays from a source to the desired target: the so-called ''edge-ray'' principle. Later, new concentrator types were generated by placing reflectors along the flow lines of the ''vector flux'' emanating from lambertian emitters in various geometries. A few years ago, a new development occurred with the discovery that making the design edge-ray a functional of some other system parameter permits the construction of whole

  9. Principles of nanoscience: an overview.

    PubMed

    Behari, Jitendra

    2010-10-01

    The scientific basis of nanotechnology as envisaged from the first principles is compared to bulk behavior. Development of nanoparticles having controllable physical and electronic properties has opened up possibility of designing artificial solids. Top down and bottom up approaches are emphasized. The role of nanoparticle (quantum dots) application in nanophotonics (photovoltaic cell), and drug delivery vehicle is discussed. Fundamentals of DNA structure as the prime site in bionanotechnological manipulations is also discussed. A summary of presently available devices and applications are presented.

  10. How not to criticize the precautionary principle.

    PubMed

    Hughes, Jonathan

    2006-10-01

    The precautionary principle has its origins in debates about environmental policy, but is increasingly invoked in bioethical contexts. John Harris and Søren Holm argue that the principle should be rejected as incoherent, irrational, and representing a fundamental threat to scientific advance and technological progress. This article argues that while there are problems with standard formulations of the principle, Harris and Holm's rejection of all its forms is mistaken. In particular, they focus on strong versions of the principle and fail to recognize that weaker forms, which may escape their criticisms, are both possible and advocated in the literature.

  11. Minimal perceptrons for memorizing complex patterns

    NASA Astrophysics Data System (ADS)

    Pastor, Marissa; Song, Juyong; Hoang, Danh-Tai; Jo, Junghyo

    2016-11-01

    Feedforward neural networks have been investigated to understand learning and memory, as well as applied to numerous practical problems in pattern classification. It is a rule of thumb that more complex tasks require larger networks. However, the design of optimal network architectures for specific tasks is still an unsolved fundamental problem. In this study, we consider three-layered neural networks for memorizing binary patterns. We developed a new complexity measure of binary patterns, and estimated the minimal network size for memorizing them as a function of their complexity. We formulated the minimal network size for regular, random, and complex patterns. In particular, the minimal size for complex patterns, which are neither ordered nor disordered, was predicted by measuring their Hamming distances from known ordered patterns. Our predictions agree with simulations based on the back-propagation algorithm.

  12. Complementary Huygens Principle for Geometrical and Nongeometrical Optics

    ERIC Educational Resources Information Center

    Luis, Alfredo

    2007-01-01

    We develop a fundamental principle depicting the generalized ray formulation of optics provided by the Wigner function. This principle is formally identical to the Huygens-Fresnel principle but in terms of opposite concepts, rays instead of waves, and incoherent superpositions instead of coherent ones. This ray picture naturally includes…

  13. Minimally invasive valve surgery.

    PubMed

    Woo, Y Joseph

    2009-08-01

    Traditional cardiac valve replacement surgery is being rapidly supplanted by innovative, minimally invasive approaches toward the repair of these valves. Patients are experiencing benefits ranging from less bleeding and pain to faster recovery and greater satisfaction. These operations are proving to be safe, highly effective, and durable, and their use will likely continue to increase and become even more widely applicable.

  14. CONMIN- CONSTRAINED FUNCTION MINIMIZATION

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1994-01-01

    In many mathematical problems, it is necessary to determine the minimum and maximum of a function of several variables, limited by various linear and nonlinear inequality constraints. It is seldom possible, in practical applications, to solve these problems directly. In most cases, an iterative method must be used to numerically obtain a solution. The CONMIN program was developed to numerically perform the minimization of a multi-variable function subject to a set of inequality constraints. The function need not be a simple analytical equation; it may be any function which can be numerically evaluated. The basic analytic technique used by CONMIN is to minimize the function until one or more of the constraints become active. The minimization process then continues by following the constraint boundaries in a direction such that the value of the function continues to decrease. When a point is reached where no further decrease in the function can be obtained, the process is terminated. Function maximization may be achieved by minimizing the negative of the function. This program is written in FORTRAN IV for batch execution and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 43K (octal) of 60 bit words. The CONMIN program was originally developed in 1973 and last updated in 1978.

  15. System level electrochemical principles

    NASA Technical Reports Server (NTRS)

    Thaller, L. H.

    1985-01-01

    The traditional electrochemical storage concepts are difficult to translate into high power, high voltage multikilowatt storage systems. The increased use of electronics, and the use of electrochemical couples that minimize the difficulties associated with the corrective measures to reduce the cell to cell capacity dispersion were adopted by battery technology. Actively cooled bipolar concepts are described which represent some attractive alternative system concepts. They are projected to have higher energy densities lower volumes than current concepts. They should be easier to scale from one capacity to another and have a closer cell to cell capacity balance. These newer storage system concepts are easier to manage since they are designed to be a fully integrated battery. These ideas are referred to as system level electrochemistry. The hydrogen-oxygen regenerative fuel cells (RFC) is probably the best example of the integrated use of these principles.

  16. Fundamentals of Space Medicine

    NASA Astrophysics Data System (ADS)

    Clément, Gilles

    2005-03-01

    A total of more than 240 human space flights have been completed to date, involving about 450 astronauts from various countries, for a combined total presence in space of more than 70 years. The seventh long-duration expedition crew is currently in residence aboard the International Space Station, continuing a permanent presence in space that began in October 2000. During that time, investigations have been conducted on both humans and animal models to study the bone demineralization and muscle deconditioning, space motion sickness, the causes and possible treatment of postflight orthostatic intolerance, the changes in immune function, crew and crew-ground interactions, and the medical issues of living in a space environment, such as the effects of radiation or the risk of developing kidney stones. Some results of these investigations have led to fundamental discoveries about the adaptation of the human body to the space environment. Gilles Clément has been active in this research. This readable text presents the findings from the life science experiments conducted during and after space missions. Topics discussed in this book include: adaptation of sensory-motor, cardio-vascular, bone, and muscle systems to the microgravity of spaceflight; psychological and sociological issues of living in a confined, isolated, and stressful environment; operational space medicine, such as crew selection, training and in-flight health monitoring, countermeasures and support; results of space biology experiments on individual cells, plants, and animal models; and the impact of long-duration missions such as the human mission to Mars. The author also provides a detailed description of how to fly a space experiment, based on his own experience with research projects conducted onboard Salyut-7, Mir, Spacelab, and the Space Shuttle. Now is the time to look at the future of human spaceflight and what comes next. The future human exploration of Mars captures the imagination of both the

  17. Fundamentals of Space Medicine

    NASA Astrophysics Data System (ADS)

    Clément, G.

    2003-10-01

    As of today, a total of more than 240 human space flights have been completed, involving about 450 astronauts from various countries, for a combined total presence in space of more than 70 years. The seventh long-duration expedition crew is currently in residence aboard the International Space Station, continuing a permanent presence in space that began in October 2000. During that time, investigations have been conducted on both humans and animal models to study the bone demineralization and muscle deconditioning, space motion sickness, the causes and possible treatment of postflight orthostatic intolerance, the changes in immune function, crew and crew-ground interactions, and the medical issues of living in a space environment, such as the effects of radiation or the risk of developing kidney stones. Some results of these investigations have led to fundamental discoveries about the adaptation of the human body to the space environment. Gilles Clément has been active in this research. This book presents in a readable text the findings from the life science experiments conducted during and after space missions. Topics discussed in this book include: adaptation of sensory-motor, cardiovascular, bone and muscle systems to the microgravity of spaceflight; psychological and sociological issues of living in a confined, isolated and stressful environment; operational space medicine, such as crew selection, training and in-flight health monitoring, countermeasures and support; results of space biology experiments on individual cells, plants, and animal models; and the impact of long-duration missions such as the human mission to Mars. The author also provides a detailed description of how to fly a space experiment, based on his own experience with research projects conducted onboard Salyut-7, Mir, Spacelab, and the Space Shuttle. Now is the time to look at the future of human spaceflight and what comes next. The future human exploration of Mars captures the imagination

  18. [Minimally invasive ENT surgery. Progress due to modern technology].

    PubMed

    Plinkert, P K; Schurr, M O; Kunert, W; Flemming, E; Buess, G; Zenner, H P

    1996-06-01

    Three fundamentals have to be fulfilled to optimize minimally, invasive surgery: three-dimensional imaging, free maneuverability of the instruments, sensorial feedback. Projection of two pictures from a stereoendoscope and subsequent separation with a LCD shutter allows three-dimensional videoendoscopy to be performed. A high-frequency shutter technique (100/120 Hz) presents pictures from the two video cameras to the right and left eye, respectively, so that the surgeon has spatial vision of the operative field. Steerable instruments have four component: a control unit, rigid shaft, steerable multi-joints, distal effector. The steerable multi-joints give two additional degrees of freedom compared to conventional rigid instruments in endoscopic surgery. For intuitive movements, however, an electronic control system is necessary that is comparable to the "master-slave" principle in remote technology. A remote manipulator system with six degrees of freedom is now available. Additionally, a multifunctional distal tip permits different surgical steps to be performed without changing the instrument. For better control of the instrument and the operative procedure tactile feedback can be achieved with appropriate microsensor systems. Recent projects suggest that an artificial sensor system can be established within the foreseeable future.

  19. Revisiting Tversky's diagnosticity principle

    PubMed Central

    Evers, Ellen R. K.; Lakens, Daniël

    2013-01-01

    Similarity is a fundamental concept in cognition. In 1977, Amos Tversky published a highly influential feature-based model of how people judge the similarity between objects. The model highlights the context-dependence of similarity judgments, and challenged geometric models of similarity. One of the context-dependent effects Tversky describes is the diagnosticity principle. The diagnosticity principle determines which features are used to cluster multiple objects into subgroups. Perceived similarity between items within clusters is expected to increase, while similarity between items in different clusters decreases. Here, we present two pre-registered replications of the studies on the diagnosticity effect reported in Tversky (1977). Additionally, one alternative mechanism that has been proposed to play a role in the original studies, an increase in the choice for distractor items (a substitution effect, see Medin et al., 1995), is examined. Our results replicate those found by Tversky (1977), revealing an average diagnosticity-effect of 4.75%. However, when we eliminate the possibility of substitution effects confounding the results, a meta-analysis of the data provides no indication of any remaining effect of diagnosticity. PMID:25161638

  20. Li-O2 Kinetic Overpotentials: Tafel Plots from Experiment and First-Principles Theory.

    PubMed

    Viswanathan, V; Nørskov, J K; Speidel, A; Scheffler, R; Gowda, S; Luntz, A C

    2013-02-21

    We report the current dependence of the fundamental kinetic overpotentials for Li-O2 discharge and charge (Tafel plots) that define the optimal cycle efficiency in a Li-air battery. Comparison of the unusual experimental Tafel plots obtained in a bulk electrolysis cell with those obtained by first-principles theory is semiquantitative. The kinetic overpotentials for any practical current density are very small, considerably less than polarization losses due to iR drops from the cell impedance in Li-O2 batteries. If only the kinetic overpotentials were present, then a discharge-charge voltaic cycle efficiency of ∼85% should be possible at ∼10 mA/cm(2) superficial current density in a battery of ∼0.1 m(2) total cathode area. We therefore suggest that minimizing the cell impedance is a more important problem than minimizing the kinetic overpotentials to develop higher current Li-air batteries.

  1. Conservative design minimizes emissions

    SciTech Connect

    Schwieger, B.

    1980-02-01

    The fundamentals of combustion and the different types of firing equipment available for burning hog fuel are reviewed. Pile burning is generally used when high-moisture fuel is burned and combustion systems such as the Dutch oven, refractory-lined fuel cells, the cyclone furnace and the wet-cell burner are described. Semipile combustion systems which burn hog fuel containing 60% moisture or more, particularly the inclined water-cooled pinhole grate, are also discussed.

  2. Promoting patient-centred fundamental care in acute healthcare systems.

    PubMed

    Feo, Rebecca; Kitson, Alison

    2016-05-01

    Meeting patients' fundamental care needs is essential for optimal safety and recovery and positive experiences within any healthcare setting. There is growing international evidence, however, that these fundamentals are often poorly executed in acute care settings, resulting in patient safety threats, poorer and costly care outcomes, and dehumanising experiences for patients and families. Whilst care standards and policy initiatives are attempting to address these issues, their impact has been limited. This discussion paper explores, through a series of propositions, why fundamental care can be overlooked in sophisticated, high technology acute care settings. We argue that the central problem lies in the invisibility and subsequent devaluing of fundamental care. Such care is perceived to involve simple tasks that require little skill to execute and have minimal impact on patient outcomes. The propositions explore the potential origins of this prevailing perception, focusing upon the impact of the biomedical model, the consequences of managerial approaches that drive healthcare cultures, and the devaluing of fundamental care by nurses themselves. These multiple sources of invisibility and devaluing surrounding fundamental care have rendered the concept underdeveloped and misunderstood both conceptually and theoretically. Likewise, there remains minimal role clarification around who should be responsible for and deliver such care, and a dearth of empirical evidence and evidence-based metrics. In explicating these propositions, we argue that key to transforming the delivery of acute healthcare is a substantial shift in the conceptualisation of fundamental care. The propositions present a cogent argument that counters the prevailing perception that fundamental care is basic and does not require systematic investigation. We conclude by calling for the explicit valuing and embedding of fundamental care in healthcare education, research, practice and policy. Without this

  3. Structuring Instruction in Arts Fundamentals for the Elementary School

    ERIC Educational Resources Information Center

    Mittler, Gene A.

    1976-01-01

    Effective instruction in art fundamentals requires a structure which allows students to master basic concepts and skills before progressing on to more elaborated ones. A matrix designed to illustrate the variety of design relationships realized by combining elements and principles of art was presented. (Author/RK)

  4. 48 CFR 9904.405-40 - Fundamental requirement.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... 9904.405-40 Section 9904.405-40 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.405-40 Fundamental requirement. (a) Costs expressly... be subject to the same cost accounting principles governing cost allocability as allowable costs....

  5. Correction of harmonic motion and Kepler orbit based on the minimal momentum uncertainty

    NASA Astrophysics Data System (ADS)

    Chung, Won Sang; Hassanabadi, Hassan

    2017-03-01

    In this paper we consider the deformed Heisenberg uncertainty principle with the minimal uncertainty in momentum which is called a minimal momentum uncertainty principle (MMUP). We consider MMUP in D-dimension and its classical analogue. Using these we investigate the MMUP effect for the harmonic motion and Kepler orbit.

  6. Discrete Minimal Surface Algebras

    NASA Astrophysics Data System (ADS)

    Arnlind, Joakim; Hoppe, Jens

    2010-05-01

    We consider discrete minimal surface algebras (DMSA) as generalized noncommutative analogues of minimal surfaces in higher dimensional spheres. These algebras appear naturally in membrane theory, where sequences of their representations are used as a regularization. After showing that the defining relations of the algebra are consistent, and that one can compute a basis of the enveloping algebra, we give several explicit examples of DMSAs in terms of subsets of sln (any semi-simple Lie algebra providing a trivial example by itself). A special class of DMSAs are Yang-Mills algebras. The representation graph is introduced to study representations of DMSAs of dimension d ≤ 4, and properties of representations are related to properties of graphs. The representation graph of a tensor product is (generically) the Cartesian product of the corresponding graphs. We provide explicit examples of irreducible representations and, for coinciding eigenvalues, classify all the unitary representations of the corresponding algebras.

  7. Minimal hepatic encephalopathy.

    PubMed

    Zamora Nava, Luis Eduardo; Torre Delgadillo, Aldo

    2011-06-01

    The term minimal hepatic encephalopathy (MHE) refers to the subtle changes in cognitive function, electrophysiological parameters, cerebral neurochemical/neurotransmitter homeostasis, cerebral blood flow, metabolism, and fluid homeostasis that can be observed in patients with cirrhosis who have no clinical evidence of hepatic encephalopathy; the prevalence is as high as 84% in patients with hepatic cirrhosis. Physician does generally not perceive cirrhosis complications, and neuropsychological tests and another especial measurement like evoked potentials and image studies like positron emission tomography can only make diagnosis. Diagnosis of minimal hepatic encephalopathy may have prognostic and therapeutic implications in cirrhotic patients. The present review pretends to explore the clinic, therapeutic, diagnosis and prognostic aspects of this complication.

  8. [Minimally invasive thymus surgery].

    PubMed

    Rückert, J C; Ismail, M; Swierzy, M; Braumann, C; Badakhshi, H; Rogalla, P; Meisel, A; Rückert, R I; Müller, J M

    2008-01-01

    There are absolute and relative indications for complete removal of the thymus gland. In the complex therapy of autoimmune-related myasthenia gravis, thymectomy plays a central role and is performed with relative indication. In case of thymoma with or without myasthenia, thymectomy is absolutely indicated. Thymus resection is further necessary for cases of hyperparathyroidism with ectopic intrathymic parathyroids or with certain forms of multiple endocrine neoplasia. The transcervical operation technique traditionally reflected the well-founded desire for minimal invasiveness for thymectomy. Due to the requirement of radicality however, most of these operations were performed using sternotomy. With the evolution of therapeutic thoracoscopy in thoracic surgery, several pure or extended minimally invasive operation techniques for thymectomy have been developed. At present uni- or bilateral, subxiphoid, and modified transcervical single or combination thoracoscopic techniques are in use. Recently a very precise new level of thoracoscopic operation technique was developed using robotic-assisted surgery. There are special advantages of this technique for thymectomy. An overview of the development and experiences with minimally invasive thymectomy is presented, including data from the largest series published so far.

  9. Waste Minimization Crosscut Plan

    SciTech Connect

    Not Available

    1992-05-13

    On November 27, 1991, the Secretary of Energy directed that a Department of Energy (DOE) crosscut plan for waste minimization (WMin) be prepared and submitted by March 1, 1992. This Waste Minimization Crosscut Plan responds to the Secretary's direction and supports the National Energy Strategy (NES) goals of achieving greater energy security, increasing energy and economic efficiency, and enhancing environmental quality. It provides a DOE-wide planning framework for effective coordination of all DOE WMin activities. This Plan was jointly prepared by the following Program Secretarial Officer (PSO) organizations: Civilian Radioactive Waste Management (RW); Conservation and Renewable Energy (CE); Defense Programs (DP); Environmental Restoration and Waste Management (EM), lead; Energy Research (ER); Fossil Energy (FE); Nuclear Energy (NE); and New Production Reactors (NP). Assistance and guidance was provided by the offices of Policy, Planning, and Analysis (PE) and Environment, Safety and Health (EH). Comprehensive application of waste minimization within the Department and in both the public and private sectors will provide significant benefits and support National Energy Strategy goals. These benefits include conservation of a substantial proportion of the energy now used by industry and Government, improved environmental quality, reduced health risks, improved production efficiencies, and longer useful life of disposal capacity. Taken together, these benefits will mean improved US global competitiveness, expanded job opportunities, and a better quality of life for all citizens.

  10. Waste Minimization Crosscut Plan

    SciTech Connect

    Not Available

    1992-05-13

    On November 27, 1991, the Secretary of Energy directed that a Department of Energy (DOE) crosscut plan for waste minimization (WMin) be prepared and submitted by March 1, 1992. This Waste Minimization Crosscut Plan responds to the Secretary`s direction and supports the National Energy Strategy (NES) goals of achieving greater energy security, increasing energy and economic efficiency, and enhancing environmental quality. It provides a DOE-wide planning framework for effective coordination of all DOE WMin activities. This Plan was jointly prepared by the following Program Secretarial Officer (PSO) organizations: Civilian Radioactive Waste Management (RW); Conservation and Renewable Energy (CE); Defense Programs (DP); Environmental Restoration and Waste Management (EM), lead; Energy Research (ER); Fossil Energy (FE); Nuclear Energy (NE); and New Production Reactors (NP). Assistance and guidance was provided by the offices of Policy, Planning, and Analysis (PE) and Environment, Safety and Health (EH). Comprehensive application of waste minimization within the Department and in both the public and private sectors will provide significant benefits and support National Energy Strategy goals. These benefits include conservation of a substantial proportion of the energy now used by industry and Government, improved environmental quality, reduced health risks, improved production efficiencies, and longer useful life of disposal capacity. Taken together, these benefits will mean improved US global competitiveness, expanded job opportunities, and a better quality of life for all citizens.

  11. Against Explanatory Minimalism in Psychiatry

    PubMed Central

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell’s criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein’s Zettel. But attention to the context of Wittgenstein’s remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  12. Against Explanatory Minimalism in Psychiatry.

    PubMed

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell's criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein's Zettel. But attention to the context of Wittgenstein's remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation.

  13. Waste minimization in environmental sampling and analysis

    SciTech Connect

    Brice, D.A.; Nixon, J. . Fernald Environmental Management Project); Lewis, E.T. )

    1992-01-01

    Environmental investigations of the extent and effect of contamination, and projects to remediate such contamination, are designed to mitigate perceived threats to human health and the environment. During the course of these investigations, excavations, borings, and monitoring wells are constructed: monitoring wells are developed and purged prior to sampling; samples are collected; equipment is decontaminated; constituents extracted and analyzed; and personal protective equipment is used to keep workers safe. All of these activities generate waste. A large portion of this waste may be classified as hazardous based on characteristics or constituent components. Waste minimization is defined as reducing the volume and/or toxicity of waste generated by a process. Waste minimization has proven to be an effective means of cost reduction and improving worker health, safety, and environmental awareness in the industrial workplace through pollution prevention. Building waste minimization goals into a project during the planning phase is both cost effective and consistent with total quality management principles. Application of waste minimization principles should be an integral part of the planning and conduct of environmental investigations. Current regulatory guidance on planning environmental investigations focuses on data quality and risk assessment objectives. Waste minimization should also be a scoping priority, along with meeting worker protection requirements, protection of human health and the environment, and achieving data quality objectives. Waste volume or toxicity can be reduced through the use of smaller sample sizes, less toxic extraction solvents, less hazardous decontamination materials, smaller excavations and borings, smaller diameter monitoring wells, dedicated sampling equipment, well-fitting personal protective equipment, judicious use of screening technologies, and analyzing only for parameters of concern.

  14. Waste minimization in environmental sampling and analysis

    SciTech Connect

    Brice, D.A.; Nixon, J.; Lewis, E.T.

    1992-03-01

    Environmental investigations of the extent and effect of contamination, and projects to remediate such contamination, are designed to mitigate perceived threats to human health and the environment. During the course of these investigations, excavations, borings, and monitoring wells are constructed: monitoring wells are developed and purged prior to sampling; samples are collected; equipment is decontaminated; constituents extracted and analyzed; and personal protective equipment is used to keep workers safe. All of these activities generate waste. A large portion of this waste may be classified as hazardous based on characteristics or constituent components. Waste minimization is defined as reducing the volume and/or toxicity of waste generated by a process. Waste minimization has proven to be an effective means of cost reduction and improving worker health, safety, and environmental awareness in the industrial workplace through pollution prevention. Building waste minimization goals into a project during the planning phase is both cost effective and consistent with total quality management principles. Application of waste minimization principles should be an integral part of the planning and conduct of environmental investigations. Current regulatory guidance on planning environmental investigations focuses on data quality and risk assessment objectives. Waste minimization should also be a scoping priority, along with meeting worker protection requirements, protection of human health and the environment, and achieving data quality objectives. Waste volume or toxicity can be reduced through the use of smaller sample sizes, less toxic extraction solvents, less hazardous decontamination materials, smaller excavations and borings, smaller diameter monitoring wells, dedicated sampling equipment, well-fitting personal protective equipment, judicious use of screening technologies, and analyzing only for parameters of concern.

  15. An Evaluation of Fundamental Schools.

    ERIC Educational Resources Information Center

    Weber, Larry J.; And Others

    1984-01-01

    When compared with regular schools in the same district, fundamental school students performed as well as or better than regular school students; fundamental schools rated better on learning climate, discipline, and suspensions; and there were no differences in student self-concept. (Author/BW)

  16. A Matter of Principle: The Principles of Quantum Theory, Dirac's Equation, and Quantum Information

    NASA Astrophysics Data System (ADS)

    Plotnitsky, Arkady

    2015-10-01

    This article is concerned with the role of fundamental principles in theoretical physics, especially quantum theory. The fundamental principles of relativity will be addressed as well, in view of their role in quantum electrodynamics and quantum field theory, specifically Dirac's work, which, in particular Dirac's derivation of his relativistic equation of the electron from the principles of relativity and quantum theory, is the main focus of this article. I shall also consider Heisenberg's earlier work leading him to the discovery of quantum mechanics, which inspired Dirac's work. I argue that Heisenberg's and Dirac's work was guided by their adherence to and their confidence in the fundamental principles of quantum theory. The final section of the article discusses the recent work by D'Ariano and coworkers on the principles of quantum information theory, which extend quantum theory and its principles in a new direction. This extension enabled them to offer a new derivation of Dirac's equations from these principles alone, without using the principles of relativity.

  17. Minimally invasive mediastinal surgery

    PubMed Central

    Melfi, Franca M. A.; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a “no-touch” technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally

  18. Equivalence principles and electromagnetism

    NASA Technical Reports Server (NTRS)

    Ni, W.-T.

    1977-01-01

    The implications of the weak equivalence principles are investigated in detail for electromagnetic systems in a general framework. In particular, it is shown that the universality of free-fall trajectories (Galileo weak equivalence principle) does not imply the validity of the Einstein equivalence principle. However, the Galileo principle plus the universality of free-fall rotation states does imply the Einstein principle.

  19. Wake Vortex Minimization

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A status report is presented on research directed at reducing the vortex disturbances of aircraft wakes. The objective of such a reduction is to minimize the hazard to smaller aircraft that might encounter these wakes. Inviscid modeling was used to study trailing vortices and viscous effects were investigated. Laser velocimeters were utilized in the measurement of aircraft wakes. Flight and wind tunnel tests were performed on scale and full model scale aircraft of various design. Parameters investigated included the effect of wing span, wing flaps, spoilers, splines and engine thrust on vortex attenuation. Results indicate that vortives may be alleviated through aerodynamic means.

  20. Minimally refined biomass fuel

    DOEpatents

    Pearson, Richard K.; Hirschfeld, Tomas B.

    1984-01-01

    A minimally refined fluid composition, suitable as a fuel mixture and derived from biomass material, is comprised of one or more water-soluble carbohydrates such as sucrose, one or more alcohols having less than four carbons, and water. The carbohydrate provides the fuel source; water solubilizes the carbohydrates; and the alcohol aids in the combustion of the carbohydrate and reduces the vicosity of the carbohydrate/water solution. Because less energy is required to obtain the carbohydrate from the raw biomass than alcohol, an overall energy savings is realized compared to fuels employing alcohol as the primary fuel.

  1. The Higgs boson mass in minimal Technicolor models

    SciTech Connect

    Doff, A.; Natale, A. A.

    2010-11-12

    Recently a Minimal and an Ultraminimal Technicolor models were proposed where the presence of TC fermions in other representations than the fundamental one led to viable models without conflict with the known value for the measured S parameter. In this work we apply the results of [5] to compute the masses of the Higgs boson in the case of the Minimal and Ultraminimal Technicolor models.

  2. Logarithmic superconformal minimal models

    NASA Astrophysics Data System (ADS)

    Pearce, Paul A.; Rasmussen, Jørgen; Tartaglia, Elena

    2014-05-01

    The higher fusion level logarithmic minimal models {\\cal LM}(P,P';n) have recently been constructed as the diagonal GKO cosets {(A_1^{(1)})_k\\oplus (A_1^ {(1)})_n}/ {(A_1^{(1)})_{k+n}} where n ≥ 1 is an integer fusion level and k = nP/(P‧- P) - 2 is a fractional level. For n = 1, these are the well-studied logarithmic minimal models {\\cal LM}(P,P')\\equiv {\\cal LM}(P,P';1). For n ≥ 2, we argue that these critical theories are realized on the lattice by n × n fusion of the n = 1 models. We study the critical fused lattice models {\\cal LM}(p,p')_{n\\times n} within a lattice approach and focus our study on the n = 2 models. We call these logarithmic superconformal minimal models {\\cal LSM}(p,p')\\equiv {\\cal LM}(P,P';2) where P = |2p - p‧|, P‧ = p‧ and p, p‧ are coprime. These models share the central charges c=c^{P,P';2}=\\frac {3}{2}\\big (1-{2(P'-P)^2}/{P P'}\\big ) of the rational superconformal minimal models {\\cal SM}(P,P'). Lattice realizations of these theories are constructed by fusing 2 × 2 blocks of the elementary face operators of the n = 1 logarithmic minimal models {\\cal LM}(p,p'). Algebraically, this entails the fused planar Temperley-Lieb algebra which is a spin-1 Birman-Murakami-Wenzl tangle algebra with loop fugacity β2 = [x]3 = x2 + 1 + x-2 and twist ω = x4 where x = eiλ and λ = (p‧- p)π/p‧. The first two members of this n = 2 series are superconformal dense polymers {\\cal LSM}(2,3) with c=-\\frac {5}{2}, β2 = 0 and superconformal percolation {\\cal LSM}(3,4) with c = 0, β2 = 1. We calculate the bulk and boundary free energies analytically. By numerically studying finite-size conformal spectra on the strip with appropriate boundary conditions, we argue that, in the continuum scaling limit, these lattice models are associated with the logarithmic superconformal models {\\cal LM}(P,P';2). For system size N, we propose finitized Kac character formulae of the form q^{-{c^{P,P';2}}/{24}+\\Delta ^{P,P';2} _{r

  3. Generalized uncertainty principle and analogue of quantum gravity in optics

    NASA Astrophysics Data System (ADS)

    Braidotti, Maria Chiara; Musslimani, Ziad H.; Conti, Claudio

    2017-01-01

    The design of optical systems capable of processing and manipulating ultra-short pulses and ultra-focused beams is highly challenging with far reaching fundamental technological applications. One key obstacle routinely encountered while implementing sub-wavelength optical schemes is how to overcome the limitations set by standard Fourier optics. A strategy to overcome these difficulties is to utilize the concept of a generalized uncertainty principle (G-UP) which has been originally developed to study quantum gravity. In this paper we propose to use the concept of G-UP within the framework of optics to show that the generalized Schrödinger equation describing short pulses and ultra-focused beams predicts the existence of a minimal spatial or temporal scale which in turn implies the existence of maximally localized states. Using a Gaussian wavepacket with complex phase, we derive the corresponding generalized uncertainty relation and its maximally localized states. Furthermore, we numerically show that the presence of nonlinearity helps the system to reach its maximal localization. Our results may trigger further theoretical and experimental tests for practical applications and analogues of fundamental physical theories.

  4. Design of the fundamental power coupler and photocathode inserts for the 112MHz superconducting electron gun

    SciTech Connect

    Xin, T.; Ben-Zvi, I.; Belomestnykh, S.; Chang, X.; Rao, T.; Skaritka, J.; Wu, Q.; Wang, E.; Liang, X.

    2011-07-25

    A 112 MHz superconducting quarter-wave resonator electron gun will be used as the injector of the Coherent Electron Cooling (CEC) proof-of-principle experiment at BNL. Furthermore, this electron gun can be the testing cavity for various photocathodes. In this paper, we present the design of the cathode stalks and a Fundamental Power Coupler (FPC) designated to the future experiments. Two types of cathode stalks are discussed. Special shape of the stalk is applied in order to minimize the RF power loss. The location of cathode plane is also optimized to enable the extraction of low emittance beam. The coaxial waveguide structure FPC has the properties of tunable coupling factor and small interference to the electron beam output. The optimization of the coupling factor and the location of the FPC are discussed in detail. Based on the transmission line theory, we designed a half wavelength cathode stalk which significantly brings down the voltage drop between the cavity and the stalk from more than 5.6 kV to 0.1 kV. The transverse field distribution on cathode has been optimized by carefully choosing the position of cathode stalk inside the cavity. Moreover, in order to decrease the RF power loss, a variable diameter design of cathode stalk has been applied. Compared to the uniform shape of stalk, this design gives us much smaller power losses in important locations. Besides that, we also proposed a fundamental power coupler based on the designed beam parameters for the future proof-of-principle CEC experiment. This FPC should give a strong enough coupling which has the Q external range from 1.5e7 to 2.6e8.

  5. Ablative Thermal Protection System Fundamentals

    NASA Technical Reports Server (NTRS)

    Beck, Robin A. S.

    2013-01-01

    This is the presentation for a short course on the fundamentals of ablative thermal protection systems. It covers the definition of ablation, description of ablative materials, how they work, how to analyze them and how to model them.

  6. Quantum operations: technical or fundamental challenge?

    NASA Astrophysics Data System (ADS)

    Mielnik, Bogdan

    2013-09-01

    A class of unitary operations generated by idealized, semiclassical fields is studied. The operations implemented by sharp potential kicks are revisited and the possibility of performing them by softly varying external fields is examined. The possibility of using the ion traps as ‘operation factories’ transforming quantum states is discussed. The non-perturbative algorithms indicate that the results of abstract δ-pulses of oscillator potentials can become real. Some of them, if empirically achieved, could be essential to examine certain atypical quantum ideas. In particular, simple dynamical manipulations might contribute to the Aharonov-Bohm criticism of the time-energy uncertainty principle, while some others may verify the existence of fundamental precision limits of the position measurements or the reality of ‘non-commutative geometries’.

  7. Fundamentals of bipolar high-frequency surgery.

    PubMed

    Reidenbach, H D

    1993-04-01

    In endoscopic surgery a very precise surgical dissection technique and an efficient hemostasis are of decisive importance. The bipolar technique may be regarded as a method which satisfies both requirements, especially regarding a high safety standard in application. In this context the biophysical and technical fundamentals of this method, which have been known in principle for a long time, are described with regard to the special demands of a newly developed field of modern surgery. After classification of this method into a general and a quasi-bipolar mode, various technological solutions of specific bipolar probes, in a strict and in a generalized sense, are characterized in terms of indication. Experimental results obtained with different bipolar instruments and probes are given. The application of modern microprocessor-controlled high-frequency surgery equipment and, wherever necessary, the integration of additional ancillary technology into the specialized bipolar instruments may result in most useful and efficient tools of a key technology in endoscopic surgery.

  8. Minimally invasive valve surgery.

    PubMed

    Woo, Y Joseph; Seeburger, Joerg; Mohr, Friedrich W

    2007-01-01

    As alternatives to standard sternotomy, surgeons have developed innovative, minimally invasive approaches to conducting valve surgery. Through very small skin incisions and partial upper sternal division for aortic valve surgery and right minithoracotomy for mitral surgery, surgeons have become adept at performing complex valve procedures. Beyond cosmetic appeal, apparent benefits range from decreased pain and bleeding to improved respiratory function and recovery time. The large retrospective studies and few small prospective randomized studies are herein briefly summarized. The focus is then directed toward describing specific intraoperative technical details in current clinical use, covering anesthetic preparation, incision, mediastinal access, cardiovascular cannulation, valve exposure, and valve reconstruction. Finally, unique situations such as pulmonic valve surgery, reoperations, beating heart surgery, and robotics are discussed.

  9. Transanal Minimally Invasive Surgery

    PubMed Central

    deBeche-Adams, Teresa; Nassif, George

    2015-01-01

    Transanal minimally invasive surgery (TAMIS) was first described in 2010 as a crossover between single-incision laparoscopic surgery and transanal endoscopic microsurgery (TEM) to allow access to the proximal and mid-rectum for resection of benign and early-stage malignant rectal lesions. The TAMIS technique can also be used for noncurative intent surgery of more advanced lesions in patients who are not candidates for radical surgery. Proper workup and staging should be done before surgical decision-making. In addition to the TAMIS port, instrumentation and set up include readily available equipment found in most operating suites. TAMIS has proven its usefulness in a wide range of applications outside of local excision, including repair of rectourethral fistula, removal of rectal foreign body, control of rectal hemorrhage, and as an adjunct in total mesorectal excision for rectal cancer. TAMIS is an easily accessible, technically feasible, and cost-effective alternative to TEM. PMID:26491410

  10. Minimally invasive esophagectomy

    PubMed Central

    Herbella, Fernando A; Patti, Marco G

    2010-01-01

    Esophageal resection is associated with a high morbidity and mortality rate. Minimally invasive esophagectomy (MIE) might theoretically decrease this rate. We reviewed the current literature on MIE, with a focus on the available techniques, outcomes and comparison with open surgery. This review shows that the available literature on MIE is still crowded with heterogeneous studies with different techniques. There are no controlled and randomized trials, and the few retrospective comparative cohort studies are limited by small numbers of patients and biased by historical controls of open surgery. Based on the available literature, there is no evidence that MIE brings clear benefits compared to conventional esophagectomy. Increasing experience and the report of larger series might change this scenario. PMID:20698044

  11. The minimal length and quantum partition functions

    NASA Astrophysics Data System (ADS)

    Abbasiyan-Motlaq, M.; Pedram, P.

    2014-08-01

    We study the thermodynamics of various physical systems in the framework of the generalized uncertainty principle that implies a minimal length uncertainty proportional to the Planck length. We present a general scheme to analytically calculate the quantum partition function of the physical systems to first order of the deformation parameter based on the behavior of the modified energy spectrum and compare our results with the classical approach. Also, we find the modified internal energy and heat capacity of the systems for the anti-Snyder framework.

  12. Minimally invasive parathyroid surgery

    PubMed Central

    Noureldine, Salem I.; Gooi, Zhen

    2015-01-01

    Traditionally, bilateral cervical exploration for localization of all four parathyroid glands and removal of any that are grossly enlarged has been the standard surgical treatment for primary hyperparathyroidism (PHPT). With the advances in preoperative localization studies and greater public demand for less invasive procedures, novel targeted, minimally invasive techniques to the parathyroid glands have been described and practiced over the past 2 decades. Minimally invasive parathyroidectomy (MIP) can be done either through the standard Kocher incision, a smaller midline incision, with video assistance (purely endoscopic and video-assisted techniques), or through an ectopically placed, extracervical, incision. In current practice, once PHPT is diagnosed, preoperative evaluation using high-resolution radiographic imaging to localize the offending parathyroid gland is essential if MIP is to be considered. The imaging study results suggest where the surgeon should begin the focused procedure and serve as a road map to allow tailoring of an efficient, imaging-guided dissection while eliminating the unnecessary dissection of multiple glands or a bilateral exploration. Intraoperative parathyroid hormone (IOPTH) levels may be measured during the procedure, or a gamma probe used during radioguided parathyroidectomy, to ascertain that the correct gland has been excised and that no other hyperfunctional tissue is present. MIP has many advantages over the traditional bilateral, four-gland exploration. MIP can be performed using local anesthesia, requires less operative time, results in fewer complications, and offers an improved cosmetic result and greater patient satisfaction. Additional advantages of MIP are earlier hospital discharge and decreased overall associated costs. This article aims to address the considerations for accomplishing MIP, including the role of preoperative imaging studies, intraoperative adjuncts, and surgical techniques. PMID:26425454

  13. Water Balance Covers For Waste Containment: Principles and Practice

    EPA Science Inventory

    Water Balance Covers for Waste Containment: Principles and Practices introduces water balance covers and compares them with conventional approaches to waste containment. The authors provided detailed analysis of the fundamentals of soil physics and design issues, introduce appl...

  14. Principles of Guided Missiles and Nuclear Weapons.

    ERIC Educational Resources Information Center

    Naval Personnel Program Support Activity, Washington, DC.

    Fundamentals of missile and nuclear weapons systems are presented in this book which is primarily prepared as the second text of a three-volume series for students of the Navy Reserve Officers' Training Corps and the Officer Candidate School. Following an introduction to guided missiles and nuclear physics, basic principles and theories are…

  15. The Peter Principle: A Theory of Decline

    ERIC Educational Resources Information Center

    Lazear, Edward P.

    2004-01-01

    Some have observed that individuals perform worse after being promoted. The Peter principle, which states that people are promoted to their level of incompetence, suggests that something is fundamentally misaligned in the promotion process. This view is unnecessary and inconsistent with the data. Below, it is argued that ability appears lower…

  16. Development of Canonical Transformations from Hamilton's Principle.

    ERIC Educational Resources Information Center

    Quade, C. Richard

    1979-01-01

    The theory of canonical transformations and its development are discussed with regard to its application to Hutton's principle. Included are the derivation of the equations of motion and a lack of symmetry in the formulaion with respect to Lagrangian and the fundamental commutator relations of quantum mechanics. (Author/SA)

  17. Basic principles of variable speed drives

    NASA Technical Reports Server (NTRS)

    Loewenthal, S. H.

    1973-01-01

    An understanding of the principles which govern variable speed drive operation is discussed for successful drive application. The fundamental factors of torque, speed ratio, and power as they relate to drive selection are discussed. The basic types of variable speed drives, their operating characteristics and their applications are also presented.

  18. Devising Principles of Design for Numeracy Tasks

    ERIC Educational Resources Information Center

    Geiger, Vince; Forgasz, Helen; Goos, Merrilyn; Bennison, Anne

    2014-01-01

    Numeracy is a fundamental component of the Australian National Curriculum as a General Capability identified in each F-10 subject. In this paper, we consider the principles of design necessary for the development of numeracy tasks specific to subjects other than mathematics--in this case, the subject of English. We explore the nature of potential…

  19. Reader-Response and the Pathos Principle.

    ERIC Educational Resources Information Center

    Johnson, Nan

    1988-01-01

    Reviews and equates theories of reader-response and rhetorical theories on audience response (the pathos principle). Concludes that the fundamental synonymity between them represents a significant bridge between analysis of literary texts and the dynamics of formal and social discourse and provides a theoretical foundation for teaching reading and…

  20. Beyond the Virtues-Principles Debate.

    ERIC Educational Resources Information Center

    Keat, Marilyn S.

    1992-01-01

    Indicates basic ontological assumptions in the virtues-principles debate in moral philosophy, noting Aristotle's and Kant's fundamental ideas about morality and considering a hermeneutic synthesis of theories. The article discusses what acceptance of the synthesis might mean in the theory and practice of moral pedagogy, offering examples of…

  1. Minimal Models of Multidimensional Computations

    PubMed Central

    Fitzgerald, Jeffrey D.; Sincich, Lawrence C.; Sharpee, Tatyana O.

    2011-01-01

    The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs. PMID:21455284

  2. Vedic principles of therapy.

    PubMed

    Boyer, R W

    2012-01-01

    This paper introduces Vedic principles of therapy as a holistic integration of healing and human development. The most integrative aspect is a "consciousness-based" approach in which the bottom line of the mind is consciousness itself, accessed by transcending mental activity to its simplest ground state. This directly contrasts with "unconscious-based" approaches that hold the basis of conscious mind is the unconscious, such as analytic, humanistic, and cognitive-behavioral approaches. Although not presented as a specific therapeutic approach, interventions associated with this Vedic approach have extensive support in the applied research literature. A brief review of experimental research toward a general model of mind-and cutting-edge developments in quantum physics toward nonlocal mind-shows a convergence on the ancient Vedic model of mind. Comparisons with contemporary therapies further show that the simplicity, subtlety, and holistic nature of the Vedic approach represent a significant advance over approaches which have overlooked the fundamental ground state of the mind.

  3. The minimal autopoietic unit.

    PubMed

    Luisi, Pier Luigi

    2014-12-01

    It is argued that closed, cell-like compartments, may have existed in prebiotic time, showing a simplified metabolism which was bringing about a primitive form of stationary state- a kind of homeostasis. The autopoietic primitive cell can be taken as an example and there are preliminary experimental data supporting the possible existence of this primitive form of cell activity. The genetic code permits, among other things, the continuous self-reproduction of proteins; enzymic proteins permit the synthesis of nucleic acids, and in this way there is a perfect recycling between the two most important classes of biopolymers in our life. On the other hand, the genetic code is a complex machinery, which cannot be posed at the very early time of the origin of life. And the question then arises, whether some form of alternative beginning, prior to the genetic code, would have been possible: and this is the core of the question asked. Is something with the flavor of early life conceivable, prior to the genetic code? My answer is positive, although I am too well aware that the term "conceivable" does not mean that this something is easily to be performed experimentally. To illustrate my answer, I would first go back to the operational description of cellular life as given by the theory of autopoiesis. Accordingly, a living cell is an open system capable of self-maintenance, due to a process of internal self-regeneration of the components, all within a boundary which is itself product from within. This is a universal code, valid not only for a cell, but for any living macroscopic entity, as no living system exists on Earth which does not obey this principle. In this definition (or better operational description) there is no mention of DNA or genetic code. I added in that definition the term "open system"-which is not present in the primary literature (Varela, et al., 1974) to make clear that every living system is indeed an open system-without this addition, it may seem that

  4. Minimal Marking: A Success Story

    ERIC Educational Resources Information Center

    McNeilly, Anne

    2014-01-01

    The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…

  5. Superpower nuclear minimalism

    SciTech Connect

    Graben, E.K.

    1992-01-01

    During the Cold War, the United States and the Soviet Union competed in building weapons -- now it seems like America and Russia are competing to get rid of them the fastest. The lengthy process of formal arms control has been replaced by exchanges of unilateral force reductions and proposals for reciprocal reductions not necessarily codified by treaty. Should superpower nuclear strategies change along with force postures President Bush has yet to make a formal pronouncement on post-Cold War American nuclear strategy, and it is uncertain if the Soviet/Russian doctrine of reasonable sufficiency formulated in the Gorbachev era actually heralds a change in strategy. Some of the provisions in the most recent round of unilateral proposals put forth by Presidents Bush and Yeltsin in January 1992 are compatible with a change in strategy. Whether such a change has actually occurred remains to be seen. With the end of the Cold War and the breakup of the Soviet Union, the strategic environment has fundamentally changed, so it would seem logical to reexamine strategy as well. There are two main schools of nuclear strategic thought: a maximalist school, mutual assured destruction (MAD) which emphasizes counterforce superiority and nuclear war- fighting capability, and a MAD-plus school, which emphasizes survivability of an assured destruction capability along with the ability to deliver small, limited nuclear attacks in the event that conflict occurs. The MAD-plus strategy is based on an attempt to conventionalize nuclear weapons which is unrealistic.

  6. Minimally legally invasive dentistry.

    PubMed

    Lam, R

    2014-12-01

    One disadvantage of the rapid advances in modern dentistry is that treatment options have never been more varied or confusing. Compounded by a more educated population greatly assisted by online information in an increasingly litigious society, a major concern in recent times is increased litigation against health practitioners. The manner in which courts handle disputes is ambiguous and what is considered fair or just may not be reflected in the judicial process. Although legal decisions in Australia follow a doctrine of precedent, the law is not static and is often reflected by community sentiment. In medical litigation, this has seen the rejection of the Bolam principle with a preference towards greater patient rights. Recent court decisions may change the practice of dentistry and it is important that the clinician is not caught unaware. The aim of this article is to discuss legal issues that are pertinent to the practice of modern dentistry through an analysis of legal cases that have shaped health law. Through these discussions, the importance of continuing professional development, professional association and informed consent will be realized as a means to limit the legal complications of dental practice.

  7. Basic principles of stability.

    PubMed

    Egan, William; Schofield, Timothy

    2009-11-01

    An understanding of the principles of degradation, as well as the statistical tools for measuring product stability, is essential to management of product quality. Key to this is management of vaccine potency. Vaccine shelf life is best managed through determination of a minimum potency release requirement, which helps assure adequate potency throughout expiry. Use of statistical tools such a least squares regression analysis should be employed to model potency decay. The use of such tools provides incentive to properly design vaccine stability studies, while holding stability measurements to specification presents a disincentive for collecting valuable data. The laws of kinetics such as Arrhenius behavior help practitioners design effective accelerated stability programs, which can be utilized to manage stability after a process change. Design of stability studies should be carefully considered, with an eye to minimizing the variability of the stability parameter. In the case of measuring the degradation rate, testing at the beginning and the end of the study improves the precision of this estimate. Additional design considerations such as bracketing and matrixing improve the efficiency of stability evaluation of vaccines.

  8. Minimizing Accidents and Risks in High Adventure Outdoor Pursuits.

    ERIC Educational Resources Information Center

    Meier, Joel

    The fundamental dilemma in adventure programming is eliminating unreasonable risks to participants without also reducing levels of excitement, challenge, and stress. Most accidents are caused by a combination of unsafe conditions, unsafe acts, and error judgments. The best and only way to minimize critical human error in adventure programs is…

  9. Generalized uncertainty principle: Approaches and applications

    NASA Astrophysics Data System (ADS)

    Tawfik, A.; Diab, A.

    2014-11-01

    In this paper, we review some highlights from the String theory, the black hole physics and the doubly special relativity and some thought experiments which were suggested to probe the shortest distances and/or maximum momentum at the Planck scale. Furthermore, all models developed in order to implement the minimal length scale and/or the maximum momentum in different physical systems are analyzed and compared. They entered the literature as the generalized uncertainty principle (GUP) assuming modified dispersion relation, and therefore are allowed for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. A second one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.

  10. The Elements and Principles of Design: A Baseline Study

    ERIC Educational Resources Information Center

    Adams, Erin

    2013-01-01

    Critical to the discipline, both professionally and academically, are the fundamentals of interior design. These fundamentals include the elements and principles of interior design: the commonly accepted tools and vocabulary used to create and communicate successful interior environments. Research indicates a lack of consistency in both the…

  11. Astrophysical probes of fundamental physics

    NASA Astrophysics Data System (ADS)

    Martins, C. J. A. P.

    2009-10-01

    I review the motivation for varying fundamental couplings and discuss how these measurements can be used to constrain fundamental physics scenarios that would otherwise be inaccessible to experiment. I highlight the current controversial evidence for varying couplings and present some new results. Finally I focus on the relation between varying couplings and dark energy, and explain how varying coupling measurements might be used to probe the nature of dark energy, with some advantages over standard methods. In particular I discuss what can be achieved with future spectrographs such as ESPRESSO and CODEX.

  12. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  13. Integration of Social Studies Principles in the Home Economics Curriculum.

    ERIC Educational Resources Information Center

    Texas Tech Univ., Lubbock. Home Economics Curriculum Center.

    This document is intended to help secondary home economics teachers incorporate social studies principles into their curriculum. After an introduction, the document is divided into three sections. The first section identifies and explains fundamental principles within social studies and covers the history and current state of the social studies…

  14. Reconstitution of a Minimal Actin Cortex by Coupling Actin Filaments to Reconstituted Membranes.

    PubMed

    Vogel, Sven K

    2016-01-01

    A thin layer of actin filaments in many eukaryotic cell types drives pivotal aspects of cell morphogenesis and is generally cited as the actin cortex. Myosin driven contractility and actin cytoskeleton membrane interactions form the basis of fundamental cellular processes such as cytokinesis, cell migration, and cortical flows. How the interplay between the actin cytoskeleton, the membrane, and actin binding proteins drives these processes is far from being understood. The complexity of the actin cortex in living cells and the hardly feasible manipulation of the omnipotent cellular key players, namely actin, myosin, and the membrane, are challenging in order to gain detailed insights about the underlying mechanisms. Recent progress in developing bottom-up in vitro systems where the actin cytoskeleton is combined with reconstituted membranes may provide a complementary route to reveal general principles underlying actin cortex properties. In this chapter the reconstitution of a minimal actin cortex by coupling actin filaments to a supported membrane is described. This minimal system may be very well suited to study for example protein interactions on membrane bound actin filaments in a very controlled and quantitative manner as it may be difficult to perform in living systems.

  15. Control principles of complex systems

    NASA Astrophysics Data System (ADS)

    Liu, Yang-Yu; Barabási, Albert-László

    2016-07-01

    A reflection of our ultimate understanding of a complex system is our ability to control its behavior. Typically, control has multiple prerequisites: it requires an accurate map of the network that governs the interactions between the system's components, a quantitative description of the dynamical laws that govern the temporal behavior of each component, and an ability to influence the state and temporal behavior of a selected subset of the components. With deep roots in dynamical systems and control theory, notions of control and controllability have taken a new life recently in the study of complex networks, inspiring several fundamental questions: What are the control principles of complex systems? How do networks organize themselves to balance control with functionality? To address these questions here recent advances on the controllability and the control of complex networks are reviewed, exploring the intricate interplay between the network topology and dynamical laws. The pertinent mathematical results are matched with empirical findings and applications. Uncovering the control principles of complex systems can help us explore and ultimately understand the fundamental laws that govern their behavior.

  16. Down to the fundamental lower limit of direct time cognition

    NASA Astrophysics Data System (ADS)

    Fanchenko, S. D.; Schelev, Mikhail Y.

    1999-06-01

    The age of high-speed photography exceeds a century and may be divided into three epochs: recording of fast events being illuminated with short light bursts; opto-mechanical recording with mechanical shutters, rotating drums, prisms and mirrors; and high-speed recording with the help of electron imaging tubes. During the third epoch the authors have been enthusiastically involved into pico-femtosecond image- converter physics and technology by establishing fundamental principles and experimental realization of ultrafast image- tube photography.

  17. Intuitions, principles and consequences

    PubMed Central

    Shaw, A

    2001-01-01

    Some approaches to the assessment of moral intuitions are discussed. The controlled ethical trial isolates a moral issue from confounding factors and thereby clarifies what a person's intuition actually is. Casuistic reasoning from situations, where intuitions are clear, suggests or modifies principles, which can then help to make decisions in situations where intuitions are unclear. When intuitions are defended by a supporting principle, that principle can be tested by finding extreme cases, in which it is counterintuitive to follow the principle. An approach to the resolution of conflict between valid moral principles, specifically the utilitarian and justice principles, is considered. It is argued that even those who justify intuitions by a priori principles are often obliged to modify or support their principles by resort to the consideration of consequences. Key Words: Intuitions • principles • consequences • utilitarianism PMID:11233371

  18. What Metadata Principles Apply to Scientific Data?

    NASA Astrophysics Data System (ADS)

    Mayernik, M. S.

    2014-12-01

    Information researchers and professionals based in the library and information science fields often approach their work through developing and applying defined sets of principles. For example, for over 100 years, the evolution of library cataloging practice has largely been driven by debates (which are still ongoing) about the fundamental principles of cataloging and how those principles should manifest in rules for cataloging. Similarly, the development of archival research and practices over the past century has proceeded hand-in-hand with the emergence of principles of archival arrangement and description, such as maintaining the original order of records and documenting provenance. This project examines principles related to the creation of metadata for scientific data. The presentation will outline: 1) how understandings and implementations of metadata can range broadly depending on the institutional context, and 2) how metadata principles developed by the library and information science community might apply to metadata developments for scientific data. The development and formalization of such principles would contribute to the development of metadata practices and standards in a wide range of institutions, including data repositories, libraries, and research centers. Shared metadata principles would potentially be useful in streamlining data discovery and integration, and would also benefit the growing efforts to formalize data curation education.

  19. Brake Fundamentals. Automotive Articulation Project.

    ERIC Educational Resources Information Center

    Cunningham, Larry; And Others

    Designed for secondary and postsecondary auto mechanics programs, this curriculum guide contains learning exercises in seven areas: (1) brake fundamentals; (2) brake lines, fluid, and hoses; (3) drum brakes; (4) disc brake system and service; (5) master cylinder, power boost, and control valves; (6) parking brakes; and (7) trouble shooting. Each…

  20. Light as a Fundamental Particle

    ERIC Educational Resources Information Center

    Weinberg, Steven

    1975-01-01

    Presents two arguments concerning the role of the photon. One states that the photon is just another particle distinguished by a particular value of charge, spin, mass, lifetime, and interaction properties. The second states that the photon plays a fundamental role with a deep relation to ultimate formulas of physics. (GS)

  1. Environmental Law: Fundamentals for Schools.

    ERIC Educational Resources Information Center

    Day, David R.

    This booklet outlines the environmental problems most likely to arise in schools. An overview provides a fundamental analysis of environmental issues rather than comprehensive analysis and advice. The text examines the concerns that surround superfund cleanups, focusing on the legal framework, and furnishes some practical pointers, such as what to…

  2. Fundamentals of the Slide Library.

    ERIC Educational Resources Information Center

    Boerner, Susan Zee

    This paper is an introduction to the fundamentals of the art (including architecture) slide library, with some emphasis on basic procedures of the science slide library. Information in this paper is particularly relevant to the college, university, and museum slide library. Topics addressed include: (1) history of the slide library; (2) duties of…

  3. Fundamentals of Welding. Teacher Edition.

    ERIC Educational Resources Information Center

    Fortney, Clarence; And Others

    These instructional materials assist teachers in improving instruction on the fundamentals of welding. The following introductory information is included: use of this publication; competency profile; instructional/task analysis; related academic and workplace skills list; tools, materials, and equipment list; and 27 references. Seven units of…

  4. Fundamentals of Microelectronics Processing (VLSI).

    ERIC Educational Resources Information Center

    Takoudis, Christos G.

    1987-01-01

    Describes a 15-week course in the fundamentals of microelectronics processing in chemical engineering, which emphasizes the use of very large scale integration (VLSI). Provides a listing of the topics covered in the course outline, along with a sample of some of the final projects done by students. (TW)

  5. FUNdamental Movement in Early Childhood.

    ERIC Educational Resources Information Center

    Campbell, Linley

    2001-01-01

    Noting that the development of fundamental movement skills is basic to children's motor development, this booklet provides a guide for early childhood educators in planning movement experiences for children between 4 and 8 years. The booklet introduces a wide variety of appropriate practices to promote movement skill acquisition and increased…

  6. Slow magic angle sample spinning: a non- or minimally invasive method for high-resolution 1H nuclear magnetic resonance (NMR) metabolic profiling.

    PubMed

    Hu, Jian Zhi

    2011-01-01

    High-resolution (1)H magic angle spinning nuclear magnetic resonance (NMR), using a sample spinning rate of several kilohertz or more (i.e., high-resolution magic angle spinning (hr-MAS)), is a well-established method for metabolic profiling in intact tissues without the need for sample extraction. The only shortcoming with hr-MAS is that it is invasive and is thus unusable for non-destructive detections. Recently, a method called slow MAS, using the concept of two-dimensional NMR spectroscopy, has emerged as an alternative method for non- or minimally invasive metabolomics in intact tissues, including live animals, due to the slow or ultra-slow sample spinning used. Although slow MAS is a powerful method, its applications are hindered by experimental challenges. Correctly designing the experiment and choosing the appropriate slow MAS method both require a fundamental understanding of the operation principles, in particular the details of line narrowing due to the presence of molecular diffusion. However, these fundamental principles have not yet been fully disclosed in previous publications. The goal of this chapter is to provide an in-depth evaluation of the principles associated with slow MAS techniques by emphasizing the challenges associated with a phantom sample consisting of glass beads and H(2)O, where an unusually large magnetic susceptibility field gradient is obtained.

  7. Diesel Fundamentals. Teacher Edition (Revised).

    ERIC Educational Resources Information Center

    Clark, Elton; And Others

    This module is one of a series of teaching guides that cover diesel mechanics. The module contains 4 sections and 19 units. Section A--Orientation includes the following units: introduction to diesel mechanics and shop safety; basic shop tools; test equipment and service tools; fasteners; bearings; and seals. Section B--Engine Principles and…

  8. Fundamentals of Aqueous Microwave Chemistry

    EPA Science Inventory

    The first chemical revolution changed modern life with a host of excellent amenities and services, but created serious problems related to environmental pollution. After 150 years of current chemistry principles and practices, we need a radical change to a new type of chemistry k...

  9. Fundamentals of freeze-drying.

    PubMed

    Nail, Steven L; Jiang, Shan; Chongprasert, Suchart; Knopp, Shawn A

    2002-01-01

    Given the increasing importance of reducing development time for new pharmaceutical products, formulation and process development scientists must continually look for ways to "work smarter, not harder." Within the product development arena, this means reducing the amount of trial and error empiricism in arriving at a formulation and identification of processing conditions which will result in a quality final dosage form. Characterization of the freezing behavior of the intended formulation is necessary for developing processing conditions which will result in the shortest drying time while maintaining all critical quality attributes of the freeze-dried product. Analysis of frozen systems was discussed in detail, particularly with respect to the glass transition as the physical event underlying collapse during freeze-drying, eutectic mixture formation, and crystallization events upon warming of frozen systems. Experiments to determine how freezing and freeze-drying behavior is affected by changes in the composition of the formulation are often useful in establishing the "robustness" of a formulation. It is not uncommon for seemingly subtle changes in composition of the formulation, such as a change in formulation pH, buffer salt, drug concentration, or an additional excipient, to result in striking differences in freezing and freeze-drying behavior. With regard to selecting a formulation, it is wise to keep the formulation as simple as possible. If a buffer is needed, a minimum concentration should be used. The same principle applies to added salts: If used at all, the concentration should be kept to a minimum. For many proteins a combination of an amorphous excipient, such as a disaccharide, and a crystallizing excipient, such as glycine, will result in a suitable combination of chemical stability and physical stability of the freeze-dried solid. Concepts of heat and mass transfer are valuable in rational design of processing conditions. Heat transfer by conduction

  10. Biological Extension of the Action Principle: Endpoint Determination beyond the Quantum Level and the Ultimate Physical Roots of Consciousness

    NASA Astrophysics Data System (ADS)

    Grandpierre, Attila

    2007-12-01

    With the explosive growth of biology, biological data accumulate in an increasing rate. At present, theoretical biology does not have its fundamental principles that could offer biological insight. In this situation, it is advisable for biology to learn from its older brother, physics. The most powerful tool of physics is the action principle, from which all the fundamental laws of physics can be derived in their most elegant form. We show that today's physics is far from utilizing the full potential of the action principle. This circumstance is almost inevitable, since it belongs to the nature of the physical problems that the endpoint of the action principle is fixed already by the initial conditions, and that physical behavior in most cases corresponds to the minimal form of the action principle. Actually, the mathematical form of the action principle allows also endpoints corresponding to the maximum of the action. We show that when we endow the action principle with this overlooked possibility, it gains an enormous additional power, which, perhaps surprisingly, directly corresponds to biological behavior. The biological version of the least action principle is the most action principle. It is characteristically biological to strive to the most action, instead of manifesting inert behavior corresponding to the least action. A fallen body in classical physics cannot select its endpoint. How is it possible that a fallen bird can select the endpoint of its trajectory? We consider how the photon "selects" its endpoint in the classical and the extended double-slit experiments, and propose a new causal interpretation of quantum physics. We show that "spontaneous targeting" observed in living organisms is a direct manifestation of the causally determined quantum processes. For the first time, we formulate here the first principle of biology in a mathematical form and present some of its applications of primary importance. We indicate that the general phenomenon of

  11. Systems Biology Perspectives on Minimal and Simpler Cells

    PubMed Central

    Xavier, Joana C.; Patil, Kiran Raosaheb

    2014-01-01

    SUMMARY The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. PMID:25184563

  12. Bacillus subtilis and Escherichia coli essential genes and minimal cell factories after one decade of genome engineering.

    PubMed

    Juhas, Mario; Reuß, Daniel R; Zhu, Bingyao; Commichau, Fabian M

    2014-11-01

    Investigation of essential genes, besides contributing to understanding the fundamental principles of life, has numerous practical applications. Essential genes can be exploited as building blocks of a tightly controlled cell 'chassis'. Bacillus subtilis and Escherichia coli K-12 are both well-characterized model bacteria used as hosts for a plethora of biotechnological applications. Determination of the essential genes that constitute the B. subtilis and E. coli minimal genomes is therefore of the highest importance. Recent advances have led to the modification of the original B. subtilis and E. coli essential gene sets identified 10 years ago. Furthermore, significant progress has been made in the area of genome minimization of both model bacteria. This review provides an update, with particular emphasis on the current essential gene sets and their comparison with the original gene sets identified 10 years ago. Special attention is focused on the genome reduction analyses in B. subtilis and E. coli and the construction of minimal cell factories for industrial applications.

  13. Chemical Principls Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1973-01-01

    Two topics are discussed: (1) Stomach Upset Caused by Aspirin, illustrating principles of acid-base equilibrium and solubility; (2) Physical Chemistry of the Drinking Duck, illustrating principles of phase equilibria and thermodynamics. (DF)

  14. Principles of project management

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The basic principles of project management as practiced by NASA management personnel are presented. These principles are given as ground rules and guidelines to be used in the performance of research, development, construction or operational assignments.

  15. Astrophysical Probes of Fundamental Physics

    NASA Astrophysics Data System (ADS)

    Martins, C. J. A. P.

    I review the theoretical motivation for varying fundamental couplings and discuss how these measurements can be used to constrain a number of fundamental physics scenarios that would otherwise be inacessible to experiment. As a case study I will focus on the relation between varying couplings and dark energy, and explain how varying coupling measurements can be used to probe the nature of dark energy, with important advantages over the standard methods. Assuming that the current observational evidence for varying α. and μ is correct, a several-sigma detection of dynamical dark energy is feasible within a few years, using currently operational ground-based facilities. With forthcoming instruments like CODEX, a high-accuracy reconstruction of the equation of state may be possible all the way up to redshift z ˜ 4.

  16. Fundamental neutron physics at LANSCE

    SciTech Connect

    Greene, G.

    1995-10-01

    Modern neutron sources and science share a common origin in mid-20th-century scientific investigations concerned with the study of the fundamental interactions between elementary particles. Since the time of that common origin, neutron science and the study of elementary particles have evolved into quite disparate disciplines. The neutron became recognized as a powerful tool for studying condensed matter with modern neutron sources being primarily used (and justified) as tools for neutron scattering and materials science research. The study of elementary particles has, of course, led to the development of rather different tools and is now dominated by activities performed at extremely high energies. Notwithstanding this trend, the study of fundamental interactions using neutrons has continued and remains a vigorous activity at many contemporary neutron sources. This research, like neutron scattering research, has benefited enormously by the development of modern high-flux neutron facilities. Future sources, particularly high-power spallation sources, offer exciting possibilities for continuing this research.

  17. DOE Fundamentals Handbook: Classical Physics

    SciTech Connect

    Not Available

    1992-06-01

    The Classical Physics Fundamentals Handbook was developed to assist nuclear facility operating contractors provide operators, maintenance personnel, and the technical staff with the necessary fundamentals training to ensure a basic understanding of physical forces and their properties. The handbook includes information on the units used to measure physical properties; vectors, and how they are used to show the net effect of various forces; Newton's Laws of motion, and how to use these laws in force and motion applications; and the concepts of energy, work, and power, and how to measure and calculate the energy involved in various applications. This information will provide personnel with a foundation for understanding the basic operation of various types of DOE nuclear facility systems and equipment.

  18. Microplasmas: from applications to fundamentals

    NASA Astrophysics Data System (ADS)

    Nguon, Olivier; Huang, Sisi; Gauthier, Mario; Karanassios, Vassili

    2014-05-01

    Microplasmas are receiving increasing attention in the scientific literature and in recent conferences. Yet, few analytical applications of microplasmas for elemental analysis using liquid samples have been described in the literature. To address this, we describe two applications: one involves the determination of Zn in microsamples of the metallo-enzyme Super Oxide Dismutase. The other involves determination of Pd-concentration in microsamples of Pd nanocatalysts. These applications demonstrate the potential of microplasmas and point to the need for future fundamental studies.

  19. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1970-01-01

    This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…

  20. TLS from fundamentals to practice

    PubMed Central

    Urzhumtsev, Alexandre; Afonine, Pavel V.; Adams, Paul D.

    2014-01-01

    The Translation-Libration-Screw-rotation (TLS) model of rigid-body harmonic displacements introduced in crystallography by Schomaker & Trueblood (1968) is now a routine tool in macromolecular studies and is a feature of most modern crystallographic structure refinement packages. In this review we consider a number of simple examples that illustrate important features of the TLS model. Based on these examples simplified formulae are given for several special cases that may occur in structure modeling and refinement. The derivation of general TLS formulae from basic principles is also provided. This manuscript describes the principles of TLS modeling, as well as some select algorithmic details for practical application. An extensive list of applications references as examples of TLS in macromolecular crystallography refinement is provided. PMID:25249713

  1. TLS from fundamentals to practice.

    PubMed

    Urzhumtsev, Alexandre; Afonine, Pavel V; Adams, Paul D

    2013-07-01

    The Translation-Libration-Screw-rotation (TLS) model of rigid-body harmonic displacements introduced in crystallography by Schomaker & Trueblood (1968) is now a routine tool in macromolecular studies and is a feature of most modern crystallographic structure refinement packages. In this review we consider a number of simple examples that illustrate important features of the TLS model. Based on these examples simplified formulae are given for several special cases that may occur in structure modeling and refinement. The derivation of general TLS formulae from basic principles is also provided. This manuscript describes the principles of TLS modeling, as well as some select algorithmic details for practical application. An extensive list of applications references as examples of TLS in macromolecular crystallography refinement is provided.

  2. Fundamental Study on Quantum Nanojets

    DTIC Science & Technology

    2004-08-01

    operating at high injection energy exhibit classical jet like behavior which are predicted by molecular dynamics or classical Navier - Stokes type equation ...Analytical formulations of planar and cylindrical shaped nanojets injector in QDFD formalism. Conservation equations of QDFD Canonical theoretic formulation...computational schemes for Schrödinger’s equation and quantum fluid dynamics, are developed. Principles of quantum mechanical equivalence between two formalisms

  3. Physical fundamentals of remote sensing

    NASA Astrophysics Data System (ADS)

    Schanda, E.

    The physical principles describing the propagation of EM waves in the atmosphere and their interactions with matter are discussed as they apply to remote sensing, in an introductory text intended for graduate science students, environmental-science researchers, and remote-sensing practitioners. The emphasis is on basic effects rather than an specific remote-sensing techniques or observational results. Chapters are devoted to basic relations, the spectral lines of atmospheric gases, the spectral properties of condensed matter, and radiative transfer.

  4. Guidelines for mixed waste minimization

    SciTech Connect

    Owens, C.

    1992-02-01

    Currently, there is no commercial mixed waste disposal available in the United States. Storage and treatment for commercial mixed waste is limited. Host States and compacts region officials are encouraging their mixed waste generators to minimize their mixed wastes because of management limitations. This document provides a guide to mixed waste minimization.

  5. Influenza SIRS with Minimal Pneumonitis.

    PubMed

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement.

  6. Waste minimization handbook, Volume 1

    SciTech Connect

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  7. Fluidization technologies: Aerodynamic principles and process engineering.

    PubMed

    Dixit, Rahul; Puthli, Shivanand

    2009-11-01

    The concept of fluidization has been adapted to different unit processes of pharmaceutical product development. Till date a lot of improvements have been made in the engineering design to achieve superior process performance. This review is focused on the fundamental principles of aerodynamics and hydrodynamics associated with the fluidization technologies. Fluid-bed coating, fluidized bed granulation, rotor processing, hot melt granulation, electrostatic coating, supercritical fluid based fluidized bed technology are highlighted. Developments in the design of processing equipments have been explicitly elucidated. This article also discusses processing problems from the operator's perspective along with latest developments in the application of these principles.

  8. Principlism and communitarianism.

    PubMed

    Callahan, D

    2003-10-01

    The decline in the interest in ethical theory is first outlined, as a background to the author's discussion of principlism. The author's own stance, that of a communitarian philosopher, is then described, before the subject of principlism itself is addressed. Two problems stand in the way of the author's embracing principlism: its individualistic bias and its capacity to block substantive ethical inquiry. The more serious problem the author finds to be its blocking function. Discussing the four scenarios the author finds that the utility of principlism is shown in the two scenarios about Jehovah's Witnesses but that when it comes to selling kidneys for transplantation and germline enhancement, principlism is of little help.

  9. The mechanical exfoliation mechanism of black phosphorus to phosphorene: A first-principles study

    NASA Astrophysics Data System (ADS)

    Mu, Yunsheng; Si, M. S.

    2015-11-01

    Today, the renaissance of black phosphorus largely depends on the mechanical exfoliation method, which is accessible to produce few-layer forms from the bulk counterpart. However, the deep understanding of the exfoliation mechanism is missing. To this end, we resolve this issue by simulating the sliding processes of bilayer phosphorene based on first-principles calculations. It is found that the interlayer Coulomb interactions dictate the optimal sliding pathway, leading to the minimal energy barrier as low as ∼60 \\text{meV} , which gives a comparable surface energy of ∼59 \\text{mJ/m}2 in experiment. This means that black phosphorus can be exfoliated by the sliding approach. In addition, considerable bandgap modulations along these sliding pathways are obtained. The study like ours builds up a fundamental understanding of how black phosphorus is exfoliated to few-layer forms, providing a good guide to experimental research.

  10. Driving Toward Guiding Principles

    PubMed Central

    Buckovich, Suzy A.; Rippen, Helga E.; Rozen, Michael J.

    1999-01-01

    As health care moves from paper to electronic data collection, providing easier access and dissemination of health information, the development of guiding privacy, confidentiality, and security principles is necessary to help balance the protection of patients' privacy interests against appropriate information access. A comparative review and analysis was done, based on a compilation of privacy, confidentiality, and security principles from many sources. Principles derived from ten identified sources were compared with each of the compiled principles to assess support level, uniformity, and inconsistencies. Of 28 compiled principles, 23 were supported by at least 50 percent of the sources. Technology could address at least 12 of the principles. Notable consistencies among the principles could provide a basis for consensus for further legislative and organizational work. It is imperative that all participants in our health care system work actively toward a viable resolution of this information privacy debate. PMID:10094065

  11. Weight minimization of a support structure

    NASA Technical Reports Server (NTRS)

    Kluberdanz, Donald J.; Segalman, Helaine J.

    1990-01-01

    This paper addresses the weight minimization of a circular plate-like structure which resulted in a 26 percent weight reduction. The optimization was performed numerically with the COPES/ADS program using the modified method of feasible directions. Design parameters were the inner thickness and outer thickness of the plate with constraints on maximum yield stress and maximum transverse displacement. Also, constraints were specified for the upper and lower bounds of the fundamental frequency and plate thicknesses. The MSC/NASTRAN finite element program was used for the evaluation of response variables. Original and final designs of the plate were tested using an Instron tension-compression machine to compare finite element results to measured strain data. The difference between finite element strain components and measured strain data was within engineering accuracy.

  12. Dynamic sealing principles. [design configurations for fluid leakage control

    NASA Technical Reports Server (NTRS)

    Zuk, J.

    1976-01-01

    The fundamental principles governing dynamic sealing operation are discussed. Different seals are described in terms of these principles. Despite the large variety of detailed construction, there appear to be some basic principles, or combinations of basic principles, by which all seals function. They are: (1) selection and control of seal geometry, (2) control of leakage fluid properties, and (3) control of forces acting on leakage fluids. Theoretical and practical considerations in the application of these principles are discussed. Advantages, disadvantages, limitations, and application examples of various conventional and special seals are presented. Fundamental equations governing liquid and gas flows in thin-film seals, which enable leakage calculations to be made, are also presented.

  13. Fundamental Limits to Cellular Sensing

    NASA Astrophysics Data System (ADS)

    ten Wolde, Pieter Rein; Becker, Nils B.; Ouldridge, Thomas E.; Mugler, Andrew

    2016-03-01

    In recent years experiments have demonstrated that living cells can measure low chemical concentrations with high precision, and much progress has been made in understanding what sets the fundamental limit to the precision of chemical sensing. Chemical concentration measurements start with the binding of ligand molecules to receptor proteins, which is an inherently noisy process, especially at low concentrations. The signaling networks that transmit the information on the ligand concentration from the receptors into the cell have to filter this receptor input noise as much as possible. These networks, however, are also intrinsically stochastic in nature, which means that they will also add noise to the transmitted signal. In this review, we will first discuss how the diffusive transport and binding of ligand to the receptor sets the receptor correlation time, which is the timescale over which fluctuations in the state of the receptor, arising from the stochastic receptor-ligand binding, decay. We then describe how downstream signaling pathways integrate these receptor-state fluctuations, and how the number of receptors, the receptor correlation time, and the effective integration time set by the downstream network, together impose a fundamental limit on the precision of sensing. We then discuss how cells can remove the receptor input noise while simultaneously suppressing the intrinsic noise in the signaling network. We describe why this mechanism of time integration requires three classes (groups) of resources—receptors and their integration time, readout molecules, energy—and how each resource class sets a fundamental sensing limit. We also briefly discuss the scheme of maximum-likelihood estimation, the role of receptor cooperativity, and how cellular copy protocols differ from canonical copy protocols typically considered in the computational literature, explaining why cellular sensing systems can never reach the Landauer limit on the optimal trade

  14. Fundamental Scientific Problems in Magnetic Recording

    SciTech Connect

    Schulthess, T.C.; Miller, M.K.

    2007-06-27

    Magnetic data storage technology is presently leading the high tech industry in advancing device integration--doubling the storage density every 12 months. To continue these advancements and to achieve terra bit per inch squared recording densities, new approaches to store and access data will be needed in about 3-5 years. In this project, collaboration between Oak Ridge National Laboratory (ORNL), Center for Materials for Information Technology (MINT) at University of Alabama (UA), Imago Scientific Instruments, and Seagate Technologies, was undertaken to address the fundamental scientific problems confronted by the industry in meeting the upcoming challenges. The areas that were the focus of this study were to: (1) develop atom probe tomography for atomic scale imaging of magnetic heterostructures used in magnetic data storage technology; (2) develop a first principles based tools for the study of exchange bias aimed at finding new anti-ferromagnetic materials to reduce the thickness of the pinning layer in the read head; (3) develop high moment magnetic materials and tools to study magnetic switching in nanostructures aimed at developing improved writers of high anisotropy magnetic storage media.

  15. Thermodynamics fundamentals of energy conversion

    NASA Astrophysics Data System (ADS)

    Dan, Nicolae

    The work reported in the chapters 1-5 focuses on the fundamentals of heat transfer, fluid dynamics, thermodynamics and electrical phenomena related to the conversion of one form of energy to another. Chapter 6 is a re-examination of the fundamental heat transfer problem of how to connect a finite-size heat generating volume to a concentrated sink. Chapter 1 extends to electrical machines the combined thermodynamics and heat transfer optimization approach that has been developed for heat engines. The conversion efficiency at maximum power is 1/2. When, as in specific applications, the operating temperature of windings must not exceed a specified level, the power output is lower and efficiency higher. Chapter 2 addresses the fundamental problem of determining the optimal history (regime of operation) of a battery so that the work output is maximum. Chapters 3 and 4 report the energy conversion aspects of an expanding mixture of hot particles, steam and liquid water. At the elemental level, steam annuli develop around the spherical drops as time increases. At the mixture level, the density decreases while the pressure and velocity increases. Chapter 4 describes numerically, based on the finite element method, the time evolution of the expanding mixture of hot spherical particles, steam and water. The fluid particles are moved in time in a Lagrangian manner to simulate the change of the domain configuration. Chapter 5 describes the process of thermal interaction between the molten material and water. In the second part of the chapter the model accounts for the irreversibility due to the flow of the mixture through the cracks of the mixing vessel. The approach presented in this chapter is based on exergy analysis and represents a departure from the line of inquiry that was followed in chapters 3-4. Chapter 6 shows that the geometry of the heat flow path between a volume and one point can be optimized in two fundamentally different ways. In the "growth" method the

  16. Short-range Fundamental forces

    SciTech Connect

    Antoniadis, I; Baessler, Stefan; Buechner, M; Fedorov, General Victor; Hoedl, S.; Lambrecht, A; Nesvizhevsky, V.; Pignol, G; Reynaud, S.; Sobolev, Yu.

    2011-01-01

    We consider theoretical motivations to search for extra short-range fundamental forces as well as experiments constraining their parameters. The forces could be of two types: (1) spin-independent forces; and (2) spin-dependent axion-like forces. Different experimental techniques are sensitive in respective ranges of characteristic distances. The techniques include measurements of gravity at short distances, searches for extra interactions on top of the Casimir force, precision atomic and neutron experiments. We focus on neutron constraints, thus the range of characteristic distances considered here corresponds to the range accessible for neutron experiments.

  17. Fundamental Characteristics of Breather Hydrodynamics

    NASA Astrophysics Data System (ADS)

    Chabchoub, Amin

    2014-05-01

    The formation of oceanic rogue waves can be explained by the modulation instability of deep-water Stokes waves. In particular, being doubly-localized and amplifying the background wave amplitude by a factor of three or higher, the class of Peregrine-type breather solutions of the nonlinear Schrödinger equation (NLS) are considered to be appropriate models to describe extreme ocean wave dynamics. Here, we present an experimental validation of fundamental properties of the NLS within the context of Peregrine breather dynamics and we discuss the long-term behavior of such in time and space localized structures.

  18. Solid Lubrication Fundamentals and Applications

    NASA Technical Reports Server (NTRS)

    Miyoshi, Kazuhisa

    2001-01-01

    Solid Lubrication Fundamentals and Applications description of the adhesion, friction, abrasion, and wear behavior of solid film lubricants and related tribological materials, including diamond and diamond-like solid films. The book details the properties of solid surfaces, clean surfaces, and contaminated surfaces as well as discussing the structures and mechanical properties of natural and synthetic diamonds; chemical-vapor-deposited diamond film; surface design and engineering toward wear-resistant, self-lubricating diamond films and coatings. The author provides selection and design criteria as well as applications for synthetic and natural coatings in the commercial, industrial and aerospace industries..

  19. Quantum correlations require multipartite information principles.

    PubMed

    Gallego, Rodrigo; Würflinger, Lars Erik; Acín, Antonio; Navascués, Miguel

    2011-11-18

    Identifying which correlations among distant observers are possible within our current description of nature, based on quantum mechanics, is a fundamental problem in physics. Recently, information concepts have been proposed as the key ingredient to characterize the set of quantum correlations. Novel information principles, such as information causality or nontrivial communication complexity, have been introduced in this context and successfully applied to some concrete scenarios. We show in this work a fundamental limitation of this approach: no principle based on bipartite information concepts is able to singleout the set of quantum correlations for an arbitrary number of parties. Our results reflect the intricate structure of quantum correlations and imply that new and intrinsically multipartite information concepts are needed for their full understanding.

  20. Fundamentals of Clinical Outcomes Assessment for Spinal Disorders: Clinical Outcome Instruments and Applications.

    PubMed

    Vavken, Patrick; Ganal-Antonio, Anne Kathleen B; Quidde, Julia; Shen, Francis H; Chapman, Jens R; Samartzis, Dino

    2015-08-01

    Study Design A broad narrative review. Objectives Outcome assessment in spinal disorders is imperative to help monitor the safety and efficacy of the treatment in an effort to change the clinical practice and improve patient outcomes. The following article, part two of a two-part series, discusses the various outcome tools and instruments utilized to address spinal disorders and their management. Methods A thorough review of the peer-reviewed literature was performed, irrespective of language, addressing outcome research, instruments and tools, and applications. Results Numerous articles addressing the development and implementation of health-related quality-of-life, neck and low back pain, overall pain, spinal deformity, and other condition-specific outcome instruments have been reported. Their applications in the context of the clinical trial studies, the economic analyses, and overall evidence-based orthopedics have been noted. Additional issues regarding the problems and potential sources of bias utilizing outcomes scales and the concept of minimally clinically important difference were discussed. Conclusion Continuing research needs to assess the outcome instruments and tools used in the clinical outcome assessment for spinal disorders. Understanding the fundamental principles in spinal outcome assessment may also advance the field of "personalized spine care."

  1. Aortic valve surgery - minimally invasive

    MedlinePlus

    ... of the heart is reduced. This is called aortic stenosis. The aortic valve can be replaced using: Minimally ... RN, Wang A. Percutaneous heart valve replacement for aortic stenosis: state of the evidence. Ann Intern Med . 2010; ...

  2. Principles of thermoacoustic energy harvesting

    NASA Astrophysics Data System (ADS)

    Avent, A. W.; Bowen, C. R.

    2015-11-01

    Thermoacoustics exploit a temperature gradient to produce powerful acoustic pressure waves. The technology has a key role to play in energy harvesting systems. A time-line in the development of thermoacoustics is presented from its earliest recorded example in glass blowing through to the development of the Sondhauss and Rijke tubes to Stirling engines and pulse-tube cryo-cooling. The review sets the current literature in context, identifies key publications and promising areas of research. The fundamental principles of thermoacoustic phenomena are explained; design challenges and factors influencing efficiency are explored. Thermoacoustic processes involve complex multi-physical coupling and transient, highly non-linear relationships which are computationally expensive to model; appropriate numerical modelling techniques and options for analyses are presented. Potential methods of harvesting the energy in the acoustic waves are also examined.

  3. Shapes of embedded minimal surfaces.

    PubMed

    Colding, Tobias H; Minicozzi, William P

    2006-07-25

    Surfaces that locally minimize area have been extensively used to model physical phenomena, including soap films, black holes, compound polymers, protein folding, etc. The mathematical field dates to the 1740s but has recently become an area of intense mathematical and scientific study, specifically in the areas of molecular engineering, materials science, and nanotechnology because of their many anticipated applications. In this work, we show that all minimal surfaces are built out of pieces of the surfaces in Figs. 1 and 2.

  4. Minimally invasive treatment options in fixed prosthodontics.

    PubMed

    Edelhoff, Daniel; Liebermann, Anja; Beuer, Florian; Stimmelmayr, Michael; Güth, Jan-Frederik

    2016-03-01

    Minimally invasive treatment options have become increasingly feasible in restorative dentistry, due to the introduction of the adhesive technique in combination with restorative materials featuring translucent properties similar to those of natural teeth. Mechanical anchoring of restorations via conventional cementation represents a predominantly subtractive treatment approach that is gradually being superseded by a primarily defect-oriented additive method in prosthodontics. Modifications of conventional treatment procedures have led to the development of an economical approach to the removal of healthy tooth structure. This is possible because the planned treatment outcome is defined in a wax-up before the treatment is commenced and this wax-up is subsequently used as a reference during tooth preparation. Similarly, resin- bonded FDPs and implants have made it possible to preserve the natural tooth structure of potential abutment teeth. This report describes a number of clinical cases to demonstrate the principles of modern prosthetic treatment strategies and discusses these approaches in the context of minimally invasive prosthetic dentistry.

  5. Instructional Software Design Principles.

    ERIC Educational Resources Information Center

    Hazen, Margret

    1985-01-01

    Discusses learner/computer interaction, learner control, sequencing of instructional events, and graphic screen design as effective principles for the design of instructional software, including tutorials. (MBR)

  6. Quantum repeaters: fundamental and future

    NASA Astrophysics Data System (ADS)

    Li, Yue; Hua, Sha; Liu, Yu; Ye, Jun; Zhou, Quan

    2007-04-01

    An overview of the Quantum Repeater techniques based on Entanglement Distillation and Swapping is provided. Beginning with a brief history and the basic concepts of the quantum repeaters, the article primarily focuses on the communication model based on the quantum repeater techniques, which mainly consists of two fundamental modules --- the Entanglement Distillation module and the Swapping module. The realizations of Entanglement Distillation are discussed, including the Bernstein's Procrustean method, the Entanglement Concentration and the CNOT-purification method, etc. The schemes of implementing Swapping, which include the Swapping based on Bell-state measurement and the Swapping in Cavity QED, are also introduced. Then a comparison between these realizations and evaluations on them are presented. At last, the article discusses the experimental schemes of quantum repeaters at present, documents some remaining problems and emerging trends in this field.

  7. Astronomical reach of fundamental physics

    NASA Astrophysics Data System (ADS)

    Burrows, Adam S.; Ostriker, Jeremiah P.

    2014-02-01

    Using basic physical arguments, we derive by dimensional and physical analysis the characteristic masses and sizes of important objects in the universe in terms of just a few fundamental constants. This exercise illustrates the unifying power of physics and the profound connections between the small and the large in the cosmos we inhabit. We focus on the minimum and maximum masses of normal stars, the corresponding quantities for neutron stars, the maximum mass of a rocky planet, the maximum mass of a white dwarf, and the mass of a typical galaxy. To zeroth order, we show that all these masses can be expressed in terms of either the Planck mass or the Chandrasekar mass, in combination with various dimensionless quantities. With these examples, we expose the deep interrelationships imposed by nature between disparate realms of the universe and the amazing consequences of the unifying character of physical law.

  8. Astronomical reach of fundamental physics.

    PubMed

    Burrows, Adam S; Ostriker, Jeremiah P

    2014-02-18

    Using basic physical arguments, we derive by dimensional and physical analysis the characteristic masses and sizes of important objects in the universe in terms of just a few fundamental constants. This exercise illustrates the unifying power of physics and the profound connections between the small and the large in the cosmos we inhabit. We focus on the minimum and maximum masses of normal stars, the corresponding quantities for neutron stars, the maximum mass of a rocky planet, the maximum mass of a white dwarf, and the mass of a typical galaxy. To zeroth order, we show that all these masses can be expressed in terms of either the Planck mass or the Chandrasekar mass, in combination with various dimensionless quantities. With these examples, we expose the deep interrelationships imposed by nature between disparate realms of the universe and the amazing consequences of the unifying character of physical law.

  9. Fundamentals of Condensed Matter Physics

    NASA Astrophysics Data System (ADS)

    Cohen, Marvin L.; Louie, Steven G.

    2016-05-01

    Part I. Basic Concepts: Electrons and Phonons: 1. Concept of a solid: qualitative introduction and overview; 2. Electrons in crystals; 3. Electronic energy bands; 4. Lattice vibrations and phonons; Part II. Electron Intercations, Dynamics and Responses: 5. Electron dynamics in crystals; 6. Many-electron interactions: the interacting electron gas and beyond; 7. Density functional theory; 8. The dielectric function for solids; Part III. Optical and Transport Phenomena: 9. Electronic transitions and optical properties of solids; 10. Electron-phonon interactions; 11. Dynamics of crystal electrons in a magnetic field; 12. Fundamentals of transport phenomena in solids; Part IV. Superconductivity, Magnetism, and Lower Dimensional Systems: 13. Using many-body techniques; 14. Superconductivity; 15. Magnetism; 16. Reduced-dimensional systems and nanostructures; Index.

  10. Fundamentals of Acoustic Backscatter Imagery

    DTIC Science & Technology

    2011-09-20

    pressure, I,, of 1 /iPa, corresponds to 0.67 x 10- 8 Wim2. Assuming spherical spreading, the one meter distance reference frame, and the definition of dB (Eq...then be approximated by an infinite series Fundamentals ofAcoustic Backscatter Imagery 11 W(r) = Wm (r) + X Fjsc (r) j=O where "tic(r) is the incident...f( x ,y, Z)Iz=h(xy) = 0 f( x , y, z)I z=h( x ,y)= f( x , y, Z) I z o + h di+ h 2 d2f +zz z= The function ftx,y,z) can represent, for example, the stress

  11. Fundamental Travel Demand Model Example

    NASA Technical Reports Server (NTRS)

    Hanssen, Joel

    2010-01-01

    Instances of transportation models are abundant and detailed "how to" instruction is available in the form of transportation software help documentation. The purpose of this paper is to look at the fundamental inputs required to build a transportation model by developing an example passenger travel demand model. The example model reduces the scale to a manageable size for the purpose of illustrating the data collection and analysis required before the first step of the model begins. This aspect of the model development would not reasonably be discussed in software help documentation (it is assumed the model developer comes prepared). Recommendations are derived from the example passenger travel demand model to suggest future work regarding the data collection and analysis required for a freight travel demand model.

  12. Fundamental studies of polymer filtration

    SciTech Connect

    Smith, B.F.; Lu, M.T.; Robison, T.W.; Rogers, Y.C.; Wilson, K.V.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). The objectives of this project were (1) to develop an enhanced fundamental understanding of the coordination chemistry of hazardous-metal-ion complexation with water-soluble metal-binding polymers, and (2) to exploit this knowledge to develop improved separations for analytical methods, metals processing, and waste treatment. We investigated features of water-soluble metal-binding polymers that affect their binding constants and selectivity for selected transition metal ions. We evaluated backbone polymers using light scattering and ultrafiltration techniques to determine the effect of pH and ionic strength on the molecular volume of the polymers. The backbone polymers were incrementally functionalized with a metal-binding ligand. A procedure and analytical method to determine the absolute level of functionalization was developed and the results correlated with the elemental analysis, viscosity, and molecular size.

  13. Fundamental concepts of quantum chaos

    NASA Astrophysics Data System (ADS)

    Robnik, M.

    2016-09-01

    We review the fundamental concepts of quantum chaos in Hamiltonian systems. The quantum evolution of bound systems does not possess the sensitive dependence on initial conditions, and thus no chaotic behaviour occurs, whereas the study of the stationary solutions of the Schrödinger equation in the quantum phase space (Wigner functions) reveals precise analogy of the structure of the classical phase portrait. We analyze the regular eigenstates associated with invariant tori in the classical phase space, and the chaotic eigenstates associated with the classically chaotic regions, and the corresponding energy spectra. The effects of quantum localization of the chaotic eigenstates are treated phenomenologically, resulting in Brody-like level statistics, which can be found also at very high-lying levels, while the coupling between the regular and the irregular eigenstates due to tunneling, and of the corresponding levels, manifests itself only in low-lying levels.

  14. Cognition is … Fundamentally Cultural

    PubMed Central

    Bender, Andrea; Beller, Sieghard

    2013-01-01

    A prevailing concept of cognition in psychology is inspired by the computer metaphor. Its focus on mental states that are generated and altered by information input, processing, storage and transmission invites a disregard for the cultural dimension of cognition, based on three (implicit) assumptions: cognition is internal, processing can be distinguished from content, and processing is independent of cultural background. Arguing against each of these assumptions, we point out how culture may affect cognitive processes in various ways, drawing on instances from numerical cognition, ethnobiological reasoning, and theory of mind. Given the pervasive cultural modulation of cognition—on all of Marr’s levels of description—we conclude that cognition is indeed fundamentally cultural, and that consideration of its cultural dimension is essential for a comprehensive understanding. PMID:25379225

  15. Fundamental reaction pathways during coprocessing

    SciTech Connect

    Stock, L.M.; Gatsis, J.G. . Dept. of Chemistry)

    1992-12-01

    The objective of this research was to investigate the fundamental reaction pathways in coal petroleum residuum coprocessing. Once the reaction pathways are defined, further efforts can be directed at improving those aspects of the chemistry of coprocessing that are responsible for the desired results such as high oil yields, low dihydrogen consumption, and mild reaction conditions. We decided to carry out this investigation by looking at four basic aspects of coprocessing: (1) the effect of fossil fuel materials on promoting reactions essential to coprocessing such as hydrogen atom transfer, carbon-carbon bond scission, and hydrodemethylation; (2) the effect of varied mild conditions on the coprocessing reactions; (3) determination of dihydrogen uptake and utilization under severe conditions as a function of the coal or petroleum residuum employed; and (4) the effect of varied dihydrogen pressure, temperature, and residence time on the uptake and utilization of dihydrogen and on the distribution of the coprocessed products. Accomplishments are described.

  16. Astronomical reach of fundamental physics

    PubMed Central

    Burrows, Adam S.; Ostriker, Jeremiah P.

    2014-01-01

    Using basic physical arguments, we derive by dimensional and physical analysis the characteristic masses and sizes of important objects in the universe in terms of just a few fundamental constants. This exercise illustrates the unifying power of physics and the profound connections between the small and the large in the cosmos we inhabit. We focus on the minimum and maximum masses of normal stars, the corresponding quantities for neutron stars, the maximum mass of a rocky planet, the maximum mass of a white dwarf, and the mass of a typical galaxy. To zeroth order, we show that all these masses can be expressed in terms of either the Planck mass or the Chandrasekar mass, in combination with various dimensionless quantities. With these examples, we expose the deep interrelationships imposed by nature between disparate realms of the universe and the amazing consequences of the unifying character of physical law. PMID:24477692

  17. Rare Isotopes and Fundamental Symmetries

    NASA Astrophysics Data System (ADS)

    Brown, B. Alex; Engel, Jonathan; Haxton, Wick; Ramsey-Musolf, Michael; Romalis, Michael; Savard, Guy

    2009-01-01

    Experiments searching for new interactions in nuclear beta decay / Klaus P. Jungmann -- The beta-neutrino correlation in sodium-21 and other nuclei / P. A. Vetter ... [et al.] -- Nuclear structure and fundamental symmetries/ B. Alex Brown -- Schiff moments and nuclear structure / J. Engel -- Superallowed nuclear beta decay: recent results and their impact on V[symbol] / J. C. Hardy and I. S. Towner -- New calculation of the isospin-symmetry breaking correlation to superallowed Fermi beta decay / I. S. Towner and J. C. Hardy -- Precise measurement of the [symbol]H to [symbol]He mass difference / D. E. Pinegar ... [et al.] -- Limits on scalar currents from the 0+ to 0+ decay of [symbol]Ar and isospin breaking in [symbol]Cl and [symbol]Cl / A. Garcia -- Nuclear constraints on the weak nucleon-nucleon interaction / W. C. Haxton -- Atomic PNC theory: current status and future prospects / M. S. Safronova -- Parity-violating nucleon-nucleon interactions: what can we learn from nuclear anapole moments? / B. Desplanques -- Proposed experiment for the measurement of the anapole moment in francium / A. Perez Galvan ... [et al.] -- The Radon-EDM experiment / Tim Chupp for the Radon-EDM collaboration -- The lead radius Eexperiment (PREX) and parity violating measurements of neutron densities / C. J. Horowitz -- Nuclear structure aspects of Schiff moment and search for collective enhancements / Naftali Auerbach and Vladimir Zelevinsky -- The interpretation of atomic electric dipole moments: Schiff theorem and its corrections / C. -P. Liu -- T-violation and the search for a permanent electric dipole moment of the mercury atom / M. D. Swallows ... [et al.] -- The new concept for FRIB and its potential for fundamental interactions studies / Guy Savard -- Collinear laser spectroscopy and polarized exotic nuclei at NSCL / K. Minamisono -- Environmental dependence of masses and coupling constants / M. Pospelov.

  18. Quantum electrodynamics and fundamental constants

    NASA Astrophysics Data System (ADS)

    Wundt, Benedikt Johannes Wilhelm

    The unprecedented precision achieved both in the experimental measurements as well as in the theoretical description of atomic bound states make them an ideal study object for fundamental physics and the determination of fundamental constants. This requires a careful study of the effects from quantum electrodynamics (QED) on the interaction between the electron and the nucleus. The two theoretical approaches for the evaluation of QED corrections are presented and discussed. Due to the presence of two energy scales from the binding potential and the radiation field, an overlapping parameter has to be used in both approaches in order to separate the energy scales. The different choices for the overlapping parameter in the two methods are further illustrated in a model example. With the nonrelativistic theory, relativistic corrections in order ( Zalpha)2 to the two-photon decay rate of ionic states are calculated, as well as the leading radiative corrections of alpha( Zalpha)2ln[(Zalpha)-2 ]. It is shown that the corrections is gauge-invariant under a "hybrid" gauge transformation between Coulomb and Yennie gauge. Furthermore, QED corrections for Rydberg states in one-electron ions are investigated. The smallness of the corrections and the absence of nuclear size corrections enable very accurate theoretical predictions. Measuring transition frequencies and comparing them to the theoretical predictions, QED theory can be tested more precisely. In turn, this could yield a more accurate value for the Rydberg constant. Using a transition in a nucleus with a well determined mass, acting as a reference, a comparison to transition in other nuclei can even allow to determined nuclear masses. Finally, in order to avoid an additional uncertainty in nuclei with non zero nuclear spin, QED self-energy corrections to the hyperfine structure up to order alpha(Zalpha)2Delta EHFS are determined for highly excited Rydberg states.

  19. Fundamental enabling issues in nanotechnology :

    SciTech Connect

    Floro, Jerrold Anthony; Foiles, Stephen Martin; Hearne, Sean Joseph; Hoyt, Jeffrey John; Seel, Steven Craig; Webb III, Edmund Blackburn; Morales, Alfredo Martin; Zimmerman, Jonathan A.

    2007-10-01

    To effectively integrate nanotechnology into functional devices, fundamental aspects of material behavior at the nanometer scale must be understood. Stresses generated during thin film growth strongly influence component lifetime and performance; stress has also been proposed as a mechanism for stabilizing supported nanoscale structures. Yet the intrinsic connections between the evolving morphology of supported nanostructures and stress generation are still a matter of debate. This report presents results from a combined experiment and modeling approach to study stress evolution during thin film growth. Fully atomistic simulations are presented predicting stress generation mechanisms and magnitudes during all growth stages, from island nucleation to coalescence and film thickening. Simulations are validated by electrodeposition growth experiments, which establish the dependence of microstructure and growth stresses on process conditions and deposition geometry. Sandia is one of the few facilities with the resources to combine experiments and modeling/theory in this close a fashion. Experiments predicted an ongoing coalescence process that generates signficant tensile stress. Data from deposition experiments also supports the existence of a kinetically limited compressive stress generation mechanism. Atomistic simulations explored island coalescence and deposition onto surfaces intersected by grain boundary structures to permit investigation of stress evolution during later growth stages, e.g. continual island coalescence and adatom incorporation into grain boundaries. The predictive capabilities of simulation permit direct determination of fundamental processes active in stress generation at the nanometer scale while connecting those processes, via new theory, to continuum models for much larger island and film structures. Our combined experiment and simulation results reveal the necessary materials science to tailor stress, and therefore performance, in

  20. Fundamental Physics and Precision Measurements

    NASA Astrophysics Data System (ADS)

    Hänsch, T. W.

    2006-11-01

    "Very high precision physics has always appealed to me. The steady improvement in technologies that afford higher and higher precision has been a regular source of excitement and challenge during my career. In science, as in most things, whenever one looks at something more closely, new aspects almost always come into play …" With these word from the book "How the Laser happened", Charles H. Townes expresses a passion for precision that is now shared by many scientists. Masers and lasers have become indispensible tools for precision measurements. During the past few years, the advent of femtosecond laser frequency comb synthesizers has revolutionized the art of directly comparing optical and microwave frequencies. Inspired by the needs of precision laser spectroscopy of the simple hydrogen atom, such frequency combs are now enabling ultra-precise spectroscopy over wide spectral ranges. Recent laboratory experiments are already setting stringent limits for possible slow variations of fundamental constants. Laser frequency combs also provide the long missing clockwork for optical atomic clocks that may ultimately reach a precision of parts in 1018 and beyond. Such tools will open intriguing new opportunities for fundamental experiments including new tests of special and general relativity. In the future, frequency comb techniques may be extended into the extreme ultraviolet and soft xray regime, opening a vast new spectral territory to precision measurements. Frequency combs have also become a key tool for the emerging new field of attosecond science, since they can control the electric field of ultrashort laser pulses on an unprecedented time scale. The biggest surprise in these endeavours would be if we found no surprise.

  1. Minimal but non-minimal inflation and electroweak symmetry breaking

    SciTech Connect

    Marzola, Luca; Racioppi, Antonio

    2016-10-07

    We consider the most minimal scale invariant extension of the standard model that allows for successful radiative electroweak symmetry breaking and inflation. The framework involves an extra scalar singlet, that plays the rôle of the inflaton, and is compatibile with current experimental bounds owing to the non-minimal coupling of the latter to gravity. This inflationary scenario predicts a very low tensor-to-scalar ratio r≈10{sup −3}, typical of Higgs-inflation models, but in contrast yields a scalar spectral index n{sub s}≃0.97 which departs from the Starobinsky limit. We briefly discuss the collider phenomenology of the framework.

  2. Assessment Principles and Tools

    PubMed Central

    Golnik, Karl C.

    2014-01-01

    The goal of ophthalmology residency training is to produce competent ophthalmologists. Competence can only be determined by appropriately assessing resident performance. There are accepted guiding principles that should be applied to competence assessment methods. These principles are enumerated herein and ophthalmology-specific assessment tools that are available are described. PMID:24791100

  3. How Do Fundamental Christians Deal with Depression?

    ERIC Educational Resources Information Center

    Spinney, Douglas Harvey

    1991-01-01

    Provides explanation of developmental dynamics in experience of fundamental Christians that provoke reactive depression. Describes depressant retardant defenses against depression that have been observed in Christian fundamental subculture. Suggests four counseling strategies for helping fundamentalists. (Author/ABL)

  4. Fundamental limits for cooling of linear quantum refrigerators.

    PubMed

    Freitas, Nahuel; Paz, Juan Pablo

    2017-01-01

    We study the asymptotic dynamics of arbitrary linear quantum open systems that are periodically driven while coupled with generic bosonic reservoirs. We obtain exact results for the heat flowing from each reservoir, and these results are valid beyond the weak-coupling or Markovian approximations. We prove the validity of the dynamical third law of thermodynamics (Nernst unattainability principle), showing that the ultimate limit for cooling is imposed by a fundamental heating mechanism that dominates at low temperatures, namely the nonresonant creation of excitation pairs in the reservoirs induced by the driving field. This quantum effect, which is missed in the weak-coupling approximation, restores the unattainability principle, the validity of which was recently challenged.

  5. Fundamental limits for cooling of linear quantum refrigerators

    NASA Astrophysics Data System (ADS)

    Freitas, Nahuel; Paz, Juan Pablo

    2017-01-01

    We study the asymptotic dynamics of arbitrary linear quantum open systems that are periodically driven while coupled with generic bosonic reservoirs. We obtain exact results for the heat flowing from each reservoir, and these results are valid beyond the weak-coupling or Markovian approximations. We prove the validity of the dynamical third law of thermodynamics (Nernst unattainability principle), showing that the ultimate limit for cooling is imposed by a fundamental heating mechanism that dominates at low temperatures, namely the nonresonant creation of excitation pairs in the reservoirs induced by the driving field. This quantum effect, which is missed in the weak-coupling approximation, restores the unattainability principle, the validity of which was recently challenged.

  6. Communication: Fitting potential energy surfaces with fundamental invariant neural network

    NASA Astrophysics Data System (ADS)

    Shao, Kejie; Chen, Jun; Zhao, Zhiqiang; Zhang, Dong H.

    2016-08-01

    A more flexible neural network (NN) method using the fundamental invariants (FIs) as the input vector is proposed in the construction of potential energy surfaces for molecular systems involving identical atoms. Mathematically, FIs finitely generate the permutation invariant polynomial (PIP) ring. In combination with NN, fundamental invariant neural network (FI-NN) can approximate any function to arbitrary accuracy. Because FI-NN minimizes the size of input permutation invariant polynomials, it can efficiently reduce the evaluation time of potential energy, in particular for polyatomic systems. In this work, we provide the FIs for all possible molecular systems up to five atoms. Potential energy surfaces for OH3 and CH4 were constructed with FI-NN, with the accuracy confirmed by full-dimensional quantum dynamic scattering and bound state calculations.

  7. Lanthanide upconversion luminescence at the nanoscale: fundamentals and optical properties

    NASA Astrophysics Data System (ADS)

    Nadort, Annemarie; Zhao, Jiangbo; Goldys, Ewa M.

    2016-07-01

    Upconversion photoluminescence is a nonlinear effect where multiple lower energy excitation photons produce higher energy emission photons. This fundamentally interesting process has many applications in biomedical imaging, light source and display technology, and solar energy harvesting. In this review we discuss the underlying physical principles and their modelling using rate equations. We discuss how the understanding of photophysical processes enabled a strategic influence over the optical properties of upconversion especially in rationally designed materials. We subsequently present an overview of recent experimental strategies to control and optimize the optical properties of upconversion nanoparticles, focussing on their emission spectral properties and brightness.

  8. Einstein's Boxes: Incompleteness of Quantum Mechanics Without a Separation Principle

    NASA Astrophysics Data System (ADS)

    Held, Carsten

    2015-09-01

    Einstein made several attempts to argue for the incompleteness of quantum mechanics (QM), not all of them using a separation principle. One unpublished example, the box parable, has received increased attention in the recent literature. Though the example is tailor-made for applying a separation principle and Einstein indeed applies one, he begins his discussion without it. An analysis of this first part of the parable naturally leads to an argument for incompleteness not involving a separation principle. I discuss the argument and its systematic import. Though it should be kept in mind that the argument is not the one Einstein intends, I show how it suggests itself and leads to a conflict between QM's completeness and a physical principle more fundamental than the separation principle, i.e. a principle saying that QM should deliver probabilities for physical systems possessing properties at definite times.

  9. Future Fundamental Combustion Research for Aeropropulsion Systems.

    DTIC Science & Technology

    1985-01-01

    AD-MISS 771 FUTURE FUNDAMENTAL COMBUSTION RESEARCH FOR I AEROPROPULSION SYSTEMS(U) NATIONAL AERONAUTICS AND I SPACE ADMINISTRATION CLEVELAND OH LEWIS... Future Fundamental Combustion Research for Aeropropulsion Systems u. Edward J. Mularz V Propulsion Laboratory A VSCOM Research and Technology Laboratories... FUTURE FUNDAMENTAL COMBUSTION RESEARCH FOR AEROPROPULSION SYSTEMS Edward J. Mularz

  10. Does Minimally Invasive Spine Surgery Minimize Surgical Site Infections?

    PubMed Central

    Patel, Ravish Shammi; Dutta, Shumayou

    2016-01-01

    Study Design Retrospective review of prospectively collected data. Purpose To evaluate the incidence of surgical site infections (SSIs) in minimally invasive spine surgery (MISS) in a cohort of patients and compare with available historical data on SSI in open spinal surgery cohorts, and to evaluate additional direct costs incurred due to SSI. Overview of Literature SSI can lead to prolonged antibiotic therapy, extended hospitalization, repeated operations, and implant removal. Small incisions and minimal dissection intrinsic to MISS may minimize the risk of postoperative infections. However, there is a dearth of literature on infections after MISS and their additional direct financial implications. Methods All patients from January 2007 to January 2015 undergoing posterior spinal surgery with tubular retractor system and microscope in our institution were included. The procedures performed included tubular discectomies, tubular decompressions for spinal stenosis and minimal invasive transforaminal lumbar interbody fusion (TLIF). The incidence of postoperative SSI was calculated and compared to the range of cited SSI rates from published studies. Direct costs were calculated from medical billing for index cases and for patients with SSI. Results A total of 1,043 patients underwent 763 noninstrumented surgeries (discectomies, decompressions) and 280 instrumented (TLIF) procedures. The mean age was 52.2 years with male:female ratio of 1.08:1. Three infections were encountered with fusion surgeries (mean detection time, 7 days). All three required wound wash and debridement with one patient requiring unilateral implant removal. Additional direct cost due to infection was $2,678 per 100 MISS-TLIF. SSI increased hospital expenditure per patient 1.5-fold after instrumented MISS. Conclusions Overall infection rate after MISS was 0.29%, with SSI rate of 0% in non-instrumented MISS and 1.07% with instrumented MISS. MISS can markedly reduce the SSI rate and can be an

  11. Shapes of embedded minimal surfaces

    PubMed Central

    Colding, Tobias H.; Minicozzi, William P.

    2006-01-01

    Surfaces that locally minimize area have been extensively used to model physical phenomena, including soap films, black holes, compound polymers, protein folding, etc. The mathematical field dates to the 1740s but has recently become an area of intense mathematical and scientific study, specifically in the areas of molecular engineering, materials science, and nanotechnology because of their many anticipated applications. In this work, we show that all minimal surfaces are built out of pieces of the surfaces in Figs. 1 and 2. PMID:16847265

  12. Fundamental physics of vacuum electron sources

    NASA Astrophysics Data System (ADS)

    Yamamoto, Shigehiko

    2006-01-01

    The history of electron emission is reviewed from a standpoint of the work function that determines the electron emission capability and of applications in the fields of scientific instruments and displays. For years, in thermionic emission, a great deal of effort has been devoted to the search for low work function materials with high melting temperature, while reduction of the local change in time of the work function rather than the work function itself has been the main issue of field emission investigations. High brightness and long life are the central targets of emission material investigations for scientific instrument applications, while high current density and low power consumption are the guiding principles for display applications. In most of the present day industries, thermionic emission materials are exclusively used in such fields requiring high current and high reliability as cathode ray tubes, transmission and receiving tubes, x-ray sources and various electron beam machines. Field electron emission sources, however, since applied to high resolution electron microscopes in the 1970s have recently become dominant in research and development in the fields of scientific instruments as well as in the fields of various electron tubes and beam machines. The main issue in this report is to analyse the work function on the atomic scale and thereby to understand the fundamental physics behind the work function, the change in time of the local work function leading to field emission current fluctuation and the relationship between microscopic (on atomic scale) and macroscopic work functions. Our attempt is presented here, where the work function on the atomic scale is measured by utilizing a scanning tunnelling microscopy technique, and it is made clear how far the local work function extends its influence over neighbouring sites. As a result, a simple relationship is established between microscopic and macroscopic work functions.

  13. Fundamental Scaling Laws in Nanophotonics

    PubMed Central

    Liu, Ke; Sun, Shuai; Majumdar, Arka; Sorger, Volker J.

    2016-01-01

    The success of information technology has clearly demonstrated that miniaturization often leads to unprecedented performance, and unanticipated applications. This hypothesis of “smaller-is-better” has motivated optical engineers to build various nanophotonic devices, although an understanding leading to fundamental scaling behavior for this new class of devices is missing. Here we analyze scaling laws for optoelectronic devices operating at micro and nanometer length-scale. We show that optoelectronic device performance scales non-monotonically with device length due to the various device tradeoffs, and analyze how both optical and electrical constrains influence device power consumption and operating speed. Specifically, we investigate the direct influence of scaling on the performance of four classes of photonic devices, namely laser sources, electro-optic modulators, photodetectors, and all-optical switches based on three types of optical resonators; microring, Fabry-Perot cavity, and plasmonic metal nanoparticle. Results show that while microrings and Fabry-Perot cavities can outperform plasmonic cavities at larger length-scales, they stop working when the device length drops below 100 nanometers, due to insufficient functionality such as feedback (laser), index-modulation (modulator), absorption (detector) or field density (optical switch). Our results provide a detailed understanding of the limits of nanophotonics, towards establishing an opto-electronics roadmap, akin to the International Technology Roadmap for Semiconductors. PMID:27869159

  14. Fundamental Scaling Laws in Nanophotonics

    NASA Astrophysics Data System (ADS)

    Liu, Ke; Sun, Shuai; Majumdar, Arka; Sorger, Volker J.

    2016-11-01

    The success of information technology has clearly demonstrated that miniaturization often leads to unprecedented performance, and unanticipated applications. This hypothesis of “smaller-is-better” has motivated optical engineers to build various nanophotonic devices, although an understanding leading to fundamental scaling behavior for this new class of devices is missing. Here we analyze scaling laws for optoelectronic devices operating at micro and nanometer length-scale. We show that optoelectronic device performance scales non-monotonically with device length due to the various device tradeoffs, and analyze how both optical and electrical constrains influence device power consumption and operating speed. Specifically, we investigate the direct influence of scaling on the performance of four classes of photonic devices, namely laser sources, electro-optic modulators, photodetectors, and all-optical switches based on three types of optical resonators; microring, Fabry-Perot cavity, and plasmonic metal nanoparticle. Results show that while microrings and Fabry-Perot cavities can outperform plasmonic cavities at larger length-scales, they stop working when the device length drops below 100 nanometers, due to insufficient functionality such as feedback (laser), index-modulation (modulator), absorption (detector) or field density (optical switch). Our results provide a detailed understanding of the limits of nanophotonics, towards establishing an opto-electronics roadmap, akin to the International Technology Roadmap for Semiconductors.

  15. Fundamental studies of fusion plasmas

    SciTech Connect

    Aamodt, R.E.; Catto, P.J.; D'Ippolito, D.A.; Myra, J.R.; Russell, D.A.

    1992-05-26

    The major portion of this program is devoted to critical ICH phenomena. The topics include edge physics, fast wave propagation, ICH induced high frequency instabilities, and a preliminary antenna design for Ignitor. This research was strongly coordinated with the world's experimental and design teams at JET, Culham, ORNL, and Ignitor. The results have been widely publicized at both general scientific meetings and topical workshops including the speciality workshop on ICRF design and physics sponsored by Lodestar in April 1992. The combination of theory, empirical modeling, and engineering design in this program makes this research particularly important for the design of future devices and for the understanding and performance projections of present tokamak devices. Additionally, the development of a diagnostic of runaway electrons on TEXT has proven particularly useful for the fundamental understanding of energetic electron confinement. This work has led to a better quantitative basis for quasilinear theory and the role of magnetic vs. electrostatic field fluctuations on electron transport. An APS invited talk was given on this subject and collaboration with PPPL personnel was also initiated. Ongoing research on these topics will continue for the remainder fo the contract period and the strong collaborations are expected to continue, enhancing both the relevance of the work and its immediate impact on areas needing critical understanding.

  16. Fundamental Scaling Laws in Nanophotonics.

    PubMed

    Liu, Ke; Sun, Shuai; Majumdar, Arka; Sorger, Volker J

    2016-11-21

    The success of information technology has clearly demonstrated that miniaturization often leads to unprecedented performance, and unanticipated applications. This hypothesis of "smaller-is-better" has motivated optical engineers to build various nanophotonic devices, although an understanding leading to fundamental scaling behavior for this new class of devices is missing. Here we analyze scaling laws for optoelectronic devices operating at micro and nanometer length-scale. We show that optoelectronic device performance scales non-monotonically with device length due to the various device tradeoffs, and analyze how both optical and electrical constrains influence device power consumption and operating speed. Specifically, we investigate the direct influence of scaling on the performance of four classes of photonic devices, namely laser sources, electro-optic modulators, photodetectors, and all-optical switches based on three types of optical resonators; microring, Fabry-Perot cavity, and plasmonic metal nanoparticle. Results show that while microrings and Fabry-Perot cavities can outperform plasmonic cavities at larger length-scales, they stop working when the device length drops below 100 nanometers, due to insufficient functionality such as feedback (laser), index-modulation (modulator), absorption (detector) or field density (optical switch). Our results provide a detailed understanding of the limits of nanophotonics, towards establishing an opto-electronics roadmap, akin to the International Technology Roadmap for Semiconductors.

  17. Hyperbolic metamaterials: fundamentals and applications.

    PubMed

    Shekhar, Prashant; Atkinson, Jonathan; Jacob, Zubin

    2014-01-01

    Metamaterials are nano-engineered media with designed properties beyond those available in nature with applications in all aspects of materials science. In particular, metamaterials have shown promise for next generation optical materials with electromagnetic responses that cannot be obtained from conventional media. We review the fundamental properties of metamaterials with hyperbolic dispersion and present the various applications where such media offer potential for transformative impact. These artificial materials support unique bulk electromagnetic states which can tailor light-matter interaction at the nanoscale. We present a unified view of practical approaches to achieve hyperbolic dispersion using thin film and nanowire structures. We also review current research in the field of hyperbolic metamaterials such as sub-wavelength imaging and broadband photonic density of states engineering. The review introduces the concepts central to the theory of hyperbolic media as well as nanofabrication and characterization details essential to experimentalists. Finally, we outline the challenges in the area and offer a set of directions for future work.

  18. BOOK REVIEWS: Quantum Mechanics: Fundamentals

    NASA Astrophysics Data System (ADS)

    Whitaker, A.

    2004-02-01

    mechanics, which is assumed, but to examine whether it gives a consistent account of measurement. The conclusion is that after a measurement, interference terms are ‘effectively’ absent; the set of ‘one-to-one correlations between states of the apparatus and the object’ has the same form as that of everyday statistics and is thus a probability distribution. This probability distribution refers to potentialities, only one of which is actually realized in any one trial. Opinions may differ on whether their treatment is any less vulnerable to criticisms such as those of Bell. To sum up, Gottfried and Yan’s book contains a vast amount of knowledge and understanding. As well as explaining the way in which quantum theory works, it attempts to illuminate fundamental aspects of the theory. A typical example is the ‘fable’ elaborated in Gottfried’s article in Nature cited above, that if Newton were shown Maxwell’s equations and the Lorentz force law, he could deduce the meaning of E and B, but if Maxwell were shown Schrödinger’s equation, he could not deduce the meaning of Psi. For use with a well-constructed course (and, of course, this is the avowed purpose of the book; a useful range of problems is provided for each chapter), or for the relative expert getting to grips with particular aspects of the subject or aiming for a deeper understanding, the book is certainly ideal. It might be suggested, though, that, even compared to the first edition, the isolated learner might find the wide range of topics, and the very large number of mathematical and conceptual techniques, introduced in necessarily limited space, somewhat overwhelming. The second book under consideration, that of Schwabl, contains ‘Advanced’ elements of quantum theory; it is designed for a course following on from one for which Gottfried and Yan, or Schwabl’s own `Quantum Mechanics' might be recommended. It is the second edition in English, and is a translation of the third German edition

  19. Minimally invasive aortic valve surgery

    PubMed Central

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-01-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  20. Wilson loops in minimal surfaces

    SciTech Connect

    Drukker, Nadav; Gross, David J.; Ooguri, Hirosi

    1999-04-27

    The AdS/CFT correspondence suggests that the Wilson loop of the large N gauge theory with N = 4 supersymmetry in 4 dimensions is described by a minimal surface in AdS{sub 5} x S{sup 5}. The authors examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which the authors call BPS loops, whose expectation values are free from ultra-violet divergence. They formulate the loop equation for such loops. To the extent that they have checked, the minimal surface in AdS{sub 5} x S{sup 5} gives a solution of the equation. The authors also discuss the zig-zag symmetry of the loop operator. In the N = 4 gauge theory, they expect the zig-zag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. They will show how this is realized for the minimal surface.

  1. A Defense of Semantic Minimalism

    ERIC Educational Resources Information Center

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  2. LLNL Waste Minimization Program Plan

    SciTech Connect

    Not Available

    1990-02-14

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs.

  3. What is minimally invasive dentistry?

    PubMed

    Ericson, Dan

    2004-01-01

    Minimally Invasive Dentistry is the application of "a systematic respect for the original tissue." This implies that the dental profession recognizes that an artifact is of less biological value than the original healthy tissue. Minimally invasive dentistry is a concept that can embrace all aspects of the profession. The common delineator is tissue preservation, preferably by preventing disease from occurring and intercepting its progress, but also removing and replacing with as little tissue loss as possible. It does not suggest that we make small fillings to restore incipient lesions or surgically remove impacted third molars without symptoms as routine procedures. The introduction of predictable adhesive technologies has led to a giant leap in interest in minimally invasive dentistry. The concept bridges the traditional gap between prevention and surgical procedures, which is just what dentistry needs today. The evidence-base for survival of restorations clearly indicates that restoring teeth is a temporary palliative measure that is doomed to fail if the disease that caused the condition is not addressed properly. Today, the means, motives and opportunities for minimally invasive dentistry are at hand, but incentives are definitely lacking. Patients and third parties seem to be convinced that the only things that count are replacements. Namely, they are prepared to pay for a filling but not for a procedure that can help avoid having one.

  4. Fundamental mechanisms of micromachine reliability

    SciTech Connect

    DE BOER,MAARTEN P.; SNIEGOWSKI,JEFFRY J.; KNAPP,JAMES A.; REDMOND,JAMES M.; MICHALSKE,TERRY A.; MAYER,THOMAS K.

    2000-01-01

    Due to extreme surface to volume ratios, adhesion and friction are critical properties for reliability of Microelectromechanical Systems (MEMS), but are not well understood. In this LDRD the authors established test structures, metrology and numerical modeling to conduct studies on adhesion and friction in MEMS. They then concentrated on measuring the effect of environment on MEMS adhesion. Polycrystalline silicon (polysilicon) is the primary material of interest in MEMS because of its integrated circuit process compatibility, low stress, high strength and conformal deposition nature. A plethora of useful micromachined device concepts have been demonstrated using Sandia National Laboratories' sophisticated in-house capabilities. One drawback to polysilicon is that in air the surface oxidizes, is high energy and is hydrophilic (i.e., it wets easily). This can lead to catastrophic failure because surface forces can cause MEMS parts that are brought into contact to adhere rather than perform their intended function. A fundamental concern is how environmental constituents such as water will affect adhesion energies in MEMS. The authors first demonstrated an accurate method to measure adhesion as reported in Chapter 1. In Chapter 2 through 5, they then studied the effect of water on adhesion depending on the surface condition (hydrophilic or hydrophobic). As described in Chapter 2, they find that adhesion energy of hydrophilic MEMS surfaces is high and increases exponentially with relative humidity (RH). Surface roughness is the controlling mechanism for this relationship. Adhesion can be reduced by several orders of magnitude by silane coupling agents applied via solution processing. They decrease the surface energy and render the surface hydrophobic (i.e. does not wet easily). However, only a molecular monolayer coats the surface. In Chapters 3-5 the authors map out the extent to which the monolayer reduces adhesion versus RH. They find that adhesion is independent of

  5. Fundamental Studies of Recombinant Hydrogenases

    SciTech Connect

    Adams, Michael W

    2014-01-25

    This research addressed the long term goals of understanding the assembly and organization of hydrogenase enzymes, of reducing them in size and complexity, of determining structure/function relationships, including energy conservation via charge separation across membranes, and in screening for novel H2 catalysts. A key overall goal of the proposed research was to define and characterize minimal hydrogenases that are produced in high yields and are oxygen-resistant. Remarkably, in spite of decades of research carried out on hydrogenases, it is not possible to readily manipulate or design the enzyme using molecular biology approaches since a recombinant form produced in a suitable host is not available. Such resources are essential if we are to understand what constitutes a “minimal” hydrogenase and design such catalysts with certain properties, such as resistance to oxygen, extreme stability and specificity for a given electron donor. The model system for our studies is Pyrococcus furiosus, a hyperthermophile that grows optimally at 100°C, which contains three different nickel-iron [NiFe-] containing hydrogenases. Hydrogenases I and II are cytoplasmic while the other, MBH, is an integral membrane protein that functions to both evolve H2 and pump protons. Three important breakthroughs were made during the funding period with P. furiosus soluble hydrogenase I (SHI). First, we produced an active recombinant form of SHI in E. coli by the co-expression of sixteen genes using anaerobically-induced promoters. Second, we genetically-engineered P. furiosus to overexpress SHI by an order of magnitude compared to the wild type strain. Third, we generated the first ‘minimal’ form of SHI, one that contained two rather than four subunits. This dimeric form was stable and active, and directly interacted with a pyruvate-oxidizing enzyme with any intermediate electron carrier. The research resulted in five peer-reviewed publications.

  6. Archimedes' Principle in Action

    ERIC Educational Resources Information Center

    Kires, Marian

    2007-01-01

    The conceptual understanding of Archimedes' principle can be verified in experimental procedures which determine mass and density using a floating object. This is demonstrated by simple experiments using graduated beakers. (Contains 5 figures.)

  7. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1972-01-01

    Collection of two short descriptions of chemical principles seen in life situations: the autocatalytic reaction seen in the bombardier beetle, and molecular potential energy used for quick roasting of beef. Brief reference is also made to methanol lighters. (PS)

  8. Light modular rig for minimal environment impact

    SciTech Connect

    Mehra, S.; Abedrabbo, A.

    1996-12-31

    The fast plenary meeting of United Nations on human Environment in 1972 considered the need for a common outlook and for common principles to inspire and guide the people and industries of the world in the preservation and enhancement of human environment. Since then many countries have, or am now enacting, environmental legislation`s covering the wide spectrum of environmental protection issues. Petroleum industry has not been immune to inch scrutiny, however, little has changed in land based drilling operations, especially in remote areas. A major aspect of the ongoing program in the design of a light modular land rig has been minimization of the environmental impact. Today, concerns for protection of the environment have spread in many drilling areas: the use of some traditional drilling techniques such as waste pits is now banned. When rethinking about rig hardware and design today, environment protection needs to be considered at an early stage. There are many incentives for implementation of environmental protection programs, in design and in operation, aside from the regulatory/compliance issue. Waste disposal costs have risen dramatically over the last few years and the trend is expected to continue. Improvements in environment conditions improves morale and image. Growing public awareness and realization of the man made harm in my regions of the earth : dangerous levels of pollution in water, air, earth and living beings; major and undesirable disturbances to the ecological balance of the biosphere; destruction and depletion of irreplaceable resources; and gross deficiencies harmful to the physical, mental and social health of man in the living and working environment. This paper discusses the steps taken, early on in the design stage and operations methodology, to minimize the environmental impact.

  9. Astronomia Motivadora no Ensino Fundamental

    NASA Astrophysics Data System (ADS)

    Melo, J.; Voelzke, M. R.

    2008-09-01

    O objetivo principal deste trabalho é procurar desenvolver o interesse dos alunos pelas ciências através da Astronomia. Uma pesquisa com perguntas sobre Astronomia foi realizada junto a 161 alunos do Ensino Fundamental, com o intuito de descobrir conhecimentos prévios dos alunos sobre o assunto. Constatou-se, por exemplo, que 29,3% da 6ª série responderam corretamente o que é eclipse, 30,0% da 8ª série acertaram o que a Astronomia estuda, enquanto 42,3% dos alunos da 5ª série souberam definir o Sol. Pretende-se ampliar as turmas participantes e trabalhar, principalmente de forma prática com: dimensões e escalas no Sistema Solar, construção de luneta, questões como dia e noite, estações do ano e eclipses. Busca-se abordar, também, outros conteúdos de Física tais como a óptica na construção da luneta, e a mecânica no trabalho com escalas e medidas, e ao utilizar uma luminária para representar o Sol na questão do eclipse, e de outras disciplinas como a Matemática na transformação de unidades, regras de três; Artes na modelagem ou desenho dos planetas; a própria História com relação à busca pela origem do universo, e a Informática que possibilita a busca mais rápida por informações, além de permitir simulações e visualizações de imagens importantes. Acredita-se que a Astronomia é importante no processo ensino aprendizagem, pois permite a discussão de temas curiosos como, por exemplo, a origem do universo, viagens espaciais a existência ou não de vida em outros planetas, além de temas atuais como as novas tecnologias.

  10. Anaesthesia for minimally invasive surgery

    PubMed Central

    Dec, Marta

    2015-01-01

    Minimally invasive surgery (MIS) is rising in popularity. It offers well-known benefits to the patient. However, restricted access to the surgical site and gas insufflation into the body cavities may result in severe complications. From the anaesthetic point of view MIS poses unique challenges associated with creation of pneumoperitoneum, carbon dioxide absorption, specific positioning and monitoring a patient to whom the anaesthetist has often restricted access, in a poorly lit environment. Moreover, with refinement of surgical procedures and growing experience the anaesthetist is presented with patients from high-risk groups (obese, elderly, with advanced cardiac and respiratory disease) who once were deemed unsuitable for the laparoscopic technique. Anaesthetic management is aimed at getting the patient safely through the procedure, minimizing the specific risks arising from laparoscopy and the patient's coexisting medical problems, ensuring quick recovery and a relatively pain-free postoperative course with early return to normal function. PMID:26865885

  11. Minimal universal quantum heat machine.

    PubMed

    Gelbwaser-Klimovsky, D; Alicki, R; Kurizki, G

    2013-01-01

    In traditional thermodynamics the Carnot cycle yields the ideal performance bound of heat engines and refrigerators. We propose and analyze a minimal model of a heat machine that can play a similar role in quantum regimes. The minimal model consists of a single two-level system with periodically modulated energy splitting that is permanently, weakly, coupled to two spectrally separated heat baths at different temperatures. The equation of motion allows us to compute the stationary power and heat currents in the machine consistent with the second law of thermodynamics. This dual-purpose machine can act as either an engine or a refrigerator (heat pump) depending on the modulation rate. In both modes of operation, the maximal Carnot efficiency is reached at zero power. We study the conditions for finite-time optimal performance for several variants of the model. Possible realizations of the model are discussed.

  12. Minimal Doubling and Point Splitting

    SciTech Connect

    Creutz, M.

    2010-06-14

    Minimally-doubled chiral fermions have the unusual property of a single local field creating two fermionic species. Spreading the field over hypercubes allows construction of combinations that isolate specific modes. Combining these fields into bilinears produces meson fields of specific quantum numbers. Minimally-doubled fermion actions present the possibility of fast simulations while maintaining one exact chiral symmetry. They do, however, introduce some peculiar aspects. An explicit breaking of hyper-cubic symmetry allows additional counter-terms to appear in the renormalization. While a single field creates two different species, spreading this field over nearby sites allows isolation of specific states and the construction of physical meson operators. Finally, lattice artifacts break isospin and give two of the three pseudoscalar mesons an additional contribution to their mass. Depending on the sign of this mass splitting, one can either have a traditional Goldstone pseudoscalar meson or a parity breaking Aoki-like phase.

  13. Outcomes After Minimally Invasive Esophagectomy

    PubMed Central

    Luketich, James D.; Pennathur, Arjun; Awais, Omar; Levy, Ryan M.; Keeley, Samuel; Shende, Manisha; Christie, Neil A.; Weksler, Benny; Landreneau, Rodney J.; Abbas, Ghulam; Schuchert, Matthew J.; Nason, Katie S.

    2014-01-01

    Background Esophagectomy is a complex operation and is associated with significant morbidity and mortality. In an attempt to lower morbidity, we have adopted a minimally invasive approach to esophagectomy. Objectives Our primary objective was to evaluate the outcomes of minimally invasive esophagectomy (MIE) in a large group of patients. Our secondary objective was to compare the modified McKeown minimally invasive approach (videothoracoscopic surgery, laparoscopy, neck anastomosis [MIE-neck]) with our current approach, a modified Ivor Lewis approach (laparoscopy, videothoracoscopic surgery, chest anastomosis [MIE-chest]). Methods We reviewed 1033 consecutive patients undergoing MIE. Elective operation was performed on 1011 patients; 22 patients with nonelective operations were excluded. Patients were stratified by surgical approach and perioperative outcomes analyzed. The primary endpoint studied was 30-day mortality. Results The MIE-neck was performed in 481 (48%) and MIE-Ivor Lewis in 530 (52%). Patients undergoing MIE-Ivor Lewis were operated in the current era. The median number of lymph nodes resected was 21. The operative mortality was 1.68%. Median length of stay (8 days) and ICU stay (2 days) were similar between the 2 approaches. Mortality rate was 0.9%, and recurrent nerve injury was less frequent in the Ivor Lewis MIE group (P < 0.001). Conclusions MIE in our center resulted in acceptable lymph node resection, postoperative outcomes, and low mortality using either an MIE-neck or an MIE-chest approach. The MIE Ivor Lewis approach was associated with reduced recurrent laryngeal nerve injury and mortality of 0.9% and is now our preferred approach. Minimally invasive esophagectomy can be performed safely, with good results in an experienced center. PMID:22668811

  14. Minimal massive 3D gravity

    NASA Astrophysics Data System (ADS)

    Bergshoeff, Eric; Hohm, Olaf; Merbis, Wout; Routh, Alasdair J.; Townsend, Paul K.

    2014-07-01

    We present an alternative to topologically massive gravity (TMG) with the same ‘minimal’ bulk properties; i.e. a single local degree of freedom that is realized as a massive graviton in linearization about an anti-de Sitter (AdS) vacuum. However, in contrast to TMG, the new ‘minimal massive gravity’ has both a positive energy graviton and positive central charges for the asymptotic AdS-boundary conformal algebra.

  15. Perfusion Magnetic Resonance Imaging: A Comprehensive Update on Principles and Techniques

    PubMed Central

    Li, Ka-Loh; Ostergaard, Leif; Calamante, Fernando

    2014-01-01

    Perfusion is a fundamental biological function that refers to the delivery of oxygen and nutrients to tissue by means of blood flow. Perfusion MRI is sensitive to microvasculature and has been applied in a wide variety of clinical applications, including the classification of tumors, identification of stroke regions, and characterization of other diseases. Perfusion MRI techniques are classified with or without using an exogenous contrast agent. Bolus methods, with injections of a contrast agent, provide better sensitivity with higher spatial resolution, and are therefore more widely used in clinical applications. However, arterial spin-labeling methods provide a unique opportunity to measure cerebral blood flow without requiring an exogenous contrast agent and have better accuracy for quantification. Importantly, MRI-based perfusion measurements are minimally invasive overall, and do not use any radiation and radioisotopes. In this review, we describe the principles and techniques of perfusion MRI. This review summarizes comprehensive updated knowledge on the physical principles and techniques of perfusion MRI. PMID:25246817

  16. Structural principles for computational and de novo design of 4Fe-4S metalloproteins.

    PubMed

    Nanda, Vikas; Senn, Stefan; Pike, Douglas H; Rodriguez-Granillo, Agustina; Hansen, Will A; Khare, Sagar D; Noy, Dror

    2016-05-01

    Iron-sulfur centers in metalloproteins can access multiple oxidation states over a broad range of potentials, allowing them to participate in a variety of electron transfer reactions and serving as catalysts for high-energy redox processes. The nitrogenase FeMoCO cluster converts di-nitrogen to ammonia in an eight-electron transfer step. The 2(Fe4S4) containing bacterial ferredoxin is an evolutionarily ancient metalloprotein fold and is thought to be a primordial progenitor of extant oxidoreductases. Controlling chemical transformations mediated by iron-sulfur centers such as nitrogen fixation, hydrogen production as well as electron transfer reactions involved in photosynthesis are of tremendous importance for sustainable chemistry and energy production initiatives. As such, there is significant interest in the design of iron-sulfur proteins as minimal models to gain fundamental understanding of complex natural systems and as lead-molecules for industrial and energy applications. Herein, we discuss salient structural characteristics of natural iron-sulfur proteins and how they guide principles for design. Model structures of past designs are analyzed in the context of these principles and potential directions for enhanced designs are presented, and new areas of iron-sulfur protein design are proposed. This article is part of a Special issue entitled Biodesign for Bioenergetics--the design and engineering of electronic transfer cofactors, protein networks, edited by Ronald L. Koder and J.L Ross Anderson.

  17. Structural principles for computational and de novo design of 4Fe-4S metalloproteins

    PubMed Central

    Nanda, Vikas; Senn, Stefan; Pike, Douglas H.; Rodriguez-Granillo, Agustina; Hansen, Will; Khare, Sagar D.; Noy, Dror

    2017-01-01

    Iron-sulfur centers in metalloproteins can access multiple oxidation states over a broad range of potentials, allowing them to participate in a variety of electron transfer reactions and serving as catalysts for high-energy redox processes. The nitrogenase FeMoCO cluster converts di-nitrogen to ammonia in an eight-electron transfer step. The 2(Fe4S4) containing bacterial ferredoxin is an evolutionarily ancient metalloprotein fold and is thought to be a primordial progenitor of extant oxidoreductases. Controlling chemical transformations mediated by iron-sulfur centers such as nitrogen fixation, hydrogen production as well as electron transfer reactions involved in photosynthesis are of tremendous importance for sustainable chemistry and energy production initiatives. As such, there is significant interest in the design of iron-sulfur proteins as minimal models to gain fundamental understanding of complex natural systems and as lead-molecules for industrial and energy applications. Herein, we discuss salient structural characteristics of natural iron-sulfur proteins and how they guide principles for design. Model structures of past designs are analyzed in the context of these principles and potential directions for enhanced designs are presented, and new areas of iron-sulfur protein design are proposed. PMID:26449207

  18. Minimally Informative Prior Distributions for PSA

    SciTech Connect

    Dana L. Kelly; Robert W. Youngblood; Kurt G. Vedros

    2010-06-01

    A salient feature of Bayesian inference is its ability to incorporate information from a variety of sources into the inference model, via the prior distribution (hereafter simply “the prior”). However, over-reliance on old information can lead to priors that dominate new data. Some analysts seek to avoid this by trying to work with a minimally informative prior distribution. Another reason for choosing a minimally informative prior is to avoid the often-voiced criticism of subjectivity in the choice of prior. Minimally informative priors fall into two broad classes: 1) so-called noninformative priors, which attempt to be completely objective, in that the posterior distribution is determined as completely as possible by the observed data, the most well known example in this class being the Jeffreys prior, and 2) priors that are diffuse over the region where the likelihood function is nonnegligible, but that incorporate some information about the parameters being estimated, such as a mean value. In this paper, we compare four approaches in the second class, with respect to their practical implications for Bayesian inference in Probabilistic Safety Assessment (PSA). The most commonly used such prior, the so-called constrained noninformative prior, is a special case of the maximum entropy prior. This is formulated as a conjugate distribution for the most commonly encountered aleatory models in PSA, and is correspondingly mathematically convenient; however, it has a relatively light tail and this can cause the posterior mean to be overly influenced by the prior in updates with sparse data. A more informative prior that is capable, in principle, of dealing more effectively with sparse data is a mixture of conjugate priors. A particular diffuse nonconjugate prior, the logistic-normal, is shown to behave similarly for some purposes. Finally, we review the so-called robust prior. Rather than relying on the mathematical abstraction of entropy, as does the constrained

  19. The uncertainty principle determines the nonlocality of quantum mechanics.

    PubMed

    Oppenheim, Jonathan; Wehner, Stephanie

    2010-11-19

    Two central concepts of quantum mechanics are Heisenberg's uncertainty principle and a subtle form of nonlocality that Einstein famously called "spooky action at a distance." These two fundamental features have thus far been distinct concepts. We show that they are inextricably and quantitatively linked: Quantum mechanics cannot be more nonlocal with measurements that respect the uncertainty principle. In fact, the link between uncertainty and nonlocality holds for all physical theories. More specifically, the degree of nonlocality of any theory is determined by two factors: the strength of the uncertainty principle and the strength of a property called "steering," which determines which states can be prepared at one location given a measurement at another.

  20. Confucian Thinking in Traditional Moral Education: Key Ideas and Fundamental Features

    ERIC Educational Resources Information Center

    Fengyan, Wang

    2004-01-01

    Ancient Chinese ideas of moral education could be said to have five main dimensions-- philosophical foundations, content, principles, methods and evaluation--which are described in this paper. An analysis of the fundamental features of Confucian thinking on moral education shows that it took the idea that human beings have a good and kind nature…

  1. Evidence for the Fundamental Difference Hypothesis or Not?: Island Constraints Revisited

    ERIC Educational Resources Information Center

    Belikova, Alyona; White, Lydia

    2009-01-01

    This article examines how changes in linguistic theory affect the debate between the fundamental difference hypothesis and the access-to-Universal Grammar (UG) approach to SLA. With a focus on subjacency (Chomsky, 1973), a principle of UG that places constraints on "wh"-movement and that has frequently been taken as a test case for verifying…

  2. Implementing fundamental care in clinical practice.

    PubMed

    Feo, Rebecca; Conroy, Tiffany; Alderman, Jan; Kitson, Alison

    2017-04-05

    Modern healthcare environments are becoming increasingly complex. Delivering high-quality fundamental care in these environments is challenging for nurses and has been the focus of recent media, policy, academic and public scrutiny. Much of this attention arises from evidence that fundamental care is being neglected or delivered inadequately. There are an increasing number of standards and approaches to the delivery of fundamental care, which may result in confusion and additional documentation for nurses to complete. This article provides nurses with an approach to reframe their thinking about fundamental care, to ensure they meet patients' care needs and deliver holistic, person-centred care.

  3. A Principle of Intentionality.

    PubMed

    Turner, Charles K

    2017-01-01

    The mainstream theories and models of the physical sciences, including neuroscience, are all consistent with the principle of causality. Wholly causal explanations make sense of how things go, but are inherently value-neutral, providing no objective basis for true beliefs being better than false beliefs, nor for it being better to intend wisely than foolishly. Dennett (1987) makes a related point in calling the brain a syntactic (procedure-based) engine. He says that you cannot get to a semantic (meaning-based) engine from there. He suggests that folk psychology revolves around an intentional stance that is independent of the causal theories of the brain, and accounts for constructs such as meanings, agency, true belief, and wise desire. Dennett proposes that the intentional stance is so powerful that it can be developed into a valid intentional theory. This article expands Dennett's model into a principle of intentionality that revolves around the construct of objective wisdom. This principle provides a structure that can account for all mental processes, and for the scientific understanding of objective value. It is suggested that science can develop a far more complete worldview with a combination of the principles of causality and intentionality than would be possible with scientific theories that are consistent with the principle of causality alone.

  4. Applying the four principles.

    PubMed

    Macklin, R

    2003-10-01

    Gillon is correct that the four principles provide a sound and useful way of analysing moral dilemmas. As he observes, the approach using these principles does not provide a unique solution to dilemmas. This can be illustrated by alternatives to Gillon's own analysis of the four case scenarios. In the first scenario, a different set of factual assumptions could yield a different conclusion about what is required by the principle of beneficence. In the second scenario, although Gillon's conclusion is correct, what is open to question is his claim that what society regards as the child's best interest determines what really is in the child's best interest. The third scenario shows how it may be reasonable for the principle of beneficence to take precedence over autonomy in certain circumstances, yet like the first scenario, the ethical conclusion relies on a set of empirical assumptions and predictions of what is likely to occur. The fourth scenario illustrates how one can draw different conclusions based on the importance given to the precautionary principle.

  5. A Principle of Intentionality

    PubMed Central

    Turner, Charles K.

    2017-01-01

    The mainstream theories and models of the physical sciences, including neuroscience, are all consistent with the principle of causality. Wholly causal explanations make sense of how things go, but are inherently value-neutral, providing no objective basis for true beliefs being better than false beliefs, nor for it being better to intend wisely than foolishly. Dennett (1987) makes a related point in calling the brain a syntactic (procedure-based) engine. He says that you cannot get to a semantic (meaning-based) engine from there. He suggests that folk psychology revolves around an intentional stance that is independent of the causal theories of the brain, and accounts for constructs such as meanings, agency, true belief, and wise desire. Dennett proposes that the intentional stance is so powerful that it can be developed into a valid intentional theory. This article expands Dennett’s model into a principle of intentionality that revolves around the construct of objective wisdom. This principle provides a structure that can account for all mental processes, and for the scientific understanding of objective value. It is suggested that science can develop a far more complete worldview with a combination of the principles of causality and intentionality than would be possible with scientific theories that are consistent with the principle of causality alone. PMID:28223954

  6. Principles of multisensory behavior.

    PubMed

    Otto, Thomas U; Dassy, Brice; Mamassian, Pascal

    2013-04-24

    The combined use of multisensory signals is often beneficial. Based on neuronal recordings in the superior colliculus of cats, three basic rules were formulated to describe the effectiveness of multisensory signals: the enhancement of neuronal responses to multisensory compared with unisensory signals is largest when signals occur at the same location ("spatial rule"), when signals are presented at the same time ("temporal rule"), and when signals are rather weak ("principle of inverse effectiveness"). These rules are also considered with respect to multisensory benefits as observed with behavioral measures, but do they capture these benefits best? To uncover the principles that rule benefits in multisensory behavior, we here investigated the classical redundant signal effect (RSE; i.e., the speedup of response times in multisensory compared with unisensory conditions) in humans. Based on theoretical considerations using probability summation, we derived two alternative principles to explain the effect. First, the "principle of congruent effectiveness" states that the benefit in multisensory behavior (here the speedup of response times) is largest when behavioral performance in corresponding unisensory conditions is similar. Second, the "variability rule" states that the benefit is largest when performance in corresponding unisensory conditions is unreliable. We then tested these predictions in two experiments, in which we manipulated the relative onset and the physical strength of distinct audiovisual signals. Our results, which are based on a systematic analysis of response time distributions, show that the RSE follows these principles very well, thereby providing compelling evidence in favor of probability summation as the underlying combination rule.

  7. Minimizing radiation exposure during percutaneous nephrolithotomy.

    PubMed

    Chen, T T; Preminger, G M; Lipkin, M E

    2015-12-01

    Given the recent trends in growing per capita radiation dose from medical sources, there have been increasing concerns over patient radiation exposure. Patients with kidney stones undergoing percutaneous nephrolithotomy (PNL) are at particular risk for high radiation exposure. There exist several risk factors for increased radiation exposure during PNL which include high Body Mass Index, multiple access tracts, and increased stone burden. We herein review recent trends in radiation exposure, radiation exposure during PNL to both patients and urologists, and various approaches to reduce radiation exposure. We discuss incorporating the principles of As Low As reasonably Achievable (ALARA) into clinical practice and review imaging techniques such as ultrasound and air contrast to guide PNL access. Alternative surgical techniques and approaches to reducing radiation exposure, including retrograde intra-renal surgery, retrograde nephrostomy, endoscopic-guided PNL, and minimally invasive PNL, are also highlighted. It is important for urologists to be aware of these concepts and techniques when treating stone patients with PNL. The discussions outlined will assist urologists in providing patient counseling and high quality of care.

  8. U.S. Geological Survey Fundamental Science Practices

    USGS Publications Warehouse

    ,

    2011-01-01

    The USGS has a long and proud tradition of objective, unbiased science in service to the Nation. A reputation for impartiality and excellence is one of our most important assets. To help preserve this vital asset, in 2004 the Executive Leadership Team (ELT) of the USGS was charged by the Director to develop a set of fundamental science practices, philosophical premises, and operational principles as the foundation for all USGS research and monitoring activities. In a concept document, 'Fundamental Science Practices of the U.S. Geological Survey', the ELT proposed 'a set of fundamental principles to underlie USGS science practices.' The document noted that protecting the reputation of USGS science for quality and objectivity requires the following key elements: - Clearly articulated, Bureau-wide fundamental science practices. - A shared understanding at all levels of the organization that the health and future of the USGS depend on following these practices. - The investment of budget, time, and people to ensure that the USGS reputation and high-quality standards are maintained. The USGS Fundamental Science Practices (FSP) encompass all elements of research investigations, including data collection, experimentation, analysis, writing results, peer review, management review, and Bureau approval and publication of information products. The focus of FSP is on how science is carried out and how products are produced and disseminated. FSP is not designed to address the question of what work the USGS should do; that is addressed in USGS science planning handbooks and other documents. Building from longstanding existing USGS policies and the ELT concept document, in May 2006, FSP policies were developed with input from all parts of the organization and were subsequently incorporated into the Bureau's Survey Manual. In developing an implementation plan for FSP policy, the intent was to recognize and incorporate the best of USGS current practices to obtain the optimum

  9. Atomically Precise Colloidal Metal Nanoclusters and Nanoparticles: Fundamentals and Opportunities.

    PubMed

    Jin, Rongchao; Zeng, Chenjie; Zhou, Meng; Chen, Yuxiang

    2016-09-28

    Colloidal nanoparticles are being intensely pursued in current nanoscience research. Nanochemists are often frustrated by the well-known fact that no two nanoparticles are the same, which precludes the deep understanding of many fundamental properties of colloidal nanoparticles in which the total structures (core plus surface) must be known. Therefore, controlling nanoparticles with atomic precision and solving their total structures have long been major dreams for nanochemists. Recently, these goals are partially fulfilled in the case of gold nanoparticles, at least in the ultrasmall size regime (1-3 nm in diameter, often called nanoclusters). This review summarizes the major progress in the field, including the principles that permit atomically precise synthesis, new types of atomic structures, and unique physical and chemical properties of atomically precise nanoparticles, as well as exciting opportunities for nanochemists to understand very fundamental science of colloidal nanoparticles (such as the stability, metal-ligand interfacial bonding, ligand assembly on particle surfaces, aesthetic structural patterns, periodicities, and emergence of the metallic state) and to develop a range of potential applications such as in catalysis, biomedicine, sensing, imaging, optics, and energy conversion. Although most of the research activity currently focuses on thiolate-protected gold nanoclusters, important progress has also been achieved in other ligand-protected gold, silver, and bimetal (or alloy) nanoclusters. All of these types of unique nanoparticles will bring unprecedented opportunities, not only in understanding the fundamental questions of nanoparticles but also in opening up new horizons for scientific studies of nanoparticles.

  10. Action principles in nature

    NASA Astrophysics Data System (ADS)

    Barrow, John D.; Tipler, Frank J.

    1988-01-01

    Physical theories have their most fundamental expression as action integrals. This suggests that the total action of the universe is the most fundamental physical quantity, and hence finite. In this article it is argued that finite universal action implies that the universe is spatially closed. Further, the possible spatial topologies, the types of matter that can dominate the early universe dynamics, and the form of any quadratic additions to the lagrangian of general relativity are constrained. Initial and final cosmological curvature singularities are required to avoid a universal action singularity.

  11. Fundamental composite electroweak dynamics: Status at the LHC

    NASA Astrophysics Data System (ADS)

    Arbey, Alexandre; Cacciapaglia, Giacomo; Cai, Haiying; Deandrea, Aldo; Le Corre, Solène; Sannino, Francesco

    2017-01-01

    Using the recent joint results from the ATLAS and CMS collaborations on the Higgs boson, we determine the current status of composite electroweak dynamics models based on the expected scalar sector. Our analysis can be used as a minimal template for a wider class of models between the two limiting cases of composite Goldstone Higgs and Technicolor-like ones. This is possible due to the existence of a unified description, both at the effective and fundamental Lagrangian levels, of models of composite Higgs dynamics where the Higgs boson itself can emerge, depending on the way the electroweak symmetry is embedded, either as a pseudo-Goldstone boson or as a massive excitation of the condensate. In our template, a mass term for the fermions in the fundamental theory acts as a stabilizer of the Higgs potential, without the need for partners of the top quark. We constrain the available parameter space at the effective Lagrangian level. We show that a wide class of models of fundamental composite electroweak dynamics are still compatible with the present constraints. The results are relevant for the ongoing and future searches at the Large Hadron Collider.

  12. Spaceborne receivers: Basic principles

    NASA Technical Reports Server (NTRS)

    Stacey, J. M.

    1984-01-01

    The underlying principles of operation of microwave receivers for space observations of planetary surfaces were examined. The design philosophy of the receiver as it is applied to operate functionally as an efficient receiving system, the principle of operation of the key components of the receiver, and the important differences among receiver types are explained. The operating performance and the sensitivity expectations for both the modulated and total power receiver configurations are outlined. The expressions are derived from first principles and are developed through the important intermediate stages to form practicle and easily applied equations. The transfer of thermodynamic energy from point to point within the receiver is illustrated. The language of microwave receivers is applied statistics.

  13. Basic Principles of Chromatography

    NASA Astrophysics Data System (ADS)

    Ismail, Baraem; Nielsen, S. Suzanne

    Chromatography has a great impact on all areas of analysis and, therefore, on the progress of science in general. Chromatography differs from other methods of separation in that a wide variety of materials, equipment, and techniques can be used. [Readers are referred to references (1-19) for general and specific information on chromatography.]. This chapter will focus on the principles of chromatography, mainly liquid chromatography (LC). Detailed principles and applications of gas chromatography (GC) will be discussed in Chap. 29. In view of its widespread use and applications, high-performance liquid chromatography (HPLC) will be discussed in a separate chapter (Chap. 28). The general principles of extraction are first described as a basis for understanding chromatography.

  14. In quest of constitutional principles of "neurolaw".

    PubMed

    Pizzetti, Federico Gustavo

    2011-01-01

    The growing use of brain imaging technology and the developing of cognitive neuroscience pose unaccustomed challenges to legal systems. Until now, the fields of Law much affected are the civil and criminal law and procedure, but the constitutional dimension of "neurolaw" cannot be easily underestimated. As the capacity to investigate and to trace brain mechanisms and functional neural activities increases, it becomes urgent the recognition and definition of the unalienable rights and fundamental values in respect of this new techno-scientific power, that must be protected and safeguard at "constitutional level" of norms such as: human dignity, personal identity, authenticity and the pursuit of individual "happiness". As the same as for the law regulating research and experimentation on human genome adopted in the past years, one may also argue if the above mentioned fundamental principles of "neurolaw" must be fixed and disciplined also at European and International level.

  15. Compression as a Universal Principle of Animal Behavior

    ERIC Educational Resources Information Center

    Ferrer-i-Cancho, Ramon; Hernández-Fernández, Antoni; Lusseau, David; Agoramoorthy, Govindasamy; Hsu, Minna J.; Semple, Stuart

    2013-01-01

    A key aim in biology and psychology is to identify fundamental principles underpinning the behavior of animals, including humans. Analyses of human language and the behavior of a range of non-human animal species have provided evidence for a common pattern underlying diverse behavioral phenomena: Words follow Zipf's law of brevity (the…

  16. Findings in Experimental Psychology as Functioning Principles of Theatrical Design.

    ERIC Educational Resources Information Center

    Caldwell, George

    A gestalt approach to theatrical design seems to provide some ready and stable explanations for a number of issues in the scenic arts. Gestalt serves as the theoretical base for a number of experiments in psychology whose findings appear to delineate the principles of art to be used in scene design. The fundamental notion of gestalt theory…

  17. Chemical principles of single-molecule electronics

    NASA Astrophysics Data System (ADS)

    Su, Timothy A.; Neupane, Madhav; Steigerwald, Michael L.; Venkataraman, Latha; Nuckolls, Colin

    2016-03-01

    The field of single-molecule electronics harnesses expertise from engineering, physics and chemistry to realize circuit elements at the limit of miniaturization; it is a subfield of nanoelectronics in which the electronic components are single molecules. In this Review, we survey the field from a chemical perspective and discuss the structure-property relationships of the three components that form a single-molecule junction: the anchor, the electrode and the molecular bridge. The spatial orientation and electronic coupling between each component profoundly affect the conductance properties and functions of the single-molecule device. We describe the design principles of the anchor group, the influence of the electronic configuration of the electrode and the effect of manipulating the structure of the molecular backbone and of its substituent groups. We discuss single-molecule conductance switches as well as the phenomenon of quantum interference and then trace their fundamental roots back to chemical principles.

  18. Microrover Operates With Minimal Computation

    NASA Technical Reports Server (NTRS)

    Miller, David P.; Loch, John L.; Gat, Erann; Desai, Rajiv S.; Angle, Colin; Bickler, Donald B.

    1992-01-01

    Small, light, highly mobile robotic vehicles called "microrovers" use sensors and artificial intelligence to perform complicated tasks autonomously. Vehicle navigates, avoids obstacles, and picks up objects using reactive control scheme selected from among few preprogrammed behaviors to respond to environment while executing assigned task. Under development for exploration and mining of other planets. Also useful in firefighting, cleaning up chemical spills, and delivering materials in factories. Reactive control scheme and principle of behavior-description language useful in reducing computational loads in prosthetic limbs and automotive collision-avoidance systems.

  19. [Ethical principles in electronvulsivotherapy].

    PubMed

    Richa, S; De Carvalho, W

    2016-12-01

    ECT or electroconvulsive therapy (ECT) is a therapeutic technique invented in 1935 but which was really developed after World War II and then spreading widely until the mid 1960s. The source of this technique, and some forms of stigma including films, have participated widely to make it suspect from a moral point of view. The ethical principles that support the establishment of a treatment by ECT are those relating to any action in psychiatry and are based on the one hand on the founding principles of bioethics: autonomy, beneficence, non-malfeasance, and justice, and on the other hand on the information on the technical and consent to this type of care.

  20. Teaching/learning principles

    NASA Technical Reports Server (NTRS)

    Hankins, D. B.; Wake, W. H.

    1981-01-01

    The potential remote sensing user community is enormous, and the teaching and training tasks are even larger; however, some underlying principles may be synthesized and applied at all levels from elementary school children to sophisticated and knowledgeable adults. The basic rules applying to each of the six major elements of any training course and the underlying principle involved in each rule are summarized. The six identified major elements are: (1) field sites for problems and practice; (2) lectures and inside study; (3) learning materials and resources (the kit); (4) the field experience; (5) laboratory sessions; and (6) testing and evaluation.

  1. Minimally invasive PCNL-MIP.

    PubMed

    Zanetti, Stefano Paolo; Boeri, Luca; Gallioli, Andrea; Talso, Michele; Montanari, Emanuele

    2017-01-01

    Miniaturized percutaneous nephrolithotomy (mini-PCNL) has increased in popularity in recent years and is now widely used to overcome the therapeutic gap between conventional PCNL and less-invasive procedures such as shock wave lithotripsy (SWL) or flexible ureterorenoscopy (URS) for the treatment of renal stones. However, despite its minimally invasive nature, the superiority in terms of safety, as well as the similar efficacy of mini-PCNL compared to conventional procedures, is still under debate. The aim of this chapter is to present one of the most recent advancements in terms of mini-PCNL: the Karl Storz "minimally invasive PCNL" (MIP). A literature search for original and review articles either published or e-published up to December 2016 was performed using Google and the PubMed database. Keywords included: minimally invasive PCNL; MIP. The retrieved articles were gathered and examined. The complete MIP set is composed of different sized rigid metallic fiber-optic nephroscopes and different sized metallic operating sheaths, according to which the MIP is categorized into extra-small (XS), small (S), medium (M) and large (L). Dilation can be performed either in one-step or with a progressive technique, as needed. The reusable devices of the MIP and vacuum cleaner efect make PCNL with this set a cheap procedure. The possibility to shift from a small to a larger instrument within the same set (Matrioska technique) makes MIP a very versatile technique suitable for the treatment of almost any stone. Studies in the literature have shown that MIP is equally effective, with comparable rates of post-operative complications, as conventional PCNL, independently from stone size. MIP does not represent a new technique, but rather a combination of the last ten years of PCNL improvements in a single system that can transversally cover all available techniques in the panorama of percutaneous stone treatment.

  2. Omics technologies, data and bioinformatics principles.

    PubMed

    Schneider, Maria V; Orchard, Sandra

    2011-01-01

    We provide an overview on the state of the art for the Omics technologies, the types of omics data and the bioinformatics resources relevant and related to Omics. We also illustrate the bioinformatics challenges of dealing with high-throughput data. This overview touches several fundamental aspects of Omics and bioinformatics: data standardisation, data sharing, storing Omics data appropriately and exploring Omics data in bioinformatics. Though the principles and concepts presented are true for the various different technological fields, we concentrate in three main Omics fields namely: genomics, transcriptomics and proteomics. Finally we address the integration of Omics data, and provide several useful links for bioinformatics and Omics.

  3. The uncertainty principle and quantum chaos

    NASA Technical Reports Server (NTRS)

    Chirikov, Boris V.

    1993-01-01

    The conception of quantum chaos is described in some detail. The most striking feature of this novel phenomenon is that all the properties of classical dynamical chaos persist here but, typically, on the finite and different time scales only. The ultimate origin of such a universal quantum stability is in the fundamental uncertainty principle which makes discrete the phase space and, hence, the spectrum of bounded quantum motion. Reformulation of the ergodic theory, as a part of the general theory of dynamical systems, is briefly discussed.

  4. Minimally invasive therapy in Denmark.

    PubMed

    Schou, I

    1993-01-01

    Minimally invasive therapy (MIT) is beginning to have impacts on health care in Denmark, although diffusion has been delayed compared to diffusion in other European countries. Now policy makers are beginning to appreciate the potential advantages in terms of closing hospitals and shifting treatment to the out-patient setting, and diffusion will probably go faster in the future. Denmark does not have a system for technology assessment, neither central nor regional, and there is no early warning mechanism to survey international developments. This implies lack of possibilities for the planning of diffusion, training, and criteria for treatment.

  5. Next step in minimally invasive surgery: hybrid image-guided surgery.

    PubMed

    Marescaux, Jacques; Diana, Michele

    2015-01-01

    Surgery, interventional radiology, and advanced endoscopy have all developed minimally invasive techniques to effectively treat a variety of diseases with positive impact on patients' postoperative outcomes. However, those techniques are challenging and require extensive training. Robotics and computer sciences can help facilitate minimally invasive approaches. Furthermore, surgery, advanced endoscopy, and interventional radiology could converge towards a new hybrid specialty, hybrid image-guided minimally invasive therapies, in which the three fundamental disciplines could complement one another to maximize the positive effects and reduce the iatrogenic footprint on patients. The present manuscript describes the fundamental steps of this new paradigm shift in surgical therapies that, in our opinion, will be the next revolutionary step in minimally invasive approaches.

  6. A New Big Five: Fundamental Principles for an Integrative Science of Personality

    ERIC Educational Resources Information Center

    McAdams, Dan P.; Pals, Jennifer L.

    2006-01-01

    Despite impressive advances in recent years with respect to theory and research, personality psychology has yet to articulate clearly a comprehensive framework for understanding the whole person. In an effort to achieve that aim, the current article draws on the most promising empirical and theoretical trends in personality psychology today to…

  7. Radiological images on personal computers: introduction and fundamental principles of digital images.

    PubMed

    Gillespy, T; Rowberg, A H

    1993-05-01

    This series of articles will explore the issue related to displaying, manipulating, and analyzing radiological images on personal computers (PC). This first article discusses the digital image data file, standard PC graphic file formats, and various methods for importing radiological images into the PC.

  8. A primer on the fundamental principles of light microscopy: Optimizing magnification, resolution, and contrast.

    PubMed

    Goodwin, Paul C

    2015-01-01

    The light microscope is an indispensable tool in the study of living organisms. Most biologists are familiar with microscopes, perhaps being first introduced to the wonders of the world of small things at a very early age. Yet, few fully comprehend the nature of microscopy and the basis of its utility. This review (re)-introduces the concepts of magnification, resolution, and contrast, and explores how they are intimately related and necessary for effective microscopy.

  9. Enhancing Student Learning in Marketing Courses: An Exploration of Fundamental Principles for Website Platforms

    ERIC Educational Resources Information Center

    Hollenbeck, Candice R.; Mason, Charlotte H.; Song, Ji Hee

    2011-01-01

    The design of a course has potential to help marketing students achieve their learning objectives. Marketing courses are increasingly turning to technology to facilitate teaching and learning, and pedagogical tools such as Blackboard, WebCT, and e-Learning Commons are essential to the design of a course. Here, the authors investigate the research…

  10. 75 FR 71317 - Fundamental Principles and Policymaking Criteria for Partnerships With Faith-Based and Other...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... against on the basis of religion or religious belief in the administration or distribution of Federal... prospective beneficiaries of the social service programs on the basis of religion or religious belief... against current or prospective program beneficiaries on the basis of religion, a religious belief,...

  11. Nanotechnology in hyperthermia cancer therapy: From fundamental principles to advanced applications.

    PubMed

    Beik, Jaber; Abed, Ziaeddin; Ghoreishi, Fatemeh S; Hosseini-Nami, Samira; Mehrzadi, Saeed; Shakeri-Zadeh, Ali; Kamrava, S Kamran

    2016-08-10

    In this work, we present an in-depth review of recent breakthroughs in nanotechnology for hyperthermia cancer therapy. Conventional hyperthermia methods do not thermally discriminate between the target and the surrounding normal tissues, and this non-selective tissue heating can lead to serious side effects. Nanotechnology is expected to have great potential to revolutionize current hyperthermia methods. To find an appropriate place in cancer treatment, all nanotechnology-based hyperthermia methods and their risks/benefits must be thoroughly understood. In this review paper, we extensively examine and compare four modern nanotechnology-based hyperthermia methods. For each method, the possible physical mechanisms of heat generation and enhancement due to the presence of nanoparticles are explained, and recent in vitro and in vivo studies are reviewed and discussed. Nano-Photo-Thermal Therapy (NPTT) and Nano-Magnetic Hyperthermia (NMH) are reviewed as the two first exciting approaches for targeted hyperthermia. The third novel hyperthermia method, Nano-Radio-Frequency Ablation (NaRFA) is discussed together with the thermal effects of novel nanoparticles in the presence of radiofrequency waves. Finally, Nano-Ultrasound Hyperthermia (NUH) is described as the fourth modern method for cancer hyperthermia.

  12. Two Essays on Learning Disabilities in the Application of Fundamental Financial Principles

    ERIC Educational Resources Information Center

    Auciello, Daria Joy

    2010-01-01

    This dissertation consists of two essays which examine the relationship between dyslexia and the application and acquisition of financial knowledge. Recent behavioral research has documented that factors such as representativeness, overconfidence, loss aversion, naivete, wealth, age and gender all impact a person's risk perception and asset…

  13. Fundamental Quantum 1/F Noise in Ultrasmall Semi Conductor Devices and Their Optimal Design Principles.

    DTIC Science & Technology

    1986-05-01

    quantum 1/f noise will be derived again in three steps: first we consider just a single mode of the electromagnetic field in a coherent state and...Univ. of NRnn ad FL. Some suggestions are given at the end of Sec. IV. For devices larger than 10.100 microns coherent state quantum (1/f) noise bec...suggestions are given at the end of Sec. IV. For devices larger than 10 - 100 microns coherent state quantum 1/f noise becomes important according to

  14. 3 CFR 13559 - Executive Order 13559 of November 17, 2010. Fundamental Principles and Policymaking Criteria for...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... addition to the Co-Chairs, the Working Group shall consist of a senior official with knowledge of policies... services supported in whole or in part with Federal financial assistance, and in their outreach activities... religious activities (including activities that involve overt religious content such as worship,...

  15. New Design of MHC Class II Tetramers to Accommodate Fundamental Principles of Antigen Presentation

    PubMed Central

    Landais, Elise; Romagnoli, Pablo A.; Corper, Adam L.; Shires, John; Altman, John D.; Wilson, Ian A.; Garcia, K. Christopher; Teyton, Luc

    2009-01-01

    Direct identification and isolation of antigen-specific T cells became possible with the development of “MHC tetramers”, based on fluorescent avidins displaying biotinylated peptide-MHC (pMHC) complexes. This approach, extensively used for MHC class I–restricted T cells, has met very limited success with MHC class II tetramers (pMHCT-2) for the detection of CD4+ specific T cells. In addition, a very large number of these reagents while capable of specifically activating T cells after being coated on solid support, are still unable to stain. In order to try to understand this puzzle and design usable tetramers, we examined each parameter critical for the production of pMHCT-2 using the I-Ad-OVA system as a model. Through this process the geometry of pMHC display by avidin tetramers was examined, as well as the stability of recombinant MHC molecules. However, we discovered that the most important factor limiting the reactivity of pMHCT-2 was the display of peptides. Indeed, long peptides, as presented by MHC class II molecules, can be bound to I-A/HLA-DQ molecules in more than one register as suggested by structural studies. This mode of anchorless peptide binding allows the selection of a broader repertoire on single peptides and should favor anti-infectious immune responses. Thus, beyond the technical improvements that we propose, the redesign of pMHCT-2 will give us the tools to evaluate the real size of the CD4 repertoire and help us in the production and testing of new vaccines. PMID:19923463

  16. Integrating Fundamental Principles Underlying Somatic Practices into the Dance Technique Class

    ERIC Educational Resources Information Center

    Brodie, Julie; Lobel, Elin

    2004-01-01

    Integrating somatic practices into the dance technique class by bringing awareness to the bodily processes of breathing, sensing, connecting, and initiating can help students reconnect the mind with the body within the context of the classroom environment. Dance educators do not always have the resources to implement separate somatics courses…

  17. Precautionary principles: a jurisdiction-free framework for decision-making under risk.

    PubMed

    Ricci, Paolo F; Cox, Louis A; MacDonald, Thomas R

    2004-12-01

    Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives--defined as a choice that makes preferred consequences more likely--requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the

  18. The Species Delimitation Uncertainty Principle

    PubMed Central

    Adams, Byron J.

    2001-01-01

    If, as Einstein said, "it is the theory which decides what we can observe," then "the species problem" could be solved by simply improving our theoretical definition of what a species is. However, because delimiting species entails predicting the historical fate of evolutionary lineages, species appear to behave according to the Heisenberg Uncertainty Principle, which states that the most philosophically satisfying definitions of species are the least operational, and as species concepts are modified to become more operational they tend to lose their philosophical integrity. Can species be delimited operationally without losing their philosophical rigor? To mitigate the contingent properties of species that tend to make them difficult for us to delimit, I advocate a set of operations that takes into account the prospective nature of delimiting species. Given the fundamental role of species in studies of evolution and biodiversity, I also suggest that species delimitation proceed within the context of explicit hypothesis testing, like other scientific endeavors. The real challenge is not so much the inherent fallibility of predicting the future but rather adequately sampling and interpreting the evidence available to us in the present. PMID:19265874

  19. Individual differences in fundamental social motives.

    PubMed

    Neel, Rebecca; Kenrick, Douglas T; White, Andrew Edward; Neuberg, Steven L

    2016-06-01

    Motivation has long been recognized as an important component of how people both differ from, and are similar to, each other. The current research applies the biologically grounded fundamental social motives framework, which assumes that human motivational systems are functionally shaped to manage the major costs and benefits of social life, to understand individual differences in social motives. Using the Fundamental Social Motives Inventory, we explore the relations among the different fundamental social motives of Self-Protection, Disease Avoidance, Affiliation, Status, Mate Seeking, Mate Retention, and Kin Care; the relationships of the fundamental social motives to other individual difference and personality measures including the Big Five personality traits; the extent to which fundamental social motives are linked to recent life experiences; and the extent to which life history variables (e.g., age, sex, childhood environment) predict individual differences in the fundamental social motives. Results suggest that the fundamental social motives are a powerful lens through which to examine individual differences: They are grounded in theory, have explanatory value beyond that of the Big Five personality traits, and vary meaningfully with a number of life history variables. A fundamental social motives approach provides a generative framework for considering the meaning and implications of individual differences in social motivation. (PsycINFO Database Record

  20. Fundamentals of Physics, Problem Supplement No. 1

    NASA Astrophysics Data System (ADS)

    Halliday, David; Resnick, Robert; Walker, Jearl

    2000-05-01

    No other book on the market today can match the success of Halliday, Resnick and Walker's Fundamentals of Physics! In a breezy, easy-to-understand style the book offers a solid understanding of fundamental physics concepts, and helps readers apply this conceptual understanding to quantitative problem solving.

  1. Fundamentals of Physics, Student's Solutions Manual

    NASA Astrophysics Data System (ADS)

    Halliday, David; Resnick, Robert; Walker, Jearl

    2000-07-01

    No other book on the market today can match the success of Halliday, Resnick and Walker's Fundamentals of Physics! In a breezy, easy-to-understand style the book offers a solid understanding of fundamental physics concepts, and helps readers apply this conceptual understanding to quantitative problem solving.

  2. Fundamentals of Physics, 7th Edition

    NASA Astrophysics Data System (ADS)

    Halliday, David; Resnick, Robert; Walker, Jearl

    2004-05-01

    No other book on the market today can match the 30-year success of Halliday, Resnick and Walker's Fundamentals of Physics! In a breezy, easy-to-understand style the book offers a solid understanding of fundamental physics concepts, and helps readers apply this conceptual understanding to quantitative problem solving. This book offers a unique combination of authoritative content and stimulating applications.

  3. Principles of Naval Engineering.

    ERIC Educational Resources Information Center

    Naval Personnel Program Support Activity, Washington, DC.

    Fundamentals of shipboard machinery, equipment, and engineering plants are presented in this text prepared for engineering officers. A general description is included of the development of naval ships, ship design and construction, stability and buoyancy, and damage and casualty control. Engineering theories are explained on the background of ship…

  4. Minimizing travel claims cost with minimal-spanning tree model

    NASA Astrophysics Data System (ADS)

    Jamalluddin, Mohd Helmi; Jaafar, Mohd Azrul; Amran, Mohd Iskandar; Ainul, Mohd Sharizal; Hamid, Aqmar; Mansor, Zafirah Mohd; Nopiah, Zulkifli Mohd

    2014-06-01

    Travel demand necessitates a big expenditure in spending, as has been proven by the National Audit Department (NAD). Every year the auditing process is carried out throughout the country involving official travel claims. This study focuses on the use of the Spanning Tree model to determine the shortest path to minimize the cost of the NAD's official travel claims. The objective is to study the possibility of running a network based in the Kluang District Health Office to eight Rural Clinics in Johor state using the Spanning Tree model applications for optimizing travelling distances and make recommendations to the senior management of the Audit Department to analyze travelling details before an audit is conducted. Result of this study reveals that there were claims of savings of up to 47.4% of the original claims, over the course of the travel distance.

  5. The minimal time detection algorithm

    NASA Technical Reports Server (NTRS)

    Kim, Sungwan

    1995-01-01

    An aerospace vehicle may operate throughout a wide range of flight environmental conditions that affect its dynamic characteristics. Even when the control design incorporates a degree of robustness, system parameters may drift enough to cause its performance to degrade below an acceptable level. The object of this paper is to develop a change detection algorithm so that we can build a highly adaptive control system applicable to aircraft systems. The idea is to detect system changes with minimal time delay. The algorithm developed is called Minimal Time-Change Detection Algorithm (MT-CDA) which detects the instant of change as quickly as possible with false-alarm probability below a certain specified level. Simulation results for the aircraft lateral motion with a known or unknown change in control gain matrices, in the presence of doublet input, indicate that the algorithm works fairly well as theory indicates though there is a difficulty in deciding the exact amount of change in some situations. One of MT-CDA distinguishing properties is that detection delay of MT-CDA is superior to that of Whiteness Test.

  6. Less minimal supersymmetric standard model

    SciTech Connect

    de Gouvea, Andre; Friedland, Alexander; Murayama, Hitoshi

    1998-03-28

    Most of the phenomenological studies of supersymmetry have been carried out using the so-called minimal supergravity scenario, where one assumes a universal scalar mass, gaugino mass, and trilinear coupling at M{sub GUT}. Even though this is a useful simplifying assumption for phenomenological analyses, it is rather too restrictive to accommodate a large variety of phenomenological possibilities. It predicts, among other things, that the lightest supersymmetric particle (LSP) is an almost pure B-ino, and that the {mu}-parameter is larger than the masses of the SU(2){sub L} and U(1){sub Y} gauginos. We extend the minimal supergravity framework by introducing one extra parameter: the Fayet'Iliopoulos D-term for the hypercharge U(1), D{sub Y}. Allowing for this extra parameter, we find a much more diverse phenomenology, where the LSP is {tilde {nu}}{sub {tau}}, {tilde {tau}} or a neutralino with a large higgsino content. We discuss the relevance of the different possibilities to collider signatures. The same type of extension can be done to models with the gauge mediation of supersymmetry breaking. We argue that it is not wise to impose cosmological constraints on the parameter space.

  7. Green tribology: principles, research areas and challenges.

    PubMed

    Nosonovsky, Michael; Bhushan, Bharat

    2010-10-28

    In this introductory paper for the Theme Issue on green tribology, we discuss the concept of green tribology and its relation to other areas of tribology as well as other 'green' disciplines, namely, green engineering and green chemistry. We formulate the 12 principles of green tribology: the minimization of (i) friction and (ii) wear, (iii) the reduction or complete elimination of lubrication, including self-lubrication, (iv) natural and (v) biodegradable lubrication, (vi) using sustainable chemistry and engineering principles, (vii) biomimetic approaches, (viii) surface texturing, (ix) environmental implications of coatings, (x) real-time monitoring, (xi) design for degradation, and (xii) sustainable energy applications. We further define three areas of green tribology: (i) biomimetics for tribological applications, (ii) environment-friendly lubrication, and (iii) the tribology of renewable-energy application. The integration of these areas remains a primary challenge for this novel area of research. We also discuss the challenges of green tribology and future directions of research.

  8. Next-to-minimal SOFTSUSY

    NASA Astrophysics Data System (ADS)

    Allanach, B. C.; Athron, P.; Tunstall, Lewis C.; Voigt, A.; Williams, A. G.

    2014-09-01

    We describe an extension to the SOFTSUSY program that provides for the calculation of the sparticle spectrum in the Next-to-Minimal Supersymmetric Standard Model (NMSSM), where a chiral superfield that is a singlet of the Standard Model gauge group is added to the Minimal Supersymmetric Standard Model (MSSM) fields. Often, a Z3 symmetry is imposed upon the model. SOFTSUSY can calculate the spectrum in this case as well as the case where general Z3 violating (denoted as =) terms are added to the soft supersymmetry breaking terms and the superpotential. The user provides a theoretical boundary condition for the couplings and mass terms of the singlet. Radiative electroweak symmetry breaking data along with electroweak and CKM matrix data are used as weak-scale boundary conditions. The renormalisation group equations are solved numerically between the weak scale and a high energy scale using a nested iterative algorithm. This paper serves as a manual to the NMSSM mode of the program, detailing the approximations and conventions used. Catalogue identifier: ADPM_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPM_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154886 No. of bytes in distributed program, including test data, etc.: 1870890 Distribution format: tar.gz Programming language: C++, fortran. Computer: Personal computer. Operating system: Tested on Linux 3.x. Word size: 64 bits Classification: 11.1, 11.6. Does the new version supersede the previous version?: Yes Catalogue identifier of previous version: ADPM_v3_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 785 Nature of problem: Calculating supersymmetric particle spectrum and mixing parameters in the next-to-minimal supersymmetric standard model. The solution to the

  9. Fermat's Principle Revisited.

    ERIC Educational Resources Information Center

    Kamat, R. V.

    1991-01-01

    A principle is presented to show that, if the time of passage of light is expressible as a function of discrete variables, one may dispense with the more general method of the calculus of variations. The calculus of variations and the alternative are described. The phenomenon of mirage is discussed. (Author/KR)

  10. The Idiom Principle Revisited

    ERIC Educational Resources Information Center

    Siyanova-Chanturia, Anna; Martinez, Ron

    2015-01-01

    John Sinclair's Idiom Principle famously posited that most texts are largely composed of multi-word expressions that "constitute single choices" in the mental lexicon. At the time that assertion was made, little actual psycholinguistic evidence existed in support of that holistic, "single choice," view of formulaic language. In…

  11. Principles of Cancer Screening.

    PubMed

    Pinsky, Paul F

    2015-10-01

    Cancer screening has long been an important component of the struggle to reduce the burden of morbidity and mortality from cancer. Notwithstanding this history, many aspects of cancer screening remain poorly understood. This article presents a summary of basic principles of cancer screening that are relevant for researchers, clinicians, and public health officials alike.

  12. Principles of Biomedical Ethics

    PubMed Central

    Athar, Shahid

    2012-01-01

    In this presentation, I will discuss the principles of biomedical and Islamic medical ethics and an interfaith perspective on end-of-life issues. I will also discuss three cases to exemplify some of the conflicts in ethical decision-making. PMID:23610498

  13. Principles of sound ecotoxicology.

    PubMed

    Harris, Catherine A; Scott, Alexander P; Johnson, Andrew C; Panter, Grace H; Sheahan, Dave; Roberts, Mike; Sumpter, John P

    2014-03-18

    We have become progressively more concerned about the quality of some published ecotoxicology research. Others have also expressed concern. It is not uncommon for basic, but extremely important, factors to apparently be ignored. For example, exposure concentrations in laboratory experiments are sometimes not measured, and hence there is no evidence that the test organisms were actually exposed to the test substance, let alone at the stated concentrations. To try to improve the quality of ecotoxicology research, we suggest 12 basic principles that should be considered, not at the point of publication of the results, but during the experimental design. These principles range from carefully considering essential aspects of experimental design through to accurately defining the exposure, as well as unbiased analysis and reporting of the results. Although not all principles will apply to all studies, we offer these principles in the hope that they will improve the quality of the science that is available to regulators. Science is an evidence-based discipline and it is important that we and the regulators can trust the evidence presented to us. Significant resources often have to be devoted to refuting the results of poor research when those resources could be utilized more effectively.

  14. Minimal model of a heat engine: information theory approach.

    PubMed

    Zhou, Yun; Segal, Dvira

    2010-07-01

    We construct a generic model for a heat engine using information theory concepts, attributing irreversible energy dissipation to the information transmission channels. Using several forms for the channel capacity, classical and quantum, we demonstrate that our model recovers both the Carnot principle in the reversible limit, and the universal maximum power efficiency expression of nonreversible thermodynamics in the linear response regime. We expect the model to be very useful as a testbed for studying fundamental topics in thermodynamics, and for providing new insights into the relationship between information theory and actual thermal devices.

  15. Quantum theory of the generalised uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bruneton, Jean-Philippe; Larena, Julien

    2017-04-01

    We extend significantly previous works on the Hilbert space representations of the generalized uncertainty principle (GUP) in 3 + 1 dimensions of the form [X_i,P_j] = i F_{ij} where F_{ij} = f({{P}}^2) δ _{ij} + g({{P}}^2) P_i P_j for any functions f. However, we restrict our study to the case of commuting X's. We focus in particular on the symmetries of the theory, and the minimal length that emerge in some cases. We first show that, at the algebraic level, there exists an unambiguous mapping between the GUP with a deformed quantum algebra and a quadratic Hamiltonian into a standard, Heisenberg algebra of operators and an aquadratic Hamiltonian, provided the boost sector of the symmetries is modified accordingly. The theory can also be mapped to a completely standard Quantum Mechanics with standard symmetries, but with momentum dependent position operators. Next, we investigate the Hilbert space representations of these algebraically equivalent models, and focus specifically on whether they exhibit a minimal length. We carry the functional analysis of the various operators involved, and show that the appearance of a minimal length critically depends on the relationship between the generators of translations and the physical momenta. In particular, because this relationship is preserved by the algebraic mapping presented in this paper, when a minimal length is present in the standard GUP, it is also present in the corresponding Aquadratic Hamiltonian formulation, despite the perfectly standard algebra of this model. In general, a minimal length requires bounded generators of translations, i.e. a specific kind of quantization of space, and this depends on the precise shape of the function f defined previously. This result provides an elegant and unambiguous classification of which universal quantum gravity corrections lead to the emergence of a minimal length.

  16. Crystal Structure Prediction for Cyclotrimethylene Trinitramine (RDX) from First Principles

    DTIC Science & Technology

    2009-04-01

    REPORT Crystal structure prediction for cyclotrimethylene trinitramine (RDX) from ?rst principles 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: Crystal... structure prediction and molecular dynamics methods were applied to the cyclotrimethylene trinitramine (RDX) crystal to explore the stability rankings...500 high-density structures resulting from molecular packing were minimized and the 14 lowest-energy structures were subjected to isothermal

  17. Transport Processes from Mechanics: Minimal and Simplest Models

    NASA Astrophysics Data System (ADS)

    Bunimovich, Leonid A.; Grigo, Alexander

    2017-02-01

    We review the current state of a fundamental problem of rigorous derivation of transport processes in classical statistical mechanics from classical mechanics. Such derivations for diffusion and momentum transport (viscosities) were obtained for minimal models of these processes involving one and two particles respectively. However, a minimal model which demonstrates heat conductivity contains three particles. Its rigorous analysis is currently out of reach for existing mathematical techniques. The gas of localized balls is widely accepted as a basis for a simplest model for derivation of Fourier's law. We suggest a modification of the localized balls gas and argue that this gas of localized activated balls is a good candidate to rigorously prove Fourier's law. In particular, hyperbolicity is derived for a reduced version of this model.

  18. Transport Processes from Mechanics: Minimal and Simplest Models

    NASA Astrophysics Data System (ADS)

    Bunimovich, Leonid A.; Grigo, Alexander

    2016-12-01

    We review the current state of a fundamental problem of rigorous derivation of transport processes in classical statistical mechanics from classical mechanics. Such derivations for diffusion and momentum transport (viscosities) were obtained for minimal models of these processes involving one and two particles respectively. However, a minimal model which demonstrates heat conductivity contains three particles. Its rigorous analysis is currently out of reach for existing mathematical techniques. The gas of localized balls is widely accepted as a basis for a simplest model for derivation of Fourier's law. We suggest a modification of the localized balls gas and argue that this gas of localized activated balls is a good candidate to rigorously prove Fourier's law. In particular, hyperbolicity is derived for a reduced version of this model.

  19. Tensorial Minkowski functionals of triply periodic minimal surfaces

    PubMed Central

    Mickel, Walter; Schröder-Turk, Gerd E.; Mecke, Klaus

    2012-01-01

    A fundamental understanding of the formation and properties of a complex spatial structure relies on robust quantitative tools to characterize morphology. A systematic approach to the characterization of average properties of anisotropic complex interfacial geometries is provided by integral geometry which furnishes a family of morphological descriptors known as tensorial Minkowski functionals. These functionals are curvature-weighted integrals of tensor products of position vectors and surface normal vectors over the interfacial surface. We here demonstrate their use by application to non-cubic triply periodic minimal surface model geometries, whose Weierstrass parametrizations allow for accurate numerical computation of the Minkowski tensors. PMID:24098847

  20. A minimal living system and the origin of a protocell

    NASA Technical Reports Server (NTRS)

    Oro, J.; Lazcano, A.

    1984-01-01

    The essential molecular attributes of a minimal living system are discussed, and the evolution of a protocell from such a system is considered. Present thought on the emergence and evolution of life is summarized, and the complexity of biological systems is reviewed. The five fundamental molecular attributes considered are: informational molecules, catalytic peptides, a decoding and translation system, protoribosomes, and protomembranes. Their functions in a primitive cell are discussed. Positive feedback interaction between proto-RNA, proto-AA-tRNA, and protoenzyme are identified as the three major steps to the formation of a primitive living cell.

  1. Electrosurgery: principles and practice to reduce risk and maximize efficacy.

    PubMed

    Brill, Andrew I

    2011-12-01

    Science becomes art and art becomes function when fundamental principles are utilized to dictate surgical practice. Most important, the risk for inadvertent thermal injury during electrosurgery can be minimized by a sound comprehension of the predictable behaviors of electricity in living tissue.Guided by the Hippocratic charge of primum non nocere, the ultimate aim of energy-assisted surgery is the attainment of anatomic dissection and hemostasis with the least amount of collateral damage and subsequent scar tissue formation.Ideally, the surgeon’s final view of the operative field should accurately approximate the topography discoverable after postoperative healing. Despite the continued innovation of products borne to reduce thermal damage and then marketed as being comparatively safer, it is the hands and mind of the surgeon that serve to preserve tissue integrity by reducing the burden of delayed thermal necrosis and taking steps to prevent excessive devitalization of tissue. Regardless of the chosen modality, the inseparable and exponentially linked elements of time and the quantity of delivered energy must be integrated while purposefully moderating to attain the desired tissue effect. Ultimately, the reduction of unwanted thermal injury is inherently linked to good surgical judgment and technique, a sound comprehension of the applied energy modality, and the surgeon’s ability to recognize anatomic structures within the field of surgical dissection as well as those within the zone of significant thermal change.During the use of any energy-based device for hemostasis, out of sight must never mean out of mind. If the bowel, bladder, or ureter is in close proximity to a bleeder,they should be sufficiently mobilized before applying energy. Thermal energy should always be withheld until an orderly sequence of anatomic triage is carried out.Whenever a vital structure cannot be adequately mobilized, hemorrhage is preferentially controlled by using mechanical

  2. Universal Principles in the Repair of Communication Problems

    PubMed Central

    Dingemanse, Mark; Roberts, Seán G.; Baranova, Julija; Blythe, Joe; Drew, Paul; Floyd, Simeon; Gisladottir, Rosa S.; Kendrick, Kobin H.; Levinson, Stephen C.; Manrique, Elizabeth; Rossi, Giovanni; Enfield, N. J.

    2015-01-01

    There would be little adaptive value in a complex communication system like human language if there were no ways to detect and correct problems. A systematic comparison of conversation in a broad sample of the world’s languages reveals a universal system for the real-time resolution of frequent breakdowns in communication. In a sample of 12 languages of 8 language families of varied typological profiles we find a system of ‘other-initiated repair’, where the recipient of an unclear message can signal trouble and the sender can repair the original message. We find that this system is frequently used (on average about once per 1.4 minutes in any language), and that it has detailed common properties, contrary to assumptions of radical cultural variation. Unrelated languages share the same three functionally distinct types of repair initiator for signalling problems and use them in the same kinds of contexts. People prefer to choose the type that is the most specific possible, a principle that minimizes cost both for the sender being asked to fix the problem and for the dyad as a social unit. Disruption to the conversation is kept to a minimum, with the two-utterance repair sequence being on average no longer that the single utterance which is being fixed. The findings, controlled for historical relationships, situation types and other dependencies, reveal the fundamentally cooperative nature of human communication and offer support for the pragmatic universals hypothesis: while languages may vary in the organization of grammar and meaning, key systems of language use may be largely similar across cultural groups. They also provide a fresh perspective on controversies about the core properties of language, by revealing a common infrastructure for social interaction which may be the universal bedrock upon which linguistic diversity rests. PMID:26375483

  3. Universal Principles in the Repair of Communication Problems.

    PubMed

    Dingemanse, Mark; Roberts, Seán G; Baranova, Julija; Blythe, Joe; Drew, Paul; Floyd, Simeon; Gisladottir, Rosa S; Kendrick, Kobin H; Levinson, Stephen C; Manrique, Elizabeth; Rossi, Giovanni; Enfield, N J

    2015-01-01

    There would be little adaptive value in a complex communication system like human language if there were no ways to detect and correct problems. A systematic comparison of conversation in a broad sample of the world's languages reveals a universal system for the real-time resolution of frequent breakdowns in communication. In a sample of 12 languages of 8 language families of varied typological profiles we find a system of 'other-initiated repair', where the recipient of an unclear message can signal trouble and the sender can repair the original message. We find that this system is frequently used (on average about once per 1.4 minutes in any language), and that it has detailed common properties, contrary to assumptions of radical cultural variation. Unrelated languages share the same three functionally distinct types of repair initiator for signalling problems and use them in the same kinds of contexts. People prefer to choose the type that is the most specific possible, a principle that minimizes cost both for the sender being asked to fix the problem and for the dyad as a social unit. Disruption to the conversation is kept to a minimum, with the two-utterance repair sequence being on average no longer that the single utterance which is being fixed. The findings, controlled for historical relationships, situation types and other dependencies, reveal the fundamentally cooperative nature of human communication and offer support for the pragmatic universals hypothesis: while languages may vary in the organization of grammar and meaning, key systems of language use may be largely similar across cultural groups. They also provide a fresh perspective on controversies about the core properties of language, by revealing a common infrastructure for social interaction which may be the universal bedrock upon which linguistic diversity rests.

  4. On the Minimal Length Uncertainty Relation and the Foundations of String Theory

    DOE PAGES

    Chang, Lay Nam; Lewis, Zachary; Minic, Djordje; ...

    2011-01-01

    We review our work on the minimal length uncertainty relation as suggested by perturbative string theory. We discuss simple phenomenological implications of the minimal length uncertainty relation and then argue that the combination of the principles of quantum theory and general relativity allow for a dynamical energy-momentum space. We discuss the implication of this for the problem of vacuum energy and the foundations of nonperturbative string theory.

  5. Minimizing communication cost among distributed controllers in software defined networks

    NASA Astrophysics Data System (ADS)

    Arlimatti, Shivaleela; Elbreiki, Walid; Hassan, Suhaidi; Habbal, Adib; Elshaikh, Mohamed

    2016-08-01

    Software Defined Networking (SDN) is a new paradigm to increase the flexibility of today's network by promising for a programmable network. The fundamental idea behind this new architecture is to simplify network complexity by decoupling control plane and data plane of the network devices, and by making the control plane centralized. Recently controllers have distributed to solve the problem of single point of failure, and to increase scalability and flexibility during workload distribution. Even though, controllers are flexible and scalable to accommodate more number of network switches, yet the problem of intercommunication cost between distributed controllers is still challenging issue in the Software Defined Network environment. This paper, aims to fill the gap by proposing a new mechanism, which minimizes intercommunication cost with graph partitioning algorithm, an NP hard problem. The methodology proposed in this paper is, swapping of network elements between controller domains to minimize communication cost by calculating communication gain. The swapping of elements minimizes inter and intra communication cost among network domains. We validate our work with the OMNeT++ simulation environment tool. Simulation results show that the proposed mechanism minimizes the inter domain communication cost among controllers compared to traditional distributed controllers.

  6. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information

    PubMed Central

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft’s algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms. PMID:27806102

  7. Update on designing and building minimal cells

    PubMed Central

    Jewett, Michael C.; Forster, Anthony C.

    2010-01-01

    Summary Minimal cells comprise only the genes and biomolecular machinery necessary for basic life. Synthesizing minimal and minimized cells will improve understanding of core biology, enhance development of biotechnology strains of bacteria, and enable evolutionary optimization of natural and unnatural biopolymers. Design and construction of minimal cells is proceeding in two different directions: “top-down” reduction of bacterial genomes in vivo and “bottom-up” integration of DNA/RNA/protein/membrane syntheses in vitro. Major progress in the last 5 years has occurred in synthetic genomics, minimization of the Escherichia coli genome, sequencing of minimal bacterial endosymbionts, identification of essential genes, and integration of biochemical systems. PMID:20638265

  8. Passively minimizing structural sound radiation using shunted piezoelectric materials

    NASA Astrophysics Data System (ADS)

    Bulent Ozer, M.; Royston, Thomas J.

    2003-10-01

    Two methods are presented to determine optimal inductance and resistance values of the shunt circuit across a piezoceramic material, which is bonded to a simply supported plate in order to minimize sound radiation from the plate. The first method (DH) makes use of den Hartog's damped vibration absorber principle. The second method (SM) uses the Sherman Morrison matrix inversion theorem. The effectiveness of each method is compared with regard to minimizing total acoustic sound-power radiation and acoustic pressure at a point. Optimization algorithms and case studies are presented using a linearized model for the piezoceramic and using a nonlinear model for the piezoceramic that accounts for the inherent dielectric hysteresis. Case studies demonstrate that the second method (SM) results in superior performance, under both linear and nonlinear system assumptions. Studies also illustrate that, if the nonlinearity in the system is significant, it must be incorporated in the optimization process.

  9. Strategies to Minimize Antibiotic Resistance

    PubMed Central

    Lee, Chang-Ro; Cho, Ill Hwan; Jeong, Byeong Chul; Lee, Sang Hee

    2013-01-01

    Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs) and various data such as pharmacokinetic (PK) and pharmacodynamic (PD) properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST), clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care), the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students) regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing). The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics. PMID:24036486

  10. Minimally packed phases in holography

    NASA Astrophysics Data System (ADS)

    Donos, Aristomenis; Gauntlett, Jerome P.

    2016-03-01

    We numerically construct asymptotically AdS black brane solutions of D = 4 Einstein-Maxwell theory coupled to a pseudoscalar. The solutions are holographically dual to d = 3 CFTs at finite chemical potential and in a constant magnetic field, which spontaneously break translation invariance leading to the spontaneous formation of abelian and momentum magnetisation currents flowing around the plaquettes of a periodic Bravais lattice. We analyse the three-dimensional moduli space of lattice solutions, which are generically oblique, and show, for a specific value of the magnetic field, that the free energy is minimised by the triangular lattice, associated with minimal packing of circles in the plane. We show that the average stress tensor for the thermodynamically preferred phase is that of a perfect fluid and that this result applies more generally to spontaneously generated periodic phases. The triangular structure persists at low temperatures indicating the existence of novel crystalline ground states.

  11. [MINIMALLY INVASIVE AORTIC VALVE REPLACEMENT].

    PubMed

    Tabata, Minoru

    2016-03-01

    Minimally invasive aortic valve replacement (MIAVR) is defined as aortic valve replacement avoiding full sternotomy. Common approaches include a partial sternotomy right thoracotomy, and a parasternal approach. MIAVR has been shown to have advantages over conventional AVR such as shorter length of stay and smaller amount of blood transfusion and better cosmesis. However, it is also known to have disadvantages such as longer cardiopulmonary bypass and aortic cross-clamp times and potential complications related to peripheral cannulation. Appropriate patient selection is very important. Since the procedure is more complex than conventional AVR, more intensive teamwork in the operating room is essential. Additionally, a team approach during postoperative management is critical to maximize the benefits of MIAVR.

  12. Minimally Invasive Spigelian Hernia Repair

    PubMed Central

    Baucom, Catherine; Nguyen, Quan D.; Hidalgo, Marco

    2009-01-01

    Introduction: Spigelian hernia is an uncommon ventral hernia characterized by a defect in the linea semilunaris. Repair of spigelian hernia has traditionally been accomplished via an open transverse incision and primary repair. The purpose of this article is to present 2 case reports of incarcerated spigelian hernia that were successfully repaired laparoscopically using Gortex mesh and to present a review of the literature regarding laparoscopic repair of spigelian hernias. Methods: Retrospective chart review and Medline literature search. Results: Two patients underwent laparoscopic mesh repair of incarcerated spigelian hernias. Both were started on a regular diet on postoperative day 1 and discharged on postoperative days 2 and 3. One patient developed a seroma that resolved without intervention. There was complete resolution of preoperative symptoms at the 12-month follow-up. Conclusion: Minimally invasive repair of spigelian hernias is an alternative to the traditional open surgical technique. Further studies are needed to directly compare the open and the laparoscopic repair. PMID:19660230

  13. Strategies to minimize antibiotic resistance.

    PubMed

    Lee, Chang-Ro; Cho, Ill Hwan; Jeong, Byeong Chul; Lee, Sang Hee

    2013-09-12

    Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs) and various data such as pharmacokinetic (PK) and pharmacodynamic (PD) properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST), clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care), the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students) regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing). The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics.

  14. The Minimal Cost of Life in Space

    NASA Astrophysics Data System (ADS)

    Drysdale, A.; Rutkze, C.; Albright, L.; Ladue, R.

    Life in space requires protection from the external environment, provision of a suitable internal environment, provision of consumables to maintain life, and removal of wastes. Protection from the external environment will mainly require shielding from radiation and meteoroids. Provision of a suitable environment inside the spacecraft will require provision of suitable air pressure and composition, temperature, and protection from environmental toxins (trace contaminants) and pathogenic micro-organisms. Gravity may be needed for longer missions to avoid excessive changes such as decalcification and muscle degeneration. Similarly, the volume required per crewmember will increase as the mission duration increases. Consumables required include oxygen, food, and water. Nitrogen might be required, depending on the total pressure and non-metabolic losses. We normally provide these consumables from the Earth, with a greater or lesser degree of regeneration. In principle, all consumables can be regenerated. Water and air are easiest to regenerate. At the present time, food can only be regenerated by using plants, and higher plants at that. Waste must be removed, including carbon dioxide and other metabolic waste as well as trash such as food packaging, filters, and expended spare parts. This can be done by dumping or regeneration. The minimal cost of life in space would be to use a synthesis process or system to regenerate all consumables from wastes. As the efficiency of the various processes rises, the minimal cost of life support will fall. However, real world regeneration requires significant equipment, power, and crew time. Make-up will be required for those items that cannot be economically regenerated. For very inefficient processes, it might be cheaper to ship all or part of the consumables. We are currently far down the development curve, and for short missions it is cheaper to ship consumables. For longer duration missions, greater closure is cost effective

  15. Ergonomic T-Handle for Minimally Invasive Surgical Instruments

    PubMed Central

    Parekh, J; Shepherd, DET; Hukins, DWL; Maffulli, N

    2016-01-01

    A T-handle has been designed to be used for minimally invasive implantation of a dynamic hip screw to repair fractures of the proximal femur. It is capable of being used in two actions: (i) push and hold (while using an angle guide) and (ii) application of torque when using the insertion wrench and lag screw tap. The T-handle can be held in a power or precision grip. It is suitable for either single (sterilised by γ-irradiation) or multiple (sterilised by autoclaving) use. The principles developed here are applicable to handles for a wide range of surgical instruments. PMID:27326394

  16. On the quantum mechanical solutions with minimal length uncertainty

    NASA Astrophysics Data System (ADS)

    Shababi, Homa; Pedram, Pouria; Chung, Won Sang

    2016-06-01

    In this paper, we study two generalized uncertainty principles (GUPs) including [X,P] = iℏ(1 + βP2j) and [X,P] = iℏ(1 + βP2 + kβ2P4) which imply minimal measurable lengths. Using two momentum representations, for the former GUP, we find eigenvalues and eigenfunctions of the free particle and the harmonic oscillator in terms of generalized trigonometric functions. Also, for the latter GUP, we obtain quantum mechanical solutions of a particle in a box and harmonic oscillator. Finally we investigate the statistical properties of the harmonic oscillator including partition function, internal energy, and heat capacity in the context of the first GUP.

  17. Gauge theories under incorporation of a generalized uncertainty principle

    SciTech Connect

    Kober, Martin

    2010-10-15

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  18. Fundamentals of preparative and nonlinear chromatography

    SciTech Connect

    Guiochon, Georges A; Felinger, Attila; Katti, Anita; Shirazi, Dean G

    2006-02-01

    The second edition of Fundamentals of Preparative and Nonlinear Chromatography is devoted to the fundamentals of a new process of purification or extraction of chemicals or proteins widely used in the pharmaceutical industry and in preparative chromatography. This process permits the preparation of extremely pure compounds satisfying the requests of the US Food and Drug Administration. The book describes the fundamentals of thermodynamics, mass transfer kinetics, and flow through porous media that are relevant to chromatography. It presents the models used in chromatography and their solutions, discusses the applications made, describes the different processes used, their numerous applications, and the methods of optimization of the experimental conditions of this process.

  19. Precautionary Principles: General Definitions and Specific Applications to Genetically Modified Organisms

    ERIC Educational Resources Information Center

    Lofstedt, Ragnar E.; Fischhoff, Baruch; Fischhoff, Ilya R.

    2002-01-01

    Precautionary principles have been proposed as a fundamental element of sound risk management. Their advocates see them as guiding action in the face of uncertainty, encouraging the adoption of measures that reduce serious risks to health, safety, and the environment. Their opponents may reject the very idea of precautionary principles, find…

  20. The Principles of Language-Study. Language and Language Learning [Series], Number 5.

    ERIC Educational Resources Information Center

    Palmer, Harold E.

    As a reissue of a popular book written in the 1920's on the principles of language study, this work is included in a series of publications devoted to language and language learning. The methodology prescribed centers upon nine fundamental principles: (1) initial preparation, (2) habit forming, (3) accuracy, (4) gradation, (5) proportion, (6)…

  1. AUTOMOTIVE DIESEL MAINTENANCE 2. UNIT XVI, LEARNING ABOUT AC GENERATOR (ALTERNATOR) PRINCIPLES (PART I).

    ERIC Educational Resources Information Center

    Human Engineering Inst., Cleveland, OH.

    THIS MODULE OF A 25-MODULE COURSE IS DESIGNED TO DEVELOP AN UNDERSTANDING OF THE OPERATING PRINCIPLES OF ALTERNATING CURRENT GENERATORS USED ON DIESEL POWERED EQUIPMENT. TOPICS ARE REVIEWING ELECTRICAL FUNDAMENTALS, AND OPERATING PRINCIPLES OF ALTERNATORS. THE MODULE CONSISTS OF A SELF-INSTRUCTIONAL PROGRAMED TRAINING FILM "AC GENERATORS…

  2. The Principles of Progress

    NASA Astrophysics Data System (ADS)

    Khalisi, E.

    2012-09-01

    The achievements of mankind are based on the interaction of discovery, invention, and innovation. Once man learnt how to utilize the laws of nature, he advanced to a being who attained greatest strength upon other creatures. An analogy can be drawn for civilisations: Those conducting fundamental research will gain strategical power. Among the sciences, astronomy and astrophysics provide the largest potential for discoveries that reach far beyond our intellectual limits. They trigger technology and have a decisive impact on the society.

  3. Principles of Glacier Mechanics

    NASA Astrophysics Data System (ADS)

    Waddington, Edwin D.

    Glaciers are awesome in size and move at a majestic pace, and they frequently occupy spectacular mountainous terrain. Naturally, many Earth scientists are attracted to glaciers. Some of us are even fortunate enough to make a career of studying glacier flow. Many others work on the large, flat polar ice sheets where there is no scenery. As a leader of one of the foremost research projects now studying the flow of mountain glaciers (Storglaciaren, Norway), Roger Hooke is well qualified to describe the principles of glacier mechanics. Principles of Glacier Mechanics is written for upper-level undergraduate students and graduate students with an interest in glaciers and the landforms that glaciers produce. While most of the examples in the text are drawn from valley glacier studies, much of the material is also relevant to “glacier flatland” on the polar ice sheets.

  4. Principles of plasma diagnostics

    NASA Astrophysics Data System (ADS)

    Hutchinson, Ian H.

    The physical principles, techniques, and instrumentation of plasma diagnostics are examined in an introduction and reference work for students and practicing scientists. Topics addressed include basic plasma properties, magnetic diagnostics, plasma particle flux, and refractive-index measurements. Consideration is given to EM emission by free and bound electrons, the scattering of EM radiation, and ion processes. Diagrams, drawings, graphs, sample problems, and a glossary of symbols are provided.

  5. Pauli Exclusion Principle

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    A principle of quantum theory, devised in 1925 by Wolfgang Pauli (1900-58), which states that no two fermions may exist in the same quantum state. The quantum state of a particle is defined by a set of numbers that describe quantities such as energy, angular momentum and spin. Fermions are particles such as quarks, protons, neutrons and electrons, that have spin = ½ (in units of h/2π, where h is ...

  6. Strategic Information Resources Management: Fundamental Practices.

    ERIC Educational Resources Information Center

    Caudle, Sharon L.

    1996-01-01

    Discusses six fundamental information resources management (IRM) practices in successful organizations that can improve government service delivery performance. Highlights include directing changes, integrating IRM decision making into a strategic management process, performance management, maintaining an investment philosophy, using business…

  7. Instructor Special Report: RIF (Reading Is FUNdamental)

    ERIC Educational Resources Information Center

    Instructor, 1976

    1976-01-01

    At a time when innovative programs of the sixties are quickly falling out of the picture, Reading Is FUNdamental, after ten years and five million free paperbacks, continues to expand and show results. (Editor)

  8. Language Policy and Planning: Fundamental Issues.

    ERIC Educational Resources Information Center

    Kaplan, Robert B.

    1994-01-01

    Fundamental issues in language policy and planning are discussed: language death, language survival, language change, language revival, language shift and expansion, language contact and pidginization or creolization, and literacy development. (Contains 21 references.) (LB)

  9. Accounting Fundamentals for Non-Accountants

    EPA Pesticide Factsheets

    The purpose of this module is to provide an introduction and overview of accounting fundamentals for non-accountants. The module also covers important topics such as communication, internal controls, documentation and recordkeeping.

  10. Fundamentals of Indoor Air Quality in Buildings

    EPA Pesticide Factsheets

    This module provides the fundamentals to understanding indoor air quality. It provides a rudimentary framework for understanding how indoor and outdoor sources of pollution affect the indoor air quality of buildings.

  11. Heisenberg's observability principle

    NASA Astrophysics Data System (ADS)

    Wolff, Johanna

    2014-02-01

    Werner Heisenberg's 1925 paper 'Quantum-theoretical re-interpretation of kinematic and mechanical relations' marks the beginning of quantum mechanics. Heisenberg famously claims that the paper is based on the idea that the new quantum mechanics should be 'founded exclusively upon relationships between quantities which in principle are observable'. My paper is an attempt to understand this observability principle, and to see whether its employment is philosophically defensible. Against interpretations of 'observability' along empiricist or positivist lines I argue that such readings are philosophically unsatisfying. Moreover, a careful comparison of Heisenberg's reinterpretation of classical kinematics with Einstein's argument against absolute simultaneity reveals that the positivist reading does not fit with Heisenberg's strategy in the paper. Instead the appeal to observability should be understood as a specific criticism of the causal inefficacy of orbital electron motion in Bohr's atomic model. I conclude that the tacit philosophical principle behind Heisenberg's argument is not a positivistic connection between observability and meaning, but the idea that a theory should not contain causally idle wheels.

  12. Probing Mach's principle

    NASA Astrophysics Data System (ADS)

    Annila, Arto

    2012-06-01

    The principle of least action in its original form á la Maupertuis is used to explain geodetic and frame-dragging precessions which are customarily accounted for a curved space-time in general relativity. The least-time equations of motion agree with observations and are also in concert with general relativity. Yet according to the least-time principle, gravitation does not relate to the mathematical metric of space-time, but to a tangible energy density embodied by photons. The density of free space is in balance with the total mass of the Universein accord with the Planck law. Likewise, a local photon density and its phase distribution are in balance with the mass and charge distribution of a local body. Here gravitational force is understood as an energy density difference that will diminish when the oppositely polarized pairs of photons co-propagate from the energy-dense system of bodies to the energy-sparse system of the surrounding free space. Thus when the body changes its state of motion, the surrounding energy density must accommodate the change. The concurrent resistance in restructuring the surroundings, ultimately involving the entire Universe, is known as inertia. The all-around propagating energy density couples everything with everything else in accord with Mach’s principle.

  13. Principled Missing Data Treatments.

    PubMed

    Lang, Kyle M; Little, Todd D

    2016-04-04

    We review a number of issues regarding missing data treatments for intervention and prevention researchers. Many of the common missing data practices in prevention research are still, unfortunately, ill-advised (e.g., use of listwise and pairwise deletion, insufficient use of auxiliary variables). Our goal is to promote better practice in the handling of missing data. We review the current state of missing data methodology and recent missing data reporting in prevention research. We describe antiquated, ad hoc missing data treatments and discuss their limitations. We discuss two modern, principled missing data treatments: multiple imputation and full information maximum likelihood, and we offer practical tips on how to best employ these methods in prevention research. The principled missing data treatments that we discuss are couched in terms of how they improve causal and statistical inference in the prevention sciences. Our recommendations are firmly grounded in missing data theory and well-validated statistical principles for handling the missing data issues that are ubiquitous in biosocial and prevention research. We augment our broad survey of missing data analysis with references to more exhaustive resources.

  14. Decoherence as a Fundamental Phenomenon in Quantum Dynamics

    NASA Astrophysics Data System (ADS)

    Mensky, Michael B.

    The phenomenon of decoherence of a quantum system caused by the entanglement of the system with its environment is discussed from different points of view, particularly in the framework of quantum theory of measurements. The selective presentation of decoherence (taking into account the state of the environment) by restricted path integrals or by effective Schrödinger equation is shown to follow from the first principles or from models. Fundamental character of this phenomenon is demonstrated, particularly the role played in it by information is underlined. It is argued that quantum mechanics becomes logically closed and contains no paradoxes if it is formulated as a theory of open systems with decoherence taken into account. If one insist on considering a completely closed system (the whole Universe), the observer's consciousness has to be included in the theory explicitly. Such a theory is not motivated by physics, but may be interesting as a metaphysical theory clarifying the concept of consciousness.

  15. Some Fundamental Molecular Mechanisms of Contractility in Fibrous Macromolecules

    PubMed Central

    Mandelkern, L.

    1967-01-01

    The fundamental molecular mechanisms of contractility and tension development in fibrous macromolecules are developed from the point of view of the principles of polymer physical chemistry. The problem is treated in a general manner to encompass the behavior of all macromolecular systems irrespective of their detailed chemical structure and particular function, if any. Primary attention is given to the contractile process which accompanies the crystal-liquid transition in axially oriented macromolecular systems. The theoretical nature of the process is discussed, and many experimental examples are given from the literature which demonstrate the expected behavior. Experimental attention is focused on the contraction of fibrous proteins, and the same underlying molecular mechanism is shown to be operative for a variety of different systems. PMID:6050598

  16. Does logic moderate the fundamental attribution error?

    PubMed

    Stalder, D R

    2000-06-01

    The fundamental attribution error was investigated from an individual difference perspective. Mathematicians were compared with nonmathematicians (Exp. 1; n: 84), and undergraduates who scored high on a test of logical reasoning ability were compared with those who scored low (Exp. 2; n: 62). The mathematicians and those participants scoring higher on logic appeared less prone to the fundamental attribution error, primarily using a measure of confidence in attributions.

  17. Context Effects in Western Herbal Medicine: Fundamental to Effectiveness?

    PubMed

    Snow, James

    2016-01-01

    Western herbal medicine (WHM) is a complex healthcare system that uses traditional plant-based medicines in patient care. Typical preparations are individualized polyherbal formulae that, unlike herbal pills, retain the odor and taste of whole herbs. Qualitative studies in WHM show patient-practitioner relationships to be collaborative. Health narratives are co-constructed, leading to assessments, and treatments with personal significance for participants. It is hypothesized that the distinct characteristics of traditional herbal preparations and patient-herbalist interactions, in conjunction with the WHM physical healthcare environment, evoke context (placebo) effects that are fundamental to the overall effectiveness of herbal treatment. These context effects may need to be minimized to demonstrate pharmacological efficacy of herbal formulae in randomized, placebo-controlled trials, optimized to demonstrate effectiveness of WHM in pragmatic trials, and consciously harnessed to enhance outcomes in clinical practice.

  18. Minimally Invasive Mitral Valve Surgery II

    PubMed Central

    Wolfe, J. Alan; Malaisrie, S. Chris; Farivar, R. Saeid; Khan, Junaid H.; Hargrove, W. Clark; Moront, Michael G.; Ryan, William H.; Ailawadi, Gorav; Agnihotri, Arvind K.; Hummel, Brian W.; Fayers, Trevor M.; Grossi, Eugene A.; Guy, T. Sloane; Lehr, Eric J.; Mehall, John R.; Murphy, Douglas A.; Rodriguez, Evelio; Salemi, Arash; Segurola, Romualdo J.; Shemin, Richard J.; Smith, J. Michael; Smith, Robert L.; Weldner, Paul W.; Lewis, Clifton T. P.; Barnhart, Glenn R.; Goldman, Scott M.

    2016-01-01

    Abstract Techniques for minimally invasive mitral valve repair and replacement continue to evolve. This expert opinion, the second of a 3-part series, outlines current best practices for nonrobotic, minimally invasive mitral valve procedures, and for postoperative care after minimally invasive mitral valve surgery. PMID:27654406

  19. Mini-Med School Planning Guide

    ERIC Educational Resources Information Center

    National Institutes of Health, Office of Science Education, 2008

    2008-01-01

    Mini-Med Schools are public education programs now offered by more than 70 medical schools, universities, research institutions, and hospitals across the nation. There are even Mini-Med Schools in Ireland, Malta, and Canada! The program is typically a lecture series that meets once a week and provides "mini-med students" information on some of the…

  20. Thomas Reid's Fundamental Rules of Eloquence.

    ERIC Educational Resources Information Center

    Skopec, Eric W.

    1978-01-01

    Examines Thomas Reid's philosophy of rhetoric and identifies the principles to which he was committed, including his classification of knowledge, emphasis on artistic expression, and theory of natural signification. (JMF)

  1. Differentially Private Empirical Risk Minimization.

    PubMed

    Chaudhuri, Kamalika; Monteleoni, Claire; Sarwate, Anand D

    2011-03-01

    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance.

  2. Differentially Private Empirical Risk Minimization

    PubMed Central

    Chaudhuri, Kamalika; Monteleoni, Claire; Sarwate, Anand D.

    2011-01-01

    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance. PMID:21892342

  3. Minimal hepatic encephalopathy: A review.

    PubMed

    Nardone, Raffaele; Taylor, Alexandra C; Höller, Yvonne; Brigo, Francesco; Lochner, Piergiorgio; Trinka, Eugen

    2016-10-01

    Minimal hepatic encephalopathy (MHE) is the earliest form of hepatic encephalopathy and can affect up to 80% of patients with liver cirrhosis. By definition, MHE is characterized by cognitive function impairment in the domains of attention, vigilance and integrative function, but obvious clinical manifestation are lacking. MHE has been shown to affect daily functioning, quality of life, driving and overall mortality. The diagnosis can be achieved through neuropsychological testing, recently developed computerized psychometric tests, such as the critical flicker frequency and the inhibitory control tests, as well as neurophysiological procedures. Event related potentials can reveal subtle changes in patients with normal neuropsychological performances. Spectral analysis of electroencephalography (EEG) and quantitative analysis of sleep EEG provide early markers of cerebral dysfunction in cirrhotic patients with MHE. Neuroimaging, in particular MRI, also increasingly reveals diffuse abnormalities in intrinsic brain activity and altered organization of functional connectivity networks. Medical treatment for MHE to date has been focused on reducing serum ammonia levels and includes non-absorbable disaccharides, probiotics or rifaximin. Liver transplantation may not reverse the cognitive deficits associated with MHE. We performed here an updated review on epidemiology, burden and quality of life, neuropsychological testing, neuroimaging, neurophysiology and therapy in subjects with MHE.

  4. Principle of Spacetime and Black Hole Equivalence

    NASA Astrophysics Data System (ADS)

    Zhang, Tianxi

    2016-06-01

    Modelling the universe without relying on a set of hypothetical entities (HEs) to explain observations and overcome problems and difficulties is essential to developing a physical cosmology. The well-known big bang cosmology, widely accepted as the standard model, stands on two fundamentals, which are Einstein’s general relativity (GR) that describes the effect of matter on spacetime and the cosmological principle (CP) of spacetime isotropy and homogeneity. The field equation of GR along with the Friedmann-Lemaitre-Robertson-Walker (FLRW) metric of spacetime derived from CP generates the Friedmann equation (FE) that governs the development and dynamics of the universe. The big bang theory has made impressive successes in explaining the universe, but still has problems and solutions of them rely on an increasing number of HEs such as inflation, dark matter, dark energy, and so on. Recently, the author has developed a new cosmological model called black hole universe, which, instead of making many those hypotheses, only includes a new single postulate (or a new principle) to the cosmology - Principle of Spacetime and Black Hole Equivalence (SBHEP) - to explain all the existing observations of the universe and overcome all the existing problems in conventional cosmologies. This study thoroughly demonstrates how this newly developed black hole universe model, which therefore stands on the three fundamentals (GR, CP, and SBHEP), can fully explain the universe as well as easily conquer the difficulties according to the well-developed physics, thus, neither needing any other hypotheses nor existing any unsolved difficulties. This work was supported by NSF/REU (Grant #: PHY-1263253) at Alabama A & M University.

  5. Representations in Dynamical Embodied Agents: Re-Analyzing a Minimally Cognitive Model Agent

    ERIC Educational Resources Information Center

    Mirolli, Marco

    2012-01-01

    Understanding the role of "representations" in cognitive science is a fundamental problem facing the emerging framework of embodied, situated, dynamical cognition. To make progress, I follow the approach proposed by an influential representational skeptic, Randall Beer: building artificial agents capable of minimally cognitive behaviors and…

  6. Catalyst design for enhanced sustainability through fundamental surface chemistry.

    PubMed

    Personick, Michelle L; Montemore, Matthew M; Kaxiras, Efthimios; Madix, Robert J; Biener, Juergen; Friend, Cynthia M

    2016-02-28

    Decreasing energy consumption in the production of platform chemicals is necessary to improve the sustainability of the chemical industry, which is the largest consumer of delivered energy. The majority of industrial chemical transformations rely on catalysts, and therefore designing new materials that catalyse the production of important chemicals via more selective and energy-efficient processes is a promising pathway to reducing energy use by the chemical industry. Efficiently designing new catalysts benefits from an integrated approach involving fundamental experimental studies and theoretical modelling in addition to evaluation of materials under working catalytic conditions. In this review, we outline this approach in the context of a particular catalyst-nanoporous gold (npAu)-which is an unsupported, dilute AgAu alloy catalyst that is highly active for the selective oxidative transformation of alcohols. Fundamental surface science studies on Au single crystals and AgAu thin-film alloys in combination with theoretical modelling were used to identify the principles which define the reactivity of npAu and subsequently enabled prediction of new reactive pathways on this material. Specifically, weak van der Waals interactions are key to the selectivity of Au materials, including npAu. We also briefly describe other systems in which this integrated approach was applied.

  7. Fundamentals of microfluidic cell culture in controlled microenvironments†

    PubMed Central

    Young, Edmond W. K.; Beebe, David J.

    2010-01-01

    Microfluidics has the potential to revolutionize the way we approach cell biology research. The dimensions of microfluidic channels are well suited to the physical scale of biological cells, and the many advantages of microfluidics make it an attractive platform for new techniques in biology. One of the key benefits of microfluidics for basic biology is the ability to control parameters of the cell microenvironment at relevant length and time scales. Considerable progress has been made in the design and use of novel microfluidic devices for culturing cells and for subsequent treatment and analysis. With the recent pace of scientific discovery, it is becoming increasingly important to evaluate existing tools and techniques, and to synthesize fundamental concepts that would further improve the efficiency of biological research at the microscale. This tutorial review integrates fundamental principles from cell biology and local microenvironments with cell culture techniques and concepts in microfluidics. Culturing cells in microscale environments requires knowledge of multiple disciplines including physics, biochemistry, and engineering. We discuss basic concepts related to the physical and biochemical microenvironments of the cell, physicochemical properties of that microenvironment, cell culture techniques, and practical knowledge of microfluidic device design and operation. We also discuss the most recent advances in microfluidic cell culture and their implications on the future of the field. The goal is to guide new and interested researchers to the important areas and challenges facing the scientific community as we strive toward full integration of microfluidics with biology. PMID:20179823

  8. Principle of non-interaction of waves

    NASA Astrophysics Data System (ADS)

    Roychoudhuri, Chandrasekhar

    2010-07-01

    Non-interaction of waves (NIW) in the linear domain is an unappreciated but general principle of nature. Explicit recognition of this NIW-principle will add renewed momentum to the progress of fundamental physics and related technologies like spectrometry, coherence, polarizations, laser mode-locking, etc. This principle helps us appreciate that the mathematical correctness of a theorem and its capability to predict certain groups of measured data, do not necessarily imply that the theorem is always capable of mapping real interaction processes in nature. The time-frequency Fourier theorem (TF-FT) is an example since superposed light beams, by themselves, cannot reorganize or sum their energies. Quantum Mechanics (QM) correctly discovered that photons (light beams) are non-interacting bosons. Yet, to accommodate (i) the classical belief that light beams interfere (interact) by themselves, and (ii) Einstein's heuristic hypothesis that discrete packets of energy emitted by molecules travel as indivisible quanta (contradicting spontaneous diffractive spreading), QM has been forced to hypothesize that a photon interferes only with itself. In reality, it is the quantized detecting material media that make the superposition effects become manifest as their physical transformations, from bound electrons to released photoelectrons, after absorbing energy from all the beams due to induced simultaneous stimulations by the beams.

  9. Classification: Purposes, Principles, Progress, Prospects

    ERIC Educational Resources Information Center

    Sokal, Robert R.

    1974-01-01

    Clustering and other new techniques have changed classificatory principles and practice in many sciences. Discussed are defintions, purposes of classification, principles of classification, and recent trends. (Author/RH)

  10. Fundamentals of green chemistry: efficiency in reaction design.

    PubMed

    Sheldon, Roger A

    2012-02-21

    In this tutorial review, the fundamental concepts underlying the principles of green and sustainable chemistry--atom and step economy and the E factor--are presented, within the general context of efficiency in organic synthesis. The importance of waste minimisation through the widespread application of catalysis in all its forms--homogeneous, heterogeneous, organocatalysis and biocatalysis--is discussed. These general principles are illustrated with simple practical examples, such as alcohol oxidation and carbonylation and the asymmetric reduction of ketones. The latter reaction is exemplified by a three enzyme process for the production of a key intermediate in the synthesis of the cholesterol lowering agent, atorvastatin. The immobilisation of enzymes as cross-linked enzyme aggregates (CLEAs) as a means of optimizing operational performance is presented. The use of immobilised enzymes in catalytic cascade processes is illustrated with a trienzymatic process for the conversion of benzaldehyde to (S)-mandelic acid using a combi-CLEA containing three enzymes. Finally, the transition from fossil-based chemicals manufacture to a more sustainable biomass-based production is discussed.

  11. Fundamental role of bistability in optimal homeostatic control

    NASA Astrophysics Data System (ADS)

    Wang, Guanyu

    2013-03-01

    Bistability is a fundamental phenomenon in nature and has a number of fine properties. However, these properties are consequences of bistability at the physiological level, which do not explain why it had to emerge during evolution. Using optimal homeostasis as the first principle and Pontryagin's Maximum Principle as the optimization approach, I find that bistability emerges as an indispensable control mechanism. Because the mathematical model is general and the result is independent of parameters, it is likely that most biological systems use bistability to control homeostasis. Glucose homeostasis represents a good example. It turns out that bistability is the only solution to a dilemma in glucose homeostasis: high insulin efficiency is required for rapid plasma glucose clearance, whereas an insulin sparing state is required to guarantee the brain's safety during fasting. This new perspective can illuminate studies on the twin epidemics of obesity and diabetes and the corresponding intervening strategies. For example, overnutrition and sedentary lifestyle may represent sudden environmental changes that cause the lose of optimality, which may contribute to the marked rise of obesity and diabetes in our generation.

  12. Minimal breast cancer: a clinical appraisal.

    PubMed Central

    Peters, T G; Donegan, W L; Burg, E A

    1977-01-01

    Eighty-five patients with a diagnosis of minimal breast cancer were evaluated. The predominant lesion was intraductal carcinoma, and axillary metastases occurred in association with minimal breast cancer in seven of 96 cases. One death occurred due to minimal breast cancer. Bilateral mammary carcinoma was evident in 24% and bilateral minimal breast cancer in 13% of the patients. The component lesions of minimal breast cancer have varied biologic activity, but prognosis is good with a variety of operations. The multifocal nature of minimal breast cancer and the potential for metastases should be recognized. Therapy should include removal of the entire mammary parenchyma and low axillary nodes. The high incidence of bilateral malignancy supports elective contralateral biopsy at the time of therapy for minimal breast cancer. Images Fig. 1. Fig. 2. Fig. 3. Fig. 5. PMID:203233

  13. A minimal model for the structural energetics of VO2

    NASA Astrophysics Data System (ADS)

    Kim, Chanul; Marianetti, Chris; The Marianetti Group Team

    Resolving the structural, magnetic, and electronic structure of VO2 from the first-principles of quantum mechanics is still a forefront problem despite decades of attention. Hybrid functionals have been shown to qualitatively ruin the structural energetics. While density functional theory (DFT) combined with cluster extensions of dynamical mean-field theory (DMFT) have demonstrated promising results in terms of the electronic properties, structural phase stability has not yet been addressed. In order to capture the basic physics of the structural transition, we propose a minimal model of VO2 based on the one dimensional Peierls-Hubbard model and parameterize this based on DFT calculations of VO2. The total energy versus dimerization in the minimal mode is then solved numerically exactly using density matrix renormalization group (DMRG) and compared to the Hartree-Fock solution. We demonstrate that the Hartree-Fock solution exhibits the same pathologies as DFT+U, and spin density functional theory for that matter, while the DMRG solution is consistent with experimental observation. Our results demonstrate the critical role of non-locality in the total energy, and this will need to be accounted for to obtain a complete description of VO2 from first-principles. The authors acknowledge support from FAME, one of six centers of STARnet, a Semiconductor Research Corporation program sponsored by MARCO and DARPA.

  14. Optical chiral metamaterials: a review of the fundamentals, fabrication methods and applications

    NASA Astrophysics Data System (ADS)

    Wang, Zuojia; Cheng, Feng; Winsor, Thomas; Liu, Yongmin

    2016-10-01

    Optical chiral metamaterials have recently attracted considerable attention because they offer new and exciting opportunities for fundamental research and practical applications. Through pragmatic designs, the chiroptical response of chiral metamaterials can be several orders of magnitude higher than that of natural chiral materials. Meanwhile, the local chiral fields can be enhanced by plasmonic resonances to drive a wide range of physical and chemical processes in both linear and nonlinear regimes. In this review, we will discuss the fundamental principles of chiral metamaterials, various optical chiral metamaterials realized by different nanofabrication approaches, and the applications and future prospects of this emerging field.

  15. Supramolecular chemistry and chemical warfare agents: from fundamentals of recognition to catalysis and sensing.

    PubMed

    Sambrook, M R; Notman, S

    2013-12-21

    Supramolecular chemistry presents many possible avenues for the mitigation of the effects of chemical warfare agents (CWAs), including sensing, catalysis and sequestration. To-date, efforts in this field both to study fundamental interactions between CWAs and to design and exploit host systems remain sporadic. In this tutorial review the non-covalent recognition of CWAs is considered from first principles, including taking inspiration from enzymatic systems, and gaps in fundamental knowledge are indicated. Examples of synthetic systems developed for the recognition of CWAs are discussed with a focus on the supramolecular complexation behaviour and non-covalent approaches rather than on the proposed applications.

  16. Principles of Environmental Chemistry

    NASA Astrophysics Data System (ADS)

    Hathaway, Ruth A.

    2007-07-01

    Roy M. Harrison, Editor RSC Publishing; ISBN 0854043713; × + 363 pp.; 2006; $69.95 Environmental chemistry is an interdisciplinary science that includes chemistry of the air, water, and soil. Although it may be confused with green chemistry, which deals with potential pollution reduction, environmental chemistry is the scientific study of the chemical and biochemical principles that occur in nature. Therefore, it is the study of the sources, reactions, transport, effects, and fates of chemical species in the air, water, and soil environments, and the effect of human activity on them. Environmental chemistry not only explores each of these environments, but also closely examines the interfaces and boundaries where the environments intersect.

  17. Complex Correspondence Principle

    SciTech Connect

    Bender, Carl M.; Meisinger, Peter N.; Hook, Daniel W.; Wang Qinghai

    2010-02-12

    Quantum mechanics and classical mechanics are distinctly different theories, but the correspondence principle states that quantum particles behave classically in the limit of high quantum number. In recent years much research has been done on extending both quantum and classical mechanics into the complex domain. These complex extensions continue to exhibit a correspondence, and this correspondence becomes more pronounced in the complex domain. The association between complex quantum mechanics and complex classical mechanics is subtle and demonstrating this relationship requires the use of asymptotics beyond all orders.

  18. Protection - Principles and practice.

    NASA Technical Reports Server (NTRS)

    Graham, G. S.; Denning, P. J.

    1972-01-01

    The protection mechanisms of computer systems control the access to objects, especially information objects. The principles of protection system design are formalized as a model (theory) of protection. Each process has a unique identification number which is attached by the system to each access attempted by the process. Details of system implementation are discussed, taking into account the storing of the access matrix, aspects of efficiency, and the selection of subjects and objects. Two systems which have protection features incorporating all the elements of the model are described.

  19. [Catastrophe medicine. General principles].

    PubMed

    Linde, H

    1980-10-17

    Catastrophe medicine is the organization of masses under difficult conditions. The present article is concerned with the competence and organization in the event of a catastrophe and describes the phasic course of a catastrophe situation. The most important element of effective first aid measures, screening for shock and pain treatment, and first surgical treatment and the principles of ballistic factors are dealt with. Particular attention is given to the evacuation of emergency patients from the scene of the catastrophe. A request is made for " Catastrophe medicine" to be included by the medical faculties and educational institutes in their course of study for paramedical personnel.

  20. Principles of Liquid Chromatography

    NASA Astrophysics Data System (ADS)

    Bakalyar, Stephen R.

    This article reviews the basic principles of high performance liquid chromatography (HPLC). The introductory section provides an overview of the HPLC technique, placing it in historical context and discussing the elementary facts of the separation mechanism. The next section discusses the nature of resolution, describing the two principal aspects, zone center separation and zone spreading. The third section takes a detailed look at how HPLC is used in practice to achieve a separation. It discusses the three key variables that need to be adjusted: retention, efficiency, and selectivity. A fourth section is concerned with various relationships of practical importance: flow rate, temperature, and pressure. A final section discusses future trends in HPLC.